EP2753951A1 - Interaktion mit einem dreidimensionalen virtuellen szenario - Google Patents

Interaktion mit einem dreidimensionalen virtuellen szenario

Info

Publication number
EP2753951A1
EP2753951A1 EP12780399.7A EP12780399A EP2753951A1 EP 2753951 A1 EP2753951 A1 EP 2753951A1 EP 12780399 A EP12780399 A EP 12780399A EP 2753951 A1 EP2753951 A1 EP 2753951A1
Authority
EP
European Patent Office
Prior art keywords
selection
virtual
scenario
dimensional
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP12780399.7A
Other languages
German (de)
English (en)
French (fr)
Inventor
Leonhard Vogelmeier
David Wittmann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airbus Defence and Space GmbH
Original Assignee
EADS Deutschland GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by EADS Deutschland GmbH filed Critical EADS Deutschland GmbH
Publication of EP2753951A1 publication Critical patent/EP2753951A1/de
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/20Stereoscopic displays; Three-dimensional displays; Pseudo-three-dimensional displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays

Definitions

  • the invention relates to display devices for a three-dimensional virtual scenario.
  • the invention relates to three-dimensional virtual scenario display devices for selecting objects in the virtual scenario with feedback upon successful selection of one of the objects, a workstation device for monitoring a three-dimensional virtual scenario, and interacting with a three-dimensional virtual scenario, using a workstation device to monitor a
  • a three-dimensional virtual scenario display device for selecting objects in the virtual scenario with feedback upon successful selection of an object comprising a virtual scenario renderer and a virtual scenario renderer
  • the touch unit for touch-controlled selection of an object in the virtual scenario.
  • the touch unit is in one
  • the presentation unit can be based on stereoscopic visualization techniques, which are particularly suitable for the evaluation of three-dimensional
  • Accommodation adjustment of the refractive power of the lens of the observer's eyes.
  • convergence and accommodation are coupled together, and this coupling must be removed when considering a three-dimensional virtual scenario. This is because the eye is focused on an imaging unit, but the eye axes must align with the virtual objects that are in space or the virtual three-dimensional scenario in front of or behind the imaging
  • Unpairing convergence and accommodation can lead to stress and hence fatigue on the human eye to headache and nausea in a viewer of a three-dimensional virtual scene.
  • the conflict between convergence and accommodation also results from an operator interacting with virtual-scenario objects, for example, with his hand in the course of a direct interaction with the virtual scenario; that the real position of the hand overlaps with the virtual objects.
  • the conflict between accommodation and convergence can be intensified.
  • a three-dimensional virtual scenario may require wearing special gloves.
  • these gloves make it possible to determine the position of the user's hands and, on the other hand, they can trigger a corresponding vibration when virtual objects are touched.
  • the position of the hand is usually determined in this case with an optical detection system.
  • a user typically moves their hands in the space in front of the user. The weight of the arms and the extra weight of the
  • Gloves can limit the time of use, since the user may experience signs of fatigue early on. Particularly in the field of airspace surveillance or aviation, there are situations where two types of information are needed to build a good understanding of the current airspace situation and its evolution in the future. On the one hand, these are the global view of the overall situation and, on the other hand, a detailed look at the elements that are relevant for a potential conflict situation.
  • An air traffic controller for example, who has to resolve a conflict situation between two aircraft, must have both
  • perspective displays for the representation of a spatially acting scenario allow a graphical representation of a three-dimensional scenario
  • a representation of three-dimensional scenarios is provided that allows both an overview and a detail display simultaneously, a simple and direct one
  • the presentation unit is designed to evoke the impression of a three-dimensional scenario with a viewer.
  • the presentation unit can have at least two projection devices which project a different image for each individual eye of the observer, so that a three-dimensional impression is created in the viewer.
  • the presentation unit can also be designed to display differently polarized images, with spectacles of the observer having correspondingly polarized lenses resulting in one eye in each case being able to perceive one image and thus creating a three-dimensional impression on the viewer. It is important
  • the touch unit is an input element for touch-controlled
  • the touch unit may for example be transparent and be arranged in the three-dimensional display space of the virtual scenario, so that an object of the virtual scenario is selected by the user gripping the three-dimensional presentation space with one hand or both hands and touching the touch unit.
  • the touch unit may be located anywhere in the three-dimensional presentation space or outside the three-dimensional presentation space.
  • the touch unit can be designed as a plane or as an arbitrarily geometrically shaped surface.
  • the touch unit can be designed as a flexibly formable element in order to be able to adapt the touch unit to the three-dimensional virtual scenario.
  • the touch unit may comprise, for example, capacitive or resistive measurement systems or infrared-based gratings to determine the coordinates of one or more touch points at which the user touches the touch unit. Depending on the coordinates of a point of contact, for example, that object in the three-dimensional virtual scenario which comes closest to the point of contact is selected. According to an embodiment of the invention, the touch unit is adapted to represent a selection area for the object. The object is selected by touching the selection area.
  • a computing device may, for example, a position of the
  • the touch unit may be configured to represent a plurality of selection regions for a plurality of objects, each having a selection region assigned to an object in the virtual scenario.
  • the feedback is made at the successful selection of one of the objects from the virtual scenario at least partially by a vibration of the touch unit or by the on
  • the touch unit may, for example, be vibrated as a whole, for example with the aid of a motor, in particular a vibration motor, or individual areas of the touch unit may be vibrated.
  • piezoelectric actuators can also be used as vibration elements, with the piezoelectric actuators in each case being made to oscillate at the point of contact when an object has been selected in the virtual scenario, thus signaling the user to select the object.
  • Touch unit on a variety of areas, which are selectively selectable for tactile feedback on the selection of an object in the virtual scenario.
  • the touch unit may be configured to allow selection of multiple objects at the same time. For example, an object with a first hand and another object with a second hand of the user can be selected.
  • the touch unit can be located in the area of a selection area for an object for outputting a tactile feedback, i. for example, to perform a vibration. This allows the user, in particular when selecting multiple objects, to recognize which of the objects has been selected and which has not.
  • the touch unit can be designed to allow a change of a map scale and a shifting of the illustrated map area.
  • Tactile feedback is understood as meaning, for example, a vibration or the vibration of a piezoelectric actuator.
  • the feedback on the successful selection of an object in the three-dimensional scenario is at least partially carried out by outputting an optical signal.
  • the optical signal can alternatively or additionally to the tactile
  • Under feedback by means of an optical signal is understood to highlight or represent a selection pointer.
  • the brightness of the selected object may be changed, or the selected object may be provided with a border, or a pointing element pointing to that object may be displayed next to the selected object in the virtual scenario.
  • the acoustic signal can be output as an alternative to the tactile feedback and / or the optical signal, but also in addition to the tactile feedback and / or the optical signal.
  • an acoustic signal is understood as meaning, for example, the output of a short tone via an output unit, for example a loudspeaker.
  • This structure allows the user to view more closely the overall scenario in the overview area and a user-selectable smaller area in the detail area.
  • the overview area can be reproduced, for example, as a two-dimensional display and the detail area as a spatial representation.
  • Airspace in a simple and manageable way allows to have both a view of the overall airspace situation in the overview area as well as potential conflict situations in the detail area.
  • the invention allows the operator to change the detail area depending on the particular needs, i. Any area of the overview display can be selected for the detailed display. Of course, this selection can also be made such that a selected area of the detailed representation is displayed in the overview representation.
  • a three-dimensional virtual scenario display device for selecting objects in the virtual scenario with feedback upon successful selection of one of the objects as described above and described below.
  • the workplace device can also be used, for example, to control unmanned aerial vehicles or to monitor any
  • Scenarios can be used by one or more users.
  • a flight plan can be displayed on a display and, if an entry from the flight plan is selected, the corresponding aircraft in the
  • the workstation device may include input elements that may be used alternatively or in addition to interacting directly with the three-dimensional virtual scenario.
  • the workstation device may be a so-called. Computer mouse, a keyboard or use-typical interaction devices, such as those of a
  • Air traffic controller workplace exhibit. Likewise, it can at all displays or presentation units to
  • a workstation device is as described above and below for monitoring airspaces
  • the workstation device can also be used to monitor and control unmanned aerial vehicles and to analyze a
  • Airspace easy and quick to detect whether an aircraft threatens, for example, to fly through a restricted zone or a hazardous area.
  • An exclusion zone or a danger zone can be represented, for example, as a virtual body in the size of the exclusion zone or the hazardous area.
  • a method of selecting objects in a three-dimensional scenario is provided. In a first step, a selection area of a virtual object is touched in a display area of a three-dimensional virtual scenario. In a subsequent step, a response is issued to an operator after the selection of the virtual object.
  • the method further comprises the steps of: visualizing a selection element in the three-dimensional virtual scenario, moving the selection element according to a finger movement of the operator on the presentation surface, selecting an object in the three-dimensional scenario by matching the selection element with the one to be selected Object is brought.
  • the visualization of the selection element, the movement of the selection element and the selection of the object takes place after touching the selection surface.
  • the selector may be displayed in the virtual scenario when the operator touches the touch unit.
  • the selection element is represented, for example, in the virtual scenario as a vertically extending beam of light or a light cylinder and moves through the three-dimensional virtual scenario in accordance with a finger movement of the finger
  • the selection element encounters an object in the three-dimensional virtual scenario, then this object is selected for further operations, as long as the user remains essentially immobile on the object of the selection element for a certain time
  • the selection of the object in the virtual scenario may occur after the selector element unmoving an object for one second. This waiting time is to prevent objects from being selected in the virtual scenario, even though the selection element has only been passed by them.
  • the presentation of a selection item in the virtual scenario simplifies the selection of an item and allows the operator to select an item without considering the position of his hand in the virtual scenario.
  • the selection of an object thus takes place in that the selection element is brought into coincidence with the object to be selected by movement of the hand, which is made possible in that the selection element, for example in the form of a light cylinder, runs vertically through the virtual scenario.
  • Selection element in at least one point overlaps with the coordinates of the virtual object to be selected.
  • a computer program element for controlling a three-dimensional virtual scenario display device for selecting objects in the virtual scenario with feedback upon successful selection of one of the objects is executed, the method of selecting virtual objects in a three-dimensional one virtual scenario as described above and described when the computer program element is executed on a processor of a computing unit.
  • the computer program element can serve a processor of the
  • a computer-readable medium is indicated with the computer program element as described above and below.
  • a computer-readable medium may be any volatile or non-volatile storage medium, such as a hard disk, a CD, a DVD, a floppy disk, a memory card, or any other computer-readable medium
  • FIG. 1 is a side view of a workstation device according to one
  • Fig. 2 shows a perspective view of a workstation device according to another embodiment of the invention.
  • Fig. 3 shows a schematic view of a display device according to an embodiment of the invention.
  • Fig. 4 shows a schematic view of a display device according to another embodiment of the invention.
  • FIG. 6 shows a schematic view of a display device according to an embodiment of the invention.
  • FIG. 7 shows a schematic view of a method for selecting
  • Fig. 1 shows a workstation device 200 for an operator of a
  • the workstation device 200 has a display device 100 with a display unit 110 and a touch unit 120.
  • the display device 100 has a display device 100 with a display unit 110 and a touch unit 120.
  • Touch unit 120 may, in particular, overlay a part of display unit 110.
  • the touch unit can also the entire
  • Overlay display unit 110 the touch unit is transparent in such a case, so that the operator of the workstation or the viewer of the display device, the view of the
  • the display unit 110 and the touch unit 120 constitute one
  • Touch unit 120 and the display unit 110 apply mutatis mutandis.
  • the touch unit may be configured to cover the presentation unit, i. that the entire presentation unit with a
  • the display unit 110 has a first display area 111 and a second display area 112, wherein the second display area is angled relative to the first display area in the direction of the user so that the two display areas show an inclusion angle ⁇ 115.
  • the first display area 111 of the display unit 110 and the second display area 112 of the display unit 110 span by their mutually angled position with a viewer position 195, d. H. of the
  • the display room 130 is thus that one
  • An operator using the seat 190 during use of the workstation device 200 may also use the workspace area 140, in addition to the three-dimensional virtual scenario presentation space 130, on which other touch-sensitive or conventional displays may reside.
  • the inclusion angle ⁇ 115 can be dimensioned such that all virtual objects in the display room 130 can be within an arm reach of the user of the workstation device 200.
  • an inclusion angle a which is between 90 degrees and 150 degrees
  • the inclusion angle ⁇ can also be adapted to the individual needs of a single user while both falling below and exceeding the range of 90 degrees to 150 degrees.
  • the inclusion angle ⁇ is 120 degrees.
  • the angled geometry of the presentation unit may reflect the conflict between convergence and accommodation in a viewer of a virtual
  • the three-dimensional virtual scenario can be represented, for example, such that the second display area 112 of the display unit 110 of the virtually represented earth surface or a
  • the inventive workstation device is particularly suitable for longer-term, low-fatigue processing of three-dimensional virtual scenarios with integrated spatial representation of geographically referenced data, such. Aircraft, waypoints, control zones, threat areas, terrain topographies and weather events, with simple intuitive
  • the presentation unit 110 may also have a rounded transition from the first display area 111 to the second display area 112. This avoids or reduces an interfering influence of a real visible edge between the first display area and the second display area on the three-dimensional impression of the virtual scenario.
  • the display unit 110 also in the form of a
  • Circular arc be formed.
  • the workstation device as described above and below thus enables a large stereoscopic display volume or a
  • the workstation device enables a virtual reference surface in the virtual three-dimensional scenario
  • a terrain surface is positioned in the same plane as the actual presentation unit or touch unit.
  • a distance of the virtual objects from the surface of the presentation unit can be reduced, thus reducing a conflict between convergence and accommodation in the viewer. Furthermore, disturbing influences on the three-dimensional impression are reduced as a result of the fact that the operator reaches into the presentation space with one hand and the eye of the observer thus becomes a real object, ie. H. the operator's hand, and perceives virtual objects at the same time.
  • the touch unit 120 is executed at a touch of the
  • Touch unit with the hand of the operator to give a feedback to the operator can be done by a detection unit (not shown) detects the touch coordinates on the touch unit and, for example, the display unit an optical feedback or a sound output unit (not shown) outputs the acoustic feedback.
  • a detection unit detects the touch coordinates on the touch unit and, for example, the display unit an optical feedback or a sound output unit (not shown) outputs the acoustic feedback.
  • the touch unit can output a tactile feedback by means of vibration or vibrations of piezo actuators.
  • 2 shows a workstation device 200 with a display device 100, which is designed to display a three-dimensional virtual scenario, furthermore with three conventional display elements 210, 211, 212 for two-dimensional display of graphics and information, furthermore with two conventional input / interactive devices, such as a computer mouse 171 and a so-called space mouse 170, which is an interaction device with six degrees of freedom and with which elements in space, for example in a three-dimensional scenario, can be controlled.
  • a display device 100 which is designed to display a three-dimensional virtual scenario, furthermore with three conventional display elements 210, 211, 212 for two-dimensional display of graphics and information, furthermore with two conventional input / interactive devices, such as a computer mouse 171 and a so-called space mouse 170, which is an interaction device with six degrees of freedom and with which elements in space, for example in a three-dimensional scenario, can be controlled.
  • the three-dimensional impression of the presentation device 100 is the three-dimensional impression of the presentation device 100
  • the glasses are designed to provide the eyes of a viewer with different images, giving the viewer the impression of a three-dimensional scenario.
  • the spectacles 160 have a multiplicity of so-called reflectors 161, which serve to determine the eye position of a viewer in front of the presentation device 100 and thus, if necessary, to adapt the reproduction of the three-dimensional virtual scene to the position of the observer.
  • the workstation device 200 can for example have a position detection unit (not shown) for this purpose, which detects, for example, by means of a camera system with a plurality of cameras, the eye position due to the position of the reflectors 161.
  • FIG. 3 shows a perspective view of a display device 100 with a display unit 110 and a touch unit 120, wherein the
  • Display unit 110 has a first display area 111 and a second display area 112.
  • a selection area 302 is indicated for each virtual object in the presentation space 130.
  • Each selection area 302 may be connected to the virtual area 301 assigned to this selection area via a selection element 303.
  • the selection element 303 makes it easier for a user to assign a selection area 302 to a virtual object 301
  • the touch unit 120 is designed to transmit the touch coordinates of the operator's finger to an evaluation unit, which the Match coordinates coordinates with the representation coordinates of the selection areas 302 and so can determine the selected virtual object.
  • the touch unit 120 may be configured to respond only to the operator's touch at the locations where a selection area is displayed. This allows the operator to place his hands on the touch unit so that no selection area is touched, whereby putting down the hands can prevent operator fatigue and facilitate easy interaction with the virtual scenario.
  • the described construction of the presentation device 100 thus makes it possible for an operator to interact with a virtual three-dimensional scene and to obtain a real feedback on the sole fact that he is involved in the
  • the selection of a virtual object 301 that has taken place can be signaled to the operator by vibration of the touch unit 120, for example.
  • Touch unit 120 are vibrated only on the size of the selected selection area 302. This can be achieved, for example, by the use of oscillating piezoactuators in the touch unit, the piezoactuators, after detection of the contact coordinates of the
  • Touch unit are made to oscillate at the appropriate position.
  • the virtual objects can also be selected by a selection element in the form of a vertical plane running in the virtual three-dimensional scene when touching the touch unit 120 at the touch position
  • a virtual object 301 is then selected by bringing the selection element into coincidence with the virtual object to be selected. For example, in order to avoid an accidental selection of a virtual object, the selection can take place with a delay in such a way that a virtual object is not selected until the selection element has a certain time in
  • FIG. 4 shows a display device 100 with a display unit 110 and a touch unit 120.
  • a first display area 111 an overview area is displayed in a two-dimensional representation, and a partial area 401 of the overview area is displayed in detail in a display room 130 as a three-dimensional scenario 402.
  • the objects located in the partial section of the overview area are displayed as virtual three-dimensional objects 301.
  • the display device 100 as described above and below enables the operator to change the detail area 402 by moving the partial area in the overview area 401 or by the extract of the overview area in the three-dimensional view in the detail area 402 in the direction of at least one of the three indicated coordinates x, y or z is changed.
  • FIG. 5 shows a workstation device 200 with a display device 100 and a three-dimensional virtual scenario shown in FIG
  • Representation unit 110 and a touch unit 120 which together with the eyes of the operator 501 span the display space 130, in which the virtual objects 301 of the three-dimensional virtual
  • FIG. 6 shows a three-dimensional virtual scenario display device 100 with a display unit 110 and a touch unit 120. In the presentation space 130, virtual three-dimensional objects 301 are imaged.
  • a virtual surface 601 is arranged, on which a marking element 602 can be moved.
  • Marking element 602 moves alone on the virtual surface 601, whereby the marking element 602 has two degrees of freedom in its movement. In other words, the marking element 602 is executed, a
  • the selection of a virtual object in the three-dimensional scenario takes place in that the position of at least one user's eye 503 is detected by means of the reflectors 161 on a pair of glasses worn by the user, and a connecting line 504 is determined from the determined position of the eye 503
  • the position of the eyes of the user can be determined with or without glasses with corresponding reflectors. It should be noted that any mechanisms and methods for determining the position of the eyes can be used in the context of the invention.
  • the selection of a virtual object 301 in the three-dimensional scenario is carried out by extending the connection line 504 into the presentation space 130 and selecting that virtual object whose virtual coordinates are crossed by the connection line 504.
  • the selection of a virtual object 301 is then identified, for example, by means of a selection indicator 603.
  • Marker element 602 moves, even so in the virtual scenario in
  • Display room 130 may be arranged so that from the user's perspective virtual objects 301 are located in front of and / or behind the virtual surface 601.
  • the marker 602 on the virtual surface 601 is moved so that the connecting line 504 crosses the coordinates of a virtual object 301, the marker 602 may be represented in the three-dimensional scenario as having the virtual information three-dimensional coordinates of the selected object occupies. From the user's point of view, this change is then such that, as soon as a virtual object 301 is selected, the marking element 602 makes a spatial movement toward the user or away from the user.
  • Interaction devices such as a computer mouse. This can be done in the
  • Degrees of freedom represent a simpler and faster-to-learn interaction with a three-dimensional scenario, as an input device with less
  • FIG. 7 is a schematic view of a method according to FIG.
  • a selection area of a virtual object is touched in a display area of a three-dimensional virtual scenario.
  • the selection area is coupled to the virtual object such that a
  • Touching the selection area allows a unique determination of the correspondingly selected virtual object.
  • a selection element takes place in the three-dimensional virtual scenario.
  • the selection element can be, for example, a light cylinder running vertically in the three-dimensional virtual scenario.
  • the selection element can be visualized as a function of the contact duration of the selection area, ie. H. the selector element is visualized as soon as a user touches the selection area and can be hidden again as soon as the user removes his / her finger from the selection area. This allows the user to pause or terminate a selection operation of a virtual object, for example, because the user determines that he wishes to select another virtual object.
  • Touch unit not removed once the visualized selection element in the virtual scenario persists and can be moved by performing a movement of the finger on the display surface or the touch unit in the virtual scenario. This allows a user to make the selection of a virtual object by gradually approaching the selection element to just the virtual object to be selected.
  • a fourth step 704 an object is selected in the three-dimensional scenario by bringing the selection element into coincidence with the object to be selected.
  • the selection of the object can be done, for example, by the
  • Selection element is kept a certain time in registration with the object to be selected, for example one second.
  • the period of time after which a virtual object is displayed as selected can be set arbitrarily.
  • a return message is output to the operator after the virtual object has been selected.
  • the feedback can be as described above haptic / tactile, optical or acoustic.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Processing Or Creating Images (AREA)
EP12780399.7A 2011-09-08 2012-09-06 Interaktion mit einem dreidimensionalen virtuellen szenario Ceased EP2753951A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102011112618A DE102011112618A1 (de) 2011-09-08 2011-09-08 Interaktion mit einem dreidimensionalen virtuellen Szenario
PCT/DE2012/000892 WO2013034133A1 (de) 2011-09-08 2012-09-06 Interaktion mit einem dreidimensionalen virtuellen szenario

Publications (1)

Publication Number Publication Date
EP2753951A1 true EP2753951A1 (de) 2014-07-16

Family

ID=47115084

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12780399.7A Ceased EP2753951A1 (de) 2011-09-08 2012-09-06 Interaktion mit einem dreidimensionalen virtuellen szenario

Country Status (7)

Country Link
US (1) US20140282267A1 (ru)
EP (1) EP2753951A1 (ru)
KR (1) KR20140071365A (ru)
CA (1) CA2847425C (ru)
DE (1) DE102011112618A1 (ru)
RU (1) RU2604430C2 (ru)
WO (1) WO2013034133A1 (ru)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2976681B1 (fr) * 2011-06-17 2013-07-12 Inst Nat Rech Inf Automat Systeme de colocalisation d'un ecran tactile et d'un objet virtuel et dispostif pour la manipulation d'objets virtuels mettant en oeuvre un tel systeme
JP2015132888A (ja) * 2014-01-09 2015-07-23 キヤノン株式会社 表示制御装置及び表示制御方法、プログラム、並びに記憶媒体
DE102014107220A1 (de) * 2014-05-22 2015-11-26 Atlas Elektronik Gmbh Eingabevorrichtung, Rechner oder Bedienanlage sowie Fahrzeug
US10140776B2 (en) 2016-06-13 2018-11-27 Microsoft Technology Licensing, Llc Altering properties of rendered objects via control points
DE102017117223A1 (de) * 2017-07-31 2019-01-31 Hamm Ag Arbeitsmaschine, insbesondere Nutzfahrzeug

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100007636A1 (en) * 2006-10-02 2010-01-14 Pioneer Corporation Image display device
WO2011044936A1 (en) * 2009-10-14 2011-04-21 Nokia Corporation Autostereoscopic rendering and display apparatus

Family Cites Families (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5320538A (en) * 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
US5394202A (en) * 1993-01-14 1995-02-28 Sun Microsystems, Inc. Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US7225404B1 (en) * 1996-04-04 2007-05-29 Massachusetts Institute Of Technology Method and apparatus for determining forces to be applied to a user through a haptic interface
US6302542B1 (en) * 1996-08-23 2001-10-16 Che-Chih Tsao Moving screen projection technique for volumetric three-dimensional display
JP2985847B2 (ja) * 1997-10-17 1999-12-06 日本電気株式会社 入力装置
US6031519A (en) * 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
US6377229B1 (en) * 1998-04-20 2002-04-23 Dimensional Media Associates, Inc. Multi-planar volumetric display system and method of operation using three-dimensional anti-aliasing
US6429846B2 (en) * 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6064354A (en) * 1998-07-01 2000-05-16 Deluca; Michael Joseph Stereoscopic user interface method and apparatus
US6373463B1 (en) * 1998-10-14 2002-04-16 Honeywell International Inc. Cursor control system with tactile feedback
US6842175B1 (en) * 1999-04-22 2005-01-11 Fraunhofer Usa, Inc. Tools for interacting with virtual environments
US6727924B1 (en) * 2000-10-17 2004-04-27 Novint Technologies, Inc. Human-computer interface including efficient three-dimensional controls
US20020175911A1 (en) * 2001-05-22 2002-11-28 Light John J. Selecting a target object in three-dimensional space
US7190365B2 (en) * 2001-09-06 2007-03-13 Schlumberger Technology Corporation Method for navigating in a multi-scale three-dimensional scene
US6753847B2 (en) * 2002-01-25 2004-06-22 Silicon Graphics, Inc. Three dimensional volumetric display input and output configurations
US7324085B2 (en) * 2002-01-25 2008-01-29 Autodesk, Inc. Techniques for pointing to locations within a volumetric display
GB0204652D0 (en) * 2002-02-28 2002-04-10 Koninkl Philips Electronics Nv A method of providing a display gor a gui
US6968511B1 (en) * 2002-03-07 2005-11-22 Microsoft Corporation Graphical user interface, data structure and associated method for cluster-based document management
JP2004199496A (ja) * 2002-12-19 2004-07-15 Sony Corp 情報処理装置および方法、並びにプログラム
JP2004334590A (ja) * 2003-05-08 2004-11-25 Denso Corp 操作入力装置
JP4576131B2 (ja) * 2004-02-19 2010-11-04 パイオニア株式会社 立体的二次元画像表示装置及び立体的二次元画像表示方法
KR20050102803A (ko) * 2004-04-23 2005-10-27 삼성전자주식회사 가상입력장치, 시스템 및 방법
EP1781893A1 (en) * 2004-06-01 2007-05-09 Michael A. Vesely Horizontal perspective simulator
US7348997B1 (en) * 2004-07-21 2008-03-25 United States Of America As Represented By The Secretary Of The Navy Object selection in a computer-generated 3D environment
JP2006053678A (ja) * 2004-08-10 2006-02-23 Toshiba Corp ユニバーサルヒューマンインタフェースを有する電子機器
EP1667088B1 (en) * 2004-11-30 2009-11-04 Oculus Info Inc. System and method for interactive 3D air regions
WO2006081198A2 (en) * 2005-01-25 2006-08-03 The Board Of Trustees Of The University Of Illinois Compact haptic and augmented virtual reality system
US20060267927A1 (en) * 2005-05-27 2006-11-30 Crenshaw James E User interface controller method and apparatus for a handheld electronic device
US20070064199A1 (en) * 2005-09-19 2007-03-22 Schindler Jon L Projection display device
US7834850B2 (en) * 2005-11-29 2010-11-16 Navisense Method and system for object control
US8384665B1 (en) * 2006-07-14 2013-02-26 Ailive, Inc. Method and system for making a selection in 3D virtual environment
JP4111231B2 (ja) * 2006-07-14 2008-07-02 富士ゼロックス株式会社 立体表示システム
KR100851977B1 (ko) * 2006-11-20 2008-08-12 삼성전자주식회사 가상 평면을 이용하여 전자 기기의 사용자 인터페이스를제어하는 방법 및 장치.
US8726194B2 (en) * 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
KR101136231B1 (ko) * 2007-07-30 2012-04-17 도쿠리츠 교세이 호진 죠호 츠신 켄큐 키코 다시점 공중 영상 표시 장치
RU71008U1 (ru) * 2007-08-23 2008-02-20 Дмитрий Анатольевич Орешин Оптическая система объемного изображения
WO2009044437A1 (ja) * 2007-10-01 2009-04-09 Pioneer Corporation 画像表示装置
US20090112387A1 (en) * 2007-10-30 2009-04-30 Kabalkin Darin G Unmanned Vehicle Control Station
US8233206B2 (en) * 2008-03-18 2012-07-31 Zebra Imaging, Inc. User interaction with holographic images
JP4719929B2 (ja) * 2009-03-31 2011-07-06 Necカシオモバイルコミュニケーションズ株式会社 表示装置、および、プログラム
US8896527B2 (en) * 2009-04-07 2014-11-25 Samsung Electronics Co., Ltd. Multi-resolution pointing system
US8760391B2 (en) * 2009-05-22 2014-06-24 Robert W. Hawkins Input cueing emersion system and method
JP5614014B2 (ja) * 2009-09-04 2014-10-29 ソニー株式会社 情報処理装置、表示制御方法及び表示制御プログラム
CN102741781A (zh) * 2009-12-04 2012-10-17 奈克斯特控股公司 用于位置探测的传感器方法和系统
KR101114750B1 (ko) * 2010-01-29 2012-03-05 주식회사 팬택 다차원 영상을 이용한 사용자 인터페이스 장치
US9693039B2 (en) * 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
US20120005624A1 (en) * 2010-07-02 2012-01-05 Vesely Michael A User Interface Elements for Use within a Three Dimensional Scene
US8643569B2 (en) * 2010-07-14 2014-02-04 Zspace, Inc. Tools for use within a three dimensional scene
US8970484B2 (en) * 2010-07-23 2015-03-03 Nec Corporation Three dimensional display device and three dimensional display method
US20120069143A1 (en) * 2010-09-20 2012-03-22 Joseph Yao Hua Chu Object tracking and highlighting in stereoscopic images
US8836755B2 (en) * 2010-10-04 2014-09-16 Disney Enterprises, Inc. Two dimensional media combiner for creating three dimensional displays
US9001053B2 (en) * 2010-10-28 2015-04-07 Honeywell International Inc. Display system for controlling a selector symbol within an image
US20120113223A1 (en) * 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality
JP5671349B2 (ja) * 2011-01-06 2015-02-18 任天堂株式会社 画像処理プログラム、画像処理装置、画像処理システム、および画像処理方法
US8319746B1 (en) * 2011-07-22 2012-11-27 Google Inc. Systems and methods for removing electrical noise from a touchpad signal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100007636A1 (en) * 2006-10-02 2010-01-14 Pioneer Corporation Image display device
WO2011044936A1 (en) * 2009-10-14 2011-04-21 Nokia Corporation Autostereoscopic rendering and display apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2013034133A1 *

Also Published As

Publication number Publication date
KR20140071365A (ko) 2014-06-11
RU2014113395A (ru) 2015-10-20
DE102011112618A1 (de) 2013-03-14
WO2013034133A1 (de) 2013-03-14
CA2847425C (en) 2020-04-14
RU2604430C2 (ru) 2016-12-10
CA2847425A1 (en) 2013-03-14
US20140282267A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
EP2754298B1 (de) Auswahl von objekten in einem dreidimensionalen virtuellen szenario
DE102019002898B4 (de) Robotersimulationsvorrichtung
EP3067874A1 (de) Verfahren und vorrichtung zum testen eines in einem luftfahrzeug zu bedienenden geräts
DE102018109463C5 (de) Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung
DE60302063T2 (de) Graphische benutzeroberfläche für einen flugsimulator basierend auf einer client-server-architektur
EP3458939B1 (de) Interaktionssystem und -verfahren
DE69631947T2 (de) Positionierung eines Eingabezeigers
EP2196892B1 (de) Verfahren und Vorrichtung zum Anzeigen von Informationen
EP3709133B1 (de) System zur haptischen interaktion mit virtuellen objekten für anwendungen in der virtuellen realität
DE112017005059T5 (de) System und verfahren zum projizieren graphischer objekte
EP2753951A1 (de) Interaktion mit einem dreidimensionalen virtuellen szenario
EP3507681A1 (de) Verfahren zur interaktion mit bildinhalten, die auf einer anzeigevorrichtung in einem fahrzeug dargestellt werden
DE102021122362A1 (de) Wechsel zwischen zuständen in einer hybriden virtual-reality-desktop-rechenumgebung
DE19704677A1 (de) Verfahren und Einrichtung zum automatischen Erzeugen und Manipulieren eines dynamischen Kompaß-Cursors
EP1665023B1 (de) Verfahren und vorrichtung zur steuerung eines graphiksystems der virtuellen realität mittels interaktionen
WO2020126240A1 (de) Verfahren zum betreiben eines feldgeräts der automatisierungstechnik in einer augmented-reality/mixed-reality-umgebung
DE102004021379B4 (de) Bedien- und Beobachtungssystem für industrielle Anlagen und Verfahren
WO2013034129A2 (de) Kooperativer 3d-arbeitsplatz
EP3534240A1 (de) Verfahren und vorrichtung zur daten-annotation
DE102018212944A1 (de) Verfahren zur Unterstützung der Kollaboration zwischen Mensch und Roboter mittels einer Datenbrille
DE112019003579T5 (de) Informationsverarbeitungseinrichtung, programm undinformationsverarbeitungsverfahren
DE102019131740A1 (de) Verfahren und Anzeigevorrichtung zur Erzeugung eines Tiefeneffekts in der Perspektive eines Beobachters auf einem flachen Anzeigemedium sowie Kraftfahrzeug
DE102013211046A1 (de) Verfahren und Vorrichtung zum Gewinnen eines Stellsignals aus einer Bediengeste
DE102021212928B4 (de) Verfahren, Computerprogramm und Vorrichtung zum Erproben eines Einbaus oder Ausbaus zumindest eines Bauteils
DE102016106993A1 (de) Steuer- und Konfiguriereinheit sowie Verfahren zum Steuern und Konfigurieren eines Mikroskops

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140401

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: VOGELMEIER, LEONHARD

Inventor name: WITTMANN, DAVID

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: AIRBUS DEFENCE AND SPACE GMBH

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20160617

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20210830