EP2764698A2 - Poste de travail 3d coopératif - Google Patents

Poste de travail 3d coopératif

Info

Publication number
EP2764698A2
EP2764698A2 EP12783853.0A EP12783853A EP2764698A2 EP 2764698 A2 EP2764698 A2 EP 2764698A2 EP 12783853 A EP12783853 A EP 12783853A EP 2764698 A2 EP2764698 A2 EP 2764698A2
Authority
EP
European Patent Office
Prior art keywords
display device
operator
dimensional
holographic
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12783853.0A
Other languages
German (de)
English (en)
Inventor
Leonhard Vogelmeier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airbus Defence and Space GmbH
Original Assignee
EADS Deutschland GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by EADS Deutschland GmbH filed Critical EADS Deutschland GmbH
Publication of EP2764698A2 publication Critical patent/EP2764698A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/32Holograms used as optical elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/26Processes or apparatus specially adapted to produce multiple sub- holograms or to obtain images from them, e.g. multicolour technique
    • G03H2001/2605Arrangement of the sub-holograms, e.g. partial overlapping
    • G03H2001/261Arrangement of the sub-holograms, e.g. partial overlapping in optical contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment

Definitions

  • the invention relates to the representation of one and the interaction with a
  • the invention relates to a
  • a display device for displaying a three-dimensional scenario, a three-dimensional scenario displaying device for cooperatively processing the three-dimensional scenario by a plurality of
  • Stereoscopic systems with individual imaging for one or more users are known.
  • individual images for a viewer's left eye and right eye, respectively are displayed on a screen.
  • the respective eye can also recognize only the image intended for this eye in each case, so that the perception of the spatial view created by the viewer differs from the images perceived by the eyes.
  • This separation of the images for the respective eyes of the observer can be done for example by the use of prisms, which break the light so that the eyes look at different images.
  • glasses with differently polarized lenses for the user or viewer are known.
  • Viewers are chosen.
  • a second viewer or each additional viewer can easily be allowed to view the three-dimensional scenario, as long as all viewers observe the same scenario.
  • a display in addition to the two images for the first viewer still have to present two more images for the second viewer. It is not only important to note that each eye looks at the particular image, and only this image, but also that a distinction should be made between the images and the viewer.
  • a device for displaying a three-dimensional scenario which allows scalability of the device to a plurality of users or viewers so that a simultaneous and cooperative interaction of the plurality of viewers with the illustrated three-dimensional scenario without quality loss of the representation can take place and nevertheless each observer has an individual view on the three-dimensional scenario.
  • a display device for displaying a three-dimensional scenario with a first Projection device and a second projection device and a holographic display device with a first holographic unit and a second holographic unit specified.
  • the Projection device and the second projection device designed to throw a first image or a second image on the holographic display device.
  • the first holographic unit and the second holographic unit are configured to scatter the first image and the second image, respectively, such that a first eye of one operator perceives the first image and a second eye of the operator views the second image such that the operator Impression of a three-dimensional
  • the holographic display device may, for example, a display disk and the first and the second holographic unit, for example, a
  • Projection device may be, for example, laser projectors.
  • the holograms may be set so that they each receive only the light of one of the two projection devices and direct or scatter in a second direction such that the corresponding image can only be seen from a certain angle.
  • the first projection device is arranged to project light onto the first hologram so that it is below the corresponding one
  • Incident angle impinges on the first hologram.
  • the first holographic unit may be designed to direct the light of the first projection device into a first half-space and the second Holographic unit may be configured to direct the light of the second projection device in a second half-space.
  • a viewer of the holographic display device thus creates the impression of a three-dimensional scenario, wherein a first eye of the observer perceives the first image and a second eye of the observer perceives the second image.
  • a first eye of the observer perceives the first image and a second eye of the observer perceives the second image.
  • the first eye in the first half-space and the second eye in the second half-space are the first eye in the first half-space and the second eye in the second half-space.
  • the display device further comprises a detection unit and an actuator. It is the
  • Detection unit executed to determine a position of the first eye and the second eye of the operator.
  • the actuator is designed to move the holographic display device such that a viewing direction of the operator in a sagittal plane of the display device strikes the display device perpendicularly.
  • the viewing direction of the operator in the sagittal plane can also hit the display device at any other angle as long as this angle corresponds to the emission direction of the images through the holograms so that each eye can perceive the image assigned to it.
  • a half space designates a spatial area on the viewing side of the holographic display device, in which only the image of the first
  • Projection device or the image of the second projection device can be perceived by the eye located in the corresponding half-space.
  • the arrangement of the first half-space and the second half-space is predetermined by the arrangement of the eyes of the operator, whereby the first half-space and the second half-space are horizontally offset from each other, so that in each case an eye of the operator is in the first half-space and the second half-space ,
  • the first holographic unit and the second holographic unit may be designed such that a positioning of the first half-space and the second half-space depending on an eye distance between the first eye and the second eye of the operator and / or in dependence of a distance of the operator from the holographic Display device is adjusted.
  • Display device also be moved so that the distance of the eyes of the Operator of the display device remains substantially constant. In other words, this means that the display device can be moved in one direction towards the operator and away from the operator.
  • the first holographic unit or the second holographic unit it is also possible for the first holographic unit or the second holographic unit to be designed such that adjustments to the holographic units can be made so that the holographic units adapt to vertical movements of the user and despite perception of the first image and the second image fluctuating distance of the operator's eyes to the display device is made possible.
  • the display device To improve the ease of use of the display device, the
  • Holographic display device are rotated by means of the actuator about a vertical axis of the display device so that the eyes of the operator regardless of its horizontal positioning are always in the respective eye assigned half space.
  • the sagittal plane of the display device is spanned by a
  • a vertical movement of the eyes of the operator corresponds to a change in the inclination of the sagittal plane with respect to the
  • Display device in which the sagittal plane rotates about the horizontal axis of the display device.
  • a movement of the operator's eyes in a horizontal direction can cause an eye to leave its assigned half-space, thus spoiling the impression of a three-dimensional scenario.
  • the holographic display device To maintain three-dimensional impression in the operator, now the holographic display device must be rotated about its vertical axis that the extension of the first half-space and the second half-space in Match with the first eye and the second eye of the operator is brought.
  • This angle can be an arbitrary angle, it is crucial that it remains constant in a horizontal movement of the eyes of the operator.
  • the viewing direction in the sagittal plane forms a right angle with the display device, i. H. an angle of 90 degrees.
  • the detection unit has at least one camera.
  • the camera may be configured to determine the position of at least one eye of the operator so as to instruct the actuator to move the holographic display to a position, i. to rotate about its horizontal axis, so that the operator perceives the first image with the first eye and the second image of the holographic display device with the second eye.
  • the camera can also be designed to follow a clearly identifiable object, which is located, for example, next to one of the operator's eyes.
  • This uniquely identifiable object may be, for example, a sticker with a visual coding feature, such as a bar code.
  • the holographic display device is configured to display a pointing element.
  • the operator can interact with the three-dimensional scenario by forming a connecting line from the first eye or the second eye of the operator via the pointing element into the three-dimensional scenario.
  • the connecting line can also as Mean value between the two connecting lines from the left eye or right eye on the pointing element are formed in the three-dimensional scenario.
  • the calculated connecting line from an operator's eye to the pointing element can be used to determine which element was selected in the three-dimensional scenario.
  • the pointing element is placed on the holographic display by the operator
  • Display device touched with a finger.
  • the holographic display device may include a
  • Scanning devices such as Frustrated Total Internal Reflection (FTIR) the position of a finger on the holographic display device are determined.
  • FTIR Frustrated Total Internal Reflection
  • the pointing element is placed on the holographic display device by the detection unit detecting a position of the finger of the operator.
  • the detection unit can make the position of the finger in principle analogous to the version of the position of the eye, which has been described in detail above.
  • the detection unit may comprise a plurality of detection elements, of which a first group of a plurality of detection elements for the Detecting the position of the operator's eyes and a second group of multiple detection elements for detecting the position of the finger of the operator can be performed.
  • the positioning of the pointing element on the display device can also be done by means of an input device such as a so-called computer mouse or a trackball or via a keyboard with control arrows and any other input devices.
  • an input device such as a so-called computer mouse or a trackball or via a keyboard with control arrows and any other input devices.
  • Display device a two-dimensional display element which is designed to provide the operator information in graphical and written form. In the displayed on the two-dimensional display element
  • Information can be any information that can not or must not be presented in the three-dimensional scenario.
  • the three-dimensional scenario is an airspace to be monitored with aircraft in it
  • information on a two-dimensional display element may be displayed on a selected aircraft, such as a vehicle.
  • speed, altitude, weather data or other data are displayed.
  • the two-dimensional display element may be touch-sensitive.
  • a presentation device for a three-dimensional scenario for the cooperative processing of the 3-dimensional scenarios by a plurality of operators which has a plurality of display devices as described above and below, and wherein each one operator at least one
  • the display device thus enables the joint and cooperative processing of a scenario by multiple operators.
  • Display devices spatially separated or adjacent to be arranged. For example, a plurality of display devices on a
  • Workplace for example, a table, be arranged side by side and so in addition to the common interaction of the operator with the three-dimensional scenario also allow immediate communication between the operators.
  • the display devices can also be arranged spatially separated from each other and still enable the joint cooperative editing of a three-dimensional scenario.
  • each display device via a decentralized
  • the decentralized computing device may be connected to the central computing system, wherein the central computing system merely performs control and coordination of a plurality of decentralized computing devices.
  • the inventive design of the display devices it is possible to scale the number of operators as desired. For example, the
  • Display device for four, eight, twelve or any number of operators to be executed, wherein a parameter for the determination of the number of operators, the complexity and the extent of the monitored
  • a central computer system controls the display devices in such a way that every operator gets the impression of a three-dimensional scenario.
  • the display device can, for example, for the cooperative mission planning of land, water and air vehicles, a joint mission implementation of several unmanned land, water and air vehicles by the respective vehicle operator, the cooperative monitoring of airspaces or national borders or even for the
  • a display device can be extended so that each operator is provided with one or more display devices.
  • the central computing system controls the distributed to the individual display devices representations or
  • Presentation device can also be used for training and evaluation purposes. According to one embodiment of the invention, each operator sees that
  • supervisory airspace that viewpoint, as if the operators were positioned relative to the supervisory airspace as positioned relative to the three-dimensional scenario of the airspace.
  • one of four operators uniformly distributed around the display device refers to the three-dimensional scenario from an easterly direction, a second operator from a southward direction, a third operator from a western direction, and a fourth operator from a northward direction the three-dimensional scenario of the airspace to be monitored looks.
  • each operator sees the three-dimensional scenario from a cloned perspective.
  • the cloned perspective corresponds to a given perspective of the operator on the three-dimensional scenario.
  • all or even a predefinable part of the operator can view the three-dimensional scenario from the same perspective.
  • an operator can be assisted by a second operator by giving the same to both operators on their display device
  • the individual perspective corresponds to a perspective view of the three-dimensional scenario that can be set as desired by each operator. In other words, this means that an operator can adjust his perspective as if he were moving freely in the space shown.
  • the operator may select and enlarge an area of the three-dimensional scenario, for example, to obtain increased detail of the presentation.
  • each operator is assigned a second display device which is executed
  • an operator of a remote display device can also be represented here.
  • the display device is described above and below for cooperative monitoring of
  • a method for representing a three-dimensional scenario is provided.
  • a respective first image and a second image are projected onto a respective holographic display device from one Variety of holographic display devices, so that the impression of a three-dimensional scenario arises in a viewer of the three-dimensional display device, each holographic display device the
  • the holographic display device is rotated about a vertical axis so that a viewing direction of the viewer in one
  • Sagittal plane of the display device falls at a predetermined angle to the holographic display device.
  • a method for cooperatively processing a three-dimensional scenario is provided.
  • the detection of an eye position of a viewer of the three-dimensional overall scenario takes place.
  • a further step becomes a reference point on the holographic
  • a connecting line is calculated from the eye position via the reference point into the three-dimensional overall scenario.
  • Fig. 1 shows a plan view of a display device according to an embodiment of the invention.
  • Fig. 2 shows a plan view of a display device according to another embodiment of the invention.
  • Fig. 3 shows an isometric view of a holographic display device according to an embodiment of the invention.
  • 4 shows an isometric view of a display device according to a further embodiment of the invention.
  • Fig. 5 shows a side view of a display device according to another embodiment of the invention.
  • FIG. 6 shows a plan view of a three-dimensional scenario display device for cooperative processing by a plurality of operators according to an embodiment of the invention.
  • 7 shows a schematic view of a method for the representation and cooperative processing of a three-dimensional overall scenario.
  • Fig. 1 shows a display device 100 according to an embodiment of the invention.
  • the display device has a first projection device 111 and a second projection device 1 12, as well as a holographic
  • Display device 130 with a first holographic unit 131 and a second holographic unit 132.
  • the first projection device 11 1 is designed to project an image onto the first holographic unit 131, the image of the first holographic unit 131
  • Projection device is directed in the direction of a first eye 121 of an operator and the image of the second projection device 1 12 is directed by the second holographic unit 132 in the direction of a second eye 122 of an operator.
  • the different images perceived by the first eye and the second eye give the operator the impression of a three-dimensional scenario.
  • FIG. 1 shows a first half space 151 and a second half space 152, in which the first eye or the second eye can be located, without the impression of a three-dimensional scenario being disturbed. Only when leaving the first half-space or the second half-space through the first eye or the second eye, this impression is disturbed. Furthermore, the first half space and the second half space are limited by the fact that a vertical movement of the operator changes a distance of the operator's eyes from the display device, which can also interfere with the perception of the first image and / or the second image. To counteract this effect that the display device are moved toward or away from the user, so that a change in the distance of the eyes of the
  • Display device is compensated.
  • Fig. 2 shows a display device 100 according to another
  • the display device 100 has a holographic display device 130, an actuator 202 and a detection unit 220.
  • the detection unit 220 is executed, a position of the operator of
  • Detect display device In order to ensure that the operator's eyes perceive different images, giving the impression of a three-dimensional scenario to the operator, depending on the operator's position relative to the holographic display 130, it may be necessary for the holographic display 130 to be about a vertical axis 135 along the directional arrow 136 to rotate so that each eye of the
  • Fig. 3 shows a holographic display device 130 in isometric
  • a sagittal plane 310 is spanned by a viewing direction 301 of the operator on the display device 130 and a horizontal axis 320 of the display device 130. Thus, the sagittal plane 310 intersects the display device 130 at an angle ⁇ 31 1. From the viewing direction 301 in the sagittal plane 310 and the
  • a change in the angle ⁇ 31 1 corresponds to a vertical movement of the operator in front of the display device 130.
  • a horizontal movement of the operator in front of the display device 130 causes at least one eye to leave the half-space provided for this eye and thus perceive either the wrong image or no image, so that the impression of the three-dimensional scenario is disturbed.
  • the holographic display device 130 is moved by the actuator 202 so that the angle ß 321 constantly maintains a predetermined value. 4 shows a display device 100 according to another
  • the display device has a
  • holographic display device 130 a two-dimensional display element 430, a second holographic display device 230 and four cameras 221, 222, 223, 224, which represent the position detection unit of the operator's and / or the operator's finger.
  • the cameras can be designed to determine a positioning of the finger on the display device, but also to determine a positioning of the finger in the room.
  • Both the first display device 130 and the second display device 230 may be rotated about their respective vertical axis by an actuator (not shown) such that a viewing direction 301 falls on the display device 130 and the display device 230 at a constant predeterminable angle.
  • FIG. 5 shows a side view of a holographic display device 130 and a schematically represented three-dimensional scenario 550.
  • Display device 130 is configured to display a pointing element 510.
  • the pointing element 510 is movable on the display device 130. This can be done for example by touching the display device 130 with a finger or for example by movement or actuation of an input element, such as a so-called. Computer mouse.
  • a connecting line 51 1 is formed by the operator's eye 121, 122 via the pointing element 510 into the three-dimensional scenario 550.
  • the operator-selected element 555 can be determined in the three-dimensional scenario.
  • the selected element 555 may be a single object or part of an object in the three-dimensional scenario.
  • a vehicle such as an aircraft, or any part of the vehicle, such as a wing or a rudder, may be selected.
  • the connecting line 51 1 corresponds to the viewing direction 301 of the operator on the display device 130, whereby the connecting line 51 1 and the
  • Indicator 130 includes the angle ⁇ 31 1.
  • a change in the angle ⁇ does not influence the impression of the perception of a three-dimensional scenario by the operator.
  • FIG. 6 shows a three-dimensional scenario display device 600 for cooperatively processing the two-dimensional scenario by a plurality of operators according to an embodiment of the invention.
  • each display device is assigned an operator 120 in each case.
  • Each operator 120 looks at the display device 100 assigned to it, so that each operator is given the impression of a three-dimensional scenario. From the operator 120's perspective, the situation is that the operators a virtual three-dimensional overall scenario 610
  • the display device shown in FIG. 6 thus enables the joint cooperative processing of a three-dimensional overall scenario by a plurality of operators, wherein the processing of the three-dimensional scenario can be assisted by providing the operators with an immediate
  • FIG. 7 shows a method 700 for the representation and cooperative processing of a three-dimensional overall scenario according to an embodiment of the invention.
  • a respective first image and a second image are projected onto a respective holographic display device from a multiplicity of holographic display devices, so that the impression of a three-dimensional scenario arises for a viewer of the three-dimensional display device, each holographic
  • Display device represents the three-dimensional overall scenario from a specific perspective.
  • the first image and the second image each of a first
  • Projection device or a second projection device projected onto a holographic display device By projecting a plurality of first images and second images onto each one of a plurality of holographic display devices, it is subjected to a plurality of Operator can view the three-dimensional scenario from its own perspective.
  • an operator's perspective on the three-dimensional scenario is compiled from a pair of a first image and a second image, respectively, which are projected onto the display device assigned to that operator.
  • the offering of the first image and the second image for the plurality of display devices may be controlled, for example, by a central control system or central computer system.
  • the central control system may be implemented, the various above and below
  • each holographic display device the three-dimensional
  • the three-dimensional overall scenario can be represented such that a plurality of display devices which are arranged at a workstation, the three-dimensional overall scenario of a so-called natural
  • a first display device shows the three-dimensional overall scenario from a first perspective, for example from the east
  • a second display device from a second perspective for example from the south
  • shows a third display device from one third perspective for example, pointing west
  • a fourth Display device from a fourth perspective for example, north direction shows.
  • the perspectives of the operator can correspond to the perspective which the operators have on a miniature image of the user
  • each display device can show any perspective on the overall three-dimensional scenario.
  • Beholder In this case, for example, only the position of an eye, for example the left eye or the right eye, can be detected and closed on the position of the right eye or the left eye of the operator. But it can also be the position of the right eye and the left eye of the operator are detected, so as to meet the individual horizontal eye relief of different operators.
  • the central control system can have an operator identification, by which the position of the second eye is determined after detecting a first eye position by the individual horizontal eye distance known to the central control system.
  • the detection of the eye position can take place by means of a detection unit, for example one or a plurality of cameras. In this case, the detection of the eye position can take place by means of image recognition. Similarly, the eye position can be detected by a marker is placed at a certain distance and angle to an eye, for example, on the forehead of the operator. About the Detecting the position of the marker, the central control system determines the position of an eye or both eyes of the operator.
  • a third step 703 the holographic display device is rotated about a vertical axis so that a viewing direction of the viewer in a sagittal plane of the display device falls on the holographic display device at a predetermined angle.
  • the rotation of the display device avoids that an eye of the operator perceives no image or both eyes perceive the same image, which would disturb the three-dimensional impression.
  • a reference point is set on the holographic display device.
  • the reference point can be used as a point on the holographic
  • Display device are set. Then it is a commitment on a level corresponding to the display device, ie by one
  • the reference point can also be defined as a point in space.
  • a finger of the operator can be used in particular, which defines the reference point via a touch-sensitive detection layer on the display device.
  • a position of a finger of the operator in the room or on the display device can be determined by means of a detection system.
  • the finger position can basically be determined by the same method as the detection of the eye position of the operator described above.
  • the reference point by movement of a conventional graphical pointing device, such as a so-called computer mouse or a trackball, which is connected to the central control system as an input device, take place.
  • a fifth step 705 the calculation of a connecting line from the eye position via the reference point into the three-dimensional overall scenario takes place.
  • the central control system knows the position of at least one eye of the operator, which serves as the first point of the connecting line. Furthermore, the reference point is known, which serves as the second point of the connecting line.
  • the connecting line is extended into the three-dimensional scenario, thus identifying an object in this scenario, which is the Has targeted or selected operator, ie to which object the operator has shown.
  • any available actions can be applied to the selected object.
  • the central control system can be executed, the operator a selection of actions from a
  • the total amount of actions may include all actions for an unmanned aerial vehicle, and the selection of actions only those actions that should be allowed in a specific situation.
  • the instruction to reduce the altitude in such a case may be inadmissible if the minimum altitude would be exceeded.
  • the inventive device and / or allows
  • inventive method a simple, fast and intuitive operation of cooperative by multiple operators three-dimensional scenarios.
  • the number of operators can be any number, since the method according to the invention and the device according to the invention are arbitrarily scalable.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Holo Graphy (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un dispositif d'affichage (100) destiné à représenter un scénario tridimensionnel, comprenant un premier (111) et deuxième dispositif de projection (112). De plus, le dispositif d'affichage présente un dispositif d'affichage holographique (130) pourvu d'une première (131) et d'une deuxième unité holographique (132). En outre, un dispositif de représentation (600) pourvu d'une pluralité de dispositifs d'affichage pour la représentation et le traitement coopératif d'un scénario tridimensionnel (610) par une pluralité d'utilisateurs (120), chaque utilisateur observant le scénario tridimensionnel depuis une perspective pouvant être déterminée individuellement.
EP12783853.0A 2011-09-08 2012-09-04 Poste de travail 3d coopératif Withdrawn EP2764698A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102011112617A DE102011112617A1 (de) 2011-09-08 2011-09-08 Kooperativer 3D-Arbeitsplatz
PCT/DE2012/000882 WO2013034129A2 (fr) 2011-09-08 2012-09-04 Poste de travail 3d coopératif

Publications (1)

Publication Number Publication Date
EP2764698A2 true EP2764698A2 (fr) 2014-08-13

Family

ID=47148549

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12783853.0A Withdrawn EP2764698A2 (fr) 2011-09-08 2012-09-04 Poste de travail 3d coopératif

Country Status (7)

Country Link
US (1) US20140289649A1 (fr)
EP (1) EP2764698A2 (fr)
KR (1) KR20140054214A (fr)
CA (1) CA2847396A1 (fr)
DE (1) DE102011112617A1 (fr)
RU (1) RU2637562C2 (fr)
WO (1) WO2013034129A2 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150041482A (ko) * 2013-10-08 2015-04-16 삼성전자주식회사 디스플레이 장치 및 이를 이용한 디스플레이 방법
US10630965B2 (en) * 2015-10-02 2020-04-21 Microsoft Technology Licensing, Llc Calibrating a near-eye display
DE102018107113A1 (de) * 2018-03-26 2019-09-26 Carl Zeiss Ag Anzeigevorrichtung
CN110782815B (zh) * 2019-11-13 2021-04-13 吉林大学 一种全息立体探测系统及其方法

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291316A (en) * 1991-09-27 1994-03-01 Astronautics Corporation Of America Information display system having transparent holographic optical element
US5694142A (en) * 1993-06-21 1997-12-02 General Electric Company Interactive digital arrow (d'arrow) three-dimensional (3D) pointing
US5798761A (en) * 1996-01-26 1998-08-25 Silicon Graphics, Inc. Robust mapping of 2D cursor motion onto 3D lines and planes
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
AUPP690598A0 (en) * 1998-11-03 1998-11-26 Commonwealth Of Australia, The Control centre console arrangement
US6583808B2 (en) * 2001-10-04 2003-06-24 National Research Council Of Canada Method and system for stereo videoconferencing
US7554541B2 (en) * 2002-06-28 2009-06-30 Autodesk, Inc. Widgets displayed and operable on a surface of a volumetric display enclosure
US7324085B2 (en) * 2002-01-25 2008-01-29 Autodesk, Inc. Techniques for pointing to locations within a volumetric display
JP2003296757A (ja) * 2002-03-29 2003-10-17 Canon Inc 情報処理方法および装置
JP4147054B2 (ja) * 2002-05-17 2008-09-10 オリンパス株式会社 立体観察装置
DE10259968A1 (de) * 2002-12-16 2004-07-01 X3D Technologies Gmbh Autostereoskopisches Projektionsverfahren und autostereoskopisches Projektionsanordnung
US9274598B2 (en) * 2003-08-25 2016-03-01 International Business Machines Corporation System and method for selecting and activating a target object using a combination of eye gaze and key presses
US7092001B2 (en) * 2003-11-26 2006-08-15 Sap Aktiengesellschaft Video conferencing system with physical cues
JP4643583B2 (ja) * 2004-09-10 2011-03-02 株式会社日立製作所 表示装置及び撮像装置
US7626569B2 (en) * 2004-10-25 2009-12-01 Graphics Properties Holdings, Inc. Movable audio/video communication interface system
RU2277725C1 (ru) * 2004-10-25 2006-06-10 Открытое Акционерное Общество "Пензенское Конструкторское Бюро Моделирования" Устройство отображения визуальной информации авиационного тренажера
EP1667088B1 (fr) * 2004-11-30 2009-11-04 Oculus Info Inc. Système et procédé pour régions aériennes tridimensionnelles interactives
JP4419903B2 (ja) * 2005-04-15 2010-02-24 ソニー株式会社 入力装置、入力方法および入力制御プログラム、ならびに、再生装置、再生制御方法および再生制御プログラム
US20070279483A1 (en) * 2006-05-31 2007-12-06 Beers Ted W Blended Space For Aligning Video Streams
US20080094398A1 (en) * 2006-09-19 2008-04-24 Bracco Imaging, S.P.A. Methods and systems for interacting with a 3D visualization system using a 2D interface ("DextroLap")
KR100907104B1 (ko) * 2007-11-09 2009-07-09 광주과학기술원 포인팅 지점 산출 방법 및 장치, 그리고 상기 장치를포함하는 원격 협업 시스템
US8319819B2 (en) * 2008-03-26 2012-11-27 Cisco Technology, Inc. Virtual round-table videoconference
GB0901084D0 (en) * 2009-01-22 2009-03-11 Trayner David J Autostereoscopic display
JP5465523B2 (ja) * 2009-01-29 2014-04-09 三洋電機株式会社 立体画像表示システム
US8363922B2 (en) * 2009-02-12 2013-01-29 International Business Machines Corporation IC layout pattern matching and classification system and method
CN101840265B (zh) * 2009-03-21 2013-11-06 深圳富泰宏精密工业有限公司 视觉感知装置及其控制方法
US8494760B2 (en) * 2009-12-14 2013-07-23 American Aerospace Advisors, Inc. Airborne widefield airspace imaging and monitoring
US9083062B2 (en) * 2010-08-02 2015-07-14 Envia Systems, Inc. Battery packs for vehicles and high capacity pouch secondary batteries for incorporation into compact battery packs
US8823769B2 (en) * 2011-01-05 2014-09-02 Ricoh Company, Ltd. Three-dimensional video conferencing system with eye contact
JP5889539B2 (ja) * 2011-03-28 2016-03-22 独立行政法人石油天然ガス・金属鉱物資源機構 炭化水素の製造方法
KR20160084502A (ko) * 2011-03-29 2016-07-13 퀄컴 인코포레이티드 로컬 멀티-사용자 협업을 위한 모듈식 모바일 접속된 피코 프로젝터들
US9143724B2 (en) * 2011-07-06 2015-09-22 Hewlett-Packard Development Company, L.P. Telepresence portal system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2013034129A2 *

Also Published As

Publication number Publication date
CA2847396A1 (fr) 2013-03-14
WO2013034129A3 (fr) 2013-05-02
RU2014113158A (ru) 2015-10-20
KR20140054214A (ko) 2014-05-08
RU2637562C2 (ru) 2017-12-05
WO2013034129A2 (fr) 2013-03-14
US20140289649A1 (en) 2014-09-25
DE102011112617A1 (de) 2013-03-14

Similar Documents

Publication Publication Date Title
EP0836332B1 (fr) Moniteur autostéréoscopique, adaptant la position d'un observateur (PAM)
DE102005010250B4 (de) Verfahren und Einrichtung zum Nachführen von Sweet-Spots
WO1998015869A1 (fr) Systeme de projection, notamment de representations 3d sur un dispositif de visionnage
WO2008141596A1 (fr) Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel
DE102004016331B4 (de) Vorrichtung und Verfahren zur gleichzeitigen Darstellung virtueller und realer Umgebungsinformationen
EP2464098A2 (fr) Dispositif de représentation d'environnement ainsi que véhicule doté d'un tel dispositif de représentation d'environnement et procédé de représentation d'une image panoramique
DE102011112619A1 (de) Auswahl von Objekten in einem dreidimensionalen virtuellen Szenario
EP2764698A2 (fr) Poste de travail 3d coopératif
WO2009062492A2 (fr) Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel
EP2753951A1 (fr) Interaction avec un scénario tridimensionnel virtuel
DE102012211298A1 (de) Anzeigevorrichtung für ein Videoüberwachungssystem sowie Videoüberwachungssystem mit der Anzeigevorrichtung
EP3420539B1 (fr) Procédé de fonctionnement d'un dispositif d'affichage et système d'affichage de contenus d'image virtuels superposés à des contenus d'image réels d'un environnement réel
DE102018132921A1 (de) Verfahren zum Betreiben eines Feldgeräts der Automatisierungstechnik in einer Augmented-Reality/Mixed-Reality- Umgebung
DE102017208343B4 (de) Anzeigesystem für ein Fahrzeug und Fahrzeug
DE102012005880A1 (de) Verfahren zum Auslegen einer Produktionsumgebung
DE102018212944A1 (de) Verfahren zur Unterstützung der Kollaboration zwischen Mensch und Roboter mittels einer Datenbrille
DE102018102743A1 (de) Real Sound Transmission
DE102012010799B4 (de) Verfahren zur räumlichen Visualisierung von virtuellen Objekten
WO2017144033A1 (fr) Procédé de détermination et de visualisation de changements dans un environnement réel comportant un terrain réel et des objets réels se trouvant sur ce terrain
DE102019131740A1 (de) Verfahren und Anzeigevorrichtung zur Erzeugung eines Tiefeneffekts in der Perspektive eines Beobachters auf einem flachen Anzeigemedium sowie Kraftfahrzeug
DE202019105917U1 (de) Vorrichtungsanordnung zur interaktiven Großdarstellung von Fortbewegungsinformationen bewegter Objekte
DE102004032586B4 (de) Verfahren zur Erzeugung einer dreidimensionalen Darstellung
WO2015024685A1 (fr) Procédé pour représenter sur un écran un objet reproduit dans un ensemble de données de volumes
EP2753995A2 (fr) Écran d'affichage incliné pour la représentation tridimensionnelle d'un scénario
WO2017182021A1 (fr) Procédé et système pour représenter un environnement de simulation

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140401

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: AIRBUS DEFENCE AND SPACE GMBH

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180608

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20181019