WO2003102667A1 - Verfahren und vorrichtung zur konsistenten darstellung von realen und virtuelle objekten - Google Patents
Verfahren und vorrichtung zur konsistenten darstellung von realen und virtuelle objekten Download PDFInfo
- Publication number
- WO2003102667A1 WO2003102667A1 PCT/DE2003/001422 DE0301422W WO03102667A1 WO 2003102667 A1 WO2003102667 A1 WO 2003102667A1 DE 0301422 W DE0301422 W DE 0301422W WO 03102667 A1 WO03102667 A1 WO 03102667A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- real
- virtual
- shadow
- objects
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
Definitions
- the invention relates to a method and a device for the consistent representation of real and virtual objects, in which at least one viewer is presented with the superimposed representation of a real and a virtual object, the real object being illuminated, and images of the real one
- Object and the virtual object are optically merged.
- optical display devices are known in the prior art which, with the aid of partially transparent mirrors, present the viewer with the combination of the view of a real object and a virtual object.
- the virtual image is presented to the viewer by an optical image.
- the invention is based on the object of specifying a method and a device of the type mentioned at the outset with which an observer is presented with consistent coverage and lighting effects even for changing observation positions.
- the invention enables a consistent overlay of real and virtual objects, i. H. at the transition point of the images of these two objects presented to the viewer, no disturbances are recognizable to the viewer. This is achieved by projecting an occlusion shadow onto real objects arranged behind the virtual object, the shadow projection and the illumination of the objects using projectors. The projection takes place dynamically, so that changes of the
- the occlusion shadows are shadows that depend on the point of view and enable realistic coverage of real objects by virtual objects. They are not visible to the viewer.
- the inner and outer parameters of the light projectors as well as the parameters of the virtual and real objects are determined.
- the viewer's view is constantly measured by tracking his head movement.
- the parameters of the light projectors are determined once when adjusting the arrangement.
- the virtual objects can be manipulated interactively while viewing.
- a projection matrix V is calculated from this data, which captures the perspective of the respective view of the viewer and enables the conversion of the object views according to the currently existing conditions.
- Light projector is done on the real objects.
- the invention is characterized by a number of advantages. These include in particular:
- the arrangement according to the invention enables a realistic presentation for both one and several viewers
- the occlusion shadow and the lighting are dynamically created directly on the surfaces of the real objects as soon as they are on
- the occlusion shadow is not visible to the viewer as such.
- Additional phantom bodies can be represented which cover virtual objects arranged behind them.
- the combination of occlusion shadow and phantom body is both in stationary showcases and in portable display devices by the viewer similar to wearing glasses on the head.
- the invention is explained in more detail below using an exemplary embodiment.
- the exemplary embodiment explains the application of the invention to a stationary device which is referred to as a virtual display case.
- FIG. 1 shows a schematic representation of the beam paths of the virtual showcase
- FIG. 2 shows a schematic illustration of the electrical couplings
- Figure 3 shows the algorithm for determining the
- the optical conditions of the vitrine are shown in FIG. 1.
- the observer B simultaneously looks at the real object 1 and the virtual object 3, the image 3 'of which is shown on the monitor 3.1.
- the beam paths emanating from the two objects 1 and 3 are brought together with a partially transparent mirror 2.
- the real object 1 is illuminated with a light projector 5.
- the lighting conditions can be determined with the camera 6.
- the camera 6 captures the video image 9 of the real object 1.
- the light projector 5 With the light projector 5 the occult shadow OS is projected onto the real object 1.
- the light projector 5 is designed as a video projector, the image buffer 8 of which contains the image of the occlusion shadow OS with the radiation density image L.
- Light projector 5 can be adjusted.
- the light projector 5 is adjusted in the following way:
- Points on the surface of the real three-dimensional objects 1 are scanned within the image space 8 captured by the light projector 5.
- the three-dimensionally scanned reference points are marked on the surfaces of the real objects 1 by being displayed and superimposed on the display device of the virtual display case.
- the real objects 1 must be captured beforehand.
- a crosshair is then imaged in the processed image 8 of the light projector 5 and aligned with the marked surface points in order to determine their two-dimensional projections in the corresponding areas of the screen.
- the user selects sampling points that allow suitable ones
- inner parameters are the vertical field of view and the aspect ratio of the projection
- outer parameters are the position and the direction of the optical axis of the projector.
- the perspective can be calculated from n points using the Powells Direction Set method. This method is described in Press et al: Numerical Recipes in C -The Art of Scientific Computing (2nd edition), Cambridge University Press, ISBN 0-521-43108-5, pp. 412-420, 1992. Ergenbis gives the matrix of the perspective projection P, which describes the transformation of the model based on the origin of the scene.
- the occlusion shadow is depicted for an individual viewpoint using the steps shown in FIG. 3:
- Step 4 shows the lighting of the real content in an image buffer.
- This illumination can be calculated using known models in order to ensure a correct and adapted luminance to the real one and to create virtual surfaces in relation to the virtual light sources. It is also possible to project monochrome light from the point of view of the light projector 5 onto real surfaces, while virtual objects are illuminated from the positions of the virtual light sources.
- This shadow mask is then mapped onto the known geometry of the real object 1. The steps are for this
- the occlusion shadows created from other viewpoints are hard shadows cast by the virtual scene from a light source positioned at the viewer's point of view. We then If a virtual light source is attached to each viewpoint, in addition to the calculated lighting effects on the surfaces of the virtual scenery, additional adapted hard shadows are created on the surfaces of the real scene.
- Occlusion shadows are projected onto the real partial areas that are only visible from one point of view.
- parts of the real scene namely the captured image of its surface, are also reproduced, which are optically covered by the occlusion shadows of all viewers (or views).
- This virtual image of the surface is displayed under the same lighting conditions and shaded as the real counterpart, so that there are smooth transitions between the real and virtual parts.
- Stencil puf he transferred This is done by displaying the geometry of the real scene from the perspective of the viewer and by adding occlusion shadows by projecting a texture image.
- the template buffer is filled in such a way that the area around the occlusion shadow is cut out in the final image. Then the image of the real surface in the
- Image buffer read in (also from the perspective of the viewer) and shaded with the virtual lighting situation. After the template function is switched off, the virtual objects can be added to the viewer's view. In order to create a consistent lighting between real and virtual objects, the following steps are carried out: First, the image I mi is generated as follows:
- the real scene is shown from the perspective of the projector with a virtual point light source from the position of the projector.
- This image contains the physical lighting of the real scene by the video projector.
- the radiation density image L results as
- the radiation density image L is projected onto the real object with the video projector.
- the determination of the required shadow information requires different consideration of different shadow types.
- Light sources are generated, 4. shadows on real objects that are generated by real objects and virtual light sources,
- the first type of shadow is the result of concealment and self-concealment of the physical environment by a real light source
- the second and third types of shadows can be created using a standardized shadow mapping or shadow buffering method.
- the recorded geometric representation of the real objects must be reproduced together with the virtual objects when the shadow image is generated.
- Such geometrical representations of the real world which are sometimes also referred to as phantoms, are continuously reproduced in order to create a realistic concealment of the virtual objects.
- These methods create hard shadows by hardware acceleration, while general lighting methods can create soft shadows. Mixing it with textures makes it possible to add ambient light to the shadow areas. This creates dark shadow areas, with one underneath surface texture so that no unrealistic black shadows are created.
- the shadow types 4 and 5 can also be created using a shadow image. However, they are projected together with the radiation density image L onto the surface of the real object.
- I red i must contain the black (not added) shadows of the virtual objects and the real phantoms. This can be accomplished by rendering all virtual objects and all phantoms on the first shadow pass to create a shadow image.
- this shadow image is applied to the reproduced and shaded image of the reflectivity, ie the reflection texture and the shadow texture are shown on the geometric representation of the real objects.
- the division of the areas of the black shadow by I ⁇ 2 gives the shadow areas.
- occlusion shadows are special shadows that depend on the viewer's point of view and are created with light projectors on the surface of the real environment. They are intended to enable realistic coverage of real objects by virtual objects. They are not visible from the perspective of the viewer, since they are created exactly under the graphic overlays. Before the projection takes place, images of the occlusion shadows about the color mixture are added to the luminance image L. LIST OF REFERENCE NUMBERS
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- Revetment (AREA)
- Sewage (AREA)
- Image Generation (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03724882A EP1502145B1 (de) | 2002-05-04 | 2003-05-02 | Verfahren und vorrichtung zur konsistenten darstellung von realen und virtuellen objekten |
DE10393114T DE10393114D2 (de) | 2002-05-04 | 2003-05-02 | Verfahren und Vorrichtung zur konsistenten Darstellung von realen und virtuelle Objekten |
DE50303958T DE50303958D1 (de) | 2002-05-04 | 2003-05-02 | Verfahren und vorrichtung zur konsistenten darstellung von realen und virtuellen objekten |
AU2003229283A AU2003229283A1 (en) | 2002-05-04 | 2003-05-02 | Method and device for the consistent representation of concrete and virtual objects |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US37790502P | 2002-05-04 | 2002-05-04 | |
US60/377,905 | 2002-05-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003102667A1 true WO2003102667A1 (de) | 2003-12-11 |
Family
ID=29711922
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2003/001422 WO2003102667A1 (de) | 2002-05-04 | 2003-05-02 | Verfahren und vorrichtung zur konsistenten darstellung von realen und virtuelle objekten |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP1502145B1 (de) |
AT (1) | ATE331234T1 (de) |
AU (1) | AU2003229283A1 (de) |
DE (2) | DE50303958D1 (de) |
ES (1) | ES2268358T3 (de) |
PT (1) | PT1502145E (de) |
WO (1) | WO2003102667A1 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005098513A1 (de) * | 2004-04-05 | 2005-10-20 | Volkswagen Aktiengesellschaft | Einbau-kombinationsinstrument |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0537945A1 (de) * | 1991-10-12 | 1993-04-21 | British Aerospace Public Limited Company | Vom Computer erzeugte Bilder mit Überlagerung realer Sehwahrnehmung |
EP0977071A1 (de) * | 1998-07-27 | 2000-02-02 | Mixed Reality Systems Laboratory Inc. | Vorrichtung zur Bildbetrachtung |
EP1128317A2 (de) * | 2000-02-21 | 2001-08-29 | Siemens Aktiengesellschaft | Verfahren und Anordnung zur Interaktion mit einer in einem Schaufenster sichtbaren Darstellung |
US6361507B1 (en) * | 1994-06-16 | 2002-03-26 | Massachusetts Institute Of Technology | Inertial orientation tracker having gradual automatic drift compensation for tracking human head and other similarly sized body |
-
2003
- 2003-05-02 ES ES03724882T patent/ES2268358T3/es not_active Expired - Lifetime
- 2003-05-02 EP EP03724882A patent/EP1502145B1/de not_active Expired - Lifetime
- 2003-05-02 DE DE50303958T patent/DE50303958D1/de not_active Expired - Fee Related
- 2003-05-02 AU AU2003229283A patent/AU2003229283A1/en not_active Abandoned
- 2003-05-02 DE DE10393114T patent/DE10393114D2/de not_active Expired - Fee Related
- 2003-05-02 AT AT03724882T patent/ATE331234T1/de not_active IP Right Cessation
- 2003-05-02 WO PCT/DE2003/001422 patent/WO2003102667A1/de not_active Application Discontinuation
- 2003-05-02 PT PT03724882T patent/PT1502145E/pt unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0537945A1 (de) * | 1991-10-12 | 1993-04-21 | British Aerospace Public Limited Company | Vom Computer erzeugte Bilder mit Überlagerung realer Sehwahrnehmung |
US6361507B1 (en) * | 1994-06-16 | 2002-03-26 | Massachusetts Institute Of Technology | Inertial orientation tracker having gradual automatic drift compensation for tracking human head and other similarly sized body |
EP0977071A1 (de) * | 1998-07-27 | 2000-02-02 | Mixed Reality Systems Laboratory Inc. | Vorrichtung zur Bildbetrachtung |
EP1128317A2 (de) * | 2000-02-21 | 2001-08-29 | Siemens Aktiengesellschaft | Verfahren und Anordnung zur Interaktion mit einer in einem Schaufenster sichtbaren Darstellung |
Non-Patent Citations (1)
Title |
---|
"DIRECTION SET (POWELL'S) METHODS IN MULTIDIMENSIONS", NUMERICAL RECIPES IN C: THE ART OF SCIENTIFIC COMPUTING, XX, XX, PAGE(S) 412-420, XP008023515 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005098513A1 (de) * | 2004-04-05 | 2005-10-20 | Volkswagen Aktiengesellschaft | Einbau-kombinationsinstrument |
US7872569B2 (en) | 2004-04-05 | 2011-01-18 | Volkswagen Ag | Built-in instrument cluster |
Also Published As
Publication number | Publication date |
---|---|
AU2003229283A1 (en) | 2003-12-19 |
DE10393114D2 (de) | 2005-05-12 |
EP1502145B1 (de) | 2006-06-21 |
PT1502145E (pt) | 2006-11-30 |
ATE331234T1 (de) | 2006-07-15 |
DE50303958D1 (de) | 2006-08-03 |
ES2268358T3 (es) | 2007-03-16 |
EP1502145A1 (de) | 2005-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE60120474T2 (de) | Rasterung von dreidimensionalen bildern | |
EP0789328B1 (de) | Bildverarbeitungsverfahren zur Darstellung von spiegelnden Objekten und zugehörige Vorrichtung | |
DE3873792T2 (de) | Elektronische bildverarbeitung. | |
DE69518907T2 (de) | Composition mit mehreren bildern | |
DE69632755T2 (de) | Vorrichtung zur Erzeugung von rechnererzeugten stereoskopischen Bildern | |
DE19825302B4 (de) | System zur Einrichtung einer dreidimensionalen Abfallmatte, welche eine vereinfachte Einstellung räumlicher Beziehungen zwischen realen und virtuellen Szeneelementen ermöglicht | |
DE19539048B4 (de) | Video-Konferenzsystem und Verfahren zum Bereitstellen einer Parallaxenkorrektur und zum Erzeugen eines Anwesenheitsgefühls | |
EP3427474B1 (de) | Bildverarbeitungsverfahren, bildverarbeitungsmittel und bildverarbeitungsvorrichtung zur erzeugung von abbildungen eines teils eines dreidimensionalen raums | |
EP0776576B1 (de) | Verfahren und vorrichtung zur darstellung von stereoskopischen videobildern auf einem display | |
EP1763845A1 (de) | Verfahren und vorrichtung zur bestimmung von optischen überdeckungen mit ar-objekten | |
EP1438697A2 (de) | Verfahren und vorrichtung zur erzeugung lichtmikroskopischer, dreidimensionaler bilder | |
DE19640936A1 (de) | Positionsadaptiver Autostereoskoper Monitor (PAM) | |
EP1972134A1 (de) | Kalibrierungsverfahren und kalibrierungssystem für projektionsvorrichtung | |
EP1678561A1 (de) | Verfahren und anordnung zur kombination von hologrammen mit computergrafik | |
DE19924096C2 (de) | System zur stereoskopischen Bilddarstellung | |
DE19906995A1 (de) | Erzeugen von Anpaßdaten für einen virtuellen Szenenaufbau | |
DE102007057208A1 (de) | Verfahren zum Darstellen von Bildobjekten in einem virtuellen dreidimensionalen Bildraum | |
WO2006094637A1 (de) | Verfahren zum vergleich eines realen gegenstandes mit einem digitalen modell | |
EP1502145B1 (de) | Verfahren und vorrichtung zur konsistenten darstellung von realen und virtuellen objekten | |
DE102005050350A1 (de) | System und Verfahren zur Überwachung einer technischen Anlage sowie Datenbrille auf Basis eines solchen Systems | |
DE60001852T2 (de) | Bildverarbeitungsgerät,-methode, -programm und speichermedium | |
DE19853608C2 (de) | Verfahren zur Darstellung eines autostereoskopischen Bildes | |
DE10134430A1 (de) | Verfahren und Anordnung zur stereoskopischen Projektion von Bildern | |
WO2007085482A1 (de) | Verfahren zur erzeugung und darstellung räumlich wahrnehmbarer bilder | |
DE112008002083T5 (de) | Verfahren und Software für Bildtransformation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003724882 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003724882 Country of ref document: EP |
|
REF | Corresponds to |
Ref document number: 10393114 Country of ref document: DE Date of ref document: 20050512 Kind code of ref document: P |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10393114 Country of ref document: DE |
|
WWG | Wipo information: grant in national office |
Ref document number: 2003724882 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: JP |