WO2001063559A2 - Procede d'association automatique de zones d'image d'une representation en trois dimensions d'un objet a des fonctions - Google Patents

Procede d'association automatique de zones d'image d'une representation en trois dimensions d'un objet a des fonctions Download PDF

Info

Publication number
WO2001063559A2
WO2001063559A2 PCT/DE2001/000468 DE0100468W WO0163559A2 WO 2001063559 A2 WO2001063559 A2 WO 2001063559A2 DE 0100468 W DE0100468 W DE 0100468W WO 0163559 A2 WO0163559 A2 WO 0163559A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
dimensional
image
functional
functional area
Prior art date
Application number
PCT/DE2001/000468
Other languages
German (de)
English (en)
Other versions
WO2001063559A3 (fr
Inventor
Oliver Dehning
Wolfgang Niem
Marcus Steinmetz
Original Assignee
Uts United 3 D Software Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uts United 3 D Software Inc. filed Critical Uts United 3 D Software Inc.
Priority to AU2001239159A priority Critical patent/AU2001239159A1/en
Publication of WO2001063559A2 publication Critical patent/WO2001063559A2/fr
Publication of WO2001063559A3 publication Critical patent/WO2001063559A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering

Definitions

  • the invention relates to a method for automatically linking image areas of a three-dimensional representation of an object with functions, the three-dimensional representation being transformed into two-dimensional texture image data in at least one projection plane.
  • Three-dimensional objects are preferably described with a large number of three-sided polygons, the edge vectors of which are coplanar and aligned
  • Each polygon is defined with a data set that contains the vertex coordinates, vertex normal vectors and the surface normal vector
  • the spatial orientation of the polygon plane is described by the surface normal vector, while the orientation of the polygon within its neighboring polygon set is represented by the vertex normal vectors
  • the three-dimensional object can be changed in its spatial position.
  • the polynomials are subjected to corresponding transformation matrices, for example translation, scaling or rotation.
  • the back-facing process eliminates the polygons that are not visible from a viewing level, since visible polygons precede these polygons.
  • the spatial depth impression can be improved with the help of a perspective projection.
  • the surface elements are cut off at the interfaces that only partially lie within a three-dimensional field of view. This process is called polygon caps.
  • this wireframe representation only gives a relatively imprecise visual impression of the three-dimensional object.
  • Rendering processes are therefore also carried out, which serve to calculate and display the coordinates and the color values of the pixels located on the surfaces of the polygons to be displayed.
  • the surface texture of the three-dimensional object is not only considered as a polygon representation. Rather, they too
  • Reflection properties as reflection data, the color properties as color value data, the transparency properties as transparency data etc. are stored.
  • These image data referred to as texture image data, are a projection of the three-dimensional object onto a two-dimensional plane, comparable to a camera image from any angle. They are saved independently of one another as so-called maps. For example, a color map for the color values, a reflection map for the reflection properties for displaying the depth of the object, a transparency map for displaying the transparency properties, etc. are provided.
  • the rendering process is described, for example, in Peter Oel, Jens Riemschneider, "In the beginning the picture” was described in c't 1999, issue 17, pages 164 to 169.
  • image-based rendering is based on projection rays that are three-dimensional Project an object onto a 2D surface.
  • the method corresponds to photographing an environment on a flat surface with a camera.
  • the rendering processes are used to calculate new views of an object.
  • an imaginary camera and thus a new projection plane is placed at a position from which there is no real image of the object.
  • image information is obtained with which the new view of the object can be calculated.
  • Aided design tools This is relatively expensive.
  • the assignment of the image areas to the functions is relatively imprecise if the position of the displayed object is shifted in space.
  • the method of assigning functions to image positions with the aid of CAD tools is therefore only suitable for two-dimensional representations.
  • the object of the invention was therefore to create an improved method for automatically linking image areas of a three-dimensional representation of an object with functions.
  • the object is achieved in the method according to the invention by a) generating functional area data as a projection of the image areas to be linked of a three-dimensional representation of an object in the two-dimensional plane;
  • a two-dimensional texture of the functional area data is thus generated parallel to the texture for describing the three-dimensional representation, by defining image areas on the surface of the three-dimensional object and linking them to functions and storing these image areas with the function links as a two-dimensional projection.
  • this functional area data can be subjected to a geometric transformation in the same way as the texture image data after a function is called up by selecting an image area. In this way, the functional area belonging to the selected image area can be determined and the corresponding assigned function can be carried out.
  • the texture image data are preferably assigned to three-sided polygons with which the surface of the three-dimensional object or image is defined. With these polygons, a two-dimensional wire model representation of the three-dimensional image can be generated, which gives a three-dimensional impression.
  • reflection data, color value data and / or transparency data can be defined and stored as texture image data. They are used to describe the reflection properties, the color properties and the transparency properties of the three-dimensional object, in particular in relation to the polygon surfaces. These different texture image data are stored independently of one another as a reflection map, color value map and transparency map. According to the invention, the functional area data are treated in the same way as the aforementioned texture image data as a so-called functional map.
  • a so-called rendering process is advantageously carried out, in which object views are calculated from the polygon representation, the texture image data and from the coordinates, color properties, reflection properties and / or transparency properties of the pixels on the polygons by means of the reflection data, color value data and transparency data for example, can be displayed on a screen.
  • the rendering process is also carried out with the functional area data when a function is called up by selecting a pixel.
  • the function that is linked to the corresponding functional area is determined and executed from the functional area data that correlate with the pixel.
  • the functional area representation is thus shifted in accordance with the object in space, with the rendering process projecting onto the two-dimensional level of the
  • the associated functional area and the associated function of the selected pixel can be determined immediately.
  • the functions are advantageously stored in a database and linked with function identifiers, that is to say with function code words.
  • the functional area data stored as texture then only contain a link between functional areas and the functional identifiers. This can save storage space.
  • Figure 1 Process flow for linking image areas with functions according to the invention
  • Figure 2 conventional method for linking image areas with functions.
  • FIG. 1 shows a sketch of the method for automatically linking image areas of a three-dimensional representation of an object 1 with functions.
  • the three-dimensional object 1 is sketched in the three-dimensional perspective representation 2 as a wire model in the three planes X, Y and Z.
  • the wire model consists of three-sided polygons 3 for an approximate description of the surface of the object 1.
  • the invisible polygons in the background are eliminated with the so-called backfacing method.
  • the recording surface 5 e.g. a film
  • a two-dimensional projection of the image is created on the projection plane of the recording surface 5.
  • This two-dimensional projection is outlined in the two-dimensional texture map 6, which shows the projection of a selected polygon 3a.
  • Maps, color value data maps and transparency data maps can in particular the areas of the individual polygons 3 are described in more detail and the spatial impression of the object 1 is improved.
  • the invention provides a further corresponding functional area data map 7, in which functional areas 8 are defined and linked with functional identifiers A, B.
  • the functional areas 8 correspond to selected image areas which do not have to have the shape of a polygon 3.
  • functions can be funct. a, funct. b are executed, which are linked via function identifiers A, B to the corresponding functional areas 8.
  • a corresponding database 9 is provided to describe the linking of the function identifiers A, B with the associated functions Funkt.a, Funkt.b.
  • the functional area data map 7 for defining the functional areas 8 is treated in accordance with the texture map 6 and the reflection data maps, color value data maps and transparency data maps as part of the two-dimensional projection of the three-dimensional object 1.
  • Pointing instrument is selected and a function is to be executed the functional area view is adapted to the current perspective view of object 1.
  • the new perspective is calculated from the functional area data maps 7 with the aid of known geometry transformation and rendering methods.
  • the functional area data is thus treated like the other maps as texture data maps.
  • the function identifier assigned to the selected image area can be determined and the function Funkt.a, Funkt.b to be carried out can be determined and executed with the aid of the function database 9.
  • FIG. 2 shows a conventional method for linking functions to three-dimensional objects 1.
  • an object 1 is sketched in the three-dimensional view 2 and the two-dimensional projection as a texture map 6.
  • the texture map 6 shows a selected polygon 3a, which is identified as such in the method.
  • the link between the number of the selected polygon 3a and the associated function Funkt.a, Funkt.b is stored in the function database 9. Only polygons 3 can thus be linked with functions as selectable image areas. In this case, there is no need to adapt the functional area data map 7 to the projection shown. However, the selectable image areas are relatively imprecise.
  • the functions that can be linked can be of any type. Functions for transforming and moving object 1, video and audio data as multimedia applications, functions for exchanging texture data, activating hyperlinks for selecting functions and pages on the Internet or intranet, displaying explanatory texts or the like are conceivable Execution of a data exchange. In which Exchange of texture data can be changed image areas and in this way certain effects, such as the blinking of an eye of a depicted person can be achieved. By activating hyperlinks, user manuals for the displayed object or the selected picture element can be called up, for example. Only simple explanatory texts can be displayed on the screen in addition to object 1. These explanatory texts can serve, for example, as user help or as marketing information. Any other actions and interactions are conceivable as functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Procédé d'association automatique de zones d'image d'une représentation en trois dimensions d'un objet (1) à des fonctions, selon lequel la représentation (2) en trois dimensions est transformée en données d'image texturées bidimensionnelles dans au moins un plan de projection. Ledit procédé consiste (a) à produire des données de zone de fonction en tant que projection des zones d'images associées d'une représentation en trois dimensions dans le plan bidimensionnel, (b) à associer des fonctions avec des zones de fonction correspondantes (8) et à mettre en mémoire les données de zone de fonction et les associations de fonctions séparément des données d'image texturées, (c) à exécuter une transformation géométrique des données de zone de fonction pour adapter les données de zone de fonction à la situation spatiale des zones d'image à associer d'une représentation en trois dimensions lorsqu'une fonction est appelée par sélection d'une zone d'image et pour déterminer la zone (8) de fonction appartenant à la zone d'image sélectionnée, et (d) à exécuter la fonction qui est associée à la zone (8) de fonction sélectionnée.
PCT/DE2001/000468 2000-02-23 2001-02-07 Procede d'association automatique de zones d'image d'une representation en trois dimensions d'un objet a des fonctions WO2001063559A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001239159A AU2001239159A1 (en) 2000-02-23 2001-02-07 Method for automatically linking image areas of a 3-dimensional representation of an object with functions

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE10008385.4 2000-02-23
DE2000108385 DE10008385A1 (de) 2000-02-23 2000-02-23 Verfahren zur automatischen Verknüpfung von Bildbereichen einer dreidimensionalen Darstellung eines Objektes mit Funktionen

Publications (2)

Publication Number Publication Date
WO2001063559A2 true WO2001063559A2 (fr) 2001-08-30
WO2001063559A3 WO2001063559A3 (fr) 2002-03-14

Family

ID=7632062

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2001/000468 WO2001063559A2 (fr) 2000-02-23 2001-02-07 Procede d'association automatique de zones d'image d'une representation en trois dimensions d'un objet a des fonctions

Country Status (3)

Country Link
AU (1) AU2001239159A1 (fr)
DE (1) DE10008385A1 (fr)
WO (1) WO2001063559A2 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993023835A1 (fr) * 1992-05-08 1993-11-25 Apple Computer, Inc. Rendu de sphere a surface texturee et de carte geographique spherique a l'aide d'un adressage indirect double d'une carte texturee
EP0676724A2 (fr) * 1994-04-05 1995-10-11 Kabushiki Kaisha Toshiba Méthode de mappage de texture et appareil de traitement d'images

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347628A (en) * 1990-01-18 1994-09-13 International Business Machines Corporation Method of graphically accessing electronic data
US5629594A (en) * 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
EP0623799A1 (fr) * 1993-04-03 1994-11-09 SECOTRON ELEKTROGERÄTEBAU GmbH Système vidéo interactif
TW299410B (fr) * 1994-04-04 1997-03-01 At & T Corp
JP3193238B2 (ja) * 1994-08-11 2001-07-30 株式会社東芝 操作検証装置
US5678015A (en) * 1995-09-01 1997-10-14 Silicon Graphics, Inc. Four-dimensional graphical user interface

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993023835A1 (fr) * 1992-05-08 1993-11-25 Apple Computer, Inc. Rendu de sphere a surface texturee et de carte geographique spherique a l'aide d'un adressage indirect double d'une carte texturee
EP0676724A2 (fr) * 1994-04-05 1995-10-11 Kabushiki Kaisha Toshiba Méthode de mappage de texture et appareil de traitement d'images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BLINN J F: "HYPERBOLIC INTERPOLATION" IEEE COMPUTER GRAPHICS AND APPLICATIONS,US,IEEE INC. NEW YORK, Bd. 12, Nr. 4, 1. Juli 1992 (1992-07-01), Seiten 89-94, XP000281908 ISSN: 0272-1716 *
GHAZANFARPOUR D ET AL: "A HIGH-QUALITY FILTERING USING FORWARD TEXTURE MAPPING" COMPUTERS AND GRAPHICS,GB,PERGAMON PRESS LTD. OXFORD, Bd. 15, Nr. 4, 1991, Seiten 569-577, XP000281890 ISSN: 0097-8493 *
RICHARDS J ET AL: "A REAL-TIME TEXTURE MAPPING SYSTEM -- DME-9000 --" INTERNATIONAL TELEVISION SYMPOSIUM & TECHNICAL EXHIBITION,CH,MONTREUX, CCETT, Bd. SYMP. 16, 17. Juni 1989 (1989-06-17), Seiten 675-684, XP000041151 *

Also Published As

Publication number Publication date
AU2001239159A1 (en) 2001-09-03
DE10008385A1 (de) 2001-08-30
WO2001063559A3 (fr) 2002-03-14

Similar Documents

Publication Publication Date Title
DE69632578T2 (de) Computer-grafiksystem zum schaffen und verbessern von texturabbildungssystemen
DE102007045835B4 (de) Verfahren und Vorrichtung zum Darstellen eines virtuellen Objekts in einer realen Umgebung
DE102020000810A1 (de) 3D-Objektrekonstruktion unter Nutzung einer fotometrischen Netzdarstellung
EP2206089B1 (fr) Procédé et dispositif de représentation d'un objet virtuel dans un environnement réel
DE60133386T2 (de) Vorrichtung und verfahren zur anzeige eines ziels mittels bildverarbeitung ohne drei dimensionales modellieren
DE69732663T2 (de) Verfahren zur erzeugung und änderung von 3d modellen und korrelation von solchen modellen mit 2d bildern
DE112004000377B4 (de) Verfahren und Vorrichtung Bildsegmentierung in einer dreidimensionalen Arbeitsumgebung
DE69926986T2 (de) Rechnergrafiken-animationsverfahren und vorrichtung
DE69924699T2 (de) Verfahren zur Schaffung von als Oberflächenelemente dargestellten grafischen Objekten
EP2284795A2 (fr) Analyse quantitative, visualisation, et correction de mouvement dans des processus dynamiques
EP0789328A2 (fr) Méthode de traitement d'image pour affichage d'objets réfléchissants et dispositif correspondant
DE60100806T2 (de) Abbildung von volumendaten
DE102006021118B4 (de) Rendern von anatomischen Strukturen mit ihrem nahen Umgebungsbereich
DE69924230T2 (de) Verfahren zur Modellierung von durch Oberflächenelemente dargestellten grafischen Objekten
DE602004012341T2 (de) Verfahren und System zur Bereitstellung einer Volumendarstellung eines dreidimensionalen Objektes
DE19704529B4 (de) Verfahren zum automatischen Erzeugen und Handhaben einer Schatten-Geometrie und interaktive computergestützte Anzeigeeinrichtung
DE10056978A1 (de) Verfahren zur Erzeugung eines stereographischen Bildes
DE60030401T2 (de) Anzeigetechniken für dreidimensionale virtuelle Realität
WO2001063559A2 (fr) Procede d'association automatique de zones d'image d'une representation en trois dimensions d'un objet a des fonctions
EP3465608B1 (fr) Procédé et dispositif de détermination d'une transition entre deux images affichées et véhicule
EP2893510A1 (fr) Procédé et système de traitement d'image pour éliminer un objet visuel d'une image
DE102012010799B4 (de) Verfahren zur räumlichen Visualisierung von virtuellen Objekten
DE102021203023A1 (de) Mehrfachansicht-konsistenzregularisierung zur semantischen interpretation von äquirektangularpanoramen
DE60109813T2 (de) Verfahren und vorrichtung zur darstellung dreidimensionaler szenen in virtueller realität
DE602005006243T2 (de) Rechnerunterstützes Verfahren und Rechnersystem zum Positionieren eines Zeigers

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: FESTSTELLUNG EINES RECHTSVERLUSTS NACH REGEL 69(1) EPUE (EPO FORM 1205A VOM 14.01.03)

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP