New! View global litigation for patent families

EP1730970A1 - Device and method for simultaneously representing virtual and real environment information - Google Patents

Device and method for simultaneously representing virtual and real environment information

Info

Publication number
EP1730970A1
EP1730970A1 EP20050731750 EP05731750A EP1730970A1 EP 1730970 A1 EP1730970 A1 EP 1730970A1 EP 20050731750 EP20050731750 EP 20050731750 EP 05731750 A EP05731750 A EP 05731750A EP 1730970 A1 EP1730970 A1 EP 1730970A1
Authority
EP
Grant status
Application
Patent type
Prior art keywords
information
unit
device
environment
reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20050731750
Other languages
German (de)
French (fr)
Inventor
Soeren Moritz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed circuit television systems, i.e. systems in which the signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic or multiview television systems; Details thereof
    • H04N13/02Picture signal generators
    • H04N13/0203Picture signal generators using a stereoscopic image camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic or multiview television systems; Details thereof
    • H04N13/04Picture reproducers
    • H04N13/0468Picture reproducers using observer tracking

Abstract

The invention relates to a device and a method for representing virtual and real environment information for at least one user, whereby virtual arrangements and real arrangements are represented in such a way that masking of the virtual arrangements by real arrangements can be identified. The relative position and orientation of the device in the real environment are detected by means of an environment detection unit (4). In addition, a detection of the reality and the conversion thereof into a 3-dimensional surface model is continuously carried out by means of a spatial detection unit (3). A processing system (9) transfers the 3-dimensional surface model of the real arrangement and the 3-dimensional model of the virtual arrangement into a common co-ordinates system and calculates possible masking surfaces of the virtual arrangement by the real arrangement.

Description

description

Device and method for the simultaneous representation of the virtual and real environment information

The invention relates to an apparatus and a method for displaying information, in particular augmented reality information, for at least one user. Such a device is used for example in the design of equipment and machines. Using this device, a first impression of a planned facility or planned conversion measure is to be recovered in an existing environment quickly during the planning phase.

Augmented Reality (AR), augmented reality, is a form of human-computer interaction, the man z. B. expand out of data goggles In ormation in its field of view, thus expanding the perceived him current reality. This is done depending on the context, that is suitable to and derived from the observed object, such. As a component, a tool, a machine, or to his site. Example of this can be a security alert when a Mon- for days / disassembly process.

To display the augmented reality two common methods exist. The so-called optical see-through (OST) the insertion of the virtual information directly into the field of view of the user, whereby the user can continue to directly perceive the reality. In this case typically comes a worn on the head miniature monitor, a so-called head-mounted display, for displaying the image information is used. The so-called video see-through (VST) the reality is captured by a video camera. The • display of the virtual information is in the recorded video image. The image information generated in this way can then be reacted with one or more head-mounted displays, or with a standard display as a monitor for display. Thus, consider the augmented reality, for multiple users. As well as at a remote location. For video see-through the video camera and the spatial detection system can also be mounted on the head of the user. and thus advantageously also the assembly of the space detection system - - In today's practice, the installation of the video camera is carried on a portable computer (eg tablet PC).

The invention has for its object to provide an apparatus and a method for displaying information, in particular augmented-reality information to specify a user for at least machine or at reduced time and cost of a design of new plants or an extension of existing plants allows a currently existent in environment, said dynamic processes both the real and the virtual environment can be detected and displayed.

This object is achieved by a device for displaying information, in particular augmented-reality information, for at least one user with

• at least one spatial detection unit for detecting a current reality and to generate corresponding spatial information,

• at least one environment detecting unit for detecting an environment and for generating a corresponding environment information, which define a position and / or orientation of the device with respect to the environment,

• at least one processing unit for linking the environment information, the space information, and stored in a first storage medium information used for describing at least one object, to an amount of image information such that reciprocal masking the current reality and the object described by the information stored by at least one reproducing unit can be made visible.

This object is further achieved by a method for displaying information, in particular augmented-reality information, for at least one user in the • using at least one spatial detection unit detects a current reality and corresponding spatial information is generated, • detected with the aid of at least one environment detecting unit an environment and corresponding environment information are generated which define a position and / or orientation of the device with respect to the environment, • using at least one processing unit, the environment information, the spatial information and stored in a first storage medium information used for describing at least one object, be linked to an amount of image information such that reciprocal masking the current reality and the object described by the information stored erke by at least one reproducing unit nnbar be made.

In the inventive device and the method according to the invention a well-known from the prior art AR system is supplemented by a space detection unit which allows a continuous detection of the current reality. The space detecting unit detects which areas are located in a detection area of ​​the device and have what distance these surfaces to the device. With the help of an environment detecting unit relative position and orientation of the device with respect to the real environment are detected. By means of a processing system are generated by the space and the environment detecting unit data associated with a further at least one object data describing amount such that reciprocal masking the field of view of the user at a display of the erfass- th reality and objects described by the amount of data to be visualized ,

The invention is based on the realization that in the

Planning new facilities or expansion of existing systems in a currently existing environment issues regarding spatial aspects need to be answered, such as:

- Can a new piece of equipment in a certain position to be installed?

- What is the Einbauweg?

- Can by moving machine parts and human conflicts arise?

These questions can be answered with the device of the invention or the system according to the invention with the help of AR systems without a complete 3-dimensional model of the real existing environment needs to be generated at great time and expense. By means of the detected real descriptive information can be calculated masking the planned system, which is represented in the form of a virtual 3-dimensional model and displayed in the augmented reality by hiding.

The continuous detection of itself located in the detection area of ​​the device current reality even allows a visualization of the virtual device and an existing in the real environment dynamics within the ER far Erten reality. Here, the major synchronization effort that such a visualization would cause in a complete 3-dimensional modeling of a non-static reality unnecessary.

In an advantageous embodiment of the invention, the spatial detection unit • a spatial detection apparatus for detecting chenabstandspunkten area-the current reality is provided comprising, and

• a processing unit which is provided for calculating the area information, the space information, in particular describe a 3-dimensional model of reality. In this way, only the possibly occurring for calculating masking surfaces is detected required information content of the current reality and modeled, whereby the complexity compared to the creation of a complete 3-dimensional model of a real geometrical arrangement is significantly reduced.

The environment detection unit advantageously has a Umgebungserfassungsvorriehtung •, which is provided for detecting the position and / or orientation of the device with respect to the real environment, and

• a processing unit for calculating the area information describing the position and / or orientation of the device with respect to the real environment, for example in the form of a matrix on.

a model of a system and / or a system part is provided tung In a typical application of the invention as Vorrich- the object described by the data stored in the first spoke medium information. For example, at a planned installation measure a virtual model of the incorporated plant or plant part to be installed can be created so quickly to get a first impression of the system to be planned in the space provided reality.

In an advantageous embodiment of the invention, the rendering unit is designed as a head-mounted display, whereby the objects described by the generated by the processing unit image information is directly displayed in the field of view of the user, and the user selects the not hidden from those described by the image information objects perceives part of the current reality continues directly. With this type of display of augmented reality information is the so-called Optical See-Through method. In this case, the spatial detection device and the ambient detection device advantageously attached to the head of the user so that they rest in relation to the user's eyes. The viewing angle of the spatial detection apparatus and the environment detecting device overlap with the current field of view of the user, ideally, so that the entire field of view of the user is detected. However, the processing units of the spatial detection unit and the Umgebungserfassungs- unit can be realized on a computer carried by the user.

In an alternative advantageous embodiment, the playback unit is designed such that the objects and described by the generated by the processing unit image information of the non-concealed from those described by the image information items of the current reality represented, which apparatus for this purpose in particular at least assungseinheit an image, the examples of play is performed as a video camera, comprising for detecting the current reality. This embodiment enables the display of the augmented reality information for multiple users. With this type of display of augmented reality information is the so-called Vi deo see-through method. Here, the parts of the virtual objects described by the image information and not covered by the current reality are displayed in the image captured by the video camera and on one or such. B. using a Videospiltters, multiple re-illustrated transfer units. The reproduction units may be head-mounted display or conventional monitors act in particular also on the detected aktu- hurry reality remote locations can be positioned. In this type of embodiment, the spatial detection apparatus, the environment detecting device, and the Bilderfas- suηgseinheit may be mounted on the head of a user or a term sons- device such as a portable computer.

To be able to calibrate the angle and the position of the user as exact as possible with the position and the orientation of the spatial detection device in an embodiment of the apparatus using the optical see-through method, is advantageously a second storage medium provided, for storing calibration information is used, wherein the calibration information geometrical deviations assungsvorrichtung describe between the eye of the user, the position of the reproduction system and the spatial detection device and the Umgebungser. The second storage medium may alternatively be implemented with the first storage medium in the form of a common storage medium.

To be able to calibrate the angle and position of the video camera as exactly as possible with the position and the orientation of the spatial detection device .Among using the video see-through method in one embodiment of the device, the second storage medium for storing calibration information is advantageously provided, wherein the describe calibration information geometric deviations between the position of the video camera, the position of the reproduction system and the spatial detection device and the Umgebungserfas- sungsvorrichtung. The second storage medium may alternatively be implemented with the first storage medium in the form of a common storage medium.

A simple calculation of possible occlusions virtual geometric arrangements by arrays in the current reality is realized advantageously in that the processing unit based on the information generated by the spatial detection unit and the environment detecting unit, and the data stored in the storage media information in the first means of the spatial information and by means of represents storage medium information stored objects described in a common coordinate system. Based on this common reference system, the processing unit can calculate new image information in which those are hidden from those described by the data stored in the first storage medium information areas which are concealed from those described by the space information areas in the field of view of the user or the video camera. The processing unit for combining the ambient and the spatial information may be implemented on a computer together with the processing unit of the spatial detection unit and / or the processing unit of the environment detecting unit.

A dynamic of a virtual system model can be implemented with the inventive device in that the device comprises at least a simulation system for generating the data stored in the first storage medium information. The dynamic processes are calculated by the simulation system. The off set in the first storage medium information describing the virtual objects are continuously adjusted according to the calculated by the simulation system data.

The spatial detection device of the spatial detection unit may for example as a radar system, an ultrasonic system, a laser system or be configured as a stereo camera system. To minimize the required hardware effort, the spatial detection device and the ambient detection device can in particular be carried out in of a common detector for camera-based systems. In one embodiment, using the video see-through method, an integration of Raumerfas- is sungsvorrichtung and / or the environment detection device is possible in the time required for this method video camera beyond.

In the following the invention is described in detail with reference to the embodiment illustrated in the figures and explained embodiments. Show it:

1 shows a schematic representation of a device for displaying information, in particular augmented reality information, for at least one user, FIG 2 is a schematic representation of a based on the video-See-Through method embodiment of the apparatus, FIG 3 shows an alternative apparatus for the preparation of infor- mation, which is based on the video see-through method, and FIG 4 is an illustration of a typical application scenario of an embodiment of the apparatus shown in Fig. 1 1 shows a schematic representation of a device 1 for displaying information, in particular augmented reality information, for at least one user 2. The apparatus illustrated relates to an embodiment which is based on the optical see-through method. The user 2 detected by means of a spatial detection unit 3 a is exploiting Dende in his field of current reality 13 is located on the user's head 2 as part of spatial detection unit 3 is a spatial detection device 3a which rests relative to the user's eyes. 2 By means of a processing unit 3b of the space detecting unit 3 spatial detection information 10 are generated that are passed to a processing unit. 9

Also on the head of the user 2 is a Umgebungser- capturing device 4a positioned, with which the position of the user 2 and the angle of view can be detected. The environment detection device 4a rests with respect to the spatial detection apparatus 3 and with respect to the eyes of the user 2. A processing device 4b generated from the detected position and formations the detected angle Umgebungsin- 5, which are also passed on to the processing unit. 9

In an alternative embodiment, the environment detection unit 4, the environment detecting device includes egg NEN sensor which is positioned on the head of the user and a further detection device which is set up so that they have the position and orientation of the sensor and thus also of the user with respect to can detect the current reality.

In a storage medium 7 information 8 is stored, for example, describe a virtual geometric arrangement. In the virtual geometric arrangement can, for example, be the three-dimensional model of a planned facility or planned plant part. The information 8 that describe the three-dimensional model of such a system, the processing unit 9 are also fed.

Another storage medium 12 includes calibration information 11, which write the geometric deviations between the eye of the user 2, the position of a reproducing unit 6 is located on the head of the user 2 and the spatial detection device 3a and the environment detecting device 4a loading.

The processing unit 9 now combines the spatial information 10, the environment information 5, the information 8 that describe the three-dimensional model of the virtual plant, and the calibration information 11 to a set of image data 14 such that reciprocal masking the current reality and the planned virtual installation by the reproducing unit 6 can be made visible. About the reproducing unit 6, which is carried out in this example as a head-mounted display, is shown only an image of the intended virtual system in which the masked by the actual reality 13 in the field of view of the user 2 surfaces are hidden. The part of the current reality 13, which is not covered by the planned virtual plant is directly perceived by the user.

In this way, the user is visualized a mixed virtually real environment 2 without a time-consuming and costly complete 3-dimensional modeling of the real environment is required. Dynamic processes within the real environment can be visualized, wherein QUIRES ONLY lent a continuous calculation of the masking surfaces is to be performed.

2 shows a schematic representation of a based on the video see-through method embodiment of the device 1. It will here and the same reference numerals as in Figure 1 used in the description of * .weiteren figures. In this embodiment, the device 1 is supplemented by an image capturing unit 18, which is in particular designed as a video camera. The playback unit 6 now represents the complete augmented reality. That is. In addition to the picture-in ormation 14, describe a model of the virtual plant, in which the masked by the actual reality space are hidden, is also detected by the video camera 18 part of the current reality 13 which is not covered by the virtual installation, with the aid of the reproducing unit 6 is shown. For this purpose, the image of the virtual system in which the masked by the actual reality surfaces 13 are hidden, appears in the image captured by the video camera .18 image. In this type of exemplary conversion unit 15 with mixing function is used to generate a corresponding displayable by the reproducing unit 6 signal. This can be both by software and hardware technology - for example, a video card with appropriate functionality - be run.

In the illustrated embodiment, the spatial detection apparatus 3, the environment detecting device 4 and the image detection unit are positioned on the head of the user 2 eighteenth The playback unit 6 used for visualization of augmented reality is also connected to the user's head 2, for example, is this is a head-mounted display.

3 shows an alternative device for displaying information, the Siert BA on the video see-through method. In this embodiment of device 1, the spatial detection apparatus 3, the environment detecting device 4, the imaging unit 18 and the reproducing unit 6 are mounted on a portable computer. This embodiment allows multiple users to view the augmented reality. Using a video splitter and the representation of the augmented reality on a plurality of reproduction units is also possible.

FIG 4 shows a typical application scenario of an exemplary form of the apparatus shown in Fig. 1 In view of a user 2 there is a current reality 13, which may for example be a conveyor belt. The spatial detection device 3a of the spatial detection system 3 detects the located in the viewing angle of the user's 2 part of the conveyor belt 13. The processing unit 3b of the space detecting unit 3 models the surfaces of the conveyor belt 13 located in the field of view of the user 2 in a three-dimensional surface model 10b.

By means of the environment sensing unit 4, which is, for example, is a commercially available tracking system, the position of the user 2 and the angles are detected on the current reality. The processing unit 4b of the environment detection unit 4 generates therefrom information 5, which are represented in the form of a matrix 5b.

A simulation system 16 continuously generates a data set 8, 8b of a virtual geometric arrangement, in this example, a robot, describes the 3D model. In this way, a more dynamic virtual robot is performed in which is expanded reality. The corresponding amount of data 8 is stored in a first storage medium. 7

In a second storage medium 12, the calibration information described above are stored 11 in the form of a matrix 11b, and a plurality of matrices.

The processing unit 9 now illustrates using the calibration information 11 and the environment information 5, the 3-dimensional surface model 10b of the real assembly 13 (here, the conveyor belt), and the 3-dimensional model 8b of the virtual rush arrangement (here, a virtual robot) in a common coordinate system. In this coordinate system, the processing unit 9 calculates masking the virtual robot, which are caused by the conveyor belt. As a result, the processing unit 9 generates a new set of data 14, which in turn describes a virtual model of the robot 14b, in which the hidden from the conveyor belt surfaces are hidden.

The model 14b of the virtual robot, in which the Verde- ckungen are hidden by the conveyor belt, is converted by means of a video card 15 in a displayable by the reproducing unit 6 signal.

The user .2 seen by simultaneously detecting the with- means of the reproducing unit model depicted 6 14b of the virtual robot and the actual conveyor belt, a mixed virtually real image 17, in which by hiding calculated by the processing unit 9 faces of the desired 3-dimensional impression mediated. The need for a time-consuming and expensive 3-dimensional modeling of the conveyor belt is eliminated in the inventive apparatus or the inventive method.

In summary, the invention provides an apparatus and a method for display of virtual and real environment information for one or more users concerned, wherein virtual devices and real devices are shown such that masking effects of the virtual devices are made recognizable by real arrangements. With the help of an environment detection unit 4 relative position and orientation of the device in the real environment to be construed ER. Additionally, a detection of the reality and its Umre- is carried out monitoring in a 3-dimensional surface model continuously by means of a spatial detection unit. 3 A processing system 9 transfers the 3-dimensional surface model of the real array and the 3-dimensional model of the virtual placement in a common coordinate system and calculates any masking surfaces of the virtual device by the real arrangement.

Claims

claims
1. Device (1) for displaying information, in particular augmented-reality information, for at least one user (2) with
• at least one spatial detection unit (3) for detecting a current reality (13) and to generate corresponding spatial information (10),
• at least one environment sensing unit (4) for Erfas- solution of an environment and for generating a corresponding environment information (5), which define a position and / or orientation of the device (1) in relation to the surroundings,
• at least one processing unit (9) for linking the environment information (5), the space information (10) and stored by a first storage medium (7) information (8), which are used for describing at least one object, to an amount of image information (14 ) such that reciprocal masking of the current re- ality (13) and (through the stored information object described 8) can be made visible through at least a reproduction unit (6).
2. Device according to claim 1, characterized in that the spatial detection unit (3)
• a spatial detection device (3a) which is provided for the detection of surface distance points of the current reality (13), and • a processing unit (3b) is provided for calculating the space information (10), wherein the space information (10) in particular a describe 3-dimensional model (10b) of the reality (13).
3. Device according to one of the preceding claims, characterized in that the environment detecting unit (4) a surrounding detection device (4a) which is provided with respect to the real environment to Erfas- solution of the position and / or orientation of the device (1), and • a processing unit (4b) for calculating the environment information (5), which describe the position and / or orientation of the device (1) with respect to the real environment, for example in the form of a matrix (5b).
4. Device according to one of the preceding claims, characterized in that as the object stored by the first spoke in the medium (7) information (8) described above is provided / or a plant and a plant component a model (8b).
5. Device according to one of the preceding claims, characterized in that the reproduction unit (6) is designed as a head-mounted display, said generated by the by the processing unit (9) image information objects described (14) directly into the field of view of the user (2) appear ,, and perceived by the user (2) the objects described by the image information by the (14) uncovered part of the current reality (13) further directly.
6. Device according to one of the preceding claims, characterized in that the reproduction unit (6) is designed such that the data generated by the by the processing unit (9), image information (14) projects Obwalden described and that of the (by the image information 14 ) objects described unobstructed part of the current reality (13) are presented, said device comprising for this purpose in particular at least one image acquisition unit (18), which is for example designed as a video camera (for detecting the current reality 13).
7. Device according to one of claims 1 to 5, characterized in that a second storage medium (12) is provided which is used to store calibration information (11), wherein the calibration information (11) geometric deviations between the eye of the user
describe (2), the position of the reproducing system (6) and the spatial detection device (3a) and the surroundings sensing device (4a).
8. Device according to one of claims 1 to 4 or 6, characterized in that a second storage medium (12) is provided which is used to store calibration information (11), wherein the calibration information (11) geometric deviations between the position paintings- acquisition unit (18), the space detecting means (3a) and the surrounding environment detection means (4a) describe.
9. Device according to one of the preceding claims, characterized in that the processing unit (9) is designed such that it on the basis of the spatial detection unit (3) and the environment detecting unit generated (4) Information (5.10), as well as in the by means of the information space (10) and stored by the first storage medium (7) information (8) represents storage media (7.12) stored information (8,11) objects described in a common coordinate system.
10. Device according to one of the preceding claims, characterized in that the device (1) comprises at least a simulation system (16) for generating in said first storage medium (7) stored information (8).
11. Device according to one of the preceding claims, characterized in that the spatial detection device (3a) is carried out of the space detecting unit (3) as a radar system, an ultrasonic system, a laser system or as a stereo camera system.
12. Device according to one of the preceding claims, characterized in that the spatial detection device (3a) and the environment detecting device (4a) are constructed in a common detecting device.
13. A method for displaying information, in particular augmented-reality information, for at least one user (2) in which
are generated • detected by means of a space detecting unit (3) a current reality (13) and corresponding three-dimensional information (10),
• using at least one environment sensing unit (4) detects an ambient environment and corresponding information (5) to be generated, the position and / or orientation of the device (1) with respect to characterize the environment,
• using at least one processing unit (9), the environment information (5), the spatial information (10) and in a first storage medium stored (7) information (8), which are used for describing at least one object, to an amount of image information (14) are linked in such a way that reciprocal masking the current reality (13) and by the stored information (8) described object can be made visible through at least a reproduction unit (6).
14. The method according to claim 13, characterized in that with the aid of
• a spatial detection device (3a) that is part of the spatial detection unit (3), surface distance points of the current reality (13) are detected, and
• a processing unit (3b), the part of the space detecting unit (3), the spatial information (10) are calculated, and the spatial information (10) describing particular a 3-dimen- dimensional model (10b) of the current reality (13).
15. The method according to any one of claims 13 to 14, characterized in that with the aid of
• an environment sensing device (4a) forming part of the environment sensing unit (4) the position and / or orientation of the device (1) with respect to the real environment has to be recorded, and
• a processing unit (4b), which part of the environment sensing unit (4), the environment information (5) indicating the position and / or orientation of the forward direction (1) with respect to the real environment (for example in the form of a matrix 5b ) describe be calculated.
16. The method according to any one of claims 13 to 15, characterized in that the first in the by the
Spoke medium (7) objects stored information (8) represents a model described (8b) / a conditioning and or a plant component.
17. The method according to any one of claims 13 to 16, characterized in that the data generated by the by the processing unit (9) image information described (14) objects with the aid of the reproducing unit (6), which in particular is designed as a head-mounted display, be inserted directly into the field of view of the user (2) and the user (2) perceives uncovered part of the current reality (13) further to the objects directly from those described by the image information (14).
18. The method according to any one of claims 13 to 17, characterized in that the data generated by the by the processing unit (9), image information (14) objects and described the objects described by the by the image information (14) (not covered portion of the current reality 13) are represented by means of the reproducing unit (6), wherein the device for this purpose in particular at least one image acquisition unit (18) which is designed for example as a video camera and (for detecting the current reality 13).
19. A method according to any one of claims 13 to 17, characterized in that with the aid of a second storage medium (12) calibration information (11) are stored, said calibration information (11) geometric deviations between the eye of the user (2), the position of the reproducing system (6) and the spatial detection device (3a) and the surroundings sensing device (4a) describe.
20. The method according to any one of claims 13 to 16 or 18, characterized in that advertising stored using a second storage medium (12) calibration information (11) to, wherein the calibration information (11) geometric deviations between position of the image capture unit (18), the spatial detection device (3a) and the surroundings sensing device (4a) describe.
21. The method according to any one of claims 13 to 20, characterized in that integrated with the help of Verarbeitungsein- (9) (4) on the basis of the spatial detection unit (3) and the environment detecting unit generated information (5,10) as well as in the storage media (7.12) stored information (8,11) by means of the information space (10) and stored in the first memory means of the medium (7) information objects described (8) are represented in a common coordinate system.
22. The method according to any one of claims 13 to 21, characterized in that at least one simulation system (16) deposited by means of the in the first storage medium (7) information (8) are generated.
23. The method according to any one of claims 13 to 22, characterized in that a radar system, an ultrasonic system, a laser system or a stereo camera system for spatial detection device (3a) is used, the space detecting unit (3).
24. The method according to any one of claims 13 to 23, characterized in that for the spatial detection device (3a) and the environment detecting device (4a) have a common detecting device is used.
EP20050731750 2004-04-02 2005-03-16 Device and method for simultaneously representing virtual and real environment information Withdrawn EP1730970A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE200410016331 DE102004016331B4 (en) 2004-04-02 2004-04-02 Device and method for the simultaneous representation of the virtual and real environment information
PCT/EP2005/051195 WO2005096638A1 (en) 2004-04-02 2005-03-16 Device and method for simultaneously representing virtual and real environment information

Publications (1)

Publication Number Publication Date
EP1730970A1 true true EP1730970A1 (en) 2006-12-13

Family

ID=34963623

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20050731750 Withdrawn EP1730970A1 (en) 2004-04-02 2005-03-16 Device and method for simultaneously representing virtual and real environment information

Country Status (4)

Country Link
US (1) US8345066B2 (en)
EP (1) EP1730970A1 (en)
DE (1) DE102004016331B4 (en)
WO (1) WO2005096638A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9781697B2 (en) 2014-06-20 2017-10-03 Samsung Electronics Co., Ltd. Localization using converged platforms

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005060980B4 (en) * 2005-12-20 2012-10-25 Metaio Gmbh Method and system for determining a collision-free three-dimensional space volume along a path of travel with respect to a real environment,
US8094090B2 (en) * 2007-10-19 2012-01-10 Southwest Research Institute Real-time self-visualization system
US8606657B2 (en) * 2009-01-21 2013-12-10 Edgenet, Inc. Augmented reality method and system for designing environments and buying/selling goods
US20130137076A1 (en) * 2011-11-30 2013-05-30 Kathryn Stone Perez Head-mounted display based education and instruction
US20150105890A1 (en) * 2012-07-16 2015-04-16 Other Machine Company System and Method for CNC Machines and Software
FR3000241A1 (en) * 2012-12-21 2014-06-27 France Telecom Method for managing a geographic information system adapted to be used with at least one pointing device, with creation of purely digital virtual objects.
US20140375684A1 (en) * 2013-02-17 2014-12-25 Cherif Atia Algreatly Augmented Reality Technology
US9626773B2 (en) 2013-09-09 2017-04-18 Empire Technology Development Llc Augmented reality alteration detector
US9677840B2 (en) 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5428724A (en) * 1992-04-29 1995-06-27 Canon Information Systems Method and apparatus for providing transparency in an object based rasterized image
US6151009A (en) * 1996-08-21 2000-11-21 Carnegie Mellon University Method and apparatus for merging real and synthetic images
US5986674A (en) * 1996-10-31 1999-11-16 Namco. Ltd. Three-dimensional game apparatus and information storage medium
US5923333A (en) * 1997-01-06 1999-07-13 Hewlett Packard Company Fast alpha transparency rendering method
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
US6369830B1 (en) * 1999-05-10 2002-04-09 Apple Computer, Inc. Rendering translucent layers in a display system
JP2000350865A (en) * 1999-06-11 2000-12-19 Mr System Kenkyusho:Kk Game device for composite real space, image processing method therefor and program storage medium
US6335765B1 (en) * 1999-11-08 2002-01-01 Weather Central, Inc. Virtual presentation system and method
DE10127396A1 (en) * 2000-06-13 2001-12-20 Volkswagen Ag Method for utilization of old motor vehicles using a sorting plant for removal of operating fluids and dismantling of the vehicle into components parts for sorting uses augmented reality (AR) aids to speed and improve sorting
JP2002157607A (en) * 2000-11-17 2002-05-31 Canon Inc System and method for image generation, and storage medium
JP3406965B2 (en) * 2000-11-24 2003-05-19 キヤノン株式会社 Mixed reality presentation apparatus and control method thereof
US20020133264A1 (en) * 2001-01-26 2002-09-19 New Jersey Institute Of Technology Virtual reality system for creation of design models and generation of numerically controlled machining trajectories
DE10240392A1 (en) * 2002-09-02 2004-03-11 Patron, Günter A system for determining relative spacing of virtual and real objects e.g. for planning of buildings and manufacturing equipment, requires involving an augmented reality system for environmental real object position
WO2005091220A1 (en) * 2004-03-12 2005-09-29 Bracco Imaging S.P.A Accuracy evaluation of video-based augmented reality enhanced surgical navigation systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005096638A1 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9781697B2 (en) 2014-06-20 2017-10-03 Samsung Electronics Co., Ltd. Localization using converged platforms

Also Published As

Publication number Publication date Type
US20070202472A1 (en) 2007-08-30 application
DE102004016331A1 (en) 2005-11-03 application
US8345066B2 (en) 2013-01-01 grant
DE102004016331B4 (en) 2007-07-05 grant
WO2005096638A1 (en) 2005-10-13 application

Similar Documents

Publication Publication Date Title
US5130794A (en) Panoramic display system
US20050149231A1 (en) Method and a system for programming an industrial robot
US5808588A (en) Shutter synchronization circuit for stereoscopic systems
US20130141434A1 (en) Virtual light in augmented reality
US20060221098A1 (en) Calibration method and apparatus
US20100013738A1 (en) Image capture and display configuration
US20090300535A1 (en) Virtual control panel
US7138963B2 (en) Method for automatically tracking objects in augmented reality
US7162054B2 (en) Augmented reality technology
US20030210832A1 (en) Interacting augmented reality and virtual reality
US6690374B2 (en) Security camera system for tracking moving objects in both forward and reverse directions
US20100208057A1 (en) Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
US20050001852A1 (en) System and method for inserting content into an image sequence
US20040056870A1 (en) Image composition apparatus and method
US20020060648A1 (en) Image-display control apparatus
US20120162204A1 (en) Tightly Coupled Interactive Stereo Display
US20040119662A1 (en) Arbitrary object tracking in augmented reality applications
US20120293506A1 (en) Avatar-Based Virtual Collaborative Assistance
Fuhrmann et al. Occlusion in collaborative augmented environments
US20110084983A1 (en) Systems and Methods for Interaction With a Virtual Environment
US5870136A (en) Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications
US20050179617A1 (en) Mixed reality space image generation method and mixed reality system
US20110243388A1 (en) Image display apparatus, image display method, and program
US5495576A (en) Panoramic image based virtual reality/telepresence audio-visual system and method
US20070076090A1 (en) Device for generating three dimensional surface models of moving objects

Legal Events

Date Code Title Description
AK Designated contracting states:

Kind code of ref document: A1

Designated state(s): DE FR GB IT

17P Request for examination filed

Effective date: 20060726

RBV Designated contracting states (correction):

Designated state(s): DE FR GB IT

DAX Request for extension of the european patent (to any country) deleted
17Q First examination report

Effective date: 20090626

RAP1 Transfer of rights of an ep published application

Owner name: SIEMENS AKTIENGESELLSCHAFT

18D Deemed to be withdrawn

Effective date: 20130319