US6278479B1 - Dual reality system - Google Patents

Dual reality system Download PDF

Info

Publication number
US6278479B1
US6278479B1 US09028319 US2831998A US6278479B1 US 6278479 B1 US6278479 B1 US 6278479B1 US 09028319 US09028319 US 09028319 US 2831998 A US2831998 A US 2831998A US 6278479 B1 US6278479 B1 US 6278479B1
Authority
US
Grant status
Grant
Patent type
Prior art keywords
points
position
point
orientation
dru
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US09028319
Inventor
Phillip C. Wilson
Erik A. Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wilson Hewitt and Assoc Inc
Original Assignee
Wilson Hewitt and Assoc Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Grant date

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/22Other optical systems; Other optical apparatus for producing stereoscopic or other three dimensional effects
    • G02B27/2228Stereoscopes or similar systems based on providing first and second images situated at first and second locations, said images corresponding to parallactically displaced views of the same object, and presenting the first and second images to an observer's left and right eyes respectively
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/373Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/376Image reproducers using viewer tracking for tracking left-right translational head movements, i.e. lateral movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/38Image reproducers using viewer tracking for tracking vertical translational head movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0077Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0096Synchronisation or controlling aspects

Abstract

An apparatus for merging real and virtual images in which points in three dimensions of an actual scene are obtained and selected in a random fashion and in a number sufficient to track the movement of the three dimensional scene relative to an observer wearing the apparatus. Updating of selected points occurs responsive to movement of a video camera pair observing the actual scene. The position and orientation of the unit housing the video camera pair is utilized to properly merge a generated virtual image with the real image. Selection of points and orientation of the unit relative to a three dimensional real image are obtained in real time. Microprocessor-based apparatus is utilized to perform the method of the invention.

Description

FIELD OF THE INVENTION

The present invention relates to personal display systems and more particularly to a display system which uniquely combines both virtual and real images in “real time”.

BACKGROUND OF THE INVENTION

Virtual reality systems are presently in widespread use and typically comprise a headset display which presents an image which is generated to simulate a real image (hence, “virtual” reality). Virtual reality systems lack the capability of presenting real images and especially lack the capability of presenting virtual images combined with real images and especially in “real time”.

BRIEF DESCRIPTION OF THE INVENTION

The present invention provides a dual reality (DR) system which is characterized by providing apparatus and a process in which computer generated objects and images similar to those created by virtual reality systems, appear to the participant in the real world.

Similar to virtual reality, the dual reality environment is presented to the observer by way of a headset or visor which, similar to the virtual reality visor, provides a virtual image, and, in addition thereto, presents a stereoscopic view of the real environment through the employment of a pair of cameras coupled to the visor through a computer to present a stereoscopic view of the real environment to the participant through the visor.

The computer performs triangulation calculations which determine the distances from the visor to the selected navigational points to determine the position and orientation of the visor, which process will hereinafter be referred to as “visual navigation”.

As one example, a computer generated object, such as a ball, viewed through the dual reality visor appears to float in front of the wearer. The observed surroundings are coupled to the visor display from the cameras and present a view of the actual surroundings as opposed to a computer generated display (i.e., a computer generated display of the actual surroundings.

The system employs environment mapping which can, for example, generate an image of the ball which appears to bounce on a real table in front of the participant or against real walls of a room that the participant is observing.

The dual reality system and method utilizes three basic components which include:

(a) a visual navigation technique which determines the location and orientation of the dual reality system in a real three-dimensional space utilizing random points selected from an observed field of view;

(b) the dual reality superimposition technique superimposes computer generated objects with images of the real three-dimensional space. An observer wearing a dual reality headset employing visual navigation can walk around a computer generated object in his or her real surroundings, observing the object from all sides. Hereafter, computer generated objects and images will be referred to as supplemental objects or collectively as supplemental reality;

(c) environment mapping is the process by which supplemental objects are caused to “interact” with the real environment and with the observer identifying surfaces (such as walls, tables, panels, hands, etc.) in three-dimensional space wherein mathematical rules govern the interaction of supplemental objects with objects or surfaces in the real environment.

OBJECTS OF THE INVENTION

It is therefore one object of the present invention to provide a display which presents to an observer a dual reality image in which a computer generated image is superimposed with and reacts with a real image.

Another object of the present invention is to provide a display to an observer in which random points selected from a field of view are utilized to determine the location and orientation of the observer in a real three-dimensional space.

Still another object of the present invention is to provide a dual reality image presented to an observer by way of a visor worn by the observer, which dual reality image presents a computer generated image (virtual image) interacting with an actual three-dimensional image.

Still another object of the present invention is to provide a dual reality display presenting a virtual image interacting with a real, three-dimensional image wherein the interaction is obtained in accordance with pre-determined mathematical rules which govern the interaction.

BRIEF DESCRIPTION OF THE FIGURES

The above as well as other objects of the present invention will become apparent when reading the accompanying description and drawings in which:

FIG. 1 is a block diagram showing a dual reality operating system.

FIG. 1a shows a head set mounted upon a participant.

FIG. 2 is a block diagram showing the dual reality operating cycle.

FIG. 3 is a flow diagram depicting the visual navigation method.

FIGS. 4 through 16 are series of two and three-dimensional plots useful in describing the techniques of the present invention.

DETAILED DESCRIPTION OF THE INVENTION AND PREFERRED EMBODIMENTS THEREOF

FIG. 1 shows a dual reality system 10 in simplified block diagram form which is comprised of two cameras 12 and 14 having imaging systems preferably comprised of charge coupled devices (CCDs) each respectively coupled to at least one video capture card 16, 18. The video capture cards each enable an associated CPU 20, which may be one or more than one computer, to convert the video input to screen pixels whereby the video images can be analyzed in a memory as well as being displayed on a screen 22.

The system is preferably designed to be completely self-contained either in a headset 24 (see FIG. 1a) or a small backpack or sidepouch 24 a coupled to unit 24 by a wire W or by a wireless transmitter/receiver combination. The system may also utilize a wireless communication link between the headset and a medium or large size computer remote from the headset.

Visual Navigation

Visual navigation is utilized to determine location and orientation. The observer, as shown in FIG. 1a wears a headset 24. The orientation of the observer's head (the observer's viewing direction), in much the same way as an observer views a real environment through the naked eye, serves to direct the cameras 12, 14 to view the real environment so as to assume substantially the same location and orientation.

Definition of a “point”

The camera views the real environment and identifies a point on the surface of the environment as a small square grid of pixels, comprising sufficiently few pixels to most likely be part of only one physical object, and a sufficient number of pixels to reliably and uniquely identify that point and which may, as one example, be a 20×20 grid of pixels normally found, for example, in a conventional CCD imaging device, for example, of the type employed in video cameras.

A distinguishing characteristic of a point is a pattern of relative brightness and/or color within the small square grid.

During a point selection cycle, points are selected randomly using a mathematical algorithm. Any point which cannot be relocated during the next cycle of point selection is rejected. To be relocated, a point must match the previous point in birghtness/color in the location found significantly better than in any other location.

Definition of Visual Navigation

Employing the pair of cameras 12 and 14, a stereoscopic video input from the cameras is utilized to mathematically determine both position and orientation of the dual reality unit DRU 24.

Employing a starting point and absolute position as a reference, the position and orientation of unit 24 over time is determined by observing and tracking multiple, visibly distinguishable points within the field of view. Although in theory the system can work with as few as four points, for practical purposes the minimum number of points should fall within a range of 10 to 20. An increase in the number of points increases the accuracy of the system but tends to slow the system down. It is therefore a compromise between the number of points desired for accuracy weighed against a desired system speed.

“Visually distinguishable points” are defined as points with distinct patterns that can be relocated in the same general area within the field of view on subsequent search scans.

Tracking Points

The dual reality unit (“DRU”) 24 maintains a constant number of points at all times, which points are located within its field of view. The unit tracks points by scanning the field of view looking for the same patterns in the same general area. An entire scan or iteration of the field of view is completed every thirty milliseconds. In order for the DRU 24 to match a point from iteration to iteration, there must be one and only one position in the same general area which stands out as the best match for the pattern.

A randomly selected point will be discarded for any of the following three reasons:

(a) the point moved out of the field of view and therefore is no longer useful for tracking (i.e., as the wearer turns his head to the right, points on the trailing edge of the field of view (the left edge) cease to exist and subsequent iterations as they leave the field of view, whereas points closer to the center or leading edge (the right edge) will survive for a greater number of iterations.

(b) the point was not on a fixed object; the point shifted, losing its distinguishing pattern (e.g., the point was on a moveable object such as an individual's shirt and the individual turned around); or the point moved differently relative to the other point selected (e.g., the point was on a moveable object such an individual shirt and the individual moved to the right while other points remained in a constant position or moved in concert to the left, as would be the case if the participant moved his head to the right and the individual in his field of view also moved to the right);

(c) the point did not contain a pattern unique enough to be tracked (e.g., a blank wall may be initially selected but would soon be rejected since the point would be located at multiple positions throughout the field of view).

Every time a point is discarded, a new point is randomly selected in the field of view. If the new point does not contain a sufficiently unique pattern, it will only survive one iteration, after which time another point will be selected.

The random selection is achieved by use of a random signal generator which randomly selects a new point by means of a standard mathematical formula seeded by the system clock of the microprocessor.

Calculation of Point Locations In Three Dimensions

Of the points that have proven reliable as a result of having existed for a number of iterations (e.g., 20 iterations—⅔ of a second), the DRU 24 calculates the location of those points in three dimensions which is accomplished by matching the point originally located by the right side camera (“camera A”, for example, camera 14) in the left-side camera (“camera B”, for example, camera 12).

Starting on the left perimeter of the general area of the field of view in which an original pattern is located, camera B scans to the right until a matching pattern is found, or the right perimeter of the field of view is reached, in which case the point is rejected. The triangulation calculation to be described hereinbelow is performed if one and only one perfect match is found. Using this match, the DRU 24 determines three (3) angles, as follows:

Angle a: from camera A to the point along the x-axis;

Angle b: from camera A to the point along the y-axis;

Angle c: from camera B to the point along the x-axis.

The angle from camera B to the point along the y-axis is the same as the angle from camera A to the point along the y-axis.

Using the above angles, the distance between the detected point and camera A is calculated employing analytic geometry and trigonometry. The position of the point will then be defined in terms of the position and orientation of camera A, which information is converted into relative Cartesian coordinates of that point in space, using the position and orientation of camera A as the reference for the Cartesian space.

Calculating Movement of the DRU Through Space

The relative Cartesian coordinates derived for each point from the most recent iteration are transferred to a separate calculation module in the computer which determines the movement of the unit 24 through space. The module performs the necessary calculations to determine the absolute position and orientation of unit 24. In calculating these values, the module uses relative Cartesian coordinates for points contained only in the current iteration and the absolute Cartesian coordinates for any points contained in the current iteration to which was assigned an absolute Cartesian coordinate in the previous iteration. A detailed description of the calculations will be set forth hereinbelow.

The participant's position is calculated relative to the participant's starting position (i.e. relative to the position and orientation of the unit 24 worn by the participant), and the point positions and the participant's position are described in terms of absolute and relative coordinates. From all points the transfer to the calculation module, ten sets of four points are each selected in a random fashion by means of a standard mathematical formula seeded by the system clock of the micro-processor. For each of ten sets of four points, the relative and absolute Cartesian coordinates are placed in a series of equations which, using analytic geometry, trigonometry, and basic matrix operations, provide a result set containing the absolute position and orientation of unit 24. Each result set is then compared with every other result set, and a subset of a specific number of sets which most agree with each other are averaged together. This average result set represents the absolute position and orientation of unit 24.

Visual Navigation Calculations

The sequence of calculations is as follows:

Establishing the Absolute Basis and Position of the Unit in the Absolute Basis

The term “basis” is a mathematical term defined as a “frame of reference”.

The Absolute Basis is defined by the x, y and z axes. The position of unit 24 in this basis is defined as O (Ox, Oy, Oz). See FIG. 4.

Determining the DRU Basis and Orientation in Absolute Basis

The unit Basis is defined by the vectors X′, Y′ and Z′. The view of the unit 24 points along the Z′ vector and the Y′ vector extends upwardly through the top of unit 24. The orientation of the unit in Absolute Basis is defined by (see FIG. 5):

X′ is defined by the components (X′x, X′y, X′z),

Y′ is defined by the components (Y′x, Y′y, Y′z)

Z′ is defined by the components (Z′x, Z′y, Z′z) and

X′, Y′, AND Z′ are all unit vectors, i.e., they all have unit length.

Determining the Position of a Point in both the Absolute Basis and the DRU Basis

An observed object or point is defined as p, with a position (a, b, c) in the Absolute Basis, and a position (a′, b′, c′) in the DRU Basis. See FIG. 6.

Determining the Relationship between the Position of p in the Absolute Basis and the Position of p in the DRU Basis

Vector addition defines the relationship between p(a, b, c), and p(a′, b′, c′), as the following set of equations:

Equation Set A

a=(X′x) (a′)+(Y′x) (b′)+(Z′x) (c′)+Ox  Equation 1:

(FIG. 7)

b=(X′y) (a′)+(Y′y) (b′)+(Z′y) (c′)+Oy  Equation 2:

(FIG. 8)

c=(X′z) (a′)+(Y′z) (b′)+(Z′z) (c′)+Oz  Equation 3:

(FIG. 9)

By observing points with known absolute positions, these equations establish the DRU Basis (X′, Y′, Z′), and the position of the DRU 24 in the Absolute Basis (O). This is the essence of Visual Navigation.

Determining the DRU Basis and the Position of the DRU in the Absolute Basis

Overview of Initial Process: Establishing the DRU basis and the Initial Position of the DRU in the Absolute Basis

When the DRU is first turned on, the values of X′ are set to a unit vector extending along the positive x axis (1,0,0), the value of Y′ is set to a unit vector along the positive y axis (0,1,0), and the value of Z′ is set to a unit vector along the positive z axis (0,0,1), and O is set to (0,0,0). Thus, the DRU Basis and the position of the DRU in the Absolute Basis are established.

The first process the DRU will go through is the initial observation of points. At least four points are required (p1-p4) to determine the DRU Basis and the position of the DRU in the Absolute Basis, although the DRU generally observes a minimum of between 10 and 20 points.

As p1-p4 are selected by the DRU, their positions in the Absolute Basis [p1(a1, b1, c1), p2(a2, b2, c2) p3(a3, b3, c3), p4(a4, b4, c4)] are set to the same value of their positions in the DRU Basis [p1(a′1, b′1, c′1), p2(a′2, b′2, c′2), p3(a′3, b′3, c′3), p4(a′4, b′4, c′4)] and stored in the computer.

Overview of Secondary Process: Establishing the Position of the DRU in the Absolute Basis Over Time

The DRU then goes through the process of tracking points, using the following calculations:

Starting with Equation 1 of Equation Set A, four (4) points (p1-p4) are selected to solve for position and orientation. The following equations result:

Equation System 1 (based on Equation 1 using (p1-p4)

a 1=(X′x) (a′ 1)+(Y′x) (b′ 1)+(Z′x) (c′ 1)+Ox  Equation 5:

a 2=(X′x) (a′ 2)+(Y′x) (b′ 2)+(Z′x) (c′ 2)+Ox  Equation 6:

a 3=(X′x) (a′ 3)+(Y′x) (b′ 3)+(Z′x) (c′3)+OX  Equation 7:

a 4=(X′x) (a′ 4)+(Y′x) (b′ 4)+(Z′x) (c′ 4)+OX  Equation 8:

To solve this system of equations, the coefficients (a1-a4, a′1-a′4, b′1-b′4, c′1-c′4, and 1) are placed in a matrix, moving the first column to the end:

Matrix 1 (based on Equation 1 using (p1-p4)) a 1 b 1 c 1 1 a 1 a 2 b 2 c 2 1 a 2 a 3 b 3 c 3 1 a 3 a 4 b 4 c 4 1 a 4

Figure US06278479-20010821-M00001

When reduced by row operations to reduced row-echelon form, this solves for the following:

Result Matrix 1 (based on Equation 1 using p1-p4)) 1 0 0 0 X x 0 1 0 0 Y x 0 0 1 0 Z x 0 0 0 1 Ox

Figure US06278479-20010821-M00002

To solve for the rest of X′, Y′, Z′, and O, the same process is performed using equations (2) and (3), as follows:

Transformation of Matrix 2 into Result Matrix 2 (based on Equation 2 for (p1-p4)) a 1 b 1 c 1 1 b 1 a 2 b 2 c 2 1 b 2 a 3 b 3 c 3 1 b 3 a 4 b 4 c 4 1 b 4 1 0 0 0 X y 0 1 0 0 Y y 0 0 1 0 Z y 0 0 0 1 Oy

Figure US06278479-20010821-M00003

Transformation of Matrix 3 into Result Matrix 3 (based on Equation 3 for (p1p4)) a 1 b 1 c 1 1 c 1 a 2 b 2 c 2 1 c 2 a 3 b 3 c 3 1 c 3 a 4 b 4 c 4 1 c 4 1 0 0 0 X z 0 1 0 0 Y z 0 0 1 0 Z z 0 0 0 1 Oz

Figure US06278479-20010821-M00004

As these row operations are all identical, these matrices can be combined by adding addition rows, yielding the following:

Transformation of Matrix 4 into Result Matrix 4 (based on Equation 1-3 for (p1-p4)) a 1 b 1 c 1 1 a 1 b 1 c 1 a 2 b 2 c 2 1 a 2 b 2 c 2 a 3 b 3 c 3 1 a 3 b 3 c 3 a 4 b 4 c 4 1 a 4 b 4 c 4 1 0 0 0 X x X y X z 0 1 0 0 Y x Y y Y z 0 0 1 0 Z x Z y Z z 0 0 0 1 Ox Oy Oz

Figure US06278479-20010821-M00005

Finally, the system is improved by replacing p4 with a point derived from a cross-product of vectors based on p1, p2, and p3. This eliminates the possibility of the points not sufficiently spanning the DRU Basis. By performing the same cross-product on both the (an, bn, cn) points, and the (a′n, b′n, c′n) points, a consistent (a4, b4, c4) and (a′4, b′4, c′4) set are derived to describe the same imaginary point.

When the DRU locates previously observed points, it enters the new observed and the stored absolute positions for each point ((a′n, b′n, c′n) and (an, bn, cn)) into Matrix 4, which will give Result Matrix 4 and the new X′, Y′, Z′, and O of the DRU.

With the position and orientation of the DRU point of reference known, all points can be given an updated absolute position (an, bn, cn) if necessary, using Equation Set A.

Overview of Tertiary Process: Locating New Points in the DRU Basis

As described in previous sections, as the DRU's field of view changes, it will observe new points pn for which the positions in the DRU Basis are (a′n, b′n, c′n). The system will then plug these new values for (a′n, bn, c′n) into Equation Set A to determine the new (an, bn, cn). The new absolute values will then be stored in the computer so the points can be used in subsequent iterations. In this way the system can track the movement of the DRU through the Absolute Basis.

FIG. 3 is a flowchart of the program steps comprising visual navigation.

Recapitulating, at the Step S1, the system starts upon turn-on and is cleared and initialized whereupon the program advances to S2 where new points are collected until the desired number of points (n) are obtained. As was described hereinabove, the images are obtained and converted into pixels which are stored.

The stored images are scanned to identify distinctive points.

At S3, if this is the first iteration, the program jumps to S4 to set the point absolute positions. Then, at S5, an attempt is made to locate all the old points in the same area or field of view they were last observed. All old points which cannot be relocated are deleted at S6 at which time the program returns to S2 to collect new points until the total number of points equals n. Since this is the second time through the routine, the program jumps from S3 (which sets a flag after the first time through is established), and jumps to S7 at which three (3) points which are most likely to be stationary are selected. A fourth point is created at S8 which is normal to the plane formed by the three (3) points selected at S7. These points are “plugged” into the matrices at S9. The calculated unit position and orientation is transferred to the virtual object generation module at S10 and, when a new unit position is determined at S11 the program returns to S5.

Dual Reality Definition

Dual reality is the visual merging of real three-dimensional space (Observed Reality) with Supplemental Reality. Computer generated graphics are superimposed on a live stereoscopic video image in such a manner that any and all movement of the DRU through Observed Reality is reflected simultaneously by its movement in Supplemental Reality, creating an illusion of a single merged reality. Objects in Supplemental Reality can either be stationary, appearing in correct perspective relative to the participant, or can be moved around the field of view by computer, or both.

FIG. 10 shows the DRU 24 in the starting position (OO). The DRU has selected three points (p1, p2, p3) and projects a Virtual Object on Vector O.

FIG. 11 shows the DRU 24 moved to its second position relative to the starting position 24′shown in FIG. 10. In this position the DRU has located points p3 and p4 and has further selected a new point (p2). The Virtual Object is projected along vectors VO and VO′ in the same position as is shown in FIG. 10.

FIG. 12 shows the DRU moved to a third position relative to the starting position of FIG. 10. At this position, it has located points p2, p3 and p4. The Virtual Object is projected along vector VO and VO′ into the same position as shown in FIGS. 10 and 11.

FIG. 13 shows the DRU in the same position 24″as in FIG. 12, but wherein the Virtual Object is projected along a new set of vectors (V1 and V1′).

The above effect is achieved through the use of a self-contained headset 24, worn by the participant, independent of any external equipment or navigational aids, and is accomplished solely by the analysis of the stereoscopic video feed received from the unit 24 (i.e. by visual navigation). As an example, if the computer were to generate a Supplemental Reality cube suspended in mid-air, each surface could have a different color, as a participant walked around the cube so that the participant would be able to observe the different color in each respective side while walking around the computer generated cube.

Environment Mapping

Definition of Environment Mapping

Environment Mapping is a system in which the module that generates the supplemental reality is provided with information concerning the positions and orientations of surfaces existing in Observed Reality. This results in the two (2) realities being able to interact.

Calculation of the Orientation and Location of Surfaces

Calculation of the orientation and location of surfaces is accomplished by obtaining points in the same manner as described under

Visual Navigation

When an area of the field of view yields a sufficient number of points to reasonably assume they belong to a flat surface in observed reality, unit 24 will create a virtual plane in a position and with an orientation, as is shown in FIGS. 14 and 15. The coordinates of the plane are stored in the computer and are not displayed to the participant.

Once the virtual planes have been created, the Supplemental Objects can interact with them (e.g., a computer-generated object can be produced to give the effect of “bouncing off” a series of virtual planes corresponding to actual objects and surfaces), to create the effect of bouncing off “real” walls and objects within a room. Note also FIG. 16.

The process is also used to identify real objects interacting with the computer-generated objects in a dynamic manner. Using points which are identified on the near surface of the real object, unit 24 derives a three-dimensional model of the object and creates a matching virtual object (again only in the computer memory and not shown in the visor) which then interacts with the computer-generated object (e.g., if a participant attempted to hit the computer-generated image of a ball with his hand), the unit identifies the points on the observer's hand and, based on the placement of the ball in Supplemental Reality, determines that the two (2) objects overlap and cause an appropriate reaction wherein the ball could move, disappear or remain in position and the participant's hand appears to go “through” the object.

The system described hereinabove may be used independently by individuals or may be integrated into a Dual Reality network enabling multiple participants to interact with the Dual Reality as well as with each other.

Recapitulating, the Dual Reality operating cycle, and making reference to FIG. 2, the 3D video input 30 which is shown in simplified block diagram form as consisting of the video cameras shown in FIG. 1, obtain the video inputs and provide a 3D video display at output 32 as well as providing information at 34 for collecting and recognizing points in the real image which is formed by way of the memory devices, storing image frames and scanning these frames with an X by Y grid to identify points. These points are provided for environmental mapping purposes at 36 and for calculating position and orientation at 38. The environmental mapping function 36, performed by the computer, is utilized for generating the virtual object by the virtual object generator 40 to obtain the proper position and orientation relative to the real image. The position and orientation data determined at 38 and the virtual object generated at 40 is presented to the virtual object display 42 for presenting a display to the 3D video output 32.

A latitude of modification, change and substitution is intended in the foregoing disclosure, and in some instances, some features of the invention will be employed without a corresponding use of other features. Accordingly, it is appropriate that the appended claims be construed broadly and in a manner consistent with the spirit and scope of the invention herein described.

Claims (2)

What is claimed is:
1. A method for determining in real time a position and orientation of a camera pair based upon an observed environment within said camera pair's field of view comprising the steps of:
(a) obtaining an image of a region observed by each camera of said camera pair;
(b) identifying distinctive points in said image;
(c) determining current three dimensional positions of said distinctive points by comparing information obtained from each camera of said camera pair;
(d) analyzing said three-dimensional positions to determine position and orientation of said camera pair; and
(e) repeating steps (c) and (d) at given intervals;
and further comprising:
discarding points identified during a given interval which are not identified in a subsequent interval.
2. The method of claim 1 further comprising replacing said discarded points in a random manner.
US09028319 1998-02-24 1998-02-24 Dual reality system Expired - Fee Related US6278479B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09028319 US6278479B1 (en) 1998-02-24 1998-02-24 Dual reality system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09028319 US6278479B1 (en) 1998-02-24 1998-02-24 Dual reality system
US09847128 US6498618B2 (en) 1998-02-24 2001-05-02 Dual reality system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US09847128 Division US6498618B2 (en) 1998-02-24 2001-05-02 Dual reality system

Publications (1)

Publication Number Publication Date
US6278479B1 true US6278479B1 (en) 2001-08-21

Family

ID=21842784

Family Applications (2)

Application Number Title Priority Date Filing Date
US09028319 Expired - Fee Related US6278479B1 (en) 1998-02-24 1998-02-24 Dual reality system
US09847128 Expired - Fee Related US6498618B2 (en) 1998-02-24 2001-05-02 Dual reality system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US09847128 Expired - Fee Related US6498618B2 (en) 1998-02-24 2001-05-02 Dual reality system

Country Status (1)

Country Link
US (2) US6278479B1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6408257B1 (en) * 1999-08-31 2002-06-18 Xerox Corporation Augmented-reality display method and system
WO2004012141A2 (en) * 2002-07-26 2004-02-05 Zaxel Systems, Inc. Virtual reality immersion system
US20040104935A1 (en) * 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
US6774869B2 (en) * 2000-12-22 2004-08-10 Board Of Trustees Operating Michigan State University Teleportal face-to-face system
US6898307B1 (en) * 1999-09-22 2005-05-24 Xerox Corporation Object identification method and system for an augmented-reality display
US20070164988A1 (en) * 2006-01-18 2007-07-19 Samsung Electronics Co., Ltd. Augmented reality apparatus and method
WO2009003749A1 (en) * 2007-06-29 2009-01-08 Robert Bosch Gmbh Camera-assisted navigation system and method for operating it
US20090137860A1 (en) * 2005-11-10 2009-05-28 Olivier Lordereau Biomedical Device for Treating by Virtual Immersion
US20130026220A1 (en) * 2011-07-26 2013-01-31 American Power Conversion Corporation Apparatus and method of displaying hardware status using augmented reality
US20130083154A1 (en) * 2011-09-30 2013-04-04 Lg Electronics Inc. Electronic Device And Server, And Methods Of Controlling The Electronic Device And Server

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6677982B1 (en) * 2000-10-11 2004-01-13 Eastman Kodak Company Method for three dimensional spatial panorama formation
US6885939B2 (en) 2002-12-31 2005-04-26 Robert Bosch Gmbh System and method for advanced 3D visualization for mobile navigation units
CN101213833A (en) 2005-06-08 2008-07-02 汤姆逊许可公司 Method, device and system for alternate image/video insertion
US20070248283A1 (en) * 2006-04-21 2007-10-25 Mack Newton E Method and apparatus for a wide area virtual scene preview system
EP2395765B1 (en) 2010-06-14 2016-08-24 Nintendo Co., Ltd. Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method
US9384711B2 (en) 2012-02-15 2016-07-05 Microsoft Technology Licensing, Llc Speculative render ahead and caching in multiple passes
US9177533B2 (en) 2012-05-31 2015-11-03 Microsoft Technology Licensing, Llc Virtual surface compaction
US9230517B2 (en) * 2012-05-31 2016-01-05 Microsoft Technology Licensing, Llc Virtual surface gutters
US9235925B2 (en) * 2012-05-31 2016-01-12 Microsoft Technology Licensing, Llc Virtual surface rendering
US9286122B2 (en) 2012-05-31 2016-03-15 Microsoft Technology Licensing, Llc Display techniques using virtual surface allocation
US9307007B2 (en) 2013-06-14 2016-04-05 Microsoft Technology Licensing, Llc Content pre-render and pre-fetch techniques

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4402053A (en) 1980-09-25 1983-08-30 Board Of Regents For Education For The State Of Rhode Island Estimating workpiece pose using the feature points method
US4573191A (en) 1983-03-31 1986-02-25 Tokyo Shibaura Denki Kabushiki Kaisha Stereoscopic vision system
US4649425A (en) 1983-07-25 1987-03-10 Pund Marvin L Stereoscopic display
US4654872A (en) 1983-07-25 1987-03-31 Omron Tateisi Electronics Co. System for recognizing three-dimensional objects
US4737972A (en) 1982-02-24 1988-04-12 Arnold Schoolman Stereoscopic fluoroscope arrangement
US4791478A (en) 1984-10-12 1988-12-13 Gec Avionics Limited Position indicating apparatus
US4853771A (en) 1986-07-09 1989-08-01 The United States Of America As Represented By The Secretary Of The Navy Robotic vision system
US4884219A (en) 1987-01-21 1989-11-28 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
US5072218A (en) 1988-02-24 1991-12-10 Spero Robert E Contact-analog headup display method and apparatus
US5259037A (en) 1991-02-07 1993-11-02 Hughes Training, Inc. Automated video imagery database generation using photogrammetry
US5311203A (en) 1993-01-29 1994-05-10 Norton M Kent Viewing and display apparatus
US5320538A (en) 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
US5321416A (en) 1992-07-27 1994-06-14 Virtual Research Systems Head-mounted visual display apparatus
US5368309A (en) 1993-05-14 1994-11-29 The Walt Disney Company Method and apparatus for a virtual video game
US5394517A (en) 1991-10-12 1995-02-28 British Aerospace Plc Integrated real and virtual environment display system
US5424556A (en) 1993-11-30 1995-06-13 Honeywell Inc. Gradient reflector location sensing system
US5423554A (en) 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5444790A (en) 1994-02-28 1995-08-22 Shure Brothers, Inc. Microphone windscreen mounting
US5488675A (en) 1994-03-31 1996-01-30 David Sarnoff Research Center, Inc. Stabilizing estimate of location of target region inferred from tracked multiple landmark regions of a video image
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5499306A (en) 1993-03-08 1996-03-12 Nippondenso Co., Ltd. Position-and-attitude recognition method and apparatus by use of image pickup means
US5526812A (en) 1993-06-21 1996-06-18 General Electric Company Display system for enhancing visualization of body structures during medical procedures
US5530420A (en) 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
US5684531A (en) * 1995-04-10 1997-11-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ranging apparatus and method implementing stereo vision system
US5699444A (en) * 1995-03-31 1997-12-16 Synthonics Incorporated Methods and apparatus for using image data to determine camera location and orientation
US5850469A (en) * 1996-07-09 1998-12-15 General Electric Company Real time tracking of camera pose
US5889550A (en) * 1996-06-10 1999-03-30 Adaptive Optics Associates, Inc. Camera tracking system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5983161A (en) * 1993-08-11 1999-11-09 Lemelson; Jerome H. GPS vehicle collision avoidance warning and control system and method
US5920337A (en) * 1994-12-27 1999-07-06 Siemens Corporate Research, Inc. Omnidirectional visual image detector and processor
EP0807352A1 (en) * 1995-01-31 1997-11-19 Transcenic, Inc Spatial referenced photography
US5850550A (en) * 1995-08-31 1998-12-15 International Business Machine Corporation No preprocessor and a source level debugger for embedded SQL in a 3GL

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4402053A (en) 1980-09-25 1983-08-30 Board Of Regents For Education For The State Of Rhode Island Estimating workpiece pose using the feature points method
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4737972A (en) 1982-02-24 1988-04-12 Arnold Schoolman Stereoscopic fluoroscope arrangement
US4573191A (en) 1983-03-31 1986-02-25 Tokyo Shibaura Denki Kabushiki Kaisha Stereoscopic vision system
US4649425A (en) 1983-07-25 1987-03-10 Pund Marvin L Stereoscopic display
US4654872A (en) 1983-07-25 1987-03-31 Omron Tateisi Electronics Co. System for recognizing three-dimensional objects
US4791478A (en) 1984-10-12 1988-12-13 Gec Avionics Limited Position indicating apparatus
US4853771A (en) 1986-07-09 1989-08-01 The United States Of America As Represented By The Secretary Of The Navy Robotic vision system
US4884219A (en) 1987-01-21 1989-11-28 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
US5072218A (en) 1988-02-24 1991-12-10 Spero Robert E Contact-analog headup display method and apparatus
US5259037A (en) 1991-02-07 1993-11-02 Hughes Training, Inc. Automated video imagery database generation using photogrammetry
US5394517A (en) 1991-10-12 1995-02-28 British Aerospace Plc Integrated real and virtual environment display system
US5321416A (en) 1992-07-27 1994-06-14 Virtual Research Systems Head-mounted visual display apparatus
US5320538A (en) 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5311203A (en) 1993-01-29 1994-05-10 Norton M Kent Viewing and display apparatus
US5499306A (en) 1993-03-08 1996-03-12 Nippondenso Co., Ltd. Position-and-attitude recognition method and apparatus by use of image pickup means
US5368309A (en) 1993-05-14 1994-11-29 The Walt Disney Company Method and apparatus for a virtual video game
US5526812A (en) 1993-06-21 1996-06-18 General Electric Company Display system for enhancing visualization of body structures during medical procedures
US5423554A (en) 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5424556A (en) 1993-11-30 1995-06-13 Honeywell Inc. Gradient reflector location sensing system
US5530420A (en) 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
US5444790A (en) 1994-02-28 1995-08-22 Shure Brothers, Inc. Microphone windscreen mounting
US5488675A (en) 1994-03-31 1996-01-30 David Sarnoff Research Center, Inc. Stabilizing estimate of location of target region inferred from tracked multiple landmark regions of a video image
US5699444A (en) * 1995-03-31 1997-12-16 Synthonics Incorporated Methods and apparatus for using image data to determine camera location and orientation
US5684531A (en) * 1995-04-10 1997-11-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ranging apparatus and method implementing stereo vision system
US5889550A (en) * 1996-06-10 1999-03-30 Adaptive Optics Associates, Inc. Camera tracking system
US5850469A (en) * 1996-07-09 1998-12-15 General Electric Company Real time tracking of camera pose

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Hearn, D., and Baker, M. Pauline, Computer Graphics (Second Edition), pp. 49-53, 1994.

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6408257B1 (en) * 1999-08-31 2002-06-18 Xerox Corporation Augmented-reality display method and system
US6898307B1 (en) * 1999-09-22 2005-05-24 Xerox Corporation Object identification method and system for an augmented-reality display
US6774869B2 (en) * 2000-12-22 2004-08-10 Board Of Trustees Operating Michigan State University Teleportal face-to-face system
US20050083248A1 (en) * 2000-12-22 2005-04-21 Frank Biocca Mobile face capture and image processing system and method
US20040104935A1 (en) * 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
WO2004012141A2 (en) * 2002-07-26 2004-02-05 Zaxel Systems, Inc. Virtual reality immersion system
WO2004012141A3 (en) * 2002-07-26 2004-05-06 Zaxel Systems Inc Virtual reality immersion system
US7946974B2 (en) 2005-11-10 2011-05-24 Olivier Lordereau Biomedical device for treating by virtual immersion
US20090137860A1 (en) * 2005-11-10 2009-05-28 Olivier Lordereau Biomedical Device for Treating by Virtual Immersion
US7817104B2 (en) * 2006-01-18 2010-10-19 Samsung Electronics Co., Ltd. Augmented reality apparatus and method
US20070164988A1 (en) * 2006-01-18 2007-07-19 Samsung Electronics Co., Ltd. Augmented reality apparatus and method
WO2009003749A1 (en) * 2007-06-29 2009-01-08 Robert Bosch Gmbh Camera-assisted navigation system and method for operating it
US20100235080A1 (en) * 2007-06-29 2010-09-16 Jens Faenger Camera-based navigation system and method for its operation
JP2010532031A (en) * 2007-06-29 2010-09-30 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh The camera incorporated navigation system and its method of operation
US8649974B2 (en) 2007-06-29 2014-02-11 Robert Bosch Gmbh Camera-based navigation system and method for its operation
US20130026220A1 (en) * 2011-07-26 2013-01-31 American Power Conversion Corporation Apparatus and method of displaying hardware status using augmented reality
US9965564B2 (en) * 2011-07-26 2018-05-08 Schneider Electric It Corporation Apparatus and method of displaying hardware status using augmented reality
US20130083154A1 (en) * 2011-09-30 2013-04-04 Lg Electronics Inc. Electronic Device And Server, And Methods Of Controlling The Electronic Device And Server
US9118804B2 (en) * 2011-09-30 2015-08-25 Lg Electronics Inc. Electronic device and server, and methods of controlling the electronic device and server

Also Published As

Publication number Publication date Type
US6498618B2 (en) 2002-12-24 grant
US20020005891A1 (en) 2002-01-17 application

Similar Documents

Publication Publication Date Title
Caspi et al. Aligning non-overlapping sequences
US6310627B1 (en) Method and system for generating a stereoscopic image of a garment
US8094928B2 (en) Stereo video for gaming
US7693702B1 (en) Visualizing space systems modeling using augmented reality
US7162054B2 (en) Augmented reality technology
US20040105573A1 (en) Augmented virtual environments
Kanade et al. A stereo machine for video-rate dense depth mapping and its new applications
US6100925A (en) Image insertion in video streams using a combination of physical sensors and pattern recognition
US20060202986A1 (en) Virtual clothing modeling apparatus and method
US20050024388A1 (en) Image displaying method and apparatus
US6778171B1 (en) Real world/virtual world correlation system using 3D graphics pipeline
US6611283B1 (en) Method and apparatus for inputting three-dimensional shape information
US20070035562A1 (en) Method and apparatus for image enhancement
Hirota et al. Superior augmented reality registration by integrating landmark tracking and magnetic tracking
US7317812B1 (en) Method and apparatus for robustly tracking objects
US20040104935A1 (en) Virtual reality immersion system
Kim et al. A helmet mounted display for telerobotics
US20130230215A1 (en) Identifying components of a humanoid form in three-dimensional scenes
Rangarajan et al. Establishing motion correspondence
US20070002037A1 (en) Image presentation system, image presentation method, program for causing computer to execute the method, and storage medium storing the program
US7002551B2 (en) Optical see-through augmented reality modified-scale display
US20110187746A1 (en) Image Processing Device, Image Processing Method, and Program
US7003136B1 (en) Plan-view projections of depth image data for object tracking
US20100195867A1 (en) Visual target tracking using model fitting and exemplar
US20030012410A1 (en) Tracking and pose estimation for augmented reality using real features

Legal Events

Date Code Title Description
AS Assignment

Owner name: WILSON, HEWITT & ASSOCIATES INC., PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WILSON, PHILLIP C.;MARTIN, ERIK A.;REEL/FRAME:008992/0085

Effective date: 19980223

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
FP Expired due to failure to pay maintenance fee

Effective date: 20090821