WO2002073287A2 - Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer - Google Patents

Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer Download PDF

Info

Publication number
WO2002073287A2
WO2002073287A2 PCT/JP2002/002297 JP0202297W WO02073287A2 WO 2002073287 A2 WO2002073287 A2 WO 2002073287A2 JP 0202297 W JP0202297 W JP 0202297W WO 02073287 A2 WO02073287 A2 WO 02073287A2
Authority
WO
WIPO (PCT)
Prior art keywords
transmitter
observer
receiver
position sensor
real space
Prior art date
Application number
PCT/JP2002/002297
Other languages
French (fr)
Other versions
WO2002073287A3 (en
Inventor
Hiroki Yonezawa
Yasuhiro Okuno
Kenji Morita
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Publication of WO2002073287A2 publication Critical patent/WO2002073287A2/en
Publication of WO2002073287A3 publication Critical patent/WO2002073287A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements

Definitions

  • the present invention relates to an augmented or mixed reality system which creates and displays a mixed reality space picture by combining a real space picture received from a video camera or the like and a virtual space picture created by CG or the like, and a head mounted display apparatus which can be used for the mixed reality system.
  • the present invention also relates to a method of determining the position of a transmitter used to measure the position and direction of a predetermined region of an observer in the above mixed reality system.
  • a mixed reality system to merge a real space with a virtual space, the 3D coordinates and direction of a real space object by using some means.
  • the following method are realized: a method using a position/direction measurement apparatus typified by FASTRAK (trademark) available from Polhemus Inc. in the US, a method of combining information obtained by image processing of a real space picture based on known camera parameters with information obtained by a gyro, and a method of measuring the position and direction of a real space object by using a special multi-eye camera.
  • Fig. 22 shows an example of how a conventional position/direction measurement apparatus typified by FASTRAK available from Polhemus Inc. in the US.
  • reference numerals 300a and 300b denote observers; 301, an HMD (Head Mounted Display) used by an observer to observe a mixed real space; 302, a position/direction measurement receiver mounted on the HMD 301; and 304, a position/direction measurement transmitter hidden by a wall.
  • the observers 300a and 300b stand to face each other in a set 310. Each observer can freely determine a viewpoint position and line-of-sight direction within a movable area 311.
  • Reference numeral 312 denotes a mixed real space picture observable area.
  • the observer 300a or 300b sees this mixed real space picture observable area 312, he/she can observe, through the HMD 301, a mixed real space picture obtained by merging a CG picture as a virtual space picture with a real space picture.
  • a real space picture is displayed on the HMD 301.
  • a virtual space picture is displayed at a position determined on the basis of the viewpoint position and line-of-sight direction of the observer. It is therefore important to reduce measurement errors of the viewpoint position and line-of-sight direction when the observer sees the mixed real space picture observable area 312.
  • the receiver 302 is fixed to the left side of the HMD 301 to measure the viewpoint position and line-of-sight direction of an observer, the distance between the receiver 302 and the transmitter 304 becomes long. This may result in an increase in measurement errors.
  • the present invention has been made in consideration of the above problem in the prior art, and has as its object to reduce positional shifts between a real space picture and a virtual space picture with an inexpensive arrangement .
  • a mixed reality system which creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, characterized by comprising a transmitter, a first receiver which receives a signal generated by said transmitter to measure a viewpoint position and direction of an observer, and a second receiver which receives a signal generated by said transmitter to measure a position and direction of another body region of the observer, wherein said transmitter is positioned such that a distance between said transmitter and said first receiver becomes shorter than a distance between said transmitter and said second receiver.
  • a head mounted display apparatus which is used in a mixed reality system that creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, and which can mount a receiver used to measure a viewpoint position and direction of an observer to be mounted, characterized in that a plurality of mount portions for detachably mounting said receiver are formed.
  • a method of positioning a transmitter in a mixed reality system which includes the transmitter, a first receiver which receives a signal generated by the transmitter to measure a viewpoint position and direction of an observer, and a second receiver which receives a signal generated by the transmitter to measure a position and direction of another body region of the observer, and creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, characterized in that said transmitter is positioned such that a distance between the transmitter and the first receiver becomes shorter than a distance between the transmitter and the second receiver.
  • Fig. 1 is a view showing the schematic arrangement of a mixed reality system to which the third embodiment of the present invention is applied;
  • Fig. 2 is a view showing the arrangement of an HMD
  • Fig. 3 is a flow chart showing the processing of creating a mixed real space picture
  • Figs . 4A and 4B are views showing a real space picture example and virtual space picture
  • Fig. 5 is a view showing an ideal (with no error in position sensor measurement values) mixed real space picture corresponding to the real space picture example and virtual space picture example in Figs. 4A and 4B;
  • Fig. 6 is a view showing a mixed real space picture corresponding to the real space picture example and virtual space picture example in Figs. 4A and 4B in a case where an error exists only in an eye position sensor measurement value;
  • Fig. 7 is a view showing a mixed real space picture corresponding to the real space picture example and virtual space picture example in Figs. 4A and 4B in a case where an error exists only in a hand position sensor measurement value;
  • Fig. 8 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the first embodiment of the present invention;
  • Fig. 9 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the second embodiment of the present invention.
  • Fig. 10 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the third embodiment of the present invention.
  • Fig. 11 is a view showing the schematic arrangement of a mixed reality system to which the third embodiment of the present invention is applied;
  • Fig. 12 is a view showing the schematic arrangement of a mixed reality system to which the fourth embodiment of the present invention is applied;
  • Fig. 13 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the fourth embodiment of the present invention.
  • Fig. 14 is a view showing the schematic arrangement of a mixed reality system to which the fifth embodiment of the present invention is applied;
  • Fig. 15 is a view for explaining the distance relationship between position sensor transmitters and position sensors in the fifth embodiment of the present invention.
  • Fig. 16 is a view showing the schematic arrangement of a mixed reality system to which the sixth embodiment of the present invention is applied;
  • Fig. 17 is a perspective view showing a case where the HMD according to the sixth and seventh embodiments is viewed from the direction of a photographing unit;
  • Fig. 18 is a perspective view showing a case where the HMD according to the sixth and seventh embodiments is viewed from the direction of a display unit;
  • Fig. 19 is a block diagram showing a hardware arrangement corresponding to one observer in the system shown in Fig. 16;
  • Fig. 20 is a view for explaining the mount position of a position sensor (receiver) in the sixth embodiment of the present invention.
  • Fig. 21 is a view for explaining the mount position of a position sensor (receiver) in the seventh embodiment of the present invention.
  • Fig. 22 is a view for explaining an example of a problem in the prior art.
  • Fig. 1 is a view showing the schematic arrangement of a mixed reality system to which the first embodiment of the present invention is applied.
  • An observer 100 is wearing an HMD (Head Mounted Display) 110 on the head and a glove 120 on his/her hand.
  • HMD Head Mounted Display
  • the HMD 110 is comprised of a video camera 111, LCD 112, position/direction measurement apparatus receiver (eye position sensor) 113, and optical prisms 114 and 115.
  • the video camera 111 photographs a real space picture at/in the viewpoint position and line-of-sight direction of the observer through the optical prism 115.
  • the eye position sensor 113 is used to measure the viewpoint position and line-of-sight direction of an observer.
  • the LCD 112 displays a mixed real space picture. This picture is guided to the pupil of the observer through the optical prism 114.
  • the glove 120 incorporates a hand position sensor 121 and speaker 122 (not shown) .
  • the hand position sensor 121 is used as a sensor for measuring the position and direction of the hand of the observer.
  • the speaker 122 generates a sound corresponding to an event that has occurred at the -Si-
  • This sound may be a sound generated when the user touches or hits a virtual space object with his/her hand, a sound generated when the condition of a virtual space object displayed in synchronism with the position of the hand changes, or the like.
  • Reference numeral 130 denotes a position/direction measurement apparatus transmitter (position sensor transmitter) ; and 131, a position/direction measurement apparatus body (position sensor body) .
  • the eye position sensor 113, hand position sensor 121, and position sensor transmitter 130 are connected to the position sensor body 131. Magnetism is transmitted from the position sensor transmitter 130. This magnetism is received by the eye position sensor 113 and hand position sensor 121.
  • the position sensor body 131 calculates the position and direction of the eye and hand on the basis of the reception intensity signal from the hand position sensor 121.
  • As this position/direction measurement apparatus FASTRAK available from Polhemus Inc. in the US or the like can be used.
  • Reference numeral 140 denotes a processing apparatus for creating a mixed real space picture for one observer and displaying it on the HMD 110.
  • This processing apparatus 140 is comprised of, for example, a personal computer, a video capture card, a video card having a CG function, a sound card, and the like.
  • the HMD 110, speaker 122, and position sensor body 131 are connected to the processing apparatus 140.
  • step S301 the processing apparatus 140 receives a viewpoint position, a line-of-sight direction, and the position and direction of the hand transmitted from the position sensor body 131 (step S301) .
  • step S301 uses thread S311 of periodically receiving the viewpoint position, the line-of-sight direction, and the position and direction of the hand transmitted from the position sensor body 131.
  • the virtual space time is then updated, and the state of the virtual space (the type, position, and condition of a virtual space object) is updated (step S302) .
  • the state of the virtual space (the type, position, and condition of a virtual space object) is updated (step S302) .
  • the state of such an object is also updated. If, for example, the user is made to see that he/she is always wearing a glove as a virtual space object on his/her hand, the position and direction of the glove are updated in accordance with changes in the position and direction of the hand in step S302.
  • the state of the virtual space is updated in accordance with the event (step S303) .
  • the occurrence of an even corresponds to a case where it is determined that the user touches the virtual space object with his/her hand. Updating the state of a virtual space in accordance with this event may be equivalent to changing, for example, the picture of the touched virtual space object into a picture of an explosion.
  • a real space picture at/in the viewpoint position and line-of-sight direction of the observer which is obtained from the video camera 111 is received (step S304) .
  • step S304 uses thread S314 of periodically acquiring the real space picture obtained from the video camera 111 through a video capture card.
  • a virtual space picture at/in the viewpoint position and line-of-sight direction of the observer, which are obtained in step S301, is created in accordance with the state of virtual space updated in steps S302 and S303 (step S305) .
  • step S305 the virtual space picture created in step S305 is combined with the real space picture received in step S304, and the resultant picture is output to the LCD 112 of the HMD 110 (step S306) .
  • the above processing is repeatedly executed until some terminating operation is performed (step S307) .
  • a method unique to this embodiment will be described next, which is used to reduce the differences in position and direction between a virtual space object and a real space object in a mixed real space picture observed by each observer.
  • the position/direction measurement apparatus can properly measure the viewpoint position and line-of-sight direction of an observer and the position and direction of the hand, there are no differences in position and direction between the virtual space object and the real space object.
  • the position/direction measurement apparatus uses magnetism or ultrasonic waves, and hence is very susceptible to disturbances and has a small measurable range.
  • the actual position/direction measurement apparatus can measure with effective accuracy only within a range of about 80 cm from the position sensor transmitter.
  • the principal object of the first embodiment is therefore to improve the measurement accuracy of the eye position sensor 113.
  • the position sensor transmitter 130 is placed such that the distance between the eye position sensor 113 and the position sensor transmitter 130 becomes shorter than the distances between the position sensor transmitter 130 and the remaining sensors connected to the position sensor body 131 at the positions and directions at/in which the respective position sensors are used with high frequencies, as shown in Fig. 8, thereby improving the measurement accuracy of the viewpoint position and line-of-sight direction as compared with the measurement accuracy of the position and direction of the remaining position sensors.
  • This makes it possible to make positional shifts with respect to all the virtual space objects less noticeable.
  • the position of the position sensor transmitter 130 cannot be specified but should be changed flexibly depending on the type of mixed reality picture the mixed reality system displays and the position and direction viewed by an observer.
  • mixed reality can be further augmented by setting a situation in which the observers can observe each other. In general, therefore, the direction in which each observer observes with a high frequency becomes the horizontal direction. In such a situation, placing the position sensor transmitter 130 above the head of the observer as shown in Fig. 8 makes it possible to have the merits of this embodiment and prevent a picture of the transmitter, which should not be combined as a real space object in the mixed real space, from being combined. In addition, this allows the observer to freely move in all directions . Furthermore, since measurement errors caused when the observer moves become equal to each other in all directions, management is facilitated. As described above, a mixed real space picture without any unnecessary real object, which has high quality as a whole, can be observed while the freedom of movement is ensured.
  • the second embodiment exemplifies a case where a position sensor transmitter 130 is placed such that the distance from a hand position sensor 121 is shortened, as well as the distance from an eye position sensor 113, at/in the viewpoint position and line-of-sight direction at/in which the hand is observed with a high frequency and at/in the hand position and direction, as shown in Fig. 9.
  • the position sensor transmitter 130 in a situation where an observer frequently observes the hand, is so placed as to shorten the distance between the eye position sensor 113 and the position sensor transmitter 130 as well as the distance between the hand position sensor 121 and the position sensor transmitter 130. This makes it possible to reduce measurement errors in both the eye position sensor 113 and the hand position sensor 121. As compared with the first embodiment, a virtual space object that moves in synchronism with the hand can be observed while the object is localized on the hand with a small error.
  • the posture of an observer in which the hand is frequently observed depends on the contents of a mixed real space picture which the mixed reality system is to present to the observer. For this reason, the position sensor transmitter 130 must be adjusted such that the distance from the hand position sensor 121 becomes almost equal to the distance from the eye position sensor 113 when the hand is frequently observed.
  • This embodiment has exemplified the case where the position of a virtual space object is changed in synchronism with the hand.
  • the present invention is not limited to this, and can also be applied to a case where the position of a virtual space object is changed in synchronism with another body region such as a foot.
  • Fig. 11 is a view showing the schematic arrangement of a mixed reality system to which the third embodiment is applied.
  • This system is based on the assumption that two observers simultaneously observe the same mixed real space.
  • This system includes two systems each identical to the one shown in Fig. 1.
  • a position sensor transmitter 130 and position sensor body 131 are shared by the two observers to realize a reduction in the total cost of the mixed real space system.
  • the position sensor body 131 is connected to only one processing apparatus 140 and is not connected to the other processing apparatus 140. For this reason, the processing apparatuses 140 for the respective observers are connected to each other through a network 150 to allow the processing apparatus 140 to which the position sensor body 131 is not connected to create a mixed real space picture .
  • the position sensor transmitter 130 is so placed as to almost match the distance (a or a') between an eye position sensor 113 for each observer and the position sensor transmitter 130 with the distance (b orb') between a hand position sensor 121 and the position sensor transmitter 130.
  • the position sensor transmitter 130 is so placed as to minimize the respective distances (a, a', b, and b')-
  • each observer can observe a virtual space object that moves in synchronism with a hand with a small position shift on the hand.
  • positional shifts can be made uniform with respect to all observers by placing the position sensor transmitter 130 such that the distance (a) between the eye position sensor 113 for a given observer and the position sensor transmitter 130 becomes almost equal to the distance
  • each processing apparatus 140 The procedure executed by each processing apparatus 140 is almost the same as that in the first embodiment except that the processing apparatus 140 to which the position sensor body 131 controls the position sensor body 131 through a position/direction acquisition thread in thread S311 in Fig. 3 to acquire measurement values. This processing apparatus 140 then accesses the network 150 to distribute the measurement values obtained by the respective position sensors to the other processing apparatus .
  • the processing apparatus 140 to which the position sensor body 131 is not connected accesses the network through a position/direction acquisition thread in thread S311 in Fig. 3 to acquire the measurement values obtained by the respective position sensors which are sent from the processing apparatus 140 to which the position sensor body 131 is connected.
  • the fourth embodiment is obtained by developing the first embodiment to have one each of a position sensor transmitter and a position sensor body for one position sensor.
  • two components i.e., a position sensor transmitter and position sensor body, are required for one position sensor, and hence the cost increases accordingly.
  • a viewpoint position, a line-of-sight direction, and the position and direction of the hand can be measured with high accuracy.
  • Fig. 12 is a view showing the schematic arrangement of a mixed reality system to which the fourth embodiment is applied.
  • This system additionally has a hand position sensor transmitter 160 and position sensor body 161 as compared with the arrangement of the first embodiment shown in Fig. 1.
  • the hand position sensor body 161 is connected to a processing apparatus 140, and the hand position sensor transmitter 160 is connected to the hand position sensor body 161.
  • a hand position sensor 121 is designed to receive magnetism from the hand position sensor transmitter 160.
  • a position sensor transmitter 130 and position sensor body 131 function as dedicated devices for the eyes .
  • the hand position sensor transmitter 160 is so placed as to shorten the distance between the hand position sensor 121 and the hand position sensor transmitter 160 at/in the hand position and direction where an observer observes the hand with a high frequency.
  • the position sensor transmitter 130 is so placed as to shorten the distance between an eye position sensor 113 and the eye position sensor transmitter 130.
  • a position/direction acquisition thread controls the plurality of position sensor bodies 131 and 161 to acquire measurement values from the respective sensor bodies.
  • the fourth embodiment is especially effective in preventing an observer from feeling a positional shift when a translucent virtual space object like seawater or a virtual space picture almost equal in size to a hand (real space picture) is to be expressed as if it were always on the hand of the observer.
  • a virtual space obj ect is not translucent .
  • the hand is completely hidden by the virtual space object. Even if, therefore, a slight positional shift occurs, the resultant mixed real space picture does not look strange to the observer.
  • this embodiment can prevent the observer from observing positional shifts between the virtual space object and the real space object.
  • the quality of a mixed real space picture can be improved by placing the eye position sensor transmitter 130 above the head of the observer, as shown in Fig. 13, as in the first embodiment.
  • the hand position sensor transmitter 160 may be placed below the hand position at/in the hand position and direction where the observer uses the system with a high accuracy. With this arrangement, the hand position sensor transmitter 160 does not interfere with the field of view of the observer in the horizontal direction, thereby presenting a mixed real space picture with higher quality.
  • Fig. 14 is a view showing the arrangement of a mixed reality system to which the fifth embodiment is applied.
  • position sensor transmitters 130 and 160 and position sensor bodies 131 and 161 are respectively provided for an eye position sensor 113 and hand position sensor 121 to allow high-accuracy measurement of a viewpoint position, a line-of-sight direction, and the position and direction of the hand.
  • the position sensor transmitters 130 and 160 and position sensor bodies 131 and 161 are shared by two observers to attain a reduction in cost.
  • the two position sensor bodies 131 and 161 are connected to only one processing apparatus 140 and are not connected to the other processing apparatus 140.
  • the processing apparatuses 140 for the respective observers are therefore connected to each other through a network 150 to allow the processing apparatus 140 to which the position sensor bodies 131 and 161 are not connected to create a mixed real space picture.
  • the distance between the eye position sensor 113 for each observer and the eye position sensor transmitter 130 and the distance between the hand position sensor 121 and the hand position sensor transmitter 160 are minimized at a position where each observer makes observations with a high frequency.
  • the eye position sensor transmitter 130 and hand position sensor transmitter 160 are placed such that the distances between the eye position sensors 113 for the respective observers and the position sensor transmitter 130 become almost equal to each other, and the distances between the hand position sensors 121 for the respective observers and the hand position sensor transmitter 160 become almost equal to each other.
  • each processing apparatus 140 The procedure executed by each processing apparatus 140 is the same as that in the first embodiment except that the processing apparatus 140 to which the position sensor bodies 131 and 161 are connected controls the position sensor bodies 131 and 161 through a position/direction acquisition thread in thread S311 in Fig. 3 to acquire measurement values. The apparatus then accesses the network 150 to distribute the measurement values obtained by the respective position sensors to the other processing apparatus .
  • the position sensor transmitters 130 and 160 do not interfere with the field of view of the observer in the horizontal direction. This allows the observer to observe a high-quality mixed real space picture in the direction in which a mixed real space is observed with a high frequency.
  • Fig. 16 is a view showing the schematic arrangement of a mixed reality system to which the sixth embodiment is applied.
  • reference numerals 300a and 300b denote observers who observe mixed real space pictures created in this system. Each of the observers 300a and 300b wears an HMD 301 on which a receiver 302 is mounted. Each observer can observe a mixed real space corresponding to position/direction measurement values through the HMD 301.
  • Reference numeral 306 denotes a position/direction measurement apparatus. A transmitter 304 and receiver 302 are connected to this position/direction measurement apparatus 306. The magnetism generated from the transmitter 304 is received by the receiver 302. The position/direction measurement apparatus 306 thenmeasures the viewpoint position and line-of-sight direction of the observer from the strength of magnetism.
  • the position/direction measurement apparatus 306 is connected to a processing apparatus 307 to be always notified of the viewpoint position and line-of-sight direction of the observer from the position/direction measurement apparatus 306.
  • the processing apparatus 307 creates a mixed real space picture corresponding to the viewpoint position and line-of-sight direction of each observer on the basis of this information, and displays it on the HMD 301.
  • the processing apparatuses 307 for the respective observers are connected to each other through a network 330.
  • the respective processing apparatuses 307 share the viewpoint positions and line-of-sight directions of the respective observers and the position and direction of a virtual space object by using the network 330. This allows a given observer to display a virtual space object at the position of the other observer.
  • Figs. 17 and 18 are perspective views of the HMD 301 to which the sixth embodiment is applied.
  • Fig. 17 is a perspective view from the direction of a photographing unit .
  • Fig. 18 is a perspective view from the direction of a display unit.
  • Reference numeral 201 denotes an HMD display unit.
  • This HMD display unit 201 includes two units, i.e., a right-eye display 201R and left-eye display 201L, each of which has a color liquid crystal and prism. A mixed real space picture corresponding to the viewpoint position and line-of-sight direction of an observer is displayed on each display unit.
  • Reference numerals 204 to 208 denote constituent members for head mounting.
  • the observer wears it while the length adjusting portion 206 is loosened by the adjuster 205.
  • the length adjusting portion 206 may be fastened by the adjuster 205 to bring the temple bridges 204 and occiput pad 207 into tight contact with the temple and occipital portion of the observer, respectively.
  • Reference numeral 203 denotes an HMD photographing unit for photographing a real space picture at/in the viewpoint position and line-of-sight direction of an observer.
  • This HMD photographing unit 203 includes two units, i.e., a right-eye photographing unit 203R and left-eye photographing unit 203L, each of which is formed by an NTSC compact video camera.
  • the photographed real space picture is superimposed on a virtual space picture to create a mixed real space picture.
  • the receiver 302 is used to receive magnetism generated by the transmitter 304 as information for the measurement of the viewpoint position and line-of-sight direction of the observer.
  • three portions i.e., receiver joints 200R, 200L, and 200C, are formed.
  • the receiver 302 can be detachably mounted on an arbitrary joint of these receiver joints 200R, 200L, and 200C. Referring to Figs. 17 and 18, the receiver 302 is mounted on the receiver joint 200R on the right side in the direction of the light of sight of the observer.
  • the receiver 302 can be amounted on the receiver joint 200L on the left side in the direction of the light of sight of the observer or the receiver joint 200C on the median line of the observer.
  • the receiver joints 200R, 200L, and 200C are designed to have receptacles in which the receiver 302 is fitted to be fixed.
  • other detachable joint (mount) schemes may be used.
  • Reference numeral 210 denotes a receiver signal line, which is exposed outside the HMD 301 at a position near the receiver joint 200C. This receiver signal line 210 is long enough to allow the receiver 302 to be mounted on either of the receiver joints 200R, 200L, and 200C.
  • Reference numeral 209 denotes a tied linear member obtained by binding various kinds of lines, e.g., signal lines and power feed lines to the HMD photographing unit 203 and the like and the above receiver signal line 210.
  • the tied linear member 209 is attached to the occipital portion mount portion 207.
  • Signal lines and power feed lines to the right and left displays 201R and 201L, HMD photographing units 203R and 203L, and the like in the tied linear member 209 pass through the right and left temple mount portions 204.
  • Fig. 19 is a block diagram showing a hardware arrangement corresponding to one observer in the system shown in Fig. 16.
  • the processing apparatus 307 incorporates a right-eye video capture board 350, left-eye video capture board 351, right-eye graphic board 352, left-eye graphic board 353, I/O interface 354, and network interface 359. These constituent elements are connected to a CPU 356, HDD 355, and memory 357.
  • the left- and right-eye video capture boards 351 and 350 are respectively connected to the left- and right-eye video cameras 203L and 203R, and convert pictures actually photographed by these video cameras 203L and 203R into pictures in a form that can be processed by the processing apparatus 307 with a virtual space picture.
  • the left- and right-eye graphic boards 353 and 352 are respectively connected to the left- and right-eye display units (devices) 201L and 201R to perform display control on the left- and right-eye display units 201L and 201R.
  • the I/O interface 354 is connected to the position/direction measurement apparatus 306.
  • the network interface 359 is connected to the network 330. Note that each of the processing apparatuses 140 according to the first to fifth embodiments is constituted by the same constituent elements as those of the processing apparatus 307 according to this embodiment.
  • the video capture boards 350 and 351 convert video signals from the video cameras 203R and 203L into digital signals and continuously store them in the memory 357.
  • the viewpoint position and line-of-sight direction of the observer are kept transmitted from the position/direction measurement apparatus 306 to the processing apparatus 307 through the I/O interface 354.
  • the processing apparatus 307 updates the time in a virtual space on the basis of a program stored in the memory 357 and calculates the position and direction of the virtual space object again.
  • a virtual space picture can also be created in consideration of the positional relationship between the real space obj ect and the virtual space obj ect .
  • a virtual space object can be seen as if it were hidden behind a real wall.
  • Fig. 20 shows a state where the receiver 302 is mounted on a right receiver joint 210R of the HMD 301 for the observer 300a, and the receiver 302 is mounted on left receiver joint 210L of the HMD 301 for the observer 300b in the same situation as that shown in Fig. 22 which shows the problem in the prior art.
  • the transmitters 304 for measuring the viewpoint positions and line-of-sight directions of the observers 300a and 300b are placed at the positions in Fig. 20.
  • the distances between the transmitters 304 and receivers 302 used by the observers 300a and 300b are shortened when both the observers 300a and 300b set their viewpoint positions in movable areas 311, and line-of-sight directions are set in the direction of a mixed real space picture observable area 312.
  • position/direction measurement errors can be reduced in a situation where the observers 300a and 300b make observations with high frequencies.
  • the seventh embodiment is applied to a case where transmitters 304 are placed in front of observers 300a and 300b, as shown in Fig. 21.
  • receivers 302 are mounted on central receiver joints 200C of HMDs 301 to minimize the distances between the receivers 302 on the HMD 301 and the transmitters 304.
  • position/direction measurement errors can be reduced in a situation where the observers 300a and 300b make observations with high frequencies.
  • position/direction measurement errors associated with the two observers 300a and 300b can be made uniform.
  • positional shifts between a real space picture and a virtual space picture can be reduced by the method requiring no cost, i.e., adjusting the mount position of the transmitter for position/direction measurement or the mount position of the receiver on each HMD.
  • the present invention is not limited to the above embodiments.
  • the transmitters and receivers position/direction measurement apparatus
  • devices based on an ultrasonic scheme instead of a magnetic scheme can be used.
  • the respective embodiments described above can be properly combined with each other.
  • the present invention can be applied to the system comprising either a plurality of units or a single unit. It is needless to say that the present invention can be applied to the case which can be attained by supplying programs which execute the process defined by the present system or invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

ABSTRACT A mixed reality system which reduces positional shifts between a real space picture and a virtual space picture with an inexpensive arrangement is disclosed. The mixed reality system according to this invention detects the viewpoint position and direction of an observer and the position and direction of another region of the observer by using reception results on signals output from a transmitter. In this mixed reality system, the transmitter is positioned such that the distance between the transmitter and a receiver used for the measurement of the viewpoint position and direction of the observer becomes shorter than the distance between the transmitter and a receiver used for the measurement of the position and direction of another region of the observer. This makes it possible to reduce measurement errors of the line-of-sight position and direction which are more dominant over positional shifts.

Description

DESCRIPTION MIXED REALITY SYSTEM WHICH REDUCES MEASUREMENT ERRORS OF VIEWPOINT POSITION AND DIRECTION OF OBSERVER
TECHNICAL FIELD
The present invention relates to an augmented or mixed reality system which creates and displays a mixed reality space picture by combining a real space picture received from a video camera or the like and a virtual space picture created by CG or the like, and a head mounted display apparatus which can be used for the mixed reality system.
The present invention also relates to a method of determining the position of a transmitter used to measure the position and direction of a predetermined region of an observer in the above mixed reality system.
BACKGROUND ART
In a mixed reality system, to merge a real space with a virtual space, the 3D coordinates and direction of a real space object by using some means. As methods of acquiring these pieces of information, for example, the following method are realized: a method using a position/direction measurement apparatus typified by FASTRAK (trademark) available from Polhemus Inc. in the US, a method of combining information obtained by image processing of a real space picture based on known camera parameters with information obtained by a gyro, and a method of measuring the position and direction of a real space object by using a special multi-eye camera.
Apparatuses used in these methods are expensive, and besides, have measurement errors. To reduce such measurement errors, a measurement apparatus which is more expensive rather than cost-effective is required, which is one of the factors that hinder a reduction in the cost of a mixed reality system. Fig. 22 shows an example of how a conventional position/direction measurement apparatus typified by FASTRAK available from Polhemus Inc. in the US.
Referring to Fig. 22, reference numerals 300a and 300b denote observers; 301, an HMD (Head Mounted Display) used by an observer to observe a mixed real space; 302, a position/direction measurement receiver mounted on the HMD 301; and 304, a position/direction measurement transmitter hidden by a wall. The observers 300a and 300b stand to face each other in a set 310. Each observer can freely determine a viewpoint position and line-of-sight direction within a movable area 311.
Reference numeral 312 denotes a mixed real space picture observable area. When the observer 300a or 300b sees this mixed real space picture observable area 312, he/she can observe, through the HMD 301, a mixed real space picture obtained by merging a CG picture as a virtual space picture with a real space picture. When the observer does not see the mixed real space picture observable area 312, only a real space picture is displayed on the HMD 301. A virtual space picture is displayed at a position determined on the basis of the viewpoint position and line-of-sight direction of the observer. It is therefore important to reduce measurement errors of the viewpoint position and line-of-sight direction when the observer sees the mixed real space picture observable area 312.
As shown in Fig. 22, if, however, the receiver 302 is fixed to the left side of the HMD 301 to measure the viewpoint position and line-of-sight direction of an observer, the distance between the receiver 302 and the transmitter 304 becomes long. This may result in an increase in measurement errors.
DISCLOSURE OF INVENTION
The present invention has been made in consideration of the above problem in the prior art, and has as its object to reduce positional shifts between a real space picture and a virtual space picture with an inexpensive arrangement .
According to the first aspect of the present invention, a mixed reality system which creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, characterized by comprising a transmitter, a first receiver which receives a signal generated by said transmitter to measure a viewpoint position and direction of an observer, and a second receiver which receives a signal generated by said transmitter to measure a position and direction of another body region of the observer, wherein said transmitter is positioned such that a distance between said transmitter and said first receiver becomes shorter than a distance between said transmitter and said second receiver.
According to the second aspect of the present invention, a head mounted display apparatus which is used in a mixed reality system that creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, and which can mount a receiver used to measure a viewpoint position and direction of an observer to be mounted, characterized in that a plurality of mount portions for detachably mounting said receiver are formed. According to the third aspect of the present invention, a method of positioning a transmitter in a mixed reality system which includes the transmitter, a first receiver which receives a signal generated by the transmitter to measure a viewpoint position and direction of an observer, and a second receiver which receives a signal generated by the transmitter to measure a position and direction of another body region of the observer, and creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, characterized in that said transmitter is positioned such that a distance between the transmitter and the first receiver becomes shorter than a distance between the transmitter and the second receiver.
BRIEF DESCRIPTION OF DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention. Fig. 1 is a view showing the schematic arrangement of a mixed reality system to which the third embodiment of the present invention is applied;
Fig. 2 is a view showing the arrangement of an HMD; Fig. 3 is a flow chart showing the processing of creating a mixed real space picture;
Figs . 4A and 4B are views showing a real space picture example and virtual space picture;
Fig. 5 is a view showing an ideal (with no error in position sensor measurement values) mixed real space picture corresponding to the real space picture example and virtual space picture example in Figs. 4A and 4B;
Fig. 6 is a view showing a mixed real space picture corresponding to the real space picture example and virtual space picture example in Figs. 4A and 4B in a case where an error exists only in an eye position sensor measurement value;
Fig. 7 is a view showing a mixed real space picture corresponding to the real space picture example and virtual space picture example in Figs. 4A and 4B in a case where an error exists only in a hand position sensor measurement value; Fig. 8 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the first embodiment of the present invention;
Fig. 9 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the second embodiment of the present invention;
Fig. 10 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the third embodiment of the present invention;
Fig. 11 is a view showing the schematic arrangement of a mixed reality system to which the third embodiment of the present invention is applied; Fig. 12 is a view showing the schematic arrangement of a mixed reality system to which the fourth embodiment of the present invention is applied;
Fig. 13 is a view for explaining the distance relationship between a position sensor transmitter and a position sensor in the fourth embodiment of the present invention;
Fig. 14 is a view showing the schematic arrangement of a mixed reality system to which the fifth embodiment of the present invention is applied;
Fig. 15 is a view for explaining the distance relationship between position sensor transmitters and position sensors in the fifth embodiment of the present invention;
Fig. 16 is a view showing the schematic arrangement of a mixed reality system to which the sixth embodiment of the present invention is applied; Fig. 17 is a perspective view showing a case where the HMD according to the sixth and seventh embodiments is viewed from the direction of a photographing unit;
Fig. 18 is a perspective view showing a case where the HMD according to the sixth and seventh embodiments is viewed from the direction of a display unit;
Fig. 19 is a block diagram showing a hardware arrangement corresponding to one observer in the system shown in Fig. 16;
Fig. 20 is a view for explaining the mount position of a position sensor (receiver) in the sixth embodiment of the present invention;
Fig. 21 is a view for explaining the mount position of a position sensor (receiver) in the seventh embodiment of the present invention; and Fig. 22 is a view for explaining an example of a problem in the prior art. BEST MODE OF CARRYING OUT THE INVENTION
Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings .
[First Embodiment]
Fig. 1 is a view showing the schematic arrangement of a mixed reality system to which the first embodiment of the present invention is applied. An observer 100 is wearing an HMD (Head Mounted Display) 110 on the head and a glove 120 on his/her hand.
As shown in Fig. 2, the HMD 110 is comprised of a video camera 111, LCD 112, position/direction measurement apparatus receiver (eye position sensor) 113, and optical prisms 114 and 115. The video camera 111 photographs a real space picture at/in the viewpoint position and line-of-sight direction of the observer through the optical prism 115. The eye position sensor 113 is used to measure the viewpoint position and line-of-sight direction of an observer. The LCD 112 displays a mixed real space picture. This picture is guided to the pupil of the observer through the optical prism 114.
The glove 120 incorporates a hand position sensor 121 and speaker 122 (not shown) . The hand position sensor 121 is used as a sensor for measuring the position and direction of the hand of the observer. The speaker 122 generates a sound corresponding to an event that has occurred at the -Si-
position of the hand. This sound may be a sound generated when the user touches or hits a virtual space object with his/her hand, a sound generated when the condition of a virtual space object displayed in synchronism with the position of the hand changes, or the like.
Reference numeral 130 denotes a position/direction measurement apparatus transmitter (position sensor transmitter) ; and 131, a position/direction measurement apparatus body (position sensor body) . The eye position sensor 113, hand position sensor 121, and position sensor transmitter 130 are connected to the position sensor body 131. Magnetism is transmitted from the position sensor transmitter 130. This magnetism is received by the eye position sensor 113 and hand position sensor 121. The position sensor body 131 calculates the position and direction of the eye and hand on the basis of the reception intensity signal from the hand position sensor 121. As this position/direction measurement apparatus, FASTRAK available from Polhemus Inc. in the US or the like can be used.
Reference numeral 140 denotes a processing apparatus for creating a mixed real space picture for one observer and displaying it on the HMD 110. This processing apparatus 140 is comprised of, for example, a personal computer, a video capture card, a video card having a CG function, a sound card, and the like. The HMD 110, speaker 122, and position sensor body 131 are connected to the processing apparatus 140.
A procedure for creating a mixed real space picture in the processing apparatus 140 will be described next with reference to the flow chart of Fig. 3. First of all, the processing apparatus 140 receives a viewpoint position, a line-of-sight direction, and the position and direction of the hand transmitted from the position sensor body 131 (step S301) . Note that the operation in step S301 uses thread S311 of periodically receiving the viewpoint position, the line-of-sight direction, and the position and direction of the hand transmitted from the position sensor body 131.
The virtual space time is then updated, and the state of the virtual space (the type, position, and condition of a virtual space object) is updated (step S302) . In this case, if there is a virtual space object whose position and direction change in synchronism with the position and direction of a real space object, the state of such an object is also updated. If, for example, the user is made to see that he/she is always wearing a glove as a virtual space object on his/her hand, the position and direction of the glove are updated in accordance with changes in the position and direction of the hand in step S302.
If the relationship between the position and direction (the hand position and viewpoint position) of the real space object and the position and direction of the virtual space object is then checked, and it is determined that a predefined event has occurred, the state of the virtual space is updated in accordance with the event (step S303) . For example, the occurrence of an even corresponds to a case where it is determined that the user touches the virtual space object with his/her hand. Updating the state of a virtual space in accordance with this event may be equivalent to changing, for example, the picture of the touched virtual space object into a picture of an explosion. A real space picture at/in the viewpoint position and line-of-sight direction of the observer which is obtained from the video camera 111 is received (step S304) . This operation in step S304 uses thread S314 of periodically acquiring the real space picture obtained from the video camera 111 through a video capture card. A virtual space picture at/in the viewpoint position and line-of-sight direction of the observer, which are obtained in step S301, is created in accordance with the state of virtual space updated in steps S302 and S303 (step S305) . Finally, the virtual space picture created in step S305 is combined with the real space picture received in step S304, and the resultant picture is output to the LCD 112 of the HMD 110 (step S306) . The above processing is repeatedly executed until some terminating operation is performed (step S307) .
A method unique to this embodiment will be described next, which is used to reduce the differences in position and direction between a virtual space object and a real space object in a mixed real space picture observed by each observer.
If the position/direction measurement apparatus can properly measure the viewpoint position and line-of-sight direction of an observer and the position and direction of the hand, there are no differences in position and direction between the virtual space object and the real space object. Consider the real space picture shown in Fig. 4A and the virtual space picture shown in Fig. 4B. In this case, ideally, a mixed reality picture like the one shown in Fig. 5 is created. In practice, however, the position/direction measurement apparatus uses magnetism or ultrasonic waves, and hence is very susceptible to disturbances and has a small measurable range. The actual position/direction measurement apparatus can measure with effective accuracy only within a range of about 80 cm from the position sensor transmitter. It is also known that even in measurement in a space without any ferromagnetic object, position and direction measurement errors monotonously increase with an increase in distance between the position sensor transmitter and the position sensor (receiver) . In order to improve this situation, it is important to minimize the distance between the position sensor transmitter and the receiver.
Measurement errors in the position/direction measurement apparatus pose a more serious problem in the measurement of viewpoint position and line-of-sight direction. If measurement errors about the hand position and direction increase, only a positional shift occurs between the hand the virtual space object that moves in synchronism with the hand, as shown in Fig. 7. If, however, measurement errors about the viewpoint position and line-of-sight direction increase, positional shifts occur with respect to all virtual space objects as well as a positional shift with respect to the hand, as shown in Fig. 6.
The principal object of the first embodiment is therefore to improve the measurement accuracy of the eye position sensor 113. To reduce positional shifts between a real space object and a virtual space object, the position sensor transmitter 130 is placed such that the distance between the eye position sensor 113 and the position sensor transmitter 130 becomes shorter than the distances between the position sensor transmitter 130 and the remaining sensors connected to the position sensor body 131 at the positions and directions at/in which the respective position sensors are used with high frequencies, as shown in Fig. 8, thereby improving the measurement accuracy of the viewpoint position and line-of-sight direction as compared with the measurement accuracy of the position and direction of the remaining position sensors. This makes it possible to make positional shifts with respect to all the virtual space objects less noticeable. The position of the position sensor transmitter 130 cannot be specified but should be changed flexibly depending on the type of mixed reality picture the mixed reality system displays and the position and direction viewed by an observer.
Assume that a plurality of observers exist. In this case, mixed reality can be further augmented by setting a situation in which the observers can observe each other. In general, therefore, the direction in which each observer observes with a high frequency becomes the horizontal direction. In such a situation, placing the position sensor transmitter 130 above the head of the observer as shown in Fig. 8 makes it possible to have the merits of this embodiment and prevent a picture of the transmitter, which should not be combined as a real space object in the mixed real space, from being combined. In addition, this allows the observer to freely move in all directions . Furthermore, since measurement errors caused when the observer moves become equal to each other in all directions, management is facilitated. As described above, a mixed real space picture without any unnecessary real object, which has high quality as a whole, can be observed while the freedom of movement is ensured.
[Second Embodiment]
The second embodiment exemplifies a case where a position sensor transmitter 130 is placed such that the distance from a hand position sensor 121 is shortened, as well as the distance from an eye position sensor 113, at/in the viewpoint position and line-of-sight direction at/in which the hand is observed with a high frequency and at/in the hand position and direction, as shown in Fig. 9.
In the second embodiment, in a situation where an observer frequently observes the hand, the position sensor transmitter 130 is so placed as to shorten the distance between the eye position sensor 113 and the position sensor transmitter 130 as well as the distance between the hand position sensor 121 and the position sensor transmitter 130. This makes it possible to reduce measurement errors in both the eye position sensor 113 and the hand position sensor 121. As compared with the first embodiment, a virtual space object that moves in synchronism with the hand can be observed while the object is localized on the hand with a small error.
Obviously, the posture of an observer in which the hand is frequently observed depends on the contents of a mixed real space picture which the mixed reality system is to present to the observer. For this reason, the position sensor transmitter 130 must be adjusted such that the distance from the hand position sensor 121 becomes almost equal to the distance from the eye position sensor 113 when the hand is frequently observed.
This embodiment has exemplified the case where the position of a virtual space object is changed in synchronism with the hand. However, the present invention is not limited to this, and can also be applied to a case where the position of a virtual space object is changed in synchronism with another body region such as a foot.
[Third Embodiment]
Fig. 11 is a view showing the schematic arrangement of a mixed reality system to which the third embodiment is applied. This system is based on the assumption that two observers simultaneously observe the same mixed real space. This system includes two systems each identical to the one shown in Fig. 1. Note that a position sensor transmitter 130 and position sensor body 131 are shared by the two observers to realize a reduction in the total cost of the mixed real space system. The position sensor body 131 is connected to only one processing apparatus 140 and is not connected to the other processing apparatus 140. For this reason, the processing apparatuses 140 for the respective observers are connected to each other through a network 150 to allow the processing apparatus 140 to which the position sensor body 131 is not connected to create a mixed real space picture .
In the third embodiment, as shown in Fig. 10, at the position where each observer observes a hand with a high frequency, the position sensor transmitter 130 is so placed as to almost match the distance (a or a') between an eye position sensor 113 for each observer and the position sensor transmitter 130 with the distance (b orb') between a hand position sensor 121 and the position sensor transmitter 130. In addition, the position sensor transmitter 130 is so placed as to minimize the respective distances (a, a', b, and b')-
With this arrangement, each observer can observe a virtual space object that moves in synchronism with a hand with a small position shift on the hand.
Note that positional shifts can be made uniform with respect to all observers by placing the position sensor transmitter 130 such that the distance (a) between the eye position sensor 113 for a given observer and the position sensor transmitter 130 becomes almost equal to the distance
(a1) between the eye position sensor 113 for the other observer and the position sensor transmitter 130, and the distance (b) between the hand position sensor 121 for a given observer and the position sensor transmitter 130 becomes almost equal to the distance (b') between the hand position sensor 121 for the other observer and the position sensor transmitter 130.
The procedure executed by each processing apparatus 140 is almost the same as that in the first embodiment except that the processing apparatus 140 to which the position sensor body 131 controls the position sensor body 131 through a position/direction acquisition thread in thread S311 in Fig. 3 to acquire measurement values. This processing apparatus 140 then accesses the network 150 to distribute the measurement values obtained by the respective position sensors to the other processing apparatus .
The processing apparatus 140 to which the position sensor body 131 is not connected accesses the network through a position/direction acquisition thread in thread S311 in Fig. 3 to acquire the measurement values obtained by the respective position sensors which are sent from the processing apparatus 140 to which the position sensor body 131 is connected.
[Fourth Embodiment]
The fourth embodiment is obtained by developing the first embodiment to have one each of a position sensor transmitter and a position sensor body for one position sensor. In this case, two components, i.e., a position sensor transmitter and position sensor body, are required for one position sensor, and hence the cost increases accordingly. However, a viewpoint position, a line-of-sight direction, and the position and direction of the hand can be measured with high accuracy.
Fig. 12 is a view showing the schematic arrangement of a mixed reality system to which the fourth embodiment is applied. This system additionally has a hand position sensor transmitter 160 and position sensor body 161 as compared with the arrangement of the first embodiment shown in Fig. 1. The hand position sensor body 161 is connected to a processing apparatus 140, and the hand position sensor transmitter 160 is connected to the hand position sensor body 161. A hand position sensor 121 is designed to receive magnetism from the hand position sensor transmitter 160. In this embodiment, a position sensor transmitter 130 and position sensor body 131 function as dedicated devices for the eyes .
The hand position sensor transmitter 160 is so placed as to shorten the distance between the hand position sensor 121 and the hand position sensor transmitter 160 at/in the hand position and direction where an observer observes the hand with a high frequency. The position sensor transmitter 130 is so placed as to shorten the distance between an eye position sensor 113 and the eye position sensor transmitter 130.
Although the procedure executed by the processing apparatus 140 is almost the same as that in the first embodiment except that in thread S311 in Fig. 3, a position/direction acquisition thread controls the plurality of position sensor bodies 131 and 161 to acquire measurement values from the respective sensor bodies.
By arranging the dedicated position sensor transmitter and position sensor body for each position sensor in this manner, positional shifts between a real space picture and a virtual space picture can be further reduced in the viewpoint position, the line-of-sight direction, and the hand position and direction where the hand is observed with high frequencies, as compared with the second embodiment. This allows the observer to observe a mixed real space picture with higher quality.
The fourth embodiment is especially effective in preventing an observer from feeling a positional shift when a translucent virtual space object like seawater or a virtual space picture almost equal in size to a hand (real space picture) is to be expressed as if it were always on the hand of the observer. Assume that a virtual space obj ect is not translucent . In this case, if the virtual space object is larger than the hand, the hand is completely hidden by the virtual space object. Even if, therefore, a slight positional shift occurs, the resultant mixed real space picture does not look strange to the observer. In the case of a translucent virtual space object equal in size to a hand, even with a slight positional shift, the hand cannot be hidden by the virtual space object, and the positional shift between the hand and the virtual space object is seen by the observer. Furthermore, in the case of a translucent virtual space object, a positional shift between a virtual space object and a real space object is always seen by the observer regardless of the size of the virtual space object.
Even in such a situation, this embodiment can prevent the observer from observing positional shifts between the virtual space object and the real space object.
Note that the quality of a mixed real space picture can be improved by placing the eye position sensor transmitter 130 above the head of the observer, as shown in Fig. 13, as in the first embodiment. In addition, if possible, the hand position sensor transmitter 160 may be placed below the hand position at/in the hand position and direction where the observer uses the system with a high accuracy. With this arrangement, the hand position sensor transmitter 160 does not interfere with the field of view of the observer in the horizontal direction, thereby presenting a mixed real space picture with higher quality.
[Fifth Embodiment]
Fig. 14 is a view showing the arrangement of a mixed reality system to which the fifth embodiment is applied. In this embodiment, as in the fourth embodiment, position sensor transmitters 130 and 160 and position sensor bodies 131 and 161 are respectively provided for an eye position sensor 113 and hand position sensor 121 to allow high-accuracy measurement of a viewpoint position, a line-of-sight direction, and the position and direction of the hand. In addition, the position sensor transmitters 130 and 160 and position sensor bodies 131 and 161 are shared by two observers to attain a reduction in cost.
The two position sensor bodies 131 and 161 are connected to only one processing apparatus 140 and are not connected to the other processing apparatus 140. The processing apparatuses 140 for the respective observers are therefore connected to each other through a network 150 to allow the processing apparatus 140 to which the position sensor bodies 131 and 161 are not connected to create a mixed real space picture. In this embodiment, as shown in Fig. 15, the distance between the eye position sensor 113 for each observer and the eye position sensor transmitter 130 and the distance between the hand position sensor 121 and the hand position sensor transmitter 160 are minimized at a position where each observer makes observations with a high frequency. In addition, the eye position sensor transmitter 130 and hand position sensor transmitter 160 are placed such that the distances between the eye position sensors 113 for the respective observers and the position sensor transmitter 130 become almost equal to each other, and the distances between the hand position sensors 121 for the respective observers and the hand position sensor transmitter 160 become almost equal to each other.
With this arrangement, all observers can observe a high-quality mixed real picture with positional shifts between a virtual object and a real object being further reduced, and positional shifts are made uniform to all the observers .
The procedure executed by each processing apparatus 140 is the same as that in the first embodiment except that the processing apparatus 140 to which the position sensor bodies 131 and 161 are connected controls the position sensor bodies 131 and 161 through a position/direction acquisition thread in thread S311 in Fig. 3 to acquire measurement values. The apparatus then accesses the network 150 to distribute the measurement values obtained by the respective position sensors to the other processing apparatus .
If the eye position sensor transmitter 130 is placed above the head of an observer and the hand position sensor transmitter 160 is placed below the hand position where the observer observes the hand with a high frequency as in the fourth embodiment, the position sensor transmitters 130 and 160 do not interfere with the field of view of the observer in the horizontal direction. This allows the observer to observe a high-quality mixed real space picture in the direction in which a mixed real space is observed with a high frequency.
[Sixth Embodiment]
Fig. 16 is a view showing the schematic arrangement of a mixed reality system to which the sixth embodiment is applied. Referring to Fig. 16, reference numerals 300a and 300b denote observers who observe mixed real space pictures created in this system. Each of the observers 300a and 300b wears an HMD 301 on which a receiver 302 is mounted. Each observer can observe a mixed real space corresponding to position/direction measurement values through the HMD 301. Reference numeral 306 denotes a position/direction measurement apparatus. A transmitter 304 and receiver 302 are connected to this position/direction measurement apparatus 306. The magnetism generated from the transmitter 304 is received by the receiver 302. The position/direction measurement apparatus 306 thenmeasures the viewpoint position and line-of-sight direction of the observer from the strength of magnetism.
The position/direction measurement apparatus 306 is connected to a processing apparatus 307 to be always notified of the viewpoint position and line-of-sight direction of the observer from the position/direction measurement apparatus 306. The processing apparatus 307 creates a mixed real space picture corresponding to the viewpoint position and line-of-sight direction of each observer on the basis of this information, and displays it on the HMD 301.
The processing apparatuses 307 for the respective observers are connected to each other through a network 330. The respective processing apparatuses 307 share the viewpoint positions and line-of-sight directions of the respective observers and the position and direction of a virtual space object by using the network 330. This allows a given observer to display a virtual space object at the position of the other observer.
Figs. 17 and 18 are perspective views of the HMD 301 to which the sixth embodiment is applied. Fig. 17 is a perspective view from the direction of a photographing unit . Fig. 18 is a perspective view from the direction of a display unit.
Reference numeral 201 denotes an HMD display unit. This HMD display unit 201 includes two units, i.e., a right-eye display 201R and left-eye display 201L, each of which has a color liquid crystal and prism. A mixed real space picture corresponding to the viewpoint position and line-of-sight direction of an observer is displayed on each display unit.
Reference numerals 204 to 208 denote constituent members for head mounting. To mount the HMD 301 on the head, the observer wears it while the length adjusting portion 206 is loosened by the adjuster 205. After the forehead pad 208 is brought into tight contact with the forehead, the length adjusting portion 206 may be fastened by the adjuster 205 to bring the temple bridges 204 and occiput pad 207 into tight contact with the temple and occipital portion of the observer, respectively. Reference numeral 203 denotes an HMD photographing unit for photographing a real space picture at/in the viewpoint position and line-of-sight direction of an observer. This HMD photographing unit 203 includes two units, i.e., a right-eye photographing unit 203R and left-eye photographing unit 203L, each of which is formed by an NTSC compact video camera. The photographed real space picture is superimposed on a virtual space picture to create a mixed real space picture.
The receiver 302 is used to receive magnetism generated by the transmitter 304 as information for the measurement of the viewpoint position and line-of-sight direction of the observer. As the mount portions of the receiver 302 for the HMD 301, three portions, i.e., receiver joints 200R, 200L, and 200C, are formed. The receiver 302 can be detachably mounted on an arbitrary joint of these receiver joints 200R, 200L, and 200C. Referring to Figs. 17 and 18, the receiver 302 is mounted on the receiver joint 200R on the right side in the direction of the light of sight of the observer. However, the receiver 302 can be amounted on the receiver joint 200L on the left side in the direction of the light of sight of the observer or the receiver joint 200C on the median line of the observer. In this embodiment, the receiver joints 200R, 200L, and 200C are designed to have receptacles in which the receiver 302 is fitted to be fixed. However, other detachable joint (mount) schemes may be used. Reference numeral 210 denotes a receiver signal line, which is exposed outside the HMD 301 at a position near the receiver joint 200C. This receiver signal line 210 is long enough to allow the receiver 302 to be mounted on either of the receiver joints 200R, 200L, and 200C. Reference numeral 209 denotes a tied linear member obtained by binding various kinds of lines, e.g., signal lines and power feed lines to the HMD photographing unit 203 and the like and the above receiver signal line 210. The tied linear member 209 is attached to the occipital portion mount portion 207. Signal lines and power feed lines to the right and left displays 201R and 201L, HMD photographing units 203R and 203L, and the like in the tied linear member 209 pass through the right and left temple mount portions 204.
Fig. 19 is a block diagram showing a hardware arrangement corresponding to one observer in the system shown in Fig. 16. The processing apparatus 307 incorporates a right-eye video capture board 350, left-eye video capture board 351, right-eye graphic board 352, left-eye graphic board 353, I/O interface 354, and network interface 359. These constituent elements are connected to a CPU 356, HDD 355, and memory 357.
The left- and right-eye video capture boards 351 and 350 are respectively connected to the left- and right-eye video cameras 203L and 203R, and convert pictures actually photographed by these video cameras 203L and 203R into pictures in a form that can be processed by the processing apparatus 307 with a virtual space picture. The left- and right-eye graphic boards 353 and 352 are respectively connected to the left- and right-eye display units (devices) 201L and 201R to perform display control on the left- and right-eye display units 201L and 201R. The I/O interface 354 is connected to the position/direction measurement apparatus 306. The network interface 359 is connected to the network 330. Note that each of the processing apparatuses 140 according to the first to fifth embodiments is constituted by the same constituent elements as those of the processing apparatus 307 according to this embodiment.
An outline of processing performed by the mixed reality system according to this embodiment will be described next. In this system, (1) a real space picture at/in the current viewpoint position and line-of-sight direction of an observer is acquired, (2) the viewpoint position and line-of-sight direction of the observer are acquired, and (3) a virtual space picture at/in the acquired viewpoint position and line-of-sight direction is created. In this case, the positional relationship between an object in a virtual space viewed from the viewpoint position and an object in a real space in consideration of the position, shape, and the like of the object in the real space which are stored in advance, (4) the real space picture is superimposed on the virtual space picture to create a mixed real space picture, and (5) the mixed real space picture is presented to the observer. By repeating this series of operations, the observer is made to experience a mixed real space picture.
In this series of operations, acquisition processing of a real space picture at/in the current viewpoint position and line-of-sight direction of the observer is performed by the video capture boards 350 and 351. More specifically, the video capture boards 350 and 351 convert video signals from the video cameras 203R and 203L into digital signals and continuously store them in the memory 357. The viewpoint position and line-of-sight direction of the observer are kept transmitted from the position/direction measurement apparatus 306 to the processing apparatus 307 through the I/O interface 354.
The processing apparatus 307 updates the time in a virtual space on the basis of a program stored in the memory 357 and calculates the position and direction of the virtual space object again. A virtual space picture corresponding to the viewpoint position and direction of the observer, which are acquired through the I/O interface 354, is created. In this case, by defining in advance an object existing in a real space such that it exists in a virtual space, a virtual space picture can also be created in consideration of the positional relationship between the real space obj ect and the virtual space obj ect . For example, a virtual space object can be seen as if it were hidden behind a real wall.
Finally, the real space picture and virtual space picture created in the memory 357 are superimposed on each other to create a mixed real space picture. The created mixed real space picture is displayed on the HMD displays 201R and 201L under the control of the graphic boards 352 and 353. A method of mounting the receiver 302, which is a characteristic feature of this embodiment, will be described next. Fig. 20 shows a state where the receiver 302 is mounted on a right receiver joint 210R of the HMD 301 for the observer 300a, and the receiver 302 is mounted on left receiver joint 210L of the HMD 301 for the observer 300b in the same situation as that shown in Fig. 22 which shows the problem in the prior art.
Assume that the transmitters 304 for measuring the viewpoint positions and line-of-sight directions of the observers 300a and 300b are placed at the positions in Fig. 20. In this case, by selecting receiver joints for mounting the receivers 302 in the above manner, the distances between the transmitters 304 and receivers 302 used by the observers 300a and 300b are shortened when both the observers 300a and 300b set their viewpoint positions in movable areas 311, and line-of-sight directions are set in the direction of a mixed real space picture observable area 312. In other words, position/direction measurement errors can be reduced in a situation where the observers 300a and 300b make observations with high frequencies.
In addition, by selecting mount positions for the receivers 302 as shown in Fig. 20, since the observers 300a and 300b, receivers 302, and transmitters 304 are symmetrical about a line segment 313 passing through the center of the mixed real space picture observable area 312, measurement errors of positions and directions associated with the two observers 300a and 300b can be made uniform.
[Seventh Embodiment]
The seventh embodiment is applied to a case where transmitters 304 are placed in front of observers 300a and 300b, as shown in Fig. 21.
When the transmitters 304 are placed in front of the observers 300a and 300b in this manner, receivers 302 are mounted on central receiver joints 200C of HMDs 301 to minimize the distances between the receivers 302 on the HMD 301 and the transmitters 304.
In this case, as in the sixth embodiment, position/direction measurement errors can be reduced in a situation where the observers 300a and 300b make observations with high frequencies. In addition, position/direction measurement errors associated with the two observers 300a and 300b can be made uniform.
As described above, according to the first to seventh embodiments positional shifts between a real space picture and a virtual space picture can be reduced by the method requiring no cost, i.e., adjusting the mount position of the transmitter for position/direction measurement or the mount position of the receiver on each HMD.
The present invention is not limited to the above embodiments. For example, as the transmitters and receivers (position/direction measurement apparatus) , for example, devices based on an ultrasonic scheme instead of a magnetic scheme can be used. In addition, the respective embodiments described above can be properly combined with each other.
As has been described above, according to the present invention, positional shifts between a real space picture and a virtual space picture can be reduced with an inexpensive arrangement.
Furthermore, the present invention can be applied to the system comprising either a plurality of units or a single unit. It is needless to say that the present invention can be applied to the case which can be attained by supplying programs which execute the process defined by the present system or invention.

Claims

1. A mixed reality system which creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, characterized by comprising: a transmitter; a first receiver which receives a signal generated by said transmitter to measure a viewpoint position and direction of an observer; and a second receiver which receives a signal generated by said transmitter to measure a position and direction of another body region of the observer, wherein said transmitter is positioned such that a distance between said transmitter and said first receiver becomes shorter than a distance between said transmitter and said second receiver.
2. The mixed reality system according to claim 1, characterized in that said transmitter is positioned such that the distance between said transmitter and said first receiver always become shorter than the distance between said transmitter and said second receiver within at least a predetermined movement range of the observer.
3. The mixed reality system according to claim 1, characterized in that the mount position of said transmitter for measuring the viewpoint position and direction of the observer is above the head of the observer.
4. The mixed reality system according to claim 1, characterized in that said transmitter is shared by said plurality of first and second receivers.
5. The mixed reality system according to claim 1, characterized in that said receiver and transmitter receive and transmit magnetism or an ultrasonic wave.
6. A head mounted display apparatus which is used in a mixed reality system that creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, and which can mount a receiver used to measure a viewpoint position and direction of an observer to be mounted, characterized in that a plurality of mount portions for detachably mounting said receiver are formed.
7. The head mounted display apparatus according to claim
6, characterized in that said mount portions are formed at least on a median line of the observer and both left and right sides in a line-of-sight direction of the observer.
8. The head mounted display apparatus according to claim 6 or 7, characterized in that a signal line connected to said receiver extends from a portion near said mount portion on the median line to the outside.
9. The head mounted display apparatus according to claim 6, characterized in that said receiver receives magnetism or an ultrasonic wave.
10. A method of positioning a transmitter in a mixed reality system which includes the transmitter, a first receiver which receives a signal generated by the transmitter to measure a viewpoint position and direction of an observer, and a second receiver which receives a signal generated by the transmitter to measure a position and direction of another body region of the observer, and creates and displays a mixed real space picture by combining a real space picture with a virtual space picture, characterized in that said transmitter is positioned such that a distance between the transmitter and the first receiver becomes shorter than a distance between the transmitter and the second receiver.
11. The method of positioning the transmitter in the mixed reality system according to claim 10, characterized in that said transmitter is positioned such that the distance between said transmitter and said first receiver always become shorter than the distance between said transmitter and said second receiver within at least a predetermined movement range of the observer.
12. The method of positioning the transmitter in the mixed reality system according to claim 10, characterized in that the mount position of said transmitter for measuring the viewpoint position and direction of the observer is above the head of the observer.
13. The method of positioning the transmitter in the mixed reality system according to claim 10, characterized in that said transmitter is shared by said plurality of first and second receivers.
PCT/JP2002/002297 2001-03-13 2002-03-12 Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer WO2002073287A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2001071117A JP4708590B2 (en) 2001-03-13 2001-03-13 Mixed reality system, head mounted display device, mixed reality realization method and program
JP2001-071117 2001-03-13

Publications (2)

Publication Number Publication Date
WO2002073287A2 true WO2002073287A2 (en) 2002-09-19
WO2002073287A3 WO2002073287A3 (en) 2003-10-30

Family

ID=18928886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2002/002297 WO2002073287A2 (en) 2001-03-13 2002-03-12 Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer

Country Status (2)

Country Link
JP (1) JP4708590B2 (en)
WO (1) WO2002073287A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1686554A3 (en) * 2005-01-31 2008-06-18 Canon Kabushiki Kaisha Virtual space generating system, image processing apparatus and information processing method
CN102200881A (en) * 2010-03-24 2011-09-28 索尼公司 Image processing apparatus, image processing method and program
US8585476B2 (en) 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
WO2014199159A1 (en) * 2013-06-11 2014-12-18 Sony Computer Entertainment Europe Limited Head-mountable apparatus and systems
US9662582B2 (en) 2003-09-02 2017-05-30 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US9677840B2 (en) 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
WO2018013237A1 (en) * 2016-07-15 2018-01-18 Qualcomm Incorporated Virtual, augmented, and mixed reality
US9958934B1 (en) 2006-05-01 2018-05-01 Jeffrey D. Mullen Home and portable augmented reality and virtual reality video game consoles
WO2018118727A1 (en) * 2016-12-22 2018-06-28 Microsoft Technology Licensing, Llc Magnetic tracker dual mode

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4769942B2 (en) * 2006-03-14 2011-09-07 国立大学法人電気通信大学 3D design support system and 3D design support method
KR101838603B1 (en) * 2016-10-11 2018-03-14 (주)세이프인 Fire extinguisher of augmented reality for training
US10713485B2 (en) 2017-06-30 2020-07-14 International Business Machines Corporation Object storage and retrieval based upon context
KR102637117B1 (en) * 2018-04-18 2024-02-15 삼성전자주식회사 Method and apparatus for virtual reality, augmented reality and mixed reality

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10334275A (en) * 1997-05-29 1998-12-18 Canon Inc Method and system for virtual reality and storage medium
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
US6061064A (en) * 1993-08-31 2000-05-09 Sun Microsystems, Inc. System and method for providing and using a computer user interface with a view space having discrete portions
EP1060772A2 (en) * 1999-06-11 2000-12-20 Mixed Reality Systems Laboratory Inc. Apparatus and method to represent mixed reality space shared by plural operators, game apparatus using mixed reality apparatus and interface method thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08160879A (en) * 1994-12-09 1996-06-21 Shimadzu Corp Head mounted display device
JP2887104B2 (en) * 1996-04-12 1999-04-26 オリンパス光学工業株式会社 Head mounted video display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061064A (en) * 1993-08-31 2000-05-09 Sun Microsystems, Inc. System and method for providing and using a computer user interface with a view space having discrete portions
US6005548A (en) * 1996-08-14 1999-12-21 Latypov; Nurakhmed Nurislamovich Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods
JPH10334275A (en) * 1997-05-29 1998-12-18 Canon Inc Method and system for virtual reality and storage medium
EP1060772A2 (en) * 1999-06-11 2000-12-20 Mixed Reality Systems Laboratory Inc. Apparatus and method to represent mixed reality space shared by plural operators, game apparatus using mixed reality apparatus and interface method thereof

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11103785B2 (en) * 2003-09-02 2021-08-31 Jeffrey D Mullen Systems and methods for location based games and employment of the same on location enabled devices
US11033821B2 (en) 2003-09-02 2021-06-15 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US10974151B2 (en) 2003-09-02 2021-04-13 Jeffrey D Mullen Systems and methods for location based games and employment of the same on location enabled devices
US9662582B2 (en) 2003-09-02 2017-05-30 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US10967270B2 (en) 2003-09-02 2021-04-06 Jeffrey David Mullen Systems and methods for location based games and employment of the same on location enabled devices
US10179277B2 (en) 2004-11-16 2019-01-15 Jeffrey David Mullen Location-based games and augmented reality systems
US8585476B2 (en) 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
US9352216B2 (en) 2004-11-16 2016-05-31 Jeffrey D Mullen Location-based games and augmented reality systems
US9744448B2 (en) 2004-11-16 2017-08-29 Jeffrey David Mullen Location-based games and augmented reality systems
US10828559B2 (en) 2004-11-16 2020-11-10 Jeffrey David Mullen Location-based games and augmented reality systems
US7843470B2 (en) 2005-01-31 2010-11-30 Canon Kabushiki Kaisha System, image processing apparatus, and information processing method
EP1686554A3 (en) * 2005-01-31 2008-06-18 Canon Kabushiki Kaisha Virtual space generating system, image processing apparatus and information processing method
US10838485B2 (en) 2006-05-01 2020-11-17 Jeffrey D. Mullen Home and portable augmented reality and virtual reality game consoles
US9958934B1 (en) 2006-05-01 2018-05-01 Jeffrey D. Mullen Home and portable augmented reality and virtual reality video game consoles
CN102200881B (en) * 2010-03-24 2016-01-13 索尼公司 Image processing apparatus and image processing method
CN102200881A (en) * 2010-03-24 2011-09-28 索尼公司 Image processing apparatus, image processing method and program
US10078366B2 (en) 2013-06-11 2018-09-18 Sony Interactive Entertainment Europe Limited Head-mountable apparatus and system
WO2014199159A1 (en) * 2013-06-11 2014-12-18 Sony Computer Entertainment Europe Limited Head-mountable apparatus and systems
US9677840B2 (en) 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
US9906885B2 (en) 2016-07-15 2018-02-27 Qualcomm Incorporated Methods and systems for inserting virtual sounds into an environment
WO2018013237A1 (en) * 2016-07-15 2018-01-18 Qualcomm Incorporated Virtual, augmented, and mixed reality
KR20190028697A (en) * 2016-07-15 2019-03-19 퀄컴 인코포레이티드 Virtual, Augmented, and Mixed Reality
CN109416585A (en) * 2016-07-15 2019-03-01 高通股份有限公司 Virtually, enhancing and mixed reality
KR102609668B1 (en) * 2016-07-15 2023-12-04 퀄컴 인코포레이티드 Virtual, Augmented, and Mixed Reality
CN110073314A (en) * 2016-12-22 2019-07-30 微软技术许可有限责任公司 Magnetic tracking device double mode
WO2018118727A1 (en) * 2016-12-22 2018-06-28 Microsoft Technology Licensing, Llc Magnetic tracker dual mode
US10139934B2 (en) 2016-12-22 2018-11-27 Microsoft Technology Licensing, Llc Magnetic tracker dual mode
CN110073314B (en) * 2016-12-22 2021-11-16 微软技术许可有限责任公司 Magnetic tracker dual mode

Also Published As

Publication number Publication date
WO2002073287A3 (en) 2003-10-30
JP4708590B2 (en) 2011-06-22
JP2002271817A (en) 2002-09-20

Similar Documents

Publication Publication Date Title
US11854171B2 (en) Compensation for deformation in head mounted display systems
EP2979127B1 (en) Display method and system
US20020075286A1 (en) Image generating system and method and storage medium
US6359601B1 (en) Method and apparatus for eye tracking
US11226406B1 (en) Devices, systems, and methods for radar-based artificial reality tracking
US20220130116A1 (en) Registration of local content between first and second augmented reality viewers
US20140347456A1 (en) Viewer with varifocal lens and video display system
WO2002073287A2 (en) Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer
CA2268864A1 (en) Projection system, in particular for three-dimensional representations on a viewing device
JP2003242527A (en) Information processor and method
JP2008146109A (en) Image processing method and image processor
US7229174B2 (en) Method to detect misalignment and distortion in near-eye displays
US11956415B2 (en) Head mounted display apparatus
JP3579585B2 (en) Multi-view simultaneous observation type horizontally arranged stereoscopic image display system
JP2011010126A (en) Image processing apparatus, and image processing method
JP6859447B2 (en) Information processing system and object information acquisition method
JP2002176661A (en) Image display device
US11454700B1 (en) Apparatus, system, and method for mitigating systematic distance errors in radar-based triangulation calculations
JP3425402B2 (en) Apparatus and method for displaying stereoscopic image
JPH11341517A (en) Horizontal layout stereoscopic image display system
JP2002269593A (en) Image processing device and method, and storage medium
JP2595942B2 (en) 3D image display device
JP2004194033A (en) System for displaying three-dimensional image and method for displaying three-dimensional pointer
EP3547081B1 (en) Data processing
JP4125085B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): US

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): DE FR GB IT NL

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase