WO2004020946A1 - ヘッドトラッキング方法及び装置 - Google Patents
ヘッドトラッキング方法及び装置 Download PDFInfo
- Publication number
- WO2004020946A1 WO2004020946A1 PCT/JP2003/010776 JP0310776W WO2004020946A1 WO 2004020946 A1 WO2004020946 A1 WO 2004020946A1 JP 0310776 W JP0310776 W JP 0310776W WO 2004020946 A1 WO2004020946 A1 WO 2004020946A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- angle
- head
- sensor
- output
- head tracking
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C19/00—Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0198—System for aligning or maintaining alignment of an image in a predetermined direction
Definitions
- the present invention relates to a head tracking method and apparatus for detecting the direction in which a head is facing on a head mount display or the like.
- sensors have been used to detect the three-dimensional direction in which the human head is facing, and an image of the detected direction is displayed on a head mounted display (HMD) mounted on the head.
- HMD head mounted display
- FIG. 11 is a diagram showing a configuration example of a conventional head mount display.
- a sensor section 70 for detecting the movement of the head a head mount display section 80 mounted on the head, and a host section 90 for supplying a video signal to the video display section 80 are provided.
- the sensor unit 70 is composed of three sensors 71, 72, and 73 that detect the movement of the human head three-dimensionally, and the outputs of the sensors 71, 72, and 73.
- the central control unit 74 calculates the three-dimensional movement of the human head based on the data, and the host unit calculates the head-facing data calculated by the central control unit 74. It is composed of a control interface unit 75 that transmits the information to 90.
- the three sensors 7 1, 7 2, and 7 3 are composed of, for example, angular velocity sensors that individually detect accelerations in three mutually orthogonal axial directions.
- the central control unit 74 determines the three-axis acceleration. Judge the three-dimensional movement of the head.
- the host unit 90 includes, for example, a memory 91 for storing video data all around a certain point, and a video data stored in the memory 91. From among the above, the video data in the direction detected by the sensor unit 70 is read out, and the central control unit 92 that supplies the video data to the 3D processing unit 93 and the supplied video data are used as video data for image display. And a video interface unit 94 for supplying the video data created by the 3D processing unit 93 to the head mount display unit 80.
- the head mount display unit 80 includes a central control unit 81 for controlling display of video, a video interface unit 82 for receiving video data supplied from the host unit 90, and a video interface unit. And a video display unit 83 for displaying and processing the video data received by the face unit 82.
- a liquid crystal display panel arranged near the left and right eyes is used as a display means.
- the sensor section 70 and the head mount display section 80 are integrally formed.
- the host unit 90 includes, for example, a personal computer device and a large-capacity storage means such as a hard disk and an optical disk.
- a head mount display configured in this way, it is possible to display images linked to the movement of the wearer's head, and to display so-called virtual reality images. This will be possible.
- the conventional head mount display requires three acceleration sensors that individually detect the acceleration of three orthogonal axes as a sensor part that detects the movement of the head, and the configuration is complicated. was there.
- the head mount display is a device that is mounted on the user's head, it is preferable that the head mount display be configured to be small and lightweight, and it is not preferable that three sensors are required.
- the present invention has been made in view of such a point, and an object of the present invention is to enable a simple sensor configuration to detect a direction in which a head is facing. And Disclosure of the invention
- a three-dimensional direction in which the head is oriented is a yaw angle that is an angle that rotates around an upright axis that stands upright on a horizontal plane of the head, and is orthogonal to the upright axis.
- the yaw angle is determined from an integrated value of an output of a ji mouth sensor, and a plane orthogonal to the upright axis direction is determined.
- the pitch angle and the roll angle are calculated from the output of a tilt sensor that detects the tilt of the vehicle.
- the cycle for judging the yaw angle from the output of the gyro sensor is different from the cycle for calculating the pitch angle and the roll angle from the output of the tilt sensor. In addition, the cycle is shorter.
- the yaw angle can be accurately determined based on the short-period determination of the dynamic angular velocity output by the gyro sensor, and the pitch angle and the roll angle can be determined by gravity. Since it is calculated from a certain static acceleration, even if the detection cycle is long to some extent, accurate detection can always be performed, and 3-axis angles can be detected accurately with good calculation distribution.
- a correction is made on the pitch angle and the roll angle determined from the output of the jay mouth sensor based on the determined pitch angle and roll angle. is there.
- a yaw angle which is an angle rotating around an upright axis which stands upright on a horizontal plane of the head, and two axes which are orthogonal to the upright axis.
- a head tracking device that detects three axes of a pitch angle and a roll angle, which are angles to be formed, a jar mouth sensor for detecting the yaw angle, and a tilt for detecting a tilt of a plane orthogonal to the direction of the upright axis.
- a calculating means for determining the yaw angle from the integrated value of the output of the gyro sensor and calculating the pitch angle and the roll angle from the angular velocity output from the tilt sensor.
- a fifth invention is the head tracking device according to the fourth invention, wherein the calculating means determines a yaw angle from an output of the gyro sensor, and calculates a pitch angle and a roll angle from an output of the tilt sensor. This is a period shorter than the period for calculating.
- the yaw angle can be accurately determined based on the short-period determination of the dynamic angular velocity output by the gyro sensor, and the pitch angle and the roll angle are gravity. Since it is calculated from static acceleration, even if the detection cycle becomes long to some extent, accurate detection can always be performed, and the angle of the three axes can be detected accurately with good calculation distribution.
- a sixth aspect of the present invention is the head tracking apparatus according to the fourth aspect of the present invention, wherein the calculating means is configured to correct the inclination angle determined based on the calculated pitch angle and roll angle from the output of the jerky mouth sensor. Is performed. By doing so, a more accurate yaw angle can be obtained. You will be able to judge.
- FIG. 1 is a perspective view showing an example of mounting a head mounted display according to an embodiment of the present invention.
- FIG. 2 is a perspective view showing an example of the shape of the head mount display according to one embodiment of the present invention.
- FIG. 3 is a side view of the head mount display of the example of FIG.
- FIG. 4 is a perspective view showing an example of a state in which a video display unit of the head mount display of FIG. 2 is raised.
- FIG. 5 is an explanatory diagram showing a reference axis according to one embodiment of the present invention. .
- FIG. 6 is an explanatory diagram showing a detection state by the sensor according to the embodiment of the present invention.
- FIG. 7 is a block diagram illustrating a system configuration example according to an embodiment of the present invention.
- FIG. 8 is a flowchart showing an example of head tracking processing according to an embodiment of the present invention.
- FIG. 9 is a flowchart illustrating an example of a two-axis sensor process according to an embodiment of the present invention.
- FIG. 10 is a flowchart showing an example of gyro sensor processing according to an embodiment of the present invention.
- FIG. 11 is a block diagram showing an example of a system configuration of a conventional head mount display. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 is a diagram showing an example of mounting the head mount display of this example.
- the head mount display 100 of the present example is shaped like a headphone worn on the left and right auricles of the user's head h, and shaped like the headphone.
- the video display is attached.
- FIG. 1 shows a state in which the video display unit 110 is positioned in front of the user to view video and audio.
- the head mount display 100 is connected to a video signal source (not shown) by a cable 148.
- the video supplied from the video signal source is displayed on the video display unit 110, and the left and right are displayed.
- the supplied audio is output from the driver cutout attached to the pinna of the user.
- a sensor for detecting the direction in which the wearer is facing is built in the head mount display 100, and the direction in which the wearer is facing is detected based on the output of the sensor.
- An image corresponding to the above is supplied from a video signal source to a head mount display 100 for display.
- the sound may also be output as a stereo sound signal in a phase corresponding to the direction in which the sound was directed.
- FIG. 2 is a diagram showing an example of the shape of the head mount display 100.
- the head mount display 100 is composed of a left driver unit 140 and a right driver unit 150 connected by a band 130, and left and right driver units 140, 1 In the state supported by 50, a horizontally long image display unit 110 is mounted.
- the band 130 is made of an elastic material, and the left dry unit 140 and the right dry unit 150 are placed on the auricle side of the wearer with relatively weak force. It is configured to be pressed and held on the head. Also, when not mounted on the head, the driver cutouts 140 and 150 on the left and right sides are in close contact with each other.
- the band 130 has a wide portion 1331 formed in the center thereof, and the head mount display 100 can be stably held by the wearer's head.
- U-shaped brackets 1332 and 1333 are formed at one end and the other end of the band 130, respectively, and are attached to the upper ends of the driver units 140 and 150.
- the middle of the U-shaped brackets 144, 154 is held by the U-shaped bracket holders 132, 133.
- the U-shaped bracket 1 4 4 1.5 4, holder 1 3 2, 1 3 3 By changing the position held by, can be adjusted in accordance with the size of the head portion of the 3 wearer.
- Each of the driver units 140 and 150 has a circular driver (speaker unit) that outputs sound by supplying an audio signal.
- a ring-shaped wire pad 14 2, 15 2 is attached around each of the driver placement sections 14 1, 15 1.
- Cavities 144, 157 are provided between each of the driver placement sections 141, 151 of this example and each of the ear pads 144, 152, and the pinna of the wearer is provided.
- the driver placement sections 14 1 and 15 1 are positioned so that they are slightly floating with respect to, so that it is a so-called full open air headphone.
- an image display panel 100L for the left eye is arranged in front of the left eye of the wearer, and an image display panel 100R for the right eye is arranged in front of the right eye of the wearer. It is arranged. Since FIGS. 1 and 2 are diagrams viewed from the outside, the image display panels 100 L and 10 OR are invisible. Each of the video display panels 100 L and 100 R is, for example, a liquid crystal display panel.
- Fig. 3 is a view of the wearing state viewed from the side, and it can be seen that the left and right image display panels 100L and 100R are positioned in front of the wearer.
- the image display means such as a liquid crystal display panel is not necessarily located at a position close to the eyes, and the display panel is provided inside the image display unit 110. In some cases, it is arranged so that an image is displayed in front of the wearer via optical components. Also, when lighting means such as a pack light is required, it is incorporated in the image display unit 110.
- the nose 100 On In the lower part between the left and right liquid crystal display panels 100 L and 100 R, there is a cutout for the nose 100 On, and when mounted as shown in FIG. 0 does not touch the nose of the wearer.
- the connection units 1 1 1 and 1 1 2 As a mechanism for supporting the image display unit 110 with the left and right driver units 140 and 150, one end and the other end of the image display unit 110 are connected to the connection units 1 1 1 and 1 1 2 and are connected to the connecting members 113 and 114 so as to be rotatable on a horizontal plane, and the ends of the connecting members 113 and 114 are connected to the connecting portions 115 , 1 16 are attached to the rod-shaped connecting members 1 17, 1 18 so as to be rotatable on a horizontal plane.
- the head mount display has two connecting portions 111, 112, 115, and 116, two at the left and right, as described above.
- the structure is such that the part 110 can be satisfactorily held.
- the rod-shaped connecting members 1 17 and 1 18 connected to the image display unit 110 are formed with through-holes 1 of shaft holding units 1 2 1 and 1 2 2 fixed to connecting members 1 2 3 and 1 2 4, respectively. 2 1 a, 1 2 2 a are passed through, and by adjusting the protruding length of the rod-shaped connecting members 1 17, 1 18 from the through holes 1 2 1 a, 1 2 2 a, The distance between the image display section 110 and the eyes of the wearer can be adjusted.
- FIG. 4 is a diagram illustrating an example of a state in which the image display unit 110 is lifted up.
- the image display section 110 is connected to the left and right driver units 140 through the cords 144 and 156 exposed to the outside from the rear ends of the rod-shaped connecting members 117 and 118.
- a video signal obtained through a code 148 connected to a video signal source is supplied to a video display unit 110 and a video signal
- the audio signal from the source is also supplied to the right driver unit 150 via the codes 146 and 156.
- two sensors are incorporated in the dry panel 150 (or the video display section 110), and control data based on the sensor outputs is transmitted via the code 148. To be supplied to the video signal source side.
- a reset switch is attached to a predetermined position (for example, one driver unit 140) of the head mount display 100 of this example, and other key switches are provided. Operation means such as a computer and a polygon are also arranged as necessary.
- the axis that erects the head h in the upright state is defined as the Z axis
- the X axis and the Y axis which are two axes orthogonal to the Z axis, are considered.
- the X axis is the axis in the left-right direction of the head
- the Y axis is the axis in the front-rear direction of the head.
- the horizontal rotation of the head h is indicated by the yaw angle (Yaw angle) 0, which is the angle of rotation about the Z axis
- the inclination of the head h in the forward and backward directions is represented by Y
- the pitch angle which is the angle between the axis and the The inclination of the head h in the left-right direction is expressed as a roll angle (an angle in the caulking direction) between the head h and the X axis.
- the yaw angle ⁇ ⁇ is detected by one gyro sensor, and the center of the sensor is shown in FIG. 5A for the roll angle and the pitch angle, as shown in FIG. 5A.
- Judgment is made based on the output of a tilt sensor (two-axis tilt sensor) that detects the tilt in the X-axis direction and Y-axis direction with respect to the plane (XY plane) composed of the X-axis and Y-axis as the origin of the coordinate system. I'm going.
- the inclination S 1 in the Y-axis direction is equivalent to the angle pitch angle in the X-axis rotation direction
- the inclination S 2 in the X-axis direction is equivalent to the angle roll angle in the Y-axis rotation direction.
- the tilt sensor is a sensor that measures gravity, which is a static acceleration, it can detect only ⁇ 90 degrees of judgment. Therefore, the rotational position of the human head can be detected. Furthermore, the pitch angle and the roll angle are output using the gravity, which is a static acceleration, as an absolute coordinate axis, so that the sensor does not cause a drift phenomenon. Since the accelerations S 1 and S 2 in the Z-axis direction are accelerations in the same direction, as shown in FIG. 6, the accelerations S 1 and S 2 are detected by one acceleration sensor 12 that detects the acceleration in the Z-axis direction. S2 is detected to determine the roll angle and pitch angle. The yaw angle 0 is determined from the acceleration output from the gyro sensor 11 that detects the acceleration in this direction.
- these two sensors 11 and 12 may be arranged at any position of the head mount display 100 ⁇ Next, the head mount display 100 of this example is used.
- the circuit configuration of This will be described with reference to the block diagram of FIG. FIG. 7 also shows a configuration of a video signal source 20 connected to the head mount display 100 via a code 148.
- the gyro sensor 11 attached to the head mount display 100 supplies the acceleration signal output from the sensor 11 to the analog processing unit 13 for analog processing such as filtering and amplification using a low-pass filter. After processing, it is converted to digital data and supplied to the central control unit 14.
- the tilt sensor 12 is a sensor that outputs an acceleration signal as a PWM signal that is a pulse width modulation signal.
- the tilt sensor 12 detects a tilt state in the X-axis direction and a tilt state in the Y-axis rotation direction. It is individually supplied to the central control unit 14 as a PWM signal.
- the roll angle and the pitch angle are calculated based on the supplied PWM signal.
- the central control unit 14 uses the position when the reset switch 15 is operated as the reference position, and mounts from the reference position based on the output of the gyro sensor 11 and the acceleration sensor 12.
- the yaw angle which is the direction in which the front of the head faces, is configured to detect the movement of the head of the user, and is calculated based on the output of the gyro sensor 11.
- the yaw angle calculated based on the output of the gyro sensor 11 may be corrected using the roll angle and the pitch angle calculated based on the output of the tilt sensor 12.
- the yaw angle may be corrected using the calculated roll angle and pitch angle.
- the data of the calculated angles for each of the three axes (yaw angle, roll angle, and pitch angle) calculated by the central control unit 14 are used as head tracking angle data, and the control interface unit 18 Is sent to the video signal source 20 side.
- the video signal source 20 is, for example, a memory 21 for storing video data all around a certain point, audio data accompanying the video data, and video data stored in the memory 21.
- Central control unit that reads video data in the direction indicated by the head tracking angle data detected by the head mount display 100 from among the data and supplies it to the 3D processing unit 23.
- 3D processing unit 23 that uses supplied video data as video data for image display, and video data created by 3D processing unit 23 is supplied to head mount display unit 100
- a control interface unit 25 for receiving the head tracking angle data detected by the head mount display 100.
- the video data supplied from the video signal source 20 to the head mount display 100 is received by the video interface unit 17 of the head mount display 100, and the video display unit 110 And performs a process of displaying on the left and right video display panels 100 L and 10 OR in the video display unit 110.
- the video data is video data for displaying a stereoscopic video
- the video data supplied to the left and right video display panels 100 L and 100 R to be displayed becomes individual data.
- the reception by the video interface unit 17 and the video display by the video display unit 110 are also executed under the control of the central control unit 14.
- the configuration for processing audio data is omitted.
- audio data see Head Trakin
- the direction in which the sound image is localized may be changed to the angle indicated by the head tracking angle data.
- the video signal source 20 for example, means for executing arithmetic processing such as a personal computer device, a video game device, a PDA (Personal Digital Assistants), a mobile phone terminal, etc., and these devices are built-in (or mounted) It consists of large-capacity storage means such as hard disks, optical disks, and semiconductor memories.
- step S11 When the power of the head mount display 100 is turned on (step S11), various initialization instructions are executed. Is performed (step S12), and thereafter, reset signal processing is executed (step S13).
- step S13 In the reset signal processing, the head tracking data in the posture of the wearer at that time is stored and signaled by the operation of the reset switch 15 or the request of the reset signal from the video signal source 20. The head tracking data is set to 0 ° at that position.
- the yaw angle can be detected at ⁇ 180 °, so there is no problem.
- the reset for the two axes is performed.
- a process is performed to limit the attitude angle for accepting to the vicinity of a plane orthogonal to the Z axis shown in Figs.
- a three-axis angle detection process is performed (step S14).
- a two-axis tilt sensor process and a gyro sensor process are performed.
- FIG. 9 is a flowchart showing the two-axis tilt sensor processing.
- 2-axis tilt sensor processing supplied from tilt sensor 12
- the X-axis duty ratio and Y-axis duty ratio of the PWM signal to be detected are detected (steps S21 and S22).
- a pitch angle and a roll angle are calculated from each duty ratio (step S23).
- step S24 If the acceleration detection axis of the tilt sensor 12 is displaced in the yaw angle direction on the ⁇ plane with respect to the wearer's X axis and ⁇ axis, the calculated pitch angle The deviation of the corner is corrected (step S24), and the processing of the two-axis tilt sensor is completed (step S25).
- FIG. 10 is a flowchart showing the jar mouth sensor process.
- the jay mouth sensor processing first, data obtained by digitally converting the output from the jay mouth sensor is obtained (step S31). Next, digital conversion is performed by a plurality of central control units having different gains, and gain ranging processing is performed to increase the dynamic range (step S32). The DC offset of step 1 is cut (step S33). Then, a coring process for cutting the noise component is performed (step S34), and the yaw angle is calculated by integrating the angular velocity data (step S35), and the gyro sensor process is performed. The process ends (step S36). As described above, when calculating the yaw angle in step S35, the yaw angle calculated based on the pitch angle and the roll angle detected by the two-axis tilt sensor processing. May be corrected.
- step S15 a process of transferring to the video signal source side is performed (step S15), and the process returns to the reset signal process of step S13.
- the reset switch is not operated or the reset signal is not supplied in the reset signal processing, the process directly returns to the three-axis angle detection processing in step S14.
- the two-axis tilt The gyro sensor process detects the dynamic acceleration component and integrates it, whereas the gyro sensor process detects the static acceleration that is gravity and calculates the inclination angle at the time of the detection. Since the yaw angle is obtained, the cycle at which each process is performed may be different.
- the delay time of the head tracking detection becomes important, so the head tracking can be performed at most within the video update rate. It is necessary to complete the processing and transfer the data, and it is important to execute the two-axis tilt sensor processing of the flow chart in Fig. 9 and the gyro sensor processing in Fig.
- a general-purpose microprocessor that performs calculations with 16 bits is used as the central control unit, 2-axis tilt sensor processing is performed at 125 Hz, and gyro sensor processing is performed.
- the update rate described above can be satisfied by performing the cycle at 1.25 KHz.
- the head mount display 100 configured as described above, it is possible to display an image linked to the movement of the wearer's head, and to display a so-called virtual reality image. It will be possible to do that. Also, it is possible to output headtracked audio.
- a sensor for detecting the head tracking angle it is sufficient to prepare a gyro sensor and a two-axis tilt sensor.
- a simple configuration using only two sensors provides a good three-dimensional head.
- the tracking angle can be detected.
- the detection range of the pitch angle and the roll angle is limited to ⁇ 90 °, but it is sufficient for detecting the posture angle within the range of normal human head movement. There is no practical problem.
- the pitch angle and the corner angle are detected using the tilt sensor, the drift phenomenon does not occur, and a virtual stable image in the horizontal direction is obtained. 3D sky The time is simple and low cost.
- the head mount display itself can be configured to be small, and the wearing feeling of the head mount display can be improved.
- the video display section is attached to a so-called full open air type headphone to provide a head mount display.
- the head mount display has the same function as a conventional fully open air headphone, so that the head mount display can be mounted. It will be good. Further, as shown in FIG. 4, when the video display unit 110 is flipped up, it can be used as a headphone, and the versatility of the device is high.
- the external shape of the head mount display shown in FIGS. 1 to 4 is merely an example, and the present invention can be applied to a head mount display having another shape.
- the head tracking processing of the present invention may be applied to a headphone device (that is, a device without a video display function) that performs the sound image localization position of the stereo sound by head tracking.
- a reset switch is provided on the head mount display, a position where the reset switch is operated is set as a reference position, and the position from the position is set as a reference position.
- the movement is detected, but the absolute direction is detected by some method (for example, a geomagnetic sensor), and the absolute angle head is used without providing a reset switch. Detracking processing may be performed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Controls And Circuits For Display Device (AREA)
- Gyroscopes (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/525,925 US20050256675A1 (en) | 2002-08-28 | 2003-08-26 | Method and device for head tracking |
EP03791296A EP1541966A4 (en) | 2002-08-28 | 2003-08-26 | METHOD AND DEVICE FOR MONITORING THE POSITION OF THE HEAD |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-249443 | 2002-08-28 | ||
JP2002249443A JP2004085476A (ja) | 2002-08-28 | 2002-08-28 | ヘッドトラッキング方法及び装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004020946A1 true WO2004020946A1 (ja) | 2004-03-11 |
Family
ID=31972580
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/010776 WO2004020946A1 (ja) | 2002-08-28 | 2003-08-26 | ヘッドトラッキング方法及び装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20050256675A1 (ja) |
EP (1) | EP1541966A4 (ja) |
JP (1) | JP2004085476A (ja) |
KR (1) | KR20050059110A (ja) |
WO (1) | WO2004020946A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111736353A (zh) * | 2020-08-25 | 2020-10-02 | 歌尔光学科技有限公司 | 一种头戴式设备 |
Families Citing this family (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7749089B1 (en) | 1999-02-26 | 2010-07-06 | Creative Kingdoms, Llc | Multi-media interactive play system |
US6761637B2 (en) | 2000-02-22 | 2004-07-13 | Creative Kingdoms, Llc | Method of game play using RFID tracking device |
US7445550B2 (en) | 2000-02-22 | 2008-11-04 | Creative Kingdoms, Llc | Magical wand and interactive play experience |
US7878905B2 (en) | 2000-02-22 | 2011-02-01 | Creative Kingdoms, Llc | Multi-layered interactive play experience |
US7066781B2 (en) | 2000-10-20 | 2006-06-27 | Denise Chapman Weston | Children's toy with wireless tag/transponder |
US6967566B2 (en) | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US20070066396A1 (en) | 2002-04-05 | 2007-03-22 | Denise Chapman Weston | Retail methods for providing an interactive product to a consumer |
US7674184B2 (en) | 2002-08-01 | 2010-03-09 | Creative Kingdoms, Llc | Interactive water attraction and quest game |
US9446319B2 (en) | 2003-03-25 | 2016-09-20 | Mq Gaming, Llc | Interactive gaming toy |
CN1774206A (zh) * | 2003-04-11 | 2006-05-17 | 松下电器产业株式会社 | 加速度传感器轴信息校正装置、及其校正方法 |
US7672781B2 (en) * | 2005-06-04 | 2010-03-02 | Microstrain, Inc. | Miniaturized wireless inertial sensing system |
DE102005033957B4 (de) * | 2005-07-20 | 2008-08-28 | Siemens Ag | Vorrichtung und Verfahren zur kabellosen Bedienung eines insbesondere medizinischen Geräts |
US7942745B2 (en) | 2005-08-22 | 2011-05-17 | Nintendo Co., Ltd. | Game operating device |
US8313379B2 (en) | 2005-08-22 | 2012-11-20 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
JP4805633B2 (ja) | 2005-08-22 | 2011-11-02 | 任天堂株式会社 | ゲーム用操作装置 |
US7927216B2 (en) | 2005-09-15 | 2011-04-19 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
JP4262726B2 (ja) | 2005-08-24 | 2009-05-13 | 任天堂株式会社 | ゲームコントローラおよびゲームシステム |
US8870655B2 (en) | 2005-08-24 | 2014-10-28 | Nintendo Co., Ltd. | Wireless game controllers |
US8308563B2 (en) | 2005-08-30 | 2012-11-13 | Nintendo Co., Ltd. | Game system and storage medium having game program stored thereon |
US8157651B2 (en) | 2005-09-12 | 2012-04-17 | Nintendo Co., Ltd. | Information processing program |
JP4530419B2 (ja) | 2006-03-09 | 2010-08-25 | 任天堂株式会社 | 座標算出装置および座標算出プログラム |
JP4151982B2 (ja) | 2006-03-10 | 2008-09-17 | 任天堂株式会社 | 動き判別装置および動き判別プログラム |
JP4684147B2 (ja) | 2006-03-28 | 2011-05-18 | 任天堂株式会社 | 傾き算出装置、傾き算出プログラム、ゲーム装置およびゲームプログラム |
JP4810295B2 (ja) * | 2006-05-02 | 2011-11-09 | キヤノン株式会社 | 情報処理装置及びその制御方法、画像処理装置、プログラム、記憶媒体 |
WO2008001635A1 (fr) | 2006-06-27 | 2008-01-03 | Nikon Corporation | Dispositif d'affichage de vidéo |
JP5228305B2 (ja) | 2006-09-08 | 2013-07-03 | ソニー株式会社 | 表示装置、表示方法 |
JP5127242B2 (ja) | 2007-01-19 | 2013-01-23 | 任天堂株式会社 | 加速度データ処理プログラムおよびゲームプログラム |
US7463953B1 (en) | 2007-06-22 | 2008-12-09 | Volkswagen Ag | Method for determining a tilt angle of a vehicle |
DE102007030972B3 (de) | 2007-07-04 | 2009-01-29 | Siemens Ag | MR-kompatibles Videosystem |
EP2012170B1 (en) | 2007-07-06 | 2017-02-15 | Harman Becker Automotive Systems GmbH | Head-tracking system and operating method thereof |
US20090119821A1 (en) * | 2007-11-14 | 2009-05-14 | Jeffery Neil Stillwell | Belt with ball mark repair tool |
KR100947046B1 (ko) * | 2007-11-19 | 2010-03-10 | 황진상 | 운동체 자세 추적 장치, 운동체 자세 추적 방법, 이를이용한 칫솔 자세 추적 장치 및 칫솔 자세 추적 방법 |
EP2327003B1 (en) | 2008-09-17 | 2017-03-29 | Nokia Technologies Oy | User interface for augmented reality |
US10015620B2 (en) * | 2009-02-13 | 2018-07-03 | Koninklijke Philips N.V. | Head tracking |
JP2011019035A (ja) * | 2009-07-08 | 2011-01-27 | Ricoh Co Ltd | 情報装置、該装置を搭載した撮像装置および角度補正方法 |
US9491560B2 (en) | 2010-07-20 | 2016-11-08 | Analog Devices, Inc. | System and method for improving headphone spatial impression |
US20130007672A1 (en) * | 2011-06-28 | 2013-01-03 | Google Inc. | Methods and Systems for Correlating Head Movement with Items Displayed on a User Interface |
US9529426B2 (en) | 2012-02-08 | 2016-12-27 | Microsoft Technology Licensing, Llc | Head pose tracking using a depth camera |
WO2014069090A1 (ja) * | 2012-11-02 | 2014-05-08 | ソニー株式会社 | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム |
CN103018907A (zh) * | 2012-12-19 | 2013-04-03 | 虢登科 | 显示方法及头戴显示器 |
JP2014215053A (ja) * | 2013-04-22 | 2014-11-17 | 富士通株式会社 | 方位検知装置、方法及びプログラム |
GB201310359D0 (en) | 2013-06-11 | 2013-07-24 | Sony Comp Entertainment Europe | Head-Mountable apparatus and systems |
USD741474S1 (en) * | 2013-08-22 | 2015-10-20 | Fresca Medical, Inc. | Sleep apnea device accessory |
US9785231B1 (en) * | 2013-09-26 | 2017-10-10 | Rockwell Collins, Inc. | Head worn display integrity monitor system and methods |
KR20150041453A (ko) * | 2013-10-08 | 2015-04-16 | 엘지전자 주식회사 | 안경형 영상표시장치 및 그것의 제어방법 |
KR101665027B1 (ko) * | 2014-03-05 | 2016-10-11 | (주)스코넥엔터테인먼트 | 헤드 마운트 디스플레이용 헤드 트래킹 바 시스템 |
JP6340301B2 (ja) | 2014-10-22 | 2018-06-06 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ、携帯情報端末、画像処理装置、表示制御プログラム、表示制御方法、及び表示システム |
JP6540004B2 (ja) * | 2014-12-08 | 2019-07-10 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
WO2016155019A1 (zh) * | 2015-04-03 | 2016-10-06 | 深圳市柔宇科技有限公司 | 头戴式电子装置 |
US10225641B2 (en) * | 2015-04-30 | 2019-03-05 | Shenzhen Royole Technologies Co., Ltd. | Wearable electronic device |
WO2016172988A1 (zh) * | 2015-04-30 | 2016-11-03 | 深圳市柔宇科技有限公司 | 穿戴式电子装置 |
CN106031194A (zh) * | 2015-04-30 | 2016-10-12 | 深圳市柔宇科技有限公司 | 穿戴式电子装置 |
SG10201912070VA (en) | 2015-05-13 | 2020-02-27 | Kolibree | Toothbrush system with magnetometer for dental hygiene monitoring |
CN107407809B (zh) * | 2016-01-26 | 2020-04-14 | 深圳市柔宇科技有限公司 | 头戴式设备、耳机装置及头戴式设备分离控制方法 |
US20180061103A1 (en) * | 2016-08-29 | 2018-03-01 | Analogix Semiconductor, Inc. | Systems and Methods for Generating Display Views Tracking User Head Movement for Head-Mounted Display Devices |
KR102614087B1 (ko) * | 2016-10-24 | 2023-12-15 | 엘지전자 주식회사 | Hmd 디바이스 |
JP2019054360A (ja) | 2017-09-14 | 2019-04-04 | セイコーエプソン株式会社 | 電子機器、モーションセンサー、位置変化検出プログラムおよび位置変化検出方法 |
JP2019114884A (ja) | 2017-12-22 | 2019-07-11 | セイコーエプソン株式会社 | 検出装置、表示装置、検出方法 |
JP2018160249A (ja) * | 2018-05-14 | 2018-10-11 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイシステム、ヘッドマウントディスプレイ、表示制御プログラム、及び表示制御方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0889011A (ja) * | 1994-09-28 | 1996-04-09 | Mitsubishi Agricult Mach Co Ltd | 移動農機の傾斜検出装置 |
JPH09222921A (ja) * | 1996-02-14 | 1997-08-26 | Mitsubishi Heavy Ind Ltd | 無人車輛の走行制御装置 |
JP2000020017A (ja) * | 1998-07-02 | 2000-01-21 | Canon Inc | 分離型表示装置 |
US6201883B1 (en) * | 1998-01-22 | 2001-03-13 | Komatsu Ltd. | Topography measuring device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5645077A (en) * | 1994-06-16 | 1997-07-08 | Massachusetts Institute Of Technology | Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body |
US5841409A (en) * | 1995-04-18 | 1998-11-24 | Minolta Co., Ltd. | Image display apparatus |
US5991085A (en) * | 1995-04-21 | 1999-11-23 | I-O Display Systems Llc | Head-mounted personal visual display apparatus with image generator and holder |
US6369952B1 (en) * | 1995-07-14 | 2002-04-09 | I-O Display Systems Llc | Head-mounted personal visual display apparatus with image generator and holder |
JP2002141841A (ja) * | 2000-10-31 | 2002-05-17 | Toshiba Corp | 頭部装着型情報処理装置 |
-
2002
- 2002-08-28 JP JP2002249443A patent/JP2004085476A/ja active Pending
-
2003
- 2003-08-26 EP EP03791296A patent/EP1541966A4/en not_active Withdrawn
- 2003-08-26 WO PCT/JP2003/010776 patent/WO2004020946A1/ja not_active Application Discontinuation
- 2003-08-26 KR KR1020057003247A patent/KR20050059110A/ko not_active Application Discontinuation
- 2003-08-26 US US10/525,925 patent/US20050256675A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0889011A (ja) * | 1994-09-28 | 1996-04-09 | Mitsubishi Agricult Mach Co Ltd | 移動農機の傾斜検出装置 |
JPH09222921A (ja) * | 1996-02-14 | 1997-08-26 | Mitsubishi Heavy Ind Ltd | 無人車輛の走行制御装置 |
US6201883B1 (en) * | 1998-01-22 | 2001-03-13 | Komatsu Ltd. | Topography measuring device |
JP2000020017A (ja) * | 1998-07-02 | 2000-01-21 | Canon Inc | 分離型表示装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1541966A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111736353A (zh) * | 2020-08-25 | 2020-10-02 | 歌尔光学科技有限公司 | 一种头戴式设备 |
Also Published As
Publication number | Publication date |
---|---|
EP1541966A1 (en) | 2005-06-15 |
US20050256675A1 (en) | 2005-11-17 |
KR20050059110A (ko) | 2005-06-17 |
EP1541966A4 (en) | 2006-02-01 |
JP2004085476A (ja) | 2004-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004020946A1 (ja) | ヘッドトラッキング方法及び装置 | |
US10638213B2 (en) | Control method of mobile terminal apparatus | |
JP5832900B2 (ja) | 慣性センサからのユーザ入力を判断するための方法および装置 | |
EP2451187A2 (en) | Headset with accelerometers to determine direction and movements of user head and method | |
JP2004096224A (ja) | 電源制御方法及びヘッドマウント装置 | |
US9916004B2 (en) | Display device | |
JP2008067877A (ja) | ゲーム装置およびゲームプログラム | |
JP2010147529A (ja) | 情報処理システムおよび情報処理方法 | |
US11589183B2 (en) | Inertially stable virtual auditory space for spatial audio applications | |
US11698258B1 (en) | Relative inertial measurement system with visual correction | |
WO2014147946A1 (ja) | 加速度感覚呈示装置、加速度感覚呈示方法および加速度感覚呈示システム | |
CN107710105A (zh) | 操作输入装置和操作输入方法 | |
JP5428261B2 (ja) | 制御装置、ヘッドマウントディスプレイ装置、プログラム及び制御方法 | |
CN107209205B (zh) | 重心移动力设备 | |
CN108393882B (zh) | 机器人姿态控制方法及机器人 | |
JP5937137B2 (ja) | 地磁気検出装置 | |
KR20170105334A (ko) | 카메라 모듈 | |
CN108844529A (zh) | 确定姿态的方法、装置及智能设备 | |
WO2020062163A1 (zh) | 云台的控制方法及手持云台、手持设备 | |
CN114764241A (zh) | 运动状态的控制方法、装置、设备及可读存储介质 | |
JP2004046006A (ja) | 3次元情報表示装置 | |
US10560777B1 (en) | Bone conduction designs in wearable electronic devices | |
JP3940896B2 (ja) | 回転検出装置、コントローラ装置、ヘッドホン装置および信号処理システム | |
CN115480561A (zh) | 运动状态的控制方法、装置、轮腿式机器人及存储介质 | |
JP2013012010A (ja) | ポインタ表示装置、ポインタ表示方法、及びポインタ表示プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10525925 Country of ref document: US Ref document number: 1020057003247 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003791296 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003791296 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057003247 Country of ref document: KR |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2003791296 Country of ref document: EP |