WO2022061468A1 - Head joystick interface system and methods thereof - Google Patents

Head joystick interface system and methods thereof Download PDF

Info

Publication number
WO2022061468A1
WO2022061468A1 PCT/CA2021/051335 CA2021051335W WO2022061468A1 WO 2022061468 A1 WO2022061468 A1 WO 2022061468A1 CA 2021051335 W CA2021051335 W CA 2021051335W WO 2022061468 A1 WO2022061468 A1 WO 2022061468A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
user
camera
observer
frame
Prior art date
Application number
PCT/CA2021/051335
Other languages
French (fr)
Inventor
Seyedebrahim HASHEMIAN
Bernhard RIECKE
Markus von der Heyde
Ernst KRUIJFF
Original Assignee
Hashemian Seyedebrahim
Riecke Bernhard
Von Der Heyde Markus
Kruijff Ernst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hashemian Seyedebrahim, Riecke Bernhard, Von Der Heyde Markus, Kruijff Ernst filed Critical Hashemian Seyedebrahim
Publication of WO2022061468A1 publication Critical patent/WO2022061468A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G1/00Controlling members, e.g. knobs or handles; Assemblies or arrangements thereof; Indicating position of controlling members
    • G05G1/52Controlling members specially adapted for actuation by other parts of the human body than hand or foot
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
    • G05G9/04737Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks with six degrees of freedom

Definitions

  • This disclosure relates generally to a joystick-like interface and more particularly to a joysticklike interface that uses the movements of a user’s head that carries a head-mounted display device to control an observer camera motions in a virtual reality or real environment.
  • VR interfaces usually require the user to control different degrees of freedom (DOFs) for changing position (translation) and direction (rotation) of the simulated or virtual camera inside a virtual reality.
  • DOFs degrees of freedom
  • Controlling movement in a VR environment using standard controllers such as handheld gamepads is cumbersome and can contribute to unwanted side effects such as motion sickness and disorientation, and restricts users in being able to use their hands for other relevant tasks such as interaction, gesturing, or communication.
  • a method for using movements of a user’s head as a joystick-like interface for controlling an observer camera that generates visuals of a headmounted display (HMD) device coupled to the user’s head comprises a set of instructions that are executed by a processor such as a computer or an embedded microprocessor.
  • the instructions include receiving, by the processor, head movement measurements related to the user’s head translational and rotational movements in 6 spatial degrees of freedom (DOFs); and processing the user's head movement measurements, by the processor, to generate computer readable commands, the commands are configured in such a way to control translational motion of the observer camera, using only the head’s translation and without being affected by the head’s rotation.
  • DOFs spatial degrees of freedom
  • the rotation of the observer camera is controlled by the rotation of the user’s head
  • the translation of the observer camera is controlled by the user’s head translation.
  • One advantage of this method over other head-motion-based methods is that the head’s pure rotation would not cause translational movement in the VR environment.
  • the user’s head translation and rotation are decoupled and the head’s translational and rotational movements correspond to controlling the translational and rotational movements of the observer camera independently.
  • the observer camera that generates the visuals for the HMD may be a virtual camera configured to generate large-scale virtual reality environment visuals.
  • the camera may be a real camera operably coupled to a remote-controlled vehicular system such as a quadcopter, a mobile ground robot, a spacecraft, or a submarine.
  • a remote-controlled vehicular system such as a quadcopter, a mobile ground robot, a spacecraft, or a submarine.
  • the proposed method uses the user’s head translation to cause translational movement for the remote-controlled vehicular system, and hence the camera.
  • a joystick-like interface system to control locomotion of an observer camera that generates visuals.
  • the system comprises: a head-mounted display (HMD) coupled to a user’s head and configured to display the observer camera generated visuals; a primary sensory subsystem, comprised of one or more sensors, such as HMD sensory subsystem, configured to measure the translation and rotation of the head in 6 spatial degrees of freedom (DOF); and a processor operably connected to the HMD and the primary sensory subsystem, the processor configured to execute a set of command codes; wherein the processor controls translational movement of the observer camera using the head’s translation such that it is not being affected by the user’s head rotation.
  • HMD head-mounted display
  • DOF spatial degrees of freedom
  • the observer camera that generates the visuals for the HMD may be a virtual camera configured to generate virtual reality environment visuals.
  • the camera may be a real camera operably coupled to a remote-controlled vehicular system such as a quadcopter, a mobile ground robot, a spacecraft, or a submarine.
  • a remote-controlled vehicular system such as a quadcopter, a mobile ground robot, a spacecraft, or a submarine.
  • the proposed system uses the user’s head translation to cause translational movement for the remote-controlled vehicular system, and hence the camera.
  • the primary sensory subsystem of the proposed interface may measure the head’s movement in 6 spatial DOFs either directly, or otherwise indirectly by measuring another reference’s movement, such as HMD, first and then deriving the head’s movement.
  • the primary sensory subsystem of the proposed interface may include: an inertial measurement unit (IMU), or any other type of sensor such as optic or magnetic sensors, coupled to the HMD, or to the user’s head, to measure the head’s rotation in 3 DOFs; at least one optical sensor to measure the HMD translation in spatial degrees of freedoms; and a primary sensory processor which is configured to use the HMD’s translational and rotational measurements to calculate the user’s head translation and rotation in 6 spatial DOFs.
  • IMU inertial measurement unit
  • optic or magnetic sensors coupled to the HMD, or to the user’s head, to measure the head’s rotation in 3 DOFs
  • at least one optical sensor to measure the HMD translation in spatial degrees of freedoms
  • a primary sensory processor which is configured to use the HMD’s translational and rotational measurements to calculate the user’s head translation and rotation in 6 spatial DOFs.
  • the primary sensory subsystem of the proposed interface may: assign a hypothetical head rotation center point to the head; identify an initial point as the location of the head rotation center point at the beginning of the observer camera movement; indicate a head displacement vector, the vector connecting the initial point to the head rotation center point in every moment during the head movement; and calculate the head’s translation and rotation by measuring the head displacement vector.
  • the processor of the proposed interface may control the translational movement of the observer camera by configuring the direction of the observer camera translation in alignment with the head displacement vector direction; and by configuring the velocity of the observer camera translation in proportion to the head displacement vector magnitude.
  • the proposed interface system may further include: a coordinate system which is assigned to a frame, such as a chair, wherein the frame has independent motions with respect to the head in at least one direction; and a secondary sensory subsystem, such as an optical tracking sensor, which is configured to measure the relative motion between the head and the frame’s coordinate system; and wherein, the head rotation center point is calculated with respect to the frame’s coordinate system.
  • a coordinate system which is assigned to a frame, such as a chair, wherein the frame has independent motions with respect to the head in at least one direction
  • a secondary sensory subsystem such as an optical tracking sensor, which is configured to measure the relative motion between the head and the frame’s coordinate system; and wherein, the head rotation center point is calculated with respect to the frame’s coordinate system.
  • the secondary sensory subsystem of the proposed interface system may be an optical tracking system which comprises: a tracker fixedly attached to the frame; and at least one optical tracking camera configured to measure the movement of the tracker in 6 spatial DOFs.
  • the secondary sensory subsystem of the proposed interface system may be a vision system attached to the HMD and configured to measure the relative motion between the head and the frame’s coordinate system.
  • the frame in the proposed interface system may be the user’s torso which often has independent movements from the user’s head.
  • the frame in the proposed interface system may be any type of vehicle that is operably configured to seat or otherwise support the user during the head movement.
  • the frame may be a regular office swivel chair.
  • the visual -vestibular sensory conflict may cause motion sickness.
  • vestibular sensory information can not be judged on an absolute scale, but appear relative to e.g. gravity or other accelerations.
  • the proposed joystick interface provides limited, but coherent vestibular cues of actual travel, it may cause less sensory conflict and thus, lower motion sickness compared to the handheld controllers.
  • Mental calculation of head movement toward a target direction when looking at another direction could be done automatically and intuitively with less mental effort compared to the thumb or hand movements. This behavior is true especially during a virtual flight, where a user may typically use two handheld controllers to control 4 DOFs including up/down, left/right, forward/backward, and yaw rotation.
  • using the proposed joystick interface system may be easier and less mentally computational than using handheld interfaces.
  • learning to use such an interface may be easier as well.
  • intuitive control of the proposed joystick interface may potentially result in faster reactions of the user and hence, the user may more likely reach a target faster.
  • Easier utilization of the proposed joystick interface system allows a user to psychologically be captivated by the task rather than spending their attention to handle the controller and hence may improve the immersion into a VR environment.
  • the proposed joystick interface may result in improved control of travel or motion of the observer camera.
  • the user moves their head similar to a joystick handle toward a target direction to control virtual translation velocity.
  • the user sits on a regular office swivel chair and rotates it physically to control virtual rotation using 1:1 mapping.
  • movement is used equivalently with the term motion. Also, the term movement is used to describe a general change in position and orientation, and is decomposed to translational movement and rotational movement throughout this draft.
  • Figure 1 is a perspective view of a head joystick interface system according to a first disclosed embodiment
  • Figure 2 is a block diagram of the head joystick interface system of Figure 1;
  • Figure 3 is a flowchart depicting blocks of code for directing the processor of Figure 2 to use the head joystick system of Figure 1 for controlling observer camera locomotion in a VR environment;
  • Figures 4A to 4C are side, top, and front views of the head joystick interface system of Figure 1 ;
  • Figures 5A and 5B are side views of a head joystick interface system according to an alternative disclosed embodiment;
  • Figure 6 is a block diagram of the head joystick interface system of Figure 5 A;
  • Figure 7 is another embodiment of the head joystick interface system of Figure 5 A;
  • FIG. 1 a perspective view of a head joystick interface system according to a first disclosed embodiment is generally shown at 100.
  • the interface system includes a head-mounted display (HMD) device 102 coupled to a user’s head 152 and a processor 130 operably connected to the HMD device 102 using a cable 131.
  • the processor 130 may be embedded into the HMD device 102 or may be connected to HMD device 102 using other means such as wireless connection.
  • the HMD device 102 displays, for the user 150, visuals (not shown in figures) that are generated by an observer camera (not shown in the figures).
  • the visuals may be computer simulations such as a virtual reality (VR) environment that are generated through a virtual camera or viewpoint.
  • VR virtual reality
  • the visuals may be representative of real scenes captured and generated by a real camera.
  • the visuals generated by the observer camera are displayed in the HMD device for the user 150.
  • the camera may be attached to a remote-controlled vehicular system (not shown in figures) such as a quadcopter (aerial vehicle), a mobile ground robot, a remote-controlled spacecraft, or a submarine.
  • the user’s head 152 in a first position is moved to a second position 154 resulting in a user’s head movement indicated by a solid vector 114 and, subsequently, an HMD’s movement indicated by a dashed vector 108.
  • the head joystick interface system 100 uses the user’s head movement vector 114 to control the movements of the observer camera in the VR or real environment. More generally, the head joystick system 100 is configured to use the user’s head 152 movements in 6 spatial degrees of freedom (DOF) 199 to control the movements of the observer camera in the VR or real environment. In different embodiments, the user’s head 152 movement may control various attributes of the observer camera motion.
  • DOF spatial degrees of freedom
  • the user’s head 152 position changes may control the speed of the observer camera while in other embodiments the position or acceleration of the observer camera may be controlled. In some embodiments there may be a 1 to 1 proportionality between the user’s head 152 movement and the observer camera movement while in other embodiments other uneven proportionalities may be used.
  • the head joystick interface system 100 uses the HMD’s movement vector 108 to calculate the user’s head movement vector 114.
  • the HMD 102 includes a primary sensory subsystem (not shown in Figure 1 but shown in Figure 2) such as an inertial measurement unit (IMU) to measure the HMD’s movement vector 108.
  • IMU inertial measurement unit
  • the HMD’s movement vector 108 may be measured using other sensory systems such as one or more cameras attached to the HMD 102 or optical tracking systems that track the HMD’s movement.
  • the measured HMD’s movement vector 108 is processed in the processor 130 to calculate the user’s head movement vector 114.
  • the calculated user’s head movement vector 114 is used to control the movements of the observer camera in the VR or real environment. Using the user’s head movement vector 114 for controlling the observer camera’s movement allows decoupling the rotational movement of the observer camera from the translational movement. In some other embodiments, the primary sensory subsystem may directly measure the user’s head movement vector 114 without requiring measuring the movement vector of any other reference object such as the HMD’s movement vector 108.
  • one of the main advantages of the proposed head joystick interface system 100 is that the user’s head movement vector 114 is used to control the observer camera’s movement. Otherwise, if the HMD’s movement vector 108 is used to control the observer camera’s movement, pure rotation of the user’s head will result in both rotation and translation of the HMD (non-zero HMD’s movement vector 108) which will result in rotation and translation of the observer camera.
  • This feature of the proposed head joystick interface system 100 allows for intuitive control of the observer camera’s movements, i.e. pure head rotation results in pure observer camera’s rotation, and pure translation of the head results in pure translation of the observer camera. It might also help to reduce motion sickness. This advantage will arguably improve the experience and immersion of a user while interfacing with the VR or real environment.
  • the observer camera’s motion is controlled by setting the observer camera’s speed vector in a direction aligned with the user’s head movement vector 114 and a magnitude proportional to the magnitude of the user’s head movement vector 114.
  • the HMD device 102 includes a primary sensory subsystem 160 such as an IMU configured to measure the position of the HMD 102 at each point in time.
  • the HMD position measurements are transferred to an I/O interface of the processor 130 using transfer line 131.
  • the processor 130 further may comprise a memory 134 to store processor executable program codes and parameters 136.
  • the processor 130 is configured to process the HMD position measurements using a program such as blocks of code presented in the flowchart of Figure 3 to control the observer camera locomotion in a VR or real environment.
  • the blocks of code in Figure 3 are directed toward a camera in a VR environment, however, in other embodiments, the blocks of code could be directed toward a camera in a real environment.
  • FIG. 3 a flowchart depicting blocks of code for directing the processor 130 the head joystick interface system 100 is shown at 300.
  • the blocks generally represent codes that may be read from the program codes 136 of the memory 134 for directing the processor 130 to perform various functions for the purpose of controlling the observer camera in a VR environment.
  • the actual code to implement each block may be written in any suitable program language, such as C, C++, C#, Java, PHP, Python, and/or assembly code, for example.
  • the process 300 starts by block 310 where measurements related to the HMD’s rotational and translations movements are acquired by the processor 130.
  • the HMD’s rotational and translational measurements are processed to calculate the user’s head movement vector 114.
  • the calculated user’s head movement vector 114 is used to calculate the virtual camera’s translational movement in the VR environment.
  • the HMD’s rotational measurements are processed to calculate the virtual camera’s rotational movements in the VR environment.
  • the calculated rotational and translational movements related to the virtual camera are processed to render the VR environment accordingly.
  • the rendered VR environment is transmitted to the HMD for display.
  • FIG. 4A to 4C more detailed views are depicted which are showing the user’s head 152 movements in the head joystick interface system 100.
  • the user’s head 152 in the first position is shown with grayed-out lines and the user’s head at the second position 154 is shown with black lines.
  • the user’s head position at the second position 154 represents a head movement in all 6 DOFs in space.
  • the user’s head 152 is moved from an initial point 110 to a secondary point 112 by the user’s head movement vector 114.
  • the HMD 102 is moved from an HMD’s initial point 104 to an HMD’s secondary point 106 by the HMD’s movement vector 108.
  • the user’s head initial point 110 is a point attached to the user’s head 152 at an initial point and is arbitrarily chosen from the user’s head 152.
  • the initial point 110 is chosen as a point on the user’s head 152 which is apart from the HMD’s initial point 104 by a fixed vector 116.
  • the initial point 110 is chosen to represent the closest point to the user’ s head 152 center of rotation for all 3 rotational DOFs and may be different from one user to another.
  • the initial point 110 is referred to as zero point thereafter, since the initial point 110 corresponds to the zero movement of the user’s head 152.
  • the zero point 110 of a user’s head may be identified in a relaxed state of the user’s head. In other words, the zero point 110 is calibrated by the processor 130 at a relaxed position of the user’s head 152.
  • FIG 4A a side view of the user’s head 152 shows the user’s head movement vector 114 in the XZ plane.
  • Figure 4B a top view of the user’s head 152 shows the user’s head movement vector 114 in the XY plane.
  • Figure 4C a front view of the user’s head 152 shows the user’s head movement vector 114 in the YZ plane.
  • the Figures 4A to 4C show that the user’s head movement vector 114 is generally different from the HMD’s movement vector 108.
  • the user’s head movement vector 114 may be calculated from the HMD’s movement vector 108 using a given fixed vector 116. By adding the HMD’s initial point 104 with the fixed vector 116 the user’s head initial point 110 is obtained.
  • the user’s head secondary point 112 is obtained by first calculating an updating fixed vector 117 by rotating the fixed vector 116 by the HMD’s rotation in yaw, pitch, and roll directions, and then adding the updated fixed vector 117 with the HMD’s secondary point 106.
  • User’s head movement vector 114 can be calculated then, by the difference between the user’s head secondary point 112 from the user’s head initial point 110. This method of calculating the user’s head movement vector 114 may be used in the calculations and processes of the block 320 of the process 300 of Figure 3.
  • the interface system 500 comprises an HMD device 502 coupled to a user’s head 552, a processor 530 operably connected to the HMD 502 using a cable 531 or wirelessly.
  • the interface system 500 fiirther includes a primary sensory subsystem 560 (as shown in Figure 6) that measures the movements of the HMD 502 with respect to a global coordinate system 570 (the XYZ coordinate system).
  • the primary sensory subsystem 560 includes a pair of optical tracking cameras 564 that optically measure and track the HMD’s 502 movement in the global coordinate system 570.
  • the measurements of the tracking cameras 564 is transmitted to the processor 530 wirelessly but in other embodiments may be transmitted using a wired connection.
  • the primary sensory subsystem 560 may include an IMU sensor attached to the HMD 502 or optical sensors attached to the HMD, for example.
  • the interface system 500 further comprises a local coordinate system 581 (the XFYFZF coordinate system) attached to a frame 580 which moves independently from the user’s head 552.
  • the interface system 500 further comprises a secondary sensory subsystem 561 (as shown in Figure 6) which is configured to measure the user’s head movements with respect to the local coordinate system 581.
  • the secondary sensory subsystem 561 is an optical tracking system comprising of the pair of tracking cameras 564 and a tracker 566 affixed to the frame 580.
  • the frame 580 is a swivel chair which comprises a wheeled chassis 585, a seat 586 configured to rotate freely with respect to the wheeled chassis 585 (rotation around axis ZF), and a backrest 588 configured to rotate vertically with respect to the wheeled chassis 585 (rotation around axis YF).
  • the frame 580 can move freely with respect to the global coordinate system 570 at least in the XY plane but in other embodiments, the frame 580 may move freely in other directions as well.
  • the local coordinate system 581 has an origin 583 which is affixed to the frame 580.
  • the origin is conveniently located at the backrest’s center of rotations (pitch rotation center) but in other embodiments, the origin 583 may be located at other locations of the frame 580 such as at the chair’s 580 yaw rotation center.
  • the origin 583 could be calculated by the processor 530 by various methods, such as by knowing the fixed position of the tracker 566 with respect to the origin 583, or by measuring, using the secondary sensory subsystem 561, the position of the tracker 566 with respect to the global coordinate system 570 in multiple backrest 588 positions and then calculating the origin by knowing that the distance between the origin 583 and the tracker 566 is fixed.
  • the swivel chair 580 is configured to seat the user 550.
  • the user’s head 552 initial point 510 also referred to as the zero point 510, is determined at a relaxed state of the user’s head 552 when the user’s 550 back is in a relaxed position against the backrest 588.
  • One advantage of using a rotating chair as the frame 580 is that the user 550 can freely rotate the virtual or real camera in 360 degrees by rotating the chair seat 586 in yaw direction without the need to rotate the head 552. This advantage will result in less fatigue for the user’s head 552 during VR or real experience.
  • the tracker 566 is affixed to the backrest 588 of the chair 580 and is configured to facilitate the measurement of the local coordinate system 581 with respect to the global coordinate system 570.
  • the user’s head 552 measurements with respect to the global coordinate system 570, and provided, by the secondary sensory subsystem 561, the local coordinate system 581 with respect to the global coordinate system 570, the user’s head 552 measurements could be derived and calculated in the processor 530.
  • the tracker 566 may be affixed to other parts of the frame 580 in other embodiments.
  • the purpose of the secondary sensory subsystem 561 is to, at all times, measure and track the zero point 510 with respect to the local coordinate system 581 rather than the global coordinate system 570.
  • the interface system 500 is configured to use the movement of zero point 510 with respect to the local coordinate system 570 for controlling the observer camera’s movement in the VR or real environment.
  • the zero point 510 does not move in the local coordinate system 581, the virtual camera in the VR environment will not move.
  • This configuration is advantageous to avoid unnecessary virtual camera movements due to unintentional user’s 550 locomotion in the global coordinate system 570.
  • head joystick interface system 500 Another advantage of the head joystick interface system 500 is that the user 550 has physical force feedback for the neutral point or zero point 510 and can halt the virtual or real camera’s movement by resting against the backrest of the chair 580. This advantage will provide more intuitive control of the observer camera for the user 550 and, in turn, will result in a more pleasant interface experience with the VR or real environment for the user 550.
  • the user’s pose at initial state 550 is represented with greyed-out lines compared to the user’s pose in a second state 551 which is represented by black lines.
  • the user’s pose 550 is relaxed and rested against the chair’s backrest 588, the user’s head initial point, or zero point, is shown at 510, the tracker is shown at 566, and the origin of the local coordinate system 581 is shown at 583.
  • the user’s pose 551 is in an upright position and not against the backrest 588, the user’s head secondary point is shown at 512, the tracker is shown at 567, and the origin of the local coordinate system 582 is shown at 584 which is assumed to be different from origin 583.
  • the user’s head 552 is moved from an initial point 510, or zero point, to a secondary point 512 by the user’s head movement vector 514.
  • the vector 514 is obtained and used to control the observer camera’s movement in the VR or real environment using a similar method as described in Figure 3.
  • the local coordinate system 581 is changed from 581 to 582.
  • the local coordinate system 581 may change in orientation and its origin 583.
  • the origin 583 is displaced to origin 584.
  • the head joystick interface system 500 can compensate for the change in the local coordinate system 581 by calculating the user’s head zero point 510 with respect to the frame’s 580 local coordinate system at each time, hence a dynamic zero point 510.
  • the zero point 510 is identified by vector 525 at the initial state 550, while at the second state 551 is identified by vector 526. This dynamic compensation and recalculation of the zero point 510 avoids unnecessary movement of the observer camera in the VR or real environment.
  • the interface system 500 includes an HMD device 502, a processor 530, and a primary sensory subsystem 560 that uses an optical tracking sensory system 564 configured to measure the position of the HMD 502 at each point at time.
  • the interface system 500 further comprises a frame 580 which is configured to move independently from the user’s head 552 (shown in Figure 5A).
  • the frame 580 includes a local coordinate system 581 attached to the frame 580.
  • the interface system 500 further comprises a secondary sensory subsystem 561 configured to measure the local coordinate system 581.
  • the optical tracking camera 564 may be used to measure the position and orientation of the local coordinate system 581.
  • the optical tracking camera 564 position measurements are transferred to an I/O interface of the processor 530 using a wireless communication line 534 through the wireless module 533 of the processor.
  • the processor 530 may further comprise a memory 534 to store processor executable program codes and parameters 536.
  • the processor 530 is configured to use a program code to process the HMD and the local coordinate system 581 position and orientation measurements for controlling the observer camera locomotion in a VR or real environment.
  • the interface system 700 includes an HMD device 702 coupled to a user’s head 752, a processor (not shown in Figure 7) operably connected to the HMD 702, a frame 780, a local coordinate system 781 affixed to the frame 780, a primary sensory subsystem (not explicitly shown in Figure 7) configured to measure the position and orientation of the HMD 702 in space with respect to a global coordinate system 770, and a secondary sensory subsystem (not explicitly shown in Figure 7) configured to measure the position and orientation of the local coordinate system 581 in space with respect to the global coordinate system 770.
  • the interface system 700 is configured to use the user’s head 552 movements to control an observer camera’s motion in a VR or real environment.
  • the primary sensory subsystem uses an IMU sensor (not shown in Figure 7) embedded in the HMD 702 to measure the HMD’s and subsequently the user’s head position and orientation in space with respect to the global coordinate system 770 using a method similar to the method of Figure 3.
  • the secondary sensory subsystem uses a camera system 764 embedded in the HMD device 702 to measure and track the position and orientation of certain features 766 from the frame 780.
  • the optical camera system 764 may be one or multiple vision cameras or depth cameras, or a combination of both, or any other type of cameras such as IR cameras or other tracking systems based on mechanical, magnetic, optical, inertia or other principles.
  • the frame feature 766 may be any distinct part of the frame 780 such as the chair headrests which is in the field of view of the camera system 764, the field of view identified by lines 765, and is distinctly detectable by the camera system 764.
  • the measurement of the frame feature 766 by the secondary sensory system will be used in the processor to calculate the position and orientation of the local coordinate system 781, for example, by knowing a fixed distance and direction between the frame feature 766 and the local coordinate system 781.
  • the advantage of the interface system 700 compared with the interface system 500 introduced in Figure 5 is that the secondary sensory subsystem of interface system 700 does not need an additional tracker attached to the frame and instead relies on the frame’s other features such as it’s natural features. Hence, the interface system 700 may be produced with fewer components and cheaper overall price.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

There is provided a method for using a user's head as a joystick-like interface for controlling the motions of an observer camera that generates visuals of a head-mounted display device coupled to the user's head. A processor receives head movement measurements related to the user's head translational and rotational movements in 6 spatial degrees of freedom, and processes the measurements to generate computer readable commands. The commands are configured to control translational motion of the observer camera using the user's head translation and without being affected by the user's head rotation.

Description

HEAD JOYSTICK INTERFACE SYSTEM AND METHODS THEREOF
TECHNICAL FIELD
[0001] This disclosure relates generally to a joystick-like interface and more particularly to a joysticklike interface that uses the movements of a user’s head that carries a head-mounted display device to control an observer camera motions in a virtual reality or real environment.
BACKGROUND
[0002] Virtual reality (VR) interfaces usually require the user to control different degrees of freedom (DOFs) for changing position (translation) and direction (rotation) of the simulated or virtual camera inside a virtual reality. Controlling movement in a VR environment using standard controllers such as handheld gamepads is cumbersome and can contribute to unwanted side effects such as motion sickness and disorientation, and restricts users in being able to use their hands for other relevant tasks such as interaction, gesturing, or communication.
SUMMARY
[0003] In accordance with one disclosed aspect there is provided a method for using movements of a user’s head as a joystick-like interface for controlling an observer camera that generates visuals of a headmounted display (HMD) device coupled to the user’s head. In general, the method comprises a set of instructions that are executed by a processor such as a computer or an embedded microprocessor. The instructions include receiving, by the processor, head movement measurements related to the user’s head translational and rotational movements in 6 spatial degrees of freedom (DOFs); and processing the user's head movement measurements, by the processor, to generate computer readable commands, the commands are configured in such a way to control translational motion of the observer camera, using only the head’s translation and without being affected by the head’s rotation. In other words, the rotation of the observer camera is controlled by the rotation of the user’s head, and the translation of the observer camera is controlled by the user’s head translation. One advantage of this method over other head-motion-based methods is that the head’s pure rotation would not cause translational movement in the VR environment. In other words, in the proposed method the user’s head translation and rotation are decoupled and the head’s translational and rotational movements correspond to controlling the translational and rotational movements of the observer camera independently.
[0004] In the proposed method the observer camera that generates the visuals for the HMD, may be a virtual camera configured to generate large-scale virtual reality environment visuals. Alternatively, the camera may be a real camera operably coupled to a remote-controlled vehicular system such as a quadcopter, a mobile ground robot, a spacecraft, or a submarine. For the case of a remote-controlled vehicular system, the proposed method uses the user’s head translation to cause translational movement for the remote-controlled vehicular system, and hence the camera. [0005] In accordance with another disclosed aspect, there is provided a joystick-like interface system to control locomotion of an observer camera that generates visuals. The system comprises: a head-mounted display (HMD) coupled to a user’s head and configured to display the observer camera generated visuals; a primary sensory subsystem, comprised of one or more sensors, such as HMD sensory subsystem, configured to measure the translation and rotation of the head in 6 spatial degrees of freedom (DOF); and a processor operably connected to the HMD and the primary sensory subsystem, the processor configured to execute a set of command codes; wherein the processor controls translational movement of the observer camera using the head’s translation such that it is not being affected by the user’s head rotation.
[0006] In the proposed system the observer camera that generates the visuals for the HMD, may be a virtual camera configured to generate virtual reality environment visuals. Alternatively, the camera may be a real camera operably coupled to a remote-controlled vehicular system such as a quadcopter, a mobile ground robot, a spacecraft, or a submarine. For the case of a remote-controlled vehicular system, the proposed system uses the user’s head translation to cause translational movement for the remote-controlled vehicular system, and hence the camera.
[0007] The primary sensory subsystem of the proposed interface may measure the head’s movement in 6 spatial DOFs either directly, or otherwise indirectly by measuring another reference’s movement, such as HMD, first and then deriving the head’s movement.
[0008] The primary sensory subsystem of the proposed interface may include: an inertial measurement unit (IMU), or any other type of sensor such as optic or magnetic sensors, coupled to the HMD, or to the user’s head, to measure the head’s rotation in 3 DOFs; at least one optical sensor to measure the HMD translation in spatial degrees of freedoms; and a primary sensory processor which is configured to use the HMD’s translational and rotational measurements to calculate the user’s head translation and rotation in 6 spatial DOFs.
[0009] The primary sensory subsystem of the proposed interface may: assign a hypothetical head rotation center point to the head; identify an initial point as the location of the head rotation center point at the beginning of the observer camera movement; indicate a head displacement vector, the vector connecting the initial point to the head rotation center point in every moment during the head movement; and calculate the head’s translation and rotation by measuring the head displacement vector.
[0010] The processor of the proposed interface may control the translational movement of the observer camera by configuring the direction of the observer camera translation in alignment with the head displacement vector direction; and by configuring the velocity of the observer camera translation in proportion to the head displacement vector magnitude.
[0011] The proposed interface system may further include: a coordinate system which is assigned to a frame, such as a chair, wherein the frame has independent motions with respect to the head in at least one direction; and a secondary sensory subsystem, such as an optical tracking sensor, which is configured to measure the relative motion between the head and the frame’s coordinate system; and wherein, the head rotation center point is calculated with respect to the frame’s coordinate system.
[0012] The secondary sensory subsystem of the proposed interface system may be an optical tracking system which comprises: a tracker fixedly attached to the frame; and at least one optical tracking camera configured to measure the movement of the tracker in 6 spatial DOFs.
[0013] The secondary sensory subsystem of the proposed interface system may be a vision system attached to the HMD and configured to measure the relative motion between the head and the frame’s coordinate system.
[0014] The frame in the proposed interface system may be the user’s torso which often has independent movements from the user’s head.
[0015] The frame in the proposed interface system may be any type of vehicle that is operably configured to seat or otherwise support the user during the head movement. For example, the frame may be a regular office swivel chair.
[0016] The advantages of the proposed joystick interface system compared to the handheld VR interfaces such as thumb-stick and touchpad interfaces are listed as below:
[0017] Moving the user’s head toward a target direction mimics the vestibular sensory cues of an actual human locomotion, which can enhance the presence or sensation of physically being in the virtual environment. A stronger presence may allow a user to easily forget that they are actually in the real world and thus help them to enjoy the virtual experience as if it happens in the real world. Moreover, mimicking the vestibular sensory cues provides a stronger perception of self-motion, known as vection. This stronger vection may allow a more realistic experience of travel as the user could feel like they move inside the virtual environment instead of the virtual environment moving around them. Additionally, as vestibular cues of travel support automatic spatial updating process and prevent disorientation, the users of the proposed joystick interface may get less disoriented and lost in the virtual environment when compared to traveling with handheld interfaces.
[0018] Using a rotating chair or frame which enables a user’s 360 degrees yaw rotation, allows the user to control the yaw rotation of the observer camera by rotating the chair seat or frame in yaw direction without the need to rotate the head. This advantage will result in less fatigue for the user’s head during VR or real experience.
[0019] Based on the sensory cue conflict theory of motion sickness, the visual -vestibular sensory conflict may cause motion sickness. However, vestibular sensory information can not be judged on an absolute scale, but appear relative to e.g. gravity or other accelerations. As the proposed joystick interface provides limited, but coherent vestibular cues of actual travel, it may cause less sensory conflict and thus, lower motion sickness compared to the handheld controllers. [0020] Mental calculation of head movement toward a target direction when looking at another direction could be done automatically and intuitively with less mental effort compared to the thumb or hand movements. This behavior is true especially during a virtual flight, where a user may typically use two handheld controllers to control 4 DOFs including up/down, left/right, forward/backward, and yaw rotation. Therefore, using the proposed joystick interface system may be easier and less mentally computational than using handheld interfaces. As a result, learning to use such an interface may be easier as well. Moreover, intuitive control of the proposed joystick interface may potentially result in faster reactions of the user and hence, the user may more likely reach a target faster.
[0021] Easier utilization of the proposed joystick interface system allows a user to psychologically be captivated by the task rather than spending their attention to handle the controller and hence may improve the immersion into a VR environment.
[0022] Additionally, due to the higher range of movement in the neck muscles compared to the lower range of thumbstick movement in the handheld joystick interfaces, the proposed joystick interface may result in improved control of travel or motion of the observer camera.
[0023] In one possible implementation of the proposed interface system, using VR and a head-mounted display (HMD), the user moves their head similar to a joystick handle toward a target direction to control virtual translation velocity. In this example, the user sits on a regular office swivel chair and rotates it physically to control virtual rotation using 1:1 mapping.
[0024] Throughout this draft, the term movement is used equivalently with the term motion. Also, the term movement is used to describe a general change in position and orientation, and is decomposed to translational movement and rotational movement throughout this draft.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] For a fuller understanding of the nature and advantages of the disclosed subject matter, as well as the preferred modes of use thereof, reference should be made to the following detailed description, read in conjunction with the accompanying drawings. In the drawings, like reference numerals designate like or similar steps or parts.
[0026] Figure 1 is a perspective view of a head joystick interface system according to a first disclosed embodiment;
[0027] Figure 2 is a block diagram of the head joystick interface system of Figure 1;
[0028] Figure 3 is a flowchart depicting blocks of code for directing the processor of Figure 2 to use the head joystick system of Figure 1 for controlling observer camera locomotion in a VR environment;
[0029] Figures 4A to 4C are side, top, and front views of the head joystick interface system of Figure 1 ; [0030] Figures 5A and 5B are side views of a head joystick interface system according to an alternative disclosed embodiment;
[0031] Figure 6 is a block diagram of the head joystick interface system of Figure 5 A;
[0032] Figure 7 is another embodiment of the head joystick interface system of Figure 5 A;
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
[0033] Referring to Figure 1, a perspective view of a head joystick interface system according to a first disclosed embodiment is generally shown at 100. The interface system includes a head-mounted display (HMD) device 102 coupled to a user’s head 152 and a processor 130 operably connected to the HMD device 102 using a cable 131. In other embodiments, the processor 130 may be embedded into the HMD device 102 or may be connected to HMD device 102 using other means such as wireless connection. The HMD device 102 displays, for the user 150, visuals (not shown in figures) that are generated by an observer camera (not shown in the figures). The visuals may be computer simulations such as a virtual reality (VR) environment that are generated through a virtual camera or viewpoint. Alternatively, the visuals may be representative of real scenes captured and generated by a real camera. In any case, the visuals generated by the observer camera are displayed in the HMD device for the user 150. In the case of the real camera, the camera may be attached to a remote-controlled vehicular system (not shown in figures) such as a quadcopter (aerial vehicle), a mobile ground robot, a remote-controlled spacecraft, or a submarine.
[0034] In the embodiment shown in Figure 1, the user’s head 152 in a first position is moved to a second position 154 resulting in a user’s head movement indicated by a solid vector 114 and, subsequently, an HMD’s movement indicated by a dashed vector 108. The head joystick interface system 100 uses the user’s head movement vector 114 to control the movements of the observer camera in the VR or real environment. More generally, the head joystick system 100 is configured to use the user’s head 152 movements in 6 spatial degrees of freedom (DOF) 199 to control the movements of the observer camera in the VR or real environment. In different embodiments, the user’s head 152 movement may control various attributes of the observer camera motion. For example, in one embodiment, the user’s head 152 position changes may control the speed of the observer camera while in other embodiments the position or acceleration of the observer camera may be controlled. In some embodiments there may be a 1 to 1 proportionality between the user’s head 152 movement and the observer camera movement while in other embodiments other uneven proportionalities may be used.
[0035] In the embodiment shown in Figure 1, The head joystick interface system 100 uses the HMD’s movement vector 108 to calculate the user’s head movement vector 114. The HMD 102 includes a primary sensory subsystem (not shown in Figure 1 but shown in Figure 2) such as an inertial measurement unit (IMU) to measure the HMD’s movement vector 108. In other embodiments, the HMD’s movement vector 108 may be measured using other sensory systems such as one or more cameras attached to the HMD 102 or optical tracking systems that track the HMD’s movement. [0036] The measured HMD’s movement vector 108 is processed in the processor 130 to calculate the user’s head movement vector 114. The calculated user’s head movement vector 114 is used to control the movements of the observer camera in the VR or real environment. Using the user’s head movement vector 114 for controlling the observer camera’s movement allows decoupling the rotational movement of the observer camera from the translational movement. In some other embodiments, the primary sensory subsystem may directly measure the user’s head movement vector 114 without requiring measuring the movement vector of any other reference object such as the HMD’s movement vector 108.
[0037] In fact, one of the main advantages of the proposed head joystick interface system 100 is that the user’s head movement vector 114 is used to control the observer camera’s movement. Otherwise, if the HMD’s movement vector 108 is used to control the observer camera’s movement, pure rotation of the user’s head will result in both rotation and translation of the HMD (non-zero HMD’s movement vector 108) which will result in rotation and translation of the observer camera. This feature of the proposed head joystick interface system 100 allows for intuitive control of the observer camera’s movements, i.e. pure head rotation results in pure observer camera’s rotation, and pure translation of the head results in pure translation of the observer camera. It might also help to reduce motion sickness. This advantage will arguably improve the experience and immersion of a user while interfacing with the VR or real environment.
[0038] In some embodiments, the observer camera’s motion is controlled by setting the observer camera’s speed vector in a direction aligned with the user’s head movement vector 114 and a magnitude proportional to the magnitude of the user’s head movement vector 114.
[0039] Referring to Figure 2, a block diagram of the head joystick interface system 100 is shown. The HMD device 102 includes a primary sensory subsystem 160 such as an IMU configured to measure the position of the HMD 102 at each point in time. The HMD position measurements are transferred to an I/O interface of the processor 130 using transfer line 131. The processor 130 further may comprise a memory 134 to store processor executable program codes and parameters 136. The processor 130 is configured to process the HMD position measurements using a program such as blocks of code presented in the flowchart of Figure 3 to control the observer camera locomotion in a VR or real environment. The blocks of code in Figure 3 are directed toward a camera in a VR environment, however, in other embodiments, the blocks of code could be directed toward a camera in a real environment.
[0040] Referring to Figure 3, a flowchart depicting blocks of code for directing the processor 130 the head joystick interface system 100 is shown at 300. The blocks generally represent codes that may be read from the program codes 136 of the memory 134 for directing the processor 130 to perform various functions for the purpose of controlling the observer camera in a VR environment. The actual code to implement each block may be written in any suitable program language, such as C, C++, C#, Java, PHP, Python, and/or assembly code, for example. [0041] The process 300 starts by block 310 where measurements related to the HMD’s rotational and translations movements are acquired by the processor 130. At block 320 the HMD’s rotational and translational measurements are processed to calculate the user’s head movement vector 114. At block 330, the calculated user’s head movement vector 114 is used to calculate the virtual camera’s translational movement in the VR environment. At block 340, the HMD’s rotational measurements are processed to calculate the virtual camera’s rotational movements in the VR environment. At block 350, the calculated rotational and translational movements related to the virtual camera are processed to render the VR environment accordingly. Eventually, at block 360, the rendered VR environment is transmitted to the HMD for display.
[0042] Referring to Figure 4A to 4C, more detailed views are depicted which are showing the user’s head 152 movements in the head joystick interface system 100. In Figures 4A to 4C the user’s head 152 in the first position is shown with grayed-out lines and the user’s head at the second position 154 is shown with black lines. The user’s head position at the second position 154 represents a head movement in all 6 DOFs in space. The user’s head 152 is moved from an initial point 110 to a secondary point 112 by the user’s head movement vector 114. While the HMD 102 is moved from an HMD’s initial point 104 to an HMD’s secondary point 106 by the HMD’s movement vector 108. The user’s head initial point 110 is a point attached to the user’s head 152 at an initial point and is arbitrarily chosen from the user’s head 152. However, in a preferred embodiment, the initial point 110 is chosen as a point on the user’s head 152 which is apart from the HMD’s initial point 104 by a fixed vector 116. In some other embodiments, the initial point 110 is chosen to represent the closest point to the user’ s head 152 center of rotation for all 3 rotational DOFs and may be different from one user to another.
[0043] The initial point 110 is referred to as zero point thereafter, since the initial point 110 corresponds to the zero movement of the user’s head 152. In some embodiments, the zero point 110 of a user’s head may be identified in a relaxed state of the user’s head. In other words, the zero point 110 is calibrated by the processor 130 at a relaxed position of the user’s head 152.
[0044] In Figure 4A a side view of the user’s head 152 shows the user’s head movement vector 114 in the XZ plane. In Figure 4B a top view of the user’s head 152 shows the user’s head movement vector 114 in the XY plane. In Figure 4C a front view of the user’s head 152 shows the user’s head movement vector 114 in the YZ plane. The Figures 4A to 4C show that the user’s head movement vector 114 is generally different from the HMD’s movement vector 108.
[0045] The user’s head movement vector 114 may be calculated from the HMD’s movement vector 108 using a given fixed vector 116. By adding the HMD’s initial point 104 with the fixed vector 116 the user’s head initial point 110 is obtained. The user’s head secondary point 112 is obtained by first calculating an updating fixed vector 117 by rotating the fixed vector 116 by the HMD’s rotation in yaw, pitch, and roll directions, and then adding the updated fixed vector 117 with the HMD’s secondary point 106. User’s head movement vector 114 can be calculated then, by the difference between the user’s head secondary point 112 from the user’s head initial point 110. This method of calculating the user’s head movement vector 114 may be used in the calculations and processes of the block 320 of the process 300 of Figure 3.
[0046] Referring to Figures 5A and 5B another embodiment of the proposed head joystick interface system is generally shown at 500 according to another disclosed aspect. The interface system 500 comprises an HMD device 502 coupled to a user’s head 552, a processor 530 operably connected to the HMD 502 using a cable 531 or wirelessly. In the embodiment shown in Figure 5A, the interface system 500 fiirther includes a primary sensory subsystem 560 (as shown in Figure 6) that measures the movements of the HMD 502 with respect to a global coordinate system 570 (the XYZ coordinate system). The primary sensory subsystem 560 includes a pair of optical tracking cameras 564 that optically measure and track the HMD’s 502 movement in the global coordinate system 570. The measurements of the tracking cameras 564 is transmitted to the processor 530 wirelessly but in other embodiments may be transmitted using a wired connection. In other embodiments, the primary sensory subsystem 560, may include an IMU sensor attached to the HMD 502 or optical sensors attached to the HMD, for example.
[0047] The interface system 500 further comprises a local coordinate system 581 (the XFYFZF coordinate system) attached to a frame 580 which moves independently from the user’s head 552. The interface system 500 further comprises a secondary sensory subsystem 561 (as shown in Figure 6) which is configured to measure the user’s head movements with respect to the local coordinate system 581. In the embodiment shown in Figure 5A, the secondary sensory subsystem 561 is an optical tracking system comprising of the pair of tracking cameras 564 and a tracker 566 affixed to the frame 580. In the embodiment shown in Figure 5A, the frame 580 is a swivel chair which comprises a wheeled chassis 585, a seat 586 configured to rotate freely with respect to the wheeled chassis 585 (rotation around axis ZF), and a backrest 588 configured to rotate vertically with respect to the wheeled chassis 585 (rotation around axis YF). The frame 580 can move freely with respect to the global coordinate system 570 at least in the XY plane but in other embodiments, the frame 580 may move freely in other directions as well.
[0048] The local coordinate system 581 has an origin 583 which is affixed to the frame 580. For the embodiment shown in Figure 5 A, the origin is conveniently located at the backrest’s center of rotations (pitch rotation center) but in other embodiments, the origin 583 may be located at other locations of the frame 580 such as at the chair’s 580 yaw rotation center. The origin 583 could be calculated by the processor 530 by various methods, such as by knowing the fixed position of the tracker 566 with respect to the origin 583, or by measuring, using the secondary sensory subsystem 561, the position of the tracker 566 with respect to the global coordinate system 570 in multiple backrest 588 positions and then calculating the origin by knowing that the distance between the origin 583 and the tracker 566 is fixed.
[0049] The swivel chair 580 is configured to seat the user 550. The user’s head 552 initial point 510, also referred to as the zero point 510, is determined at a relaxed state of the user’s head 552 when the user’s 550 back is in a relaxed position against the backrest 588. One advantage of using a rotating chair as the frame 580 is that the user 550 can freely rotate the virtual or real camera in 360 degrees by rotating the chair seat 586 in yaw direction without the need to rotate the head 552. This advantage will result in less fatigue for the user’s head 552 during VR or real experience.
[0050] The tracker 566 is affixed to the backrest 588 of the chair 580 and is configured to facilitate the measurement of the local coordinate system 581 with respect to the global coordinate system 570. Provided, by the primary sensory subsystem 530, the user’s head 552 measurements with respect to the global coordinate system 570, and provided, by the secondary sensory subsystem 561, the local coordinate system 581 with respect to the global coordinate system 570, the user’s head 552 measurements could be derived and calculated in the processor 530. The tracker 566 may be affixed to other parts of the frame 580 in other embodiments.
[0051] The purpose of the secondary sensory subsystem 561 is to, at all times, measure and track the zero point 510 with respect to the local coordinate system 581 rather than the global coordinate system 570. The interface system 500 is configured to use the movement of zero point 510 with respect to the local coordinate system 570 for controlling the observer camera’s movement in the VR or real environment. Thus, if the zero point 510 does not move in the local coordinate system 581, the virtual camera in the VR environment will not move. This configuration is advantageous to avoid unnecessary virtual camera movements due to unintentional user’s 550 locomotion in the global coordinate system 570.
[0052] Another advantage of the head joystick interface system 500 is that the user 550 has physical force feedback for the neutral point or zero point 510 and can halt the virtual or real camera’s movement by resting against the backrest of the chair 580. This advantage will provide more intuitive control of the observer camera for the user 550 and, in turn, will result in a more pleasant interface experience with the VR or real environment for the user 550.
[0053] Referring to Figure 5B, the user’s pose at initial state 550 is represented with greyed-out lines compared to the user’s pose in a second state 551 which is represented by black lines. At the initial state, the user’s pose 550 is relaxed and rested against the chair’s backrest 588, the user’s head initial point, or zero point, is shown at 510, the tracker is shown at 566, and the origin of the local coordinate system 581 is shown at 583. At the second state, the user’s pose 551 is in an upright position and not against the backrest 588, the user’s head secondary point is shown at 512, the tracker is shown at 567, and the origin of the local coordinate system 582 is shown at 584 which is assumed to be different from origin 583. The user’s head 552 is moved from an initial point 510, or zero point, to a secondary point 512 by the user’s head movement vector 514. The vector 514 is obtained and used to control the observer camera’s movement in the VR or real environment using a similar method as described in Figure 3.
[0054] As the user’s body moves from pose 550 to 551, the local coordinate system 581 is changed from 581 to 582. The local coordinate system 581 may change in orientation and its origin 583. In the embodiment shown in Figure 5B, the origin 583 is displaced to origin 584. The head joystick interface system 500 can compensate for the change in the local coordinate system 581 by calculating the user’s head zero point 510 with respect to the frame’s 580 local coordinate system at each time, hence a dynamic zero point 510. For example, the zero point 510 is identified by vector 525 at the initial state 550, while at the second state 551 is identified by vector 526. This dynamic compensation and recalculation of the zero point 510 avoids unnecessary movement of the observer camera in the VR or real environment.
[0055] Referring to Figure 6, a block diagram of the head joystick interface system 500 is shown. The interface system 500 includes an HMD device 502, a processor 530, and a primary sensory subsystem 560 that uses an optical tracking sensory system 564 configured to measure the position of the HMD 502 at each point at time. The interface system 500, further comprises a frame 580 which is configured to move independently from the user’s head 552 (shown in Figure 5A). The frame 580 includes a local coordinate system 581 attached to the frame 580. The interface system 500 further comprises a secondary sensory subsystem 561 configured to measure the local coordinate system 581. The optical tracking camera 564 may be used to measure the position and orientation of the local coordinate system 581.
[0056] The optical tracking camera 564 position measurements are transferred to an I/O interface of the processor 530 using a wireless communication line 534 through the wireless module 533 of the processor. The processor 530 may further comprise a memory 534 to store processor executable program codes and parameters 536. The processor 530 is configured to use a program code to process the HMD and the local coordinate system 581 position and orientation measurements for controlling the observer camera locomotion in a VR or real environment.
[0057] Referring to Figure 7, another embodiment of the head joystick interface system is shown generally at 700. The interface system 700 includes an HMD device 702 coupled to a user’s head 752, a processor (not shown in Figure 7) operably connected to the HMD 702, a frame 780, a local coordinate system 781 affixed to the frame 780, a primary sensory subsystem (not explicitly shown in Figure 7) configured to measure the position and orientation of the HMD 702 in space with respect to a global coordinate system 770, and a secondary sensory subsystem (not explicitly shown in Figure 7) configured to measure the position and orientation of the local coordinate system 581 in space with respect to the global coordinate system 770. The interface system 700 is configured to use the user’s head 552 movements to control an observer camera’s motion in a VR or real environment. In the embodiment shown in Figure 7, the primary sensory subsystem uses an IMU sensor (not shown in Figure 7) embedded in the HMD 702 to measure the HMD’s and subsequently the user’s head position and orientation in space with respect to the global coordinate system 770 using a method similar to the method of Figure 3. The secondary sensory subsystem uses a camera system 764 embedded in the HMD device 702 to measure and track the position and orientation of certain features 766 from the frame 780. The optical camera system 764 may be one or multiple vision cameras or depth cameras, or a combination of both, or any other type of cameras such as IR cameras or other tracking systems based on mechanical, magnetic, optical, inertia or other principles. The frame feature 766 may be any distinct part of the frame 780 such as the chair headrests which is in the field of view of the camera system 764, the field of view identified by lines 765, and is distinctly detectable by the camera system 764. The measurement of the frame feature 766 by the secondary sensory system will be used in the processor to calculate the position and orientation of the local coordinate system 781, for example, by knowing a fixed distance and direction between the frame feature 766 and the local coordinate system 781. The advantage of the interface system 700 compared with the interface system 500 introduced in Figure 5 is that the secondary sensory subsystem of interface system 700 does not need an additional tracker attached to the frame and instead relies on the frame’s other features such as it’s natural features. Hence, the interface system 700 may be produced with fewer components and cheaper overall price.
[0058] While specific embodiments have been described and illustrated, such embodiments should be considered illustrative of the invention only and not as limiting the invention as construed in accordance with the accompanying claims.

Claims

What is claimed is:
1. A method for using a user’s head as a joystick-like interface for controlling the motions of an observer camera that generates visuals of a head-mounted display device coupled to the user’s head, the method comprising: receiving, by a processor, head movement measurements related to the user’s head translational and rotational movements in 6 spatial degrees of freedom; and processing the head movement measurements, by the processor, to generate computer readable commands, wherein the commands are configured to control translational motion of the observer camera using the user’s head translation and without being affected by the user’s head rotation.
2. The method of claim 1, wherein the observer camera is a virtual camera configured to generate virtual reality environment visuals.
3. The method of claim 1, wherein the observer camera is a real camera operably coupled to a remote- controlled vehicular system.
4. The method of claim 3, ultimately dependent to claim 1, wherein the user’s head translation causes the translational motion of the observer camera by causing translational movement of the remote-controlled vehicular system.
5. A joystick-like interface system to control locomotion of an observer camera that generates visuals, the system comprising: a head-mounted display coupled to a user’s head and configured to display the observer camera generated visuals; a primary sensory subsystem comprised of one or more sensors configured to measure the translation and rotation of the head in 6 spatial degrees of freedom; a processor operably connected to the head-mounted display and the primary sensory subsystem, the processor configured to execute a set of command codes; wherein, the processor controls translational motion of the observer camera using the user’s head translation without being affected by the user’s head rotation.
6. The system of claim 5, wherein the observer camera is a virtual camera configured to generate virtual reality environment visuals.
7. The system of claim 5, wherein the observer camera is a real camera operably coupled to a remote- controlled vehicular system.
8. The system of claim 7, ultimately dependent to claim 5, wherein the user’s head translation causes the translational motion of the observer camera by causing translational movement of the remote-controlled vehicular system.
9. The system of claim 5, wherein the primary sensory subsystem indirectly measures the head’s movement in 6 spatial degrees of freedom.
10. The system of claim 5, wherein the primary sensory subsystem comprises: an inertial measurement unit coupled to the head-mounted display to measure the head’s rotation in 3 degrees of freedom; at least one optical sensor to measure the head-mounted display translation in spatial degrees of freedoms; and a primary sensory processor configured to use the head-mounted display translational and rotational measurements to calculate the head’s translation and rotation in 6 spatial degrees of freedom.
11. The system of claim 5, wherein the primary sensory subsystem: assigns a hypothetical head rotation center point to the user’s head; identifies an initial point as the location of the head rotation center point at the beginning of the observer camera motion; indicates a head displacement vector as the vector connecting the initial point to the head rotation center point in every moment during the user’s head movement; and calculates the user’s head translation and rotation by measuring the head displacement vector.
12. The system of claim 11, wherein during the observer camera motion, the processor controls the observer camera’s translational motion by configuring: the observer camera’s translational direction in alignment with the head displacement vector direction; and the observer camera’s translational velocity in proportion to the head displacement vector magnitude.
13. The system of claim 12, wherein the system further comprises: a frame coordinate system assigned to a frame, the frame having independent motions with respect to the user’s head in at least one direction; and a secondary sensory subsystem configured to measure the relative motion between the user’s head and the frame coordinate system; wherein, the head rotation center point is calculated with respect to the frame coordinate system.
14. The system of claim 13, wherein the secondary sensory subsystem is an optical tracking system comprising: a tracker fixedly attached to the frame; and at least one optical tracking camera configured to measure the movement of the tracker in 6 spatial DOFs.
15. The system of claim 13, wherein the secondary sensory subsystem is a vision system attached to the head-mounted display and configured to measure the relative motion between the user’s head and the frame coordinate system.
16. The system of claim 13, wherein the frame is the user’s torso.
17. The system of claim 13, wherein the frame is a vehicle operably configured to seat or otherwise support the user during the user and/or frame movements.
18. The system of claim 17, wherein the frame is a swivel chair operably configured to seat the user during the user and/or frame movements.
14
PCT/CA2021/051335 2020-09-25 2021-09-24 Head joystick interface system and methods thereof WO2022061468A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063083108P 2020-09-25 2020-09-25
US63/083,108 2020-09-25

Publications (1)

Publication Number Publication Date
WO2022061468A1 true WO2022061468A1 (en) 2022-03-31

Family

ID=80845954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2021/051335 WO2022061468A1 (en) 2020-09-25 2021-09-24 Head joystick interface system and methods thereof

Country Status (1)

Country Link
WO (1) WO2022061468A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170278262A1 (en) * 2014-07-31 2017-09-28 Sony Corporation Information processing device, method of information processing, and image display system
WO2018146231A1 (en) * 2017-02-08 2018-08-16 Michael Bieglmayer Apparatus for capturing movements of a person using the apparatus for the purposes of transforming the movements into a virtual space
US20180348518A1 (en) * 2017-06-05 2018-12-06 Microsoft Technology Licensing, Llc Apparatus and method of 1:1 matching head mounted display view to head movement that controls articulated camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170278262A1 (en) * 2014-07-31 2017-09-28 Sony Corporation Information processing device, method of information processing, and image display system
WO2018146231A1 (en) * 2017-02-08 2018-08-16 Michael Bieglmayer Apparatus for capturing movements of a person using the apparatus for the purposes of transforming the movements into a virtual space
US20180348518A1 (en) * 2017-06-05 2018-12-06 Microsoft Technology Licensing, Llc Apparatus and method of 1:1 matching head mounted display view to head movement that controls articulated camera

Similar Documents

Publication Publication Date Title
US10875188B2 (en) Universal motion simulator
US11828939B2 (en) Method and apparatus for adjusting motion-based data space manipulation
CA2825563C (en) Virtual reality display system
Chung et al. Exploring virtual worlds with head-mounted displays
US20200055195A1 (en) Systems and Methods for Remotely Controlling a Robotic Device
Meyer et al. A survey of position trackers
Boman International survey: Virtual-environment research
US9996149B1 (en) Method for one-touch translational navigation of immersive, virtual reality environments
EP2499550A1 (en) Avatar-based virtual collaborative assistance
CN209821674U (en) Multi-degree-of-freedom virtual reality movement device
WO2021261595A1 (en) Vr training system for aircraft, vr training method for aircraft, and vr training program for aircraft
WO2022061468A1 (en) Head joystick interface system and methods thereof
Pryor et al. A Virtual Reality Planning Environment for High-Risk, High-Latency Teleoperation
EP3700641B1 (en) Methods and systems for path-based locomotion in virtual reality
Menezes et al. Touching is believing-Adding real objects to Virtual Reality
CN113593358A (en) Two-degree-of-freedom VR airship driving simulation system
JPH06337756A (en) Three-dimensional position specifying method and virtual space stereoscopic device
WO2022014429A1 (en) Information processing method, program, and system
TW201944365A (en) A method to enhance first-person-view experience
US20240169676A1 (en) Rotational navigation in a virtual environment with a visual reference
Rize et al. Real-time virtual reality environment for MAJIC attitude control system development and implementation
WO2024125465A1 (en) Viewing-angle following display system, operating system, reconstruction sensing control system, and control method
EP3958095A1 (en) A mobile computer-tethered virtual reality/augmented reality system using the mobile computer as a man machine interface
JP2023172180A (en) Image processing apparatus, image processing method, and program
KR20210133058A (en) Tourism experience system with 60DOF online multi-player

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21870648

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06/07/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21870648

Country of ref document: EP

Kind code of ref document: A1