WO2022201693A1 - コントローラ及びトラッキングシステム - Google Patents
コントローラ及びトラッキングシステム Download PDFInfo
- Publication number
- WO2022201693A1 WO2022201693A1 PCT/JP2021/047473 JP2021047473W WO2022201693A1 WO 2022201693 A1 WO2022201693 A1 WO 2022201693A1 JP 2021047473 W JP2021047473 W JP 2021047473W WO 2022201693 A1 WO2022201693 A1 WO 2022201693A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- controller
- pen
- light emitting
- cameras
- axial direction
- Prior art date
Links
- 238000005096 rolling process Methods 0.000 claims description 21
- 238000001514 detection method Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 16
- 210000003813 thumb Anatomy 0.000 description 6
- 210000003811 finger Anatomy 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000000034 method Methods 0.000 description 4
- 230000003111 delayed effect Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/039—Accessories therefor, e.g. mouse pads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Definitions
- the present invention relates to a controller and tracking system, and in particular, a space configured by XR technology such as VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), SR (Substitutional Reality) (hereinafter referred to as "XR space”). ) and tracking systems for tracking movement of such controllers.
- XR space a space configured by XR technology
- VR Virtual Reality
- AR Augmented Reality
- MR Magnetic Reality
- SR Substitutional Reality
- a handheld controller In XR technology, a handheld controller is used for the user to indicate a position in XR space. Tracking of the controller is performed by a tracking system that includes a camera and a computer connected to the camera. When the user moves the controller within the shooting range of the camera, the computer detects the position and orientation of the controller based on the image captured by the camera, and tracks the movement of the controller based on the detection results. .
- Patent Document 1 discloses an example of a pen-type controller, which is a type of handheld controller.
- a plurality of light emitting diodes are provided on the surface of the controller disclosed in Patent Document 1.
- a computer that tracks the movement of the controller is configured to detect the position and orientation of the controller by detecting these LEDs in the image captured by the camera.
- one of the objects of the present invention is to provide a pen-type controller that can detect the position and orientation with high accuracy.
- one of the objects of the present invention is to provide a tracking system that can detect the position and orientation of a controller with high accuracy while using a camera with a rolling shutter.
- the controller according to the present invention includes a pen portion formed in a pen shape, a grip portion intersecting with the axial direction of the pen portion, and an end portion of the grip portion near the axial direction of the pen portion. and a first light emitting unit arranged.
- a tracking system is a tracking system for tracking the movement of the controller, comprising one or more roller shutters each having a rolling shutter and arranged so that the sub-scanning direction of the rolling shutter coincides with the vertical direction.
- a tracking system including a camera and a computer for tracking movement of the controller based on images captured by the one or more cameras.
- the controller of the present invention it is possible to detect the position and orientation of the controller with high accuracy.
- the tracking system of the present invention it is possible to detect the position and orientation of the controller with high accuracy using the rolling shutter.
- FIG. 4A and 4B are perspective views of the controller 6 viewed from various angles; 4A and 4B are perspective views of the controller 6 viewed from various angles; 4 is a diagram showing rotation of the controller 6 around the pen axis; FIG. 4 is a diagram showing rotation of the controller 6 around the pen axis; FIG. It is a figure which shows the image
- FIG. 4 is a diagram for explaining the arrangement of cameras 4a to 4c; (a) is a diagram showing an image sensor 40 built in each of the cameras 4a to 4c, and (b) is a diagram for explaining the operation of the rolling shutter.
- FIG. 4 is a diagram showing the structure of cameras 4a to 4c employed for arranging the cameras 4a to 4c so that the sub-scanning direction of the rolling shutter coincides with the vertical direction; It is a figure which shows the usage condition of the tracking system 1 by the modification of embodiment of this invention. It is a figure which shows the controller 6 by the modification of embodiment of this invention.
- FIG. 1 is a diagram showing a usage state of a tracking system 1 according to this embodiment.
- the tracking system 1 includes a computer 2, a position detection device 3, three cameras 4a to 4c, a head-mounted display 5, and a pen-type controller 6. .
- the computer 2, the position detection device 3, the cameras 4a to 4c, the head-mounted display 5, and the controller 6 are configured to communicate with each other by wire or wirelessly.
- the user uses the tracking system 1 while sitting on the desk chair 101, wearing the head mounted display 5 on the head, and holding the controller 6 in the right hand.
- An XR space rendered by the computer 2 is displayed on the display surface of the head-mounted display 5, and the user operates the controller 6 above the desk 100 while viewing this XR space.
- the controller 6 is a pen-shaped device having a pen with a grip, and controls 3D objects displayed in the XR space (specifically, drawing 3D objects, moving 3D objects, etc.). .
- the controller 6 is used to perform 2D input using the position detection device 3 .
- the computer 2 is configured by a notebook personal computer placed in the center of the desk 100 .
- the computer 2 does not necessarily have to be placed in the center of the desk 100, and may be placed at a position where it can communicate with the position detection device 3, the cameras 4a to 4c, the head-mounted display 5, and the controller 6.
- FIG. the computer 2 may be configured by various types of computers such as a desktop personal computer, a tablet personal computer, a smart phone, a server computer, etc., in addition to the notebook personal computer.
- the computer 2 periodically detects the positions and tilts of the head mounted display 5, the controller 6, and the position detection device 3 based on the images captured by the cameras 4a to 4c, thereby tracking their movements. fulfill The computer 2 generated and generated the XR space and the 3D objects to be displayed therein based on the movement of each device being tracked and the operation state of each operation button and dial button provided on the controller 6, which will be described later. Processing for rendering the XR space and 3D objects and transmitting them to the head-mounted display 5 is performed.
- the head-mounted display 5 plays a role of displaying the XR space including one or more 3D objects by displaying the rendered image transmitted from the computer 2 .
- the position detection device 3 is composed of a tablet arranged on the upper surface of the desk 100 at a position corresponding to the front side of the computer 2 when viewed from the user.
- the position detection device 3 does not necessarily need to be placed at this position, and may be placed within reach of the user sitting on the desk chair 101 .
- the position detection device 3 and the computer 2 may be configured by an integrated device such as a tablet terminal.
- the position detection device 3 has a function of periodically detecting the position of the pen tip of the controller 6 on the touch surface and sequentially transmitting the detected position to the computer 2 .
- the computer 2 generates and renders stroke data that constitutes a 2D object or a 3D object based on the transmitted positions.
- a specific method of position detection by the position detection device 3 is not particularly limited, it is preferable to use, for example, an active electrostatic method or an electrostatic induction method.
- the cameras 4a to 4c are image capturing devices for capturing still images or moving images, respectively, and are configured to sequentially supply images obtained by capturing to the computer 2.
- the camera 4a is positioned facing the user with the desk 100 interposed therebetween, the camera 4b is positioned above the user's left side, and the camera 4c is positioned above the user's right side so that the upper surface of the desk 100 can be photographed.
- Each of the cameras 4a-4c has a rolling shutter, and is arranged so that the sub-scanning direction of the rolling shutter coincides with the vertical direction in order to minimize distortion of the controller 6 in the image. Details of this point will be described later.
- FIG. 2 is a diagram showing a state in which the user holds the controller 6 with his or her right hand.
- 3(a), 3(b), 4(a), and 4(c) are perspective views of the controller 6 viewed from various angles.
- the controller 6 has a pen portion 6p formed in a pen shape and a grip portion 6g fixed to the pen portion 6p so that the longitudinal direction intersects the axial direction of the pen portion 6p.
- the axial direction of the pen portion 6p will be referred to as the x direction
- the direction in the plane formed by the x direction and the longitudinal direction of the grip portion 6g and perpendicular to the x direction will be referred to as the z direction
- a direction orthogonal to each of the x-direction and the z-direction is called a y-direction.
- pressure pads 6pa and 6pb and shift buttons 6pc and 6pd are provided on the surface of the pen portion 6p.
- the pressure pads 6pa and 6pb are members each including a pressure sensor and a touch sensor, and are arranged symmetrically with respect to the xz plane at a position near the pen tip on the side surface of the pen part 6p.
- the pressure detected by the pressure sensor is used for selection or drawing on the application.
- the information indicating whether or not there is a touch detected by the touch sensor is used to determine whether the pressure sensor output is on or off, and to realize a light double-tap.
- Shift buttons 6pc and 6pd are switches assigned to application menus, respectively, and are arranged symmetrically with respect to the xz plane between the pressure pads 6pa and 6pb and the grip portion 6g.
- a user holding the controller 6 with the right hand can operate the pressure pad 6pa and the shift button 6pc with the thumb, and operate the pressure pad 6pb and the shift button 6pd with the index finger, as can be seen from FIG. become.
- the tact top button 6ga is a switch that functions as a power button when pressed for a long time, and is arranged on the surface of the end closer to the pen part 6p among the longitudinal ends of the grip part 6g.
- this end portion will be referred to as the "upper end portion”
- the end portion of the grip portion 6g that is farther from the pen portion 6p in the longitudinal direction will be referred to as the "lower end portion”.
- the dial button 6ge is a rotatable ring-shaped member configured to output the amount of rotation. This rotation amount is used, for example, to rotate the selected object.
- a dial button 6ge is also arranged on the upper end of the grip portion 6g so as to surround the tact top button 6ga.
- the grab button 6ga is a switch used to grab and move an object, and is located near the lower end of the pen tip side surface of the grip portion 6g.
- the tactile buttons 6gc and 6gd are switches used for button assistance such as the right button of a mouse. placed in The tact button 6gc is arranged on the thumb side when the controller 6 is held by the right hand, and the tact button 6gd is arranged on the index finger side when the controller 6 is held by the right hand.
- the user holding the controller 6 with the right hand presses the grab button 6ga with the middle finger. Also, the thumb presses the tact button 6gc, and the index finger presses the tact button 6gd.
- a rotation operation of the dial button 6ge and a pressing operation of the tact top button 6ga are performed by the user's thumb.
- the tact top button 6ga and the dial button 6ge are positioned so that they cannot be operated unless the user intentionally lifts the thumb up to the upper end of the grip portion 6g, they are exposed without being hidden by the user's hand in a normal state. ing.
- the concave portion 6gf is a portion configured so that when the user grips the controller 6, the portion from the base of the index finger to the base of the thumb fits just right. By providing the recess 6gf in the controller 6, fatigue of the user using the controller 6 is reduced.
- each LED is composed of a so-called point light source LED.
- the computer 2 is configured to detect the position and orientation of the controller 6 by detecting these LEDs in the images captured by the cameras 4a-4c.
- one or more LEDs are arranged in each of the three portions PT1 to PT3 of the controller 6 shown in FIG. 3(b).
- the portion PT1 is a portion of the pen portion 6p located on the pen tip side when viewed from the grip portion 6g
- the portion PT2 is a portion of the pen portion 6p located on the pen rear side when viewed from the grip portion 6g.
- PT3 is the grip portion 6g.
- two LEDs 10a-1 and 10a-2 are arranged in the portion PT1
- four LEDs 10b-1 to 10b-4 are arranged in the portion PT2
- one LED 10c is arranged in the portion PT3.
- the two LEDs 10a-1 and 10a-2 corresponding to the portion PT1 are arranged side by side in the same position when viewed in the x direction, at a position slightly closer to the grip part 6g than the pen tip. Also, among the four LEDs 10b-1 to 10b-4 corresponding to the portion PT2, the LED 10b-4 is arranged at the distal end of the pen. On the other hand, the other three LEDs 10b-1 to 10b-3 are arranged in a zigzag pattern from the grip portion 6g to the end of the pen. That is, the LED 10b-1 and the LED 10b-3 are provided at a position closer to the right side of the portion PT2, and the LED 10b-2 is provided at a position closer to the left side of the portion PT2.
- the LED 10c corresponding to the portion PT3 is arranged on the surface of the upper end of the grip portion 6g (more specifically, the surface of the tact top button 6ga).
- the tact top button 6ga is exposed without being hidden by the user's hand. Therefore, by providing the LED 10c on the surface of the tact top button 6ga, the computer 2 can constantly detect the controller 6 with a high probability, and thus the position and orientation of the controller 6 can be detected with high accuracy. Become. Also, by not arranging the LEDs in the lower part of the grip portion 6g, the pattern of the LEDs in the image is simplified, and the shape recognition by the computer 2 is facilitated.
- FIGS. 5(b)(d) and 6(b)(d) are perspective views of the controller 6 seen from the left side. Yes.
- the image from the camera 4b on the left side of the controller 6 reflects the LED 10b-2 in the portion PT2 of the pen part 6p, while the tact top button
- the LED 10c on the surface of 6ga is not reflected, but as the controller 6 rotates around the pen axis, the LED 10c is also reflected.
- the distance Lz in the z direction between the LED 10c and the LED 10b-2 reflected in the image is as shown in FIGS. It becomes shorter as the controller 6 rotates.
- the x-direction distance Lx between the LED 10c and the LED 10b-2 remains unchanged. Therefore, the computer 2 can derive the rotation angle of the controller 6 around the pen axis based on Lz, Lx, and other information such as the distance and angle from the LED.
- the controller 6 has LEDs 10a-1 and 10a-2 in the portion PT1 on the pen tip side.
- the center of gravity of the coordinates derived by the computer 2 can be brought closer to the pen tip side than when the LED is not provided in the portion PT1. Therefore, also from this point, it can be said that the position and orientation of the controller 6 can be detected with high accuracy by using the controller 6 according to the present embodiment.
- the LED 10a-1 and the LED 10a-2 provided in the portion PT1 are provided asymmetrically with respect to the xz plane including the axial direction of the pen portion 6p and the longitudinal direction of the grip portion 6g.
- the LEDs 10b-1, 10b-2, and 10b-3 provided in the portion PT2 are also provided asymmetrically with respect to the xz plane including the axial direction of the pen portion 6p and the longitudinal direction of the grip portion 6g. That is, as described above, the three LEDs 10b1, 10b-2, 10b-3 are arranged in a zigzag pattern from the grip portion 6g to the end of the pen. By doing so, the computer 2 can distinguish between the left and right sides of the controller 6 .
- FIG. 7 is a diagram showing images of the cameras 4b and 4c capturing the controller 6 according to the present embodiment. Each bright spot in the image corresponds to an LED provided on the surface of controller 6 . As shown in the figure, if the controller 6 according to the present embodiment is used, the arrangement of the LEDs can be clearly seen from the image of the camera 4b photographing the controller 6 from the left side and the image of the camera 4c photographing the controller 6 from the right side. difference. Therefore, the computer 2 can determine the left and right sides of the controller 6 from the images of the cameras 4b and 4c.
- the left side of the controller 6 tends to be a blind spot from the cameras 4a to 4c.
- the LEDs can be easily reflected in the images captured by the cameras 4a to 4c.
- FIGS. 8A and 8C are diagrams for explaining the arrangement of the cameras 4a to 4c.
- the desk 100 and desk chair 101 shown in FIGS. 8A and 8C are the same as those shown in FIG. Use tracking system 1.
- FIG. 8(b) is a sectional view of the cameras 4b and 4c corresponding to line AA in FIG. 8(a).
- the direction from the camera 4b to the camera 4c is called the X direction
- the direction from the user to the camera 4a is called the Y direction
- the vertical direction is called the Z direction.
- the illustrated position P1 is the position of the head mounted display 5 shown in FIG. 1, and the two positions P2 are the positions of the user's shoulders.
- the cameras 4a to 4c are arranged so as to be able to photograph the entire portion located on the desk 100 in the substantially fan-shaped area E extending from these positions toward the computer 2 side.
- the distance Y2 from the rear end to the camera 4a in the Y direction, the distance Z1 from the floor to the cameras 4b and 4c, the distance Z2 from the floor to the camera 4a, and the shooting directions of the cameras 4c and 4b are X in the XY plane.
- the arrangement of the cameras 4a to 4c is determined by obtaining the angle ⁇ 1 formed with the direction and the angle ⁇ 2 formed by the photographing directions of the cameras 4c and 4b with the X direction in the XZ plane.
- the computer 2 can determine the respective positions and tilts of the head mounted display 5, the controller 6, and the position detection device 3 based on the images captured by the cameras 4a to 4c. can be suitably detected.
- FIG. 9 the rolling shutter will be described below with reference to FIG. 9, and then the structures of the cameras 4a to 4c according to the present embodiment will be specifically described with reference to FIGS. 10 and 11.
- FIG. 10 the rolling shutter will be described below with reference to FIG. 9, and then the structures of the cameras 4a to 4c according to the present embodiment will be specifically described with reference to FIGS. 10 and 11.
- FIG. 10 the rolling shutter will be described below with reference to FIG. 9, and then the structures of the cameras 4a to 4c according to the present embodiment will be specifically described with reference to FIGS. 10 and 11.
- FIG. 9(a) is a diagram showing the image sensor 40 built into each of the cameras 4a to 4c.
- Each square shown in the figure represents a pixel, and as shown in the figure, the image sensor 40 is composed of a pixel matrix in which a plurality of pixels are arranged in a matrix. The number of rows of this pixel matrix is assumed to be N below. Also, the row direction of the pixel matrix is called the “main scanning direction”, and the column direction is called the "sub-scanning direction”.
- FIG. 9(b) is a diagram explaining the operation of the rolling shutter.
- the horizontal axis represents time
- the vertical axis represents the main scanning direction of the pixel matrix.
- the horizontally long rectangles shown in the figure represent the time during which a plurality of pixels included in one row are scanned along the sub-scanning direction.
- cameras 4a to 4c having rolling shutters scan (expose and read) a plurality of pixels in each row along the sub-scanning direction. It is configured to be performed while moving.
- the structures of the cameras 4a to 4c are devised so that the cameras 4a to 4c can be installed so that the sub-scanning direction of the rolling shutter coincides with the vertical direction. This can minimize distortion of the controller 6 in the video, as the user typically moves the controller 6 frequently in the horizontal direction and not much in the vertical direction.
- FIG. 10 is a diagram showing the structure of the cameras 4a to 4c used for arranging the cameras 4a to 4c so that the sub-scanning direction of the rolling shutter is aligned with the vertical direction.
- the cameras 4a to 4c each have a shutter 41 and a screw hole 42 for fixing a tripod.
- the screw holes 42 are provided in the cameras 4a to 4c such that the axial direction is parallel to the sub-scanning direction of the shutter 41.
- the cameras 4a to 4c can be arranged so that the sub-scanning direction of the rolling shutter coincides with the vertical direction. Therefore, it becomes possible to minimize the distortion of the controller 6 in the image.
- the LED 10c is provided at the upper end of the grip portion 6g, which is a portion that is less likely to be hidden by the user's hand. It is possible to detect the position and orientation of the controller 6 with high accuracy.
- the tracking system 1 while using the cameras 4a to 4c having rolling shutters as the shutters 41, it is possible to minimize the distortion of the controller 6 in the image. 6 position and orientation can be detected.
- the controller 6 is provided with seven LEDs (two LEDs 10a-1 and 10a-2, four LEDs 10b-1 to 10b-4, and one LED 10c).
- the number of seven is an example, and it goes without saying that other numbers of LEDs may be arranged.
- the LEDs overlap in the images captured by the cameras 4a to 4c, making it difficult to distinguish between individual LEDs. inappropriate.
- the number of LEDs, which is seven, used in this embodiment is the number optimized in consideration of this point.
- FIG. 11 is a diagram showing a usage state of the tracking system 1 according to the modified example of the present embodiment.
- the tracking system 1 according to this modification is the same as the tracking system 1 according to this embodiment except that it has four cameras 4a to 4d.
- the cameras 4 a to 4 d are arranged so as to photograph the desk 100 from above the four corners of the desk 100 .
- the computer 2 can preferably detect the respective positions and tilts of the head mounted display 5, the controller 6, and the position detection device 3 based on the images captured by the cameras 4a to 4c.
- the specific positions of the cameras 4a to 4d are, as in the present embodiment, so that they can capture the entire portion of the area E shown in FIGS. You just have to decide.
- each LED arranged on the surface of the controller 6 is configured as a so-called point light source, but at least some of the LEDs may be LEDs having a wider light emitting area than a so-called point light source.
- FIGS. 12(a) and 12(b) are diagrams showing a controller 6 according to a modification of the present embodiment.
- the LED 10c provided on the surface of the upper end of the grip portion 6g and the LED 10b-4 provided at the end of the pen are composed of LEDs having a wider light emitting area than so-called point light sources.
- the LED 10c and the LED 10b-4 are arranged in a hemispherical shape by arranging them along the shape of the installation site. By doing so, the computer 2 can obtain the center and radius of the hemisphere from the circles appearing in the images of the cameras 4a to 4c, and can obtain the coordinates only from the images of the LEDs 10c and 10b-4. Therefore, it becomes possible to detect the position and orientation of the controller 6 with higher accuracy.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
2 コンピュータ
3 位置検出装置
4a~4d カメラ
5 ヘッドマウントディスプレイ
6 コントローラ
6g グリップ部
6ga タクトトップボタン
6gb グラブボタン
6gc,6gd タクトボタン
6ge ダイヤルボタン
6gf 凹部
6p ペン部
6pa,6pb プレッシャーパッド
6pc,6pd シフトボタン
10a-1,10a-2,10b-1~10b-4,10c LED
40 イメージセンサー
41 シャッター
42 ネジ穴
50 三脚
100 デスク
101 デスクチェア
PT1~PT3 コントローラ6の部分
Claims (11)
- ペン形状に形成されたペン部と、
前記ペン部の軸方向と交差するグリップ部と、
前記グリップ部の端部であり前記ペン部の軸方向から近い端部に配置される第1の発光部と、
を有するコントローラ。 - 前記グリップ部の端部であり前記ペン部の軸方向から近い端部に設けられる第1の操作部を有し、
前記第1の発光部は、前記第1の操作部に配置される、
請求項1に記載のコントローラ。 - 前記第1の操作部を囲むように設けられる第2の操作部を有する、
請求項2に記載のコントローラ。 - 前記ペン部に配置される1以上の第2の発光部、
を有する請求項1に記載のコントローラ。 - 前記1以上の第2の発光部は、ペン先側の前記ペン部に配置された1以上の第3の発光部と、前記ペン先側と対向するペンリア側の前記ペン部に配置される1以上の第4の発光部と、を含む、
請求項4に記載のコントローラ。 - 前記1以上の第3の発光部は、前記ペン部の軸方向及び前記グリップ部の長手方向を含む平面に対して非対称に設けられる、
請求項5に記載のコントローラ。 - 前記1以上の第4の発光部は、前記ペン部の軸方向及び前記グリップ部の長手方向を含む平面に対して非対称に設けられる、
請求項5又は6に記載のコントローラ。 - 前記ペン部は、第1の側面及び第2の側面を有し、
前記1以上の第3の発光部は前記第1の側面及び前記第2の側面に設けられ、前記第1の側面に設けられる前記第3の発光部の個数は前記第2の側面に設けられる前記第3の発光部の個数と同じである、
請求項6に記載のコントローラ。 - 前記ペン部は、第1の側面及び第2の側面を有し、
前記1以上の第4の発光部は前記第1の側面及び前記第2の側面に設けられ、前記第1の側面に設けられる発光部の個数は前記第2の側面に設けられる発光部の個数と異なる、
請求項7に記載のコントローラ。 - 請求項1乃至9のいずれか一項に記載のコントローラの動きをトラッキングするためのトラッキングシステムであって、
ローリングシャッターを有し、該ローリングシャッターの副走査方向が垂直方向と一致するように配置された1以上のカメラと、
前記1以上のカメラによって撮影された映像に基づいて前記コントローラの動きをトラッキングするコンピュータと、
を含むトラッキングシステム。 - 前記1以上のカメラは、
上向きにカメラ取付ネジが設けられた三脚に固定された状態で用いられ、
軸方向が前記ローリングシャッターの副走査方向と平行になるように設けられた三脚固定用のネジ穴を有する、
請求項10に記載のトラッキングシステム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21933283.0A EP4318185A4 (en) | 2021-03-22 | 2021-12-22 | CONTROL UNIT AND TRACKING SYSTEM |
JP2023508632A JPWO2022201693A1 (ja) | 2021-03-22 | 2021-12-22 | |
CN202180090279.5A CN116848494A (zh) | 2021-03-22 | 2021-12-22 | 控制器及追踪系统 |
US18/472,067 US20240012492A1 (en) | 2021-03-22 | 2023-09-21 | Controller and tracking system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021047265 | 2021-03-22 | ||
JP2021-047265 | 2021-03-22 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/472,067 Continuation US20240012492A1 (en) | 2021-03-22 | 2023-09-21 | Controller and tracking system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022201693A1 true WO2022201693A1 (ja) | 2022-09-29 |
Family
ID=83396735
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/047473 WO2022201693A1 (ja) | 2021-03-22 | 2021-12-22 | コントローラ及びトラッキングシステム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240012492A1 (ja) |
EP (1) | EP4318185A4 (ja) |
JP (1) | JPWO2022201693A1 (ja) |
CN (1) | CN116848494A (ja) |
TW (1) | TW202240354A (ja) |
WO (1) | WO2022201693A1 (ja) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11143629A (ja) * | 1997-11-07 | 1999-05-28 | Seiko Epson Corp | 遠隔座標入力装置および遠隔座標入力方法 |
WO2005024616A1 (ja) * | 2003-09-04 | 2005-03-17 | Matsushita Electric Industrial Co., Ltd. | 電子機器、入力装置およびこれを備えた携帯電子機器 |
JP2017010314A (ja) * | 2015-06-23 | 2017-01-12 | 株式会社リコー | 画像投影システム、画像投影装置、ポインティングデバイス、および映像供給装置 |
JP2017097696A (ja) * | 2015-11-26 | 2017-06-01 | 株式会社コロプラ | 仮想空間内オブジェクトへの動作指示方法、及びプログラム |
WO2019044003A1 (ja) * | 2017-09-04 | 2019-03-07 | 株式会社ワコム | 空間位置指示システム |
WO2019181118A1 (ja) * | 2018-03-23 | 2019-09-26 | 株式会社ワコム | 3次元位置指示器及び3次元位置検出システム |
WO2019225170A1 (ja) | 2018-05-21 | 2019-11-28 | 株式会社ワコム | 位置指示デバイス及び空間位置指示システム |
US20200042111A1 (en) * | 2018-08-03 | 2020-02-06 | Logitech Europe S.A. | Input device for use in an augmented/virtual reality environment |
US20200333891A1 (en) * | 2019-04-19 | 2020-10-22 | Apple Inc. | Stylus-based input system for a head-mounted device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7268774B2 (en) * | 1998-08-18 | 2007-09-11 | Candledragon, Inc. | Tracking motion of a writing instrument |
WO2015031456A1 (en) * | 2013-08-29 | 2015-03-05 | Interphase Corporation | Rolling shutter synchronization of a pointing device in an interactive display system |
JP7258482B2 (ja) * | 2018-07-05 | 2023-04-17 | キヤノン株式会社 | 電子機器 |
-
2021
- 2021-12-22 EP EP21933283.0A patent/EP4318185A4/en active Pending
- 2021-12-22 WO PCT/JP2021/047473 patent/WO2022201693A1/ja active Application Filing
- 2021-12-22 CN CN202180090279.5A patent/CN116848494A/zh active Pending
- 2021-12-22 JP JP2023508632A patent/JPWO2022201693A1/ja active Pending
- 2021-12-28 TW TW110149102A patent/TW202240354A/zh unknown
-
2023
- 2023-09-21 US US18/472,067 patent/US20240012492A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11143629A (ja) * | 1997-11-07 | 1999-05-28 | Seiko Epson Corp | 遠隔座標入力装置および遠隔座標入力方法 |
WO2005024616A1 (ja) * | 2003-09-04 | 2005-03-17 | Matsushita Electric Industrial Co., Ltd. | 電子機器、入力装置およびこれを備えた携帯電子機器 |
JP2017010314A (ja) * | 2015-06-23 | 2017-01-12 | 株式会社リコー | 画像投影システム、画像投影装置、ポインティングデバイス、および映像供給装置 |
JP2017097696A (ja) * | 2015-11-26 | 2017-06-01 | 株式会社コロプラ | 仮想空間内オブジェクトへの動作指示方法、及びプログラム |
WO2019044003A1 (ja) * | 2017-09-04 | 2019-03-07 | 株式会社ワコム | 空間位置指示システム |
WO2019181118A1 (ja) * | 2018-03-23 | 2019-09-26 | 株式会社ワコム | 3次元位置指示器及び3次元位置検出システム |
WO2019225170A1 (ja) | 2018-05-21 | 2019-11-28 | 株式会社ワコム | 位置指示デバイス及び空間位置指示システム |
US20200042111A1 (en) * | 2018-08-03 | 2020-02-06 | Logitech Europe S.A. | Input device for use in an augmented/virtual reality environment |
US20200333891A1 (en) * | 2019-04-19 | 2020-10-22 | Apple Inc. | Stylus-based input system for a head-mounted device |
Also Published As
Publication number | Publication date |
---|---|
CN116848494A (zh) | 2023-10-03 |
JPWO2022201693A1 (ja) | 2022-09-29 |
US20240012492A1 (en) | 2024-01-11 |
EP4318185A1 (en) | 2024-02-07 |
EP4318185A4 (en) | 2024-10-23 |
TW202240354A (zh) | 2022-10-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7295329B2 (en) | Position detection system | |
JP6153564B2 (ja) | カメラとマーク出力とによるポインティング装置 | |
US8022928B2 (en) | Free-space pointing and handwriting | |
KR100465969B1 (ko) | 손가락 표면을 이용한 소형 포인팅 장치 | |
US8941620B2 (en) | System and method for a virtual multi-touch mouse and stylus apparatus | |
US7313255B2 (en) | System and method for optically detecting a click event | |
US8013838B2 (en) | Generating position information using a video camera | |
EP2418567B1 (en) | Optical position input system and method | |
US20090009469A1 (en) | Multi-Axis Motion-Based Remote Control | |
TWI559174B (zh) | 以手勢爲基礎之三維影像操控技術 | |
US7825898B2 (en) | Inertial sensing input apparatus | |
JP4054847B2 (ja) | 光デジタイザ | |
JPH08240407A (ja) | 位置検出入力装置 | |
KR100532525B1 (ko) | 카메라를 이용한 삼차원 포인팅장치 | |
US20080055275A1 (en) | Optical sensing in displacement type input apparatus and methods | |
US20110193969A1 (en) | Object-detecting system and method by use of non-coincident fields of light | |
JP5401645B2 (ja) | ヒューマンインターフェイス装置 | |
JP2018018308A (ja) | 情報処理装置、及びその制御方法ならびにコンピュータプログラム | |
WO2022201693A1 (ja) | コントローラ及びトラッキングシステム | |
US20070241262A1 (en) | Optical sensing unit for an optical input device | |
WO2008130145A1 (en) | Touch-screen apparatus and method using laser and optical fiber | |
JP6643825B2 (ja) | 装置及び方法 | |
JP6315127B2 (ja) | 入力装置、空中像インタラクションシステム、及び入力方法 | |
WO2024219004A1 (ja) | Xrコントローラ | |
EP1775656A1 (en) | Inertial sensing input apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21933283 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023508632 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180090279.5 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021933283 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021933283 Country of ref document: EP Effective date: 20231023 |