US11712619B2 - Handle controller - Google Patents

Handle controller Download PDF

Info

Publication number
US11712619B2
US11712619B2 US17/877,219 US202217877219A US11712619B2 US 11712619 B2 US11712619 B2 US 11712619B2 US 202217877219 A US202217877219 A US 202217877219A US 11712619 B2 US11712619 B2 US 11712619B2
Authority
US
United States
Prior art keywords
light
emitting
marks
emitting marks
handle controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/877,219
Other languages
English (en)
Other versions
US20220362659A1 (en
Inventor
Tao Wu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Pico Technology Co Ltd
Original Assignee
Qingdao Pico Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Pico Technology Co Ltd filed Critical Qingdao Pico Technology Co Ltd
Publication of US20220362659A1 publication Critical patent/US20220362659A1/en
Assigned to QINGDAO PICO TECHNOLOGY CO., LTD. reassignment QINGDAO PICO TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WU, TAO
Application granted granted Critical
Publication of US11712619B2 publication Critical patent/US11712619B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • Embodiments of the present disclosure relate to the technical field of virtual reality, and particularly to a handle controller.
  • a handle plays an increasingly important role.
  • a user interacts with scenes of virtual reality, augmented reality and mixed reality.
  • an electromagnetic transmitter may be embedded in a handle, and an electromagnetic receiver may be embedded in a VR headset integrated machine, and the position and attitude information of the handle in a three-dimensional space may be calculated in real time through the principle of electromagnetic tracking.
  • An ultrasonic transmitter can also be embedded in the handle, and an ultrasonic receiver may be embedded in the VR headset integrated machine, and the position and attitude information of the handle in the three-dimensional space may be calculated in real time through the principle of ultrasonic tracking.
  • the handle however, has an electromagnetic sensor that is sensitive to and tends to be disturbed by complex electromagnetic signals in the environment, causing the electromagnetic sensor to generate erroneous electromagnetic tracking data about the handle.
  • the electromagnetic sensor of the handle is disturbed by other electromagnetic signals when it is relatively close to a computer host, or in an environment that is relatively close to an audio, TV, refrigerator, etc., and thus degrading the tracking performance of the handle. Therefore, handles using the electromagnetic sensor are greatly limited in use. Similarly, handles using an ultrasonic sensor are also greatly limited in use.
  • An object of an embodiment of the present disclosure is to provide an improved handle controller.
  • a handle controller including:
  • the light-emitting unit being provided at an end of the handle body and forming a preset angle with the handle body;
  • the light-emitting unit includes a first surface, a second surface, a plurality of first light-emitting marks and a plurality of second light-emitting marks, the second surface covering the first surface; the first light-emitting marks and the second light-emitting marks are both provided on the first surface, and the plurality of first light-emitting marks are distributed annularly;
  • the first light-emitting marks and the second light-emitting marks are configured to illuminate so as to be captured by an imaging device
  • the first light-emitting marks illuminate for a first period
  • the second light-emitting marks illuminate for a second period
  • first light-emitting marks there are seventeen first light-emitting marks and two second light-emitting marks, and the two second light-emitting marks are symmetrically provided among the seventeen first light-emitting marks.
  • first light-emitting marks there are twenty first light-emitting marks and two second light-emitting marks, the second light-emitting marks are in a strip shape, one of the second light-emitting marks is provided at an upper edge of the first surface, the other of the second light-emitting marks is provided at an lower edge of the first surface, and the twenty first light-emitting marks are distributed between the two second light-emitting marks.
  • the preset angle is 40°-135°.
  • the first light-emitting marks have a wavelength in a range of 450 nm-690 nm.
  • both the first surface and the second surface are annular.
  • a ratio of the radius of the first surface to the radius of the second surface is 1:1.5.
  • the plurality of first light-emitting marks are connected in series.
  • the plurality of second light-emitting marks are connected in series.
  • a wireless transmission module provided on the handle body.
  • a sensor provided on the handle body.
  • a blinking frequency of the first light-emitting marks is 30 Hz.
  • both the first period and the second period are 15 us-100 us.
  • a beneficial effect of the technical solutions of the embodiments of the present disclosure is that by capturing the first light-emitting mark and the second light-emitting mark on the handle controller with the imaging device, it is possible to calculate the position and attitude information of the handle controller in three-dimensional space in real time and accurately, and thus it is very easy to operate. Moreover, the capturing process is not influenced by electromagnetic wave signals and ultrasonic signals in the surrounding environment, and thus the handle controller can be widely applied.
  • FIG. 1 is a schematic structural view of a handle controller according to an embodiment of the present disclosure
  • FIG. 2 is a schematic view of a first surface in FIG. 1 when unfolded
  • FIG. 3 is a schematic structural view of a handle controller according to another embodiment of the present disclosure.
  • FIG. 4 is a schematic view of a first surface in FIG. 3 when unfolded.
  • 1 handle body; 2 . light-emitting unit; 3 . first light-emitting mark; 4 . second light-emitting mark.
  • this embodiment provides a handle controller, which facilitates an imaging device to capture first light-emitting marks 3 and second light-emitting marks 4 on the handle controller, and thus enables calculation of the position and attitude information of the handle controller in three-dimensional space in real time and accurately.
  • the handle controller includes a handle body 1 and a light-emitting unit 2 ; the light-emitting unit 2 is provided at an end of the handle body 1 and forms a preset angle with the handle body 1 ; the light-emitting unit 2 includes a first surface, a second surface, a plurality of first light-emitting marks 3 , and a plurality of second light-emitting marks 4 .
  • the second surface covers the first surface; the first light-emitting marks 3 and the second light-emitting marks 4 are both provided on the first surface, and the plurality of first light-emitting marks 3 are annularly distributed; the first light-emitting marks 3 and the second light-emitting marks 4 are configured to illuminate so as to be captured by the imaging device; the first light-emitting marks 3 illuminate for a first period, and the second light-emitting marks 4 illuminate for a second period.
  • the imaging device is implemented as two or more Camera tracking Cameras built into a VR headset integrated machine, which are used to capture the first light-emitting marks 3 and second light-emitting marks 4 on the optical handle controller in real time.
  • the imaging device By detecting the position area of the second light-emitting marks 4 on an image of the handle controller, it is possible to determine on the image a rough position of the light spot of the first light-emitting marks 3 on the handle controller. After that, at the position area on the image, two-dimensional position coordinates of the light spots corresponding to the first light-emitting marks 3 and the second light-emitting marks 4 on the handle controller on the tracking Camera image are detected in real time by computer image processing algorithm.
  • the number of the first light-emitting marks 3 is seventeen, and the number of the second light-emitting marks 4 is two.
  • the two second light-emitting marks 4 are symmetrically provided among the seventeen first light-emitting marks 3 .
  • the seventeen first light-emitting marks 3 and the two second light-emitting marks 4 arranged in a certain geometric shape will illuminate synchronically according to a certain duty cycle, and the illuminating period may range from 15 us to 100 us.
  • the first and last frames of the illuminating period of the first light-emitting marks 3 must be the same as those of the second light-emitting marks 4 .
  • the number of the first light-emitting marks 3 is twenty, and the number of the second light-emitting marks 4 is two.
  • each of the second light-emitting marks 4 is in the shape of a strip, one of the second light-emitting marks 4 is arranged on the upper edge of the first surface, the other of the second light-emitting marks 4 is arranged on the lower edge of the first surface, and the twenty first light-emitting marks 3 are distributed between the two second light-emitting marks 4 .
  • the physical position distribution on the first surface of the twenty first light-emitting marks 3 on the left-hand handle controller and that of the twenty first light-emitting marks 3 on the right-hand handle controller may be physically symmetrically distributed according to a certain symmetry rule.
  • the preset angle is 40°-135°, which not only conforms to the user's usage habits and is convenient for holding the handle body 1 , but also facilitates the imaging device to capture the first light-emitting marks 3 and the second light-emitting marks 4 .
  • the wavelength range of the first light-emitting marks 3 is 450 nm-690 nm, which facilitates the imaging device to capture the first light-emitting marks 3 .
  • the first light-emitting marks 3 are in the infrared band, with wavelength thereof being 850 nm or 940 nm.
  • both the first surface and the second surface are annular.
  • a ratio of the radius of the first surface to the radius of the second surface is 1:1.5, which facilitates the arrangement of the first light-emitting marks 3 and the second light-emitting marks 4 between the first surface and the second surface, as well as the protection of the first light-emitting marks 3 and the second light-emitting marks 4 by the second surface.
  • a plurality of the first light-emitting marks 3 are connected in series and provided on the same circuit board, which facilitates simultaneous illumination of the first light-emitting marks 3 and thereby facilitating capturing of the first light-emitting marks 3 by the imaging device.
  • a plurality of the second light-emitting marks 4 are connected in series and provided on the same circuit board, which facilitates simultaneous illumination of the second light-emitting marks 4 and thereby facilitating capturing of the second light-emitting marks 4 by the imaging device.
  • a wireless transmission module is further included, which is provided on the handle body 1 .
  • the handle controller also includes a wireless transmission module which is built into the VR headset integrated machine. Then, a control processing unit built in the VR headset integrated machine synchronizes the blinking frequency of each first light-emitting mark 3 and each second light-emitting mark 4 of the handle controller with the exposure shutter of the tracking Camera built in the VR headset integrated machine. That is, the first light-emitting marks 3 and second light-emitting marks 4 on the handle controller will illuminate during the opening period of the exposure shutter of the tracking Camera built in the VR headset integrated machine in each frame. Considering the power consumption and the actual use environment of the handle and other aspects, the illuminating period of the first light-emitting marks 3 and the second light-emitting marks 4 is generally about 15 us-100 us.
  • the exposure period of the tracking Camera is generally about 30 us-150 us, such that it is possible to sufficiently capture the illuminating periods of the first light-emitting marks 3 and the second light-emitting marks 4 within the exposure period of the tracking Camera, and the capture frequency of the tracking Camera is adjusted to 60 Hz.
  • the first light-emitting marks 3 and the second light-emitting marks 4 will illuminate separately.
  • the second light-emitting marks 4 illuminate at odd-numbered frames captured by the tracking Camera
  • the first light-emitting marks 3 illuminate at even-numbered frames captured by the tracking Camera.
  • both the first light-emitting marks 3 and the second light-emitting marks 4 can be LED lights.
  • the rectangle corresponding to this area is extended by 1.5 times the length of the long side and the short side on the image at the even-numbered frames of the tracking Camera by using the area of the image corresponding to the second light-emitting marks 4 on the image at the previous odd frame. That is, forming the area of the image where the light spot of the first light-emitting marks 3 on the handle controller is located under the current frame, and detecting the light spot of the first light-emitting marks 3 in this area, while the image area outside this area is not detected.
  • the human eye does not perceive flicker since it is sensitive to flicker of light above 50 Hz.
  • the light spots of the first light-emitting marks 3 and the second light-emitting marks 4 are captured by the VR head-mounted tracking Camera at a frequency of 30 Hz. That is, the blinking frequency of the first light-emitting mark and the second light-emitting marks 4 on the handle controller is 30 Hz according to the above steps, which is not very friendly for users to observe with their eyes in the actual environment since the human eye will perceive the flicker at 30 Hz.
  • the first light-emitting marks 3 and second light-emitting marks 4 of the handle controller illuminate twice on average and the illuminating period of each time is also 15 us-100 us during the period from the start time of each frame of the tracking Camera to the time before the exposure shutter of the tracking Camera is opened in each frame.
  • the frequency of 33 Hz per second makes the first light-emitting marks 3 and second light-emitting marks 4 of the handle controller illuminate 99 times, therefore satisfying an range above the sensitivity of human eyes to flicker of light.
  • the illuminating period of all the first light-emitting marks 3 and the second light-emitting marks 4 is the same in each frame, which facilitates accurate capturing of the first light-emitting marks 3 and the second light-emitting marks 4 by the imaging device.
  • the tracking distance range of the handle controller can support 3 cm-150 cm, which is convenient for a user to better interact with virtual reality, augmented reality and mixed reality via the handle controller.
  • a sensor provided on the handle body 1 .
  • an IMU inertial navigation sensor is embedded in the handle controller, and the sensor is at least a six-axis sensor, that is, an accelerometer unit sensor and a gyroscope unit sensor; it may also be a nine-axis sensor, that is, an accelerometer unit sensor, a gyroscope unit sensor and a geomagnetic unit sensor.
  • the output frequency of the IMU inertial navigation sensor is at least 90 Hz.
  • a wireless network transmission unit is embedded in the handle controller, a transmission protocal of which may be a 2.4 G network protocol or a BLE (Bluetooth Low Energy) protocol.
  • the wireless transmission frequency is 100 Hz or more.
  • the blinking frequency of the first light-emitting marks 3 is 30 Hz, which facilitates the imaging device to capture the first light-emitting marks 3 .
  • the first light-emitting marks 3 and the second light-emitting marks 4 have an illuminating period of 15 us-100 us, which facilitates the imaging device to accurately capture the first light-emitting marks 3 and the second light-emitting marks 4 .
  • the handle controller by capturing the first light-emitting marks 3 and the second light-emitting marks 4 on the handle controller with the imaging device, it is possible to calculate the position and attitude information of the handle controller in three-dimensional space in real time and accurately, and is very easy to operate. Moreover, the capturing process is not influenced by electromagnetic wave signals and ultrasonic signals in the surrounding environment, and thus the handle controller can be widely applied.
US17/877,219 2020-11-09 2022-07-29 Handle controller Active US11712619B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202011242190.XA CN112451962B (zh) 2020-11-09 2020-11-09 一种手柄控制追踪器
CN202011242190.X 2020-11-09
PCT/CN2021/118171 WO2022095605A1 (zh) 2020-11-09 2021-09-14 一种手柄控制追踪器

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118171 Continuation WO2022095605A1 (zh) 2020-11-09 2021-09-14 一种手柄控制追踪器

Publications (2)

Publication Number Publication Date
US20220362659A1 US20220362659A1 (en) 2022-11-17
US11712619B2 true US11712619B2 (en) 2023-08-01

Family

ID=74826331

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/877,219 Active US11712619B2 (en) 2020-11-09 2022-07-29 Handle controller

Country Status (3)

Country Link
US (1) US11712619B2 (zh)
CN (1) CN112451962B (zh)
WO (1) WO2022095605A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112451962B (zh) * 2020-11-09 2022-11-29 青岛小鸟看看科技有限公司 一种手柄控制追踪器
CN113225870B (zh) * 2021-03-29 2023-12-22 青岛小鸟看看科技有限公司 Vr设备定位方法及vr设备
CN113318435A (zh) * 2021-04-27 2021-08-31 青岛小鸟看看科技有限公司 手柄控制追踪器的控制方法、装置及头戴式显示设备
CN117017496B (zh) * 2023-09-28 2023-12-26 真健康(北京)医疗科技有限公司 柔性体表定位装置及穿刺手术导航定位系统

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794010A (zh) 2005-12-19 2006-06-28 北京威亚视讯科技有限公司 位置姿态跟踪系统
US20060277571A1 (en) * 2002-07-27 2006-12-07 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US20100184513A1 (en) * 2007-09-07 2010-07-22 Konami Digital Entertainment Co., Ltd. Motion determination apparatus, game apparatus therefor, and computer program therefor
US20110081969A1 (en) 2005-08-22 2011-04-07 Akio Ikeda Video game system with wireless modular handheld controller
US20130307772A1 (en) 2012-05-21 2013-11-21 Everest Display Inc. Interactive projection system with light spot identification and control method thereof
CN105511649A (zh) 2016-01-06 2016-04-20 王帆 一种多点定位系统及多点定位方法
US20160364910A1 (en) * 2015-06-11 2016-12-15 Oculus Vr, Llc Hand-Held Controllers with Light-Emitting Diodes Synchronized to an External Camera
CN107219963A (zh) 2017-07-04 2017-09-29 深圳市虚拟现实科技有限公司 虚拟现实手柄图形空间定位方法和系统
CN108257177A (zh) 2018-01-15 2018-07-06 天津锋时互动科技有限公司深圳分公司 基于空间标识的定位系统与方法
US20180311575A1 (en) * 2017-04-26 2018-11-01 Oculus Vr, Llc Hand-held controller using led tracking ring
US20180329484A1 (en) * 2017-05-09 2018-11-15 Microsoft Technology Licensing, Llc Object and environment tracking via shared sensor
US20190012835A1 (en) * 2017-07-07 2019-01-10 Microsoft Technology Licensing, Llc Driving an Image Capture System to Serve Plural Image-Consuming Processes
CN110119192A (zh) 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 视觉交互装置
US20190302903A1 (en) * 2018-03-30 2019-10-03 Microsoft Technology Licensing, Llc Six dof input device
US20190318501A1 (en) * 2018-04-16 2019-10-17 Microsoft Technology Licensing, Llc Tracking pose of handheld object
CN110837295A (zh) 2019-10-17 2020-02-25 重庆爱奇艺智能科技有限公司 一种手持控制设备及其追踪定位的方法、设备与系统
CN111354018A (zh) 2020-03-06 2020-06-30 合肥维尔慧渤科技有限公司 一种基于图像的物体识别方法、装置及系统
CN111459279A (zh) 2020-04-02 2020-07-28 重庆爱奇艺智能科技有限公司 主动式补光设备、3dof手柄、vr设备及追踪系统
CN112451962A (zh) 2020-11-09 2021-03-09 青岛小鸟看看科技有限公司 一种手柄控制追踪器
US11016566B1 (en) * 2015-11-05 2021-05-25 Facebook Technologies, Llc Controllers with asymmetric tracking patterns

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060277571A1 (en) * 2002-07-27 2006-12-07 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US20110081969A1 (en) 2005-08-22 2011-04-07 Akio Ikeda Video game system with wireless modular handheld controller
CN1794010A (zh) 2005-12-19 2006-06-28 北京威亚视讯科技有限公司 位置姿态跟踪系统
US20100184513A1 (en) * 2007-09-07 2010-07-22 Konami Digital Entertainment Co., Ltd. Motion determination apparatus, game apparatus therefor, and computer program therefor
US20130307772A1 (en) 2012-05-21 2013-11-21 Everest Display Inc. Interactive projection system with light spot identification and control method thereof
US20160364910A1 (en) * 2015-06-11 2016-12-15 Oculus Vr, Llc Hand-Held Controllers with Light-Emitting Diodes Synchronized to an External Camera
US11016566B1 (en) * 2015-11-05 2021-05-25 Facebook Technologies, Llc Controllers with asymmetric tracking patterns
CN105511649A (zh) 2016-01-06 2016-04-20 王帆 一种多点定位系统及多点定位方法
CN110573993A (zh) 2017-04-26 2019-12-13 脸谱科技有限责任公司 使用led追踪环的手持控制器
US20180311575A1 (en) * 2017-04-26 2018-11-01 Oculus Vr, Llc Hand-held controller using led tracking ring
US20180329484A1 (en) * 2017-05-09 2018-11-15 Microsoft Technology Licensing, Llc Object and environment tracking via shared sensor
US20180329517A1 (en) * 2017-05-09 2018-11-15 Microsoft Technology Licensing, Llc Controlling handheld object light sources for tracking
CN107219963A (zh) 2017-07-04 2017-09-29 深圳市虚拟现实科技有限公司 虚拟现实手柄图形空间定位方法和系统
US20190012835A1 (en) * 2017-07-07 2019-01-10 Microsoft Technology Licensing, Llc Driving an Image Capture System to Serve Plural Image-Consuming Processes
CN108257177A (zh) 2018-01-15 2018-07-06 天津锋时互动科技有限公司深圳分公司 基于空间标识的定位系统与方法
CN110119192A (zh) 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 视觉交互装置
US20190302903A1 (en) * 2018-03-30 2019-10-03 Microsoft Technology Licensing, Llc Six dof input device
US20190318501A1 (en) * 2018-04-16 2019-10-17 Microsoft Technology Licensing, Llc Tracking pose of handheld object
CN110837295A (zh) 2019-10-17 2020-02-25 重庆爱奇艺智能科技有限公司 一种手持控制设备及其追踪定位的方法、设备与系统
CN111354018A (zh) 2020-03-06 2020-06-30 合肥维尔慧渤科技有限公司 一种基于图像的物体识别方法、装置及系统
CN111459279A (zh) 2020-04-02 2020-07-28 重庆爱奇艺智能科技有限公司 主动式补光设备、3dof手柄、vr设备及追踪系统
CN112451962A (zh) 2020-11-09 2021-03-09 青岛小鸟看看科技有限公司 一种手柄控制追踪器

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
First Office Action issued by the Chinese Patent Office dated May 5, 2022 with respect to Chinese priority application No. 202011242190.X.

Also Published As

Publication number Publication date
CN112451962A (zh) 2021-03-09
US20220362659A1 (en) 2022-11-17
CN112451962B (zh) 2022-11-29
WO2022095605A1 (zh) 2022-05-12

Similar Documents

Publication Publication Date Title
US11712619B2 (en) Handle controller
US11460698B2 (en) Electromagnetic tracking with augmented reality systems
CN110647237B (zh) 在人工现实环境中基于手势的内容共享
US10198866B2 (en) Head-mountable apparatus and systems
WO2016184107A1 (zh) 用于视线焦点定位的可穿戴设备及视线焦点定位方法
US20180160079A1 (en) Pupil detection device
US11776242B2 (en) Augmented reality deep gesture network
US20170205903A1 (en) Systems and methods for augmented reality
JP2021530814A (ja) 位置ベクトルを使用して半球曖昧性を解決するための方法およびシステム
KR20180110051A (ko) 증강 현실을 위한 시스템들 및 방법들
KR20150093831A (ko) 혼합 현실 환경에 대한 직접 상호작용 시스템
TW202127105A (zh) 關於頭戴式顯示器的內容穩定
WO2022005698A1 (en) Visual-inertial tracking using rolling shutter cameras
JP2021060627A (ja) 情報処理装置、情報処理方法、およびプログラム
WO2021252242A2 (en) Augmented reality environment enhancement
US11216066B2 (en) Display device, learning device, and control method of display device
CN115735150A (zh) 增强现实眼戴器与3d服装
KR100871867B1 (ko) 사용자 상황 정보 인지에 기초한 개인별 콘텐츠 서비스장치 및 방법
JP5732446B2 (ja) ヘッドマウントディスプレイ、および動き検出方法
US20200159339A1 (en) Desktop spatial stereoscopic interaction system
CN108803861A (zh) 一种交互方法、设备及系统
TWI836498B (zh) 用於配件配對的方法、系統以及記錄介質
Sorger Alternative User Interfaces

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: SPECIAL NEW

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: QINGDAO PICO TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WU, TAO;REEL/FRAME:062928/0221

Effective date: 20230227

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE