WO2019033322A1 - 手持式控制器、跟踪定位方法以及系统 - Google Patents

手持式控制器、跟踪定位方法以及系统 Download PDF

Info

Publication number
WO2019033322A1
WO2019033322A1 PCT/CN2017/097738 CN2017097738W WO2019033322A1 WO 2019033322 A1 WO2019033322 A1 WO 2019033322A1 CN 2017097738 W CN2017097738 W CN 2017097738W WO 2019033322 A1 WO2019033322 A1 WO 2019033322A1
Authority
WO
WIPO (PCT)
Prior art keywords
handle
bracket
identification pattern
hand
handheld controller
Prior art date
Application number
PCT/CN2017/097738
Other languages
English (en)
French (fr)
Inventor
李威
饶碧晟
戴景文
贺杰
Original Assignee
广东虚拟现实科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东虚拟现实科技有限公司 filed Critical 广东虚拟现实科技有限公司
Priority to US16/314,400 priority Critical patent/US20190384419A1/en
Priority to CN201780007656.8A priority patent/CN109069920B/zh
Priority to PCT/CN2017/097738 priority patent/WO2019033322A1/zh
Publication of WO2019033322A1 publication Critical patent/WO2019033322A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/98Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention relates to the field of computer entertainment technologies, and in particular, to a handheld controller, a tracking positioning method, and a system.
  • the interactive control technology is an important application direction in the fields of virtual reality/augmented reality/mixed reality.
  • the interactive control technology has played a huge demand for the rapid development of the VR/AR/MR field.
  • the handheld controller (handle) is an indispensable hardware device for interactive control, providing strong support for interactive control.
  • the user can realize the human-computer interaction function by controlling the control buttons (buttons, triggers, trackpads, etc.) of the handheld controller.
  • the current tracking and positioning of the controller is basically determined by optical methods, such as by infrared or adding light spots.
  • optical methods such as by infrared or adding light spots.
  • special equipment is required for infrared tracking, and there is a delay in increasing the recognition of the light spot.
  • a complete strobe period is required to recognize the light spot, and the frequency of the spot strobe needs to be precisely controlled.
  • An object of the embodiments of the present invention is to provide a handheld controller, a tracking and positioning method, and a system to solve the above problems.
  • a first aspect of an embodiment of the present invention provides a hand-held controller including: a handle and a bracket.
  • the handle includes an input device for receiving an input operation of the user.
  • the bracket is coupled to the handle, and the outer surface of the bracket has an identification pattern.
  • the stent is, for example, annular.
  • the handle includes opposing first and second ends, the first end being coupled to the bracket and the second end being remote from the bracket.
  • the input device of the handle is located at the second end, the handle being configured such that the user can grip the handle through the center of the bracket and perform an input operation through the input device.
  • the handle has a first angle with the plane in which the bracket is located.
  • the handle is located within a space defined by the inner surface of the handle.
  • the input device of the handle is located at a first end, the handle being configured such that the user can directly grip the handle outside of the stand and perform an input operation through the input device.
  • the bracket has a notch, the bracket being attached to the handle at one end of the notch, the handle being configured such that the user can grip the handle through the center of the bracket and perform an input operation through the input device.
  • the outer surface of the bracket includes a first surface and a second surface, the first surface intersecting the second surface at an circumscribed circle of the bracket, the identification pattern being disposed on at least one of the first surface and the second surface .
  • the identification pattern is disposed on the first surface and the second surface, and the identification pattern disposed on the first surface is different from the identification pattern disposed on the second surface.
  • the outer surface of the bracket is a circular arc surface and the identification pattern is disposed on the arcuate surface.
  • the outer surface of the bracket is formed by splicing of differently shaped unit panels, the patterns on each unit panel collectively constituting the identification pattern.
  • the unit panel comprises a hexagonal panel, a pentagonal panel, a triangular panel, or a trapezoidal panel.
  • the identification pattern includes a background and a marker point distributed over the background, the background or the color or brightness of the marker point being two colors or brightness distinguishable by the imaging device.
  • the marker points are the same size and evenly distributed throughout the background.
  • the marker point size intervals are distributed over the background.
  • the marker points are circular, polygonal or square.
  • the background is black, the marker point is white, or the background is white and the marker point is black.
  • a second aspect of the embodiments of the present invention provides a hand-held controller, including: a handle, the handle includes an input device for receiving an input operation of a user; and a bracket, the bracket is coupled to the handle, and the bracket is An outer surface having an identification pattern; and a microcontroller coupled to the input device, the microcontroller configured to receive and process data or signals from the input device, the microcontroller being disposed within the handle or bracket .
  • a third aspect of the embodiments of the present invention provides a tracking and positioning system, including a terminal device, an imaging device, and the above-mentioned handheld controller.
  • the imaging device is configured to collect an identification pattern.
  • a fourth aspect of the embodiments of the present invention provides a tracking and positioning method, which is applied to a tracking system.
  • the system includes a terminal device, an imaging device, and a handheld controller.
  • the handheld controller includes a handle and a bracket connected to the handle, and the handle includes an input device.
  • the outer surface of the bracket has an identification pattern
  • the method includes: the imaging device acquires an image of the identification pattern; and the terminal device performs tracking and positioning on the handheld controller according to the identification pattern.
  • the handheld controller is further provided with a sensor for acquiring attitude data, and the terminal device performs tracking and positioning on the handheld controller according to the identification pattern, including: the terminal device is handheld according to the identification pattern and the attitude data collected by the sensor. The controller performs tracking and positioning.
  • the terminal device performs tracking and positioning on the handheld controller according to the identification pattern, including: the terminal device obtains the imaging device relative to the handheld control by identifying the marker point in the pattern and using the three-dimensional structural information of the pattern marker point. The position and orientation of a specific point of the device; the terminal device tracks and locates the handheld controller according to the position and orientation.
  • the tracking and positioning of the handheld controller can be realized by setting the identification pattern on the handheld controller.
  • the frequency of the spot strobe is not required to be accurately controlled, and the structure is simpler and the cost is lower. Lower.
  • the controller design can further reduce the control requirements for the imaging device without the need to specifically adjust the imaging device parameters to match the tracking positioning of the active illumination handheld controller.
  • FIG. 1 is a schematic structural diagram of a positioning and tracking system according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a handheld controller according to an embodiment of the present invention.
  • FIGS. 4A to 4D are schematic diagrams showing an identification pattern provided by an embodiment of the present invention.
  • FIG. 5 is an exemplary schematic diagram of another identification pattern provided by an embodiment of the present invention.
  • FIG. 6 is an exemplary schematic diagram of still another identification pattern provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of the use of a handheld controller according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of the use of another handheld controller according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of another handheld controller according to an embodiment of the present invention.
  • Figure 10 is a schematic view showing the use of the hand-held controller shown in Figure 9;
  • FIG. 11 is a schematic structural diagram of still another handheld controller according to an embodiment of the present invention.
  • Figure 12 is a schematic view showing the use of the hand-held controller shown in Figure 11;
  • FIG. 13 is an electrical block diagram of a handheld controller according to an embodiment of the present invention.
  • FIG. 14 is a schematic flowchart of a positioning and tracking method according to an embodiment of the present invention.
  • horizontal simply means that its direction is more horizontal than “vertical”, and does not mean that the structure must be completely horizontal, but may be slightly inclined.
  • FIG. 1 is an exemplary block diagram of a tracking and positioning system according to an embodiment of the present invention.
  • tracking positioning system 100 can include a handheld controller 120, an imaging device 140 having an image sensor 142, and a terminal device 160.
  • the outer surface of the hand-held controller 120 has an identification pattern.
  • the image taken by the imaging device 140 may include an identification pattern.
  • the recognition pattern includes a background and a marker point distributed to the background according to a specific rule, and the background or the color or brightness of the marker point is two colors or brightness distinguishable by the imaging device. For example, the background is black, the marker point is white, or the background is white and the marker point is black.
  • the terminal device 160 can track and locate the handheld controller 120 based on the identification pattern on the handheld controller 120 captured by the imaging device 140.
  • the embodiment of the present invention can track and locate the handheld controller based on the identification pattern on the handheld controller, thereby replacing the active illumination handheld controller, avoiding the source and the source frequency.
  • the need for control simplifies the structure of the handheld controller and saves costs.
  • the controller design can further reduce the control requirements for the imaging device without the need to specifically adjust the imaging device parameters to match the tracking positioning of the active illumination handheld controller.
  • Imaging device 140 can be any device capable of capturing an image of an object within its field of view.
  • imaging device 140 may not have a fixed location, for example, it may be worn by a user (eg, where the user's head belongs to a portion of the head mounted display device), and may be imaged as shown in FIG.
  • the device 140 is set as an example of a head mounted display device.
  • imaging device 140 can be placed in a fixed position, for example, it can be placed on a table or shelf. Imaging device 140 can be configured to capture images of objects within its field of view at different locations.
  • Imaging device 140 can include an image sensor 142.
  • the image sensor may be a CMOS (Complementary Metal Oxide Semiconductor) sensor, or a CCD (Charge-coupled Device) sensor or the like.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • imaging device 140 can be configured to capture multiple images at different points in time over a period of time. For example, when the handheld controller 120 moves within the field of view of the imaging device 140, Image device 140 may capture images of handheld controller 120 at different locations during the time period. The imaging device 140 can also obtain time information when capturing each image. Imaging device 140 may also send time information along with the image to terminal device 160 for further processing. In an embodiment of the invention, the terminal device 160 may be configured to track and locate the handheld controller 120 by identifying an identification pattern included in the image.
  • imaging device 140 may also include a position sensor (not shown) for determining the position of imaging device 140. Imaging device 140 may also be configured to transmit location data to terminal device 160. For example, imaging device 140 can include a GPS sensor configured to transmit coordinate data to terminal device 160.
  • the imaging device 140 can communicate with the terminal device 160 and transmit image data to the terminal device 160.
  • the imaging device 140 may also receive a command signal from the terminal device 160 that sets parameters for capturing an image.
  • Exemplary parameters therein for capturing an image may include setting a time of exposure, aperture, image resolution/size, field of view (eg, zooming in and out), and/or color space of the image (eg, color or black and white) and / or parameters used to perform other types of known functions of the camera.
  • Imaging device 140 and handheld controller 120 can be connected via a network connection, bus or other type of data link (e.g., hardwired, wireless (e.g., Bluetooth(TM)) or other connection known in the art.
  • Terminal device 160 can be a computing device, such as a general purpose or notebook computer, a mobile device, a tablet, a smart phone, a wearable device (such as a head mounted display device), a gaming machine, or any combination of these computers and/or accessory components.
  • a computing device such as a general purpose or notebook computer, a mobile device, a tablet, a smart phone, a wearable device (such as a head mounted display device), a gaming machine, or any combination of these computers and/or accessory components.
  • Terminal device 160 can be configured to receive and process data/signals from other components of the system. For example, as disclosed in the present disclosure, terminal device 160 can receive and process image data and/or input data from handheld controller 120 from imaging device 140. Terminal device 160 may also transmit data/signals to other components of the system, and other components may perform certain functions based on data/signals from terminal device 160.
  • the terminal device 160 can include a processor 161, a memory 162, and a communication interface 163.
  • Processor 161 may comprise any suitable type of general purpose or special purpose microprocessor, digital signal processor or microcontroller.
  • the processor 161 can be configured as a separate processor dedicated to locating the tracking object Module. Alternatively, the processor can be configured as a shared processor module for performing other functions unrelated to tracking objects.
  • the processor 161 can be configured to receive data and/or signals from various components of the system via, for example, a network. Processor 161 can also process data and/or signals to determine one or more operating conditions in the system. For example, the processor 161 can receive an image from the imaging device 140 and determine if the image includes an identification pattern, and the processor 161 can also determine a landmark point included in the identification pattern. Additionally or alternatively, the processor 161 can determine the size and number of landmarks included in the identification pattern. The processor 161 can also determine the tracking target based on the determined size of the landmark points and/or the determined number of landmark points.
  • Memory 162 can include any suitable type of mass storage that provides any type of information for storing that the processor may need to operate.
  • the memory can be volatile or nonvolatile, magnetic, semiconductor, magnetic, optical, erasable, non-erasable or other type of storage device or tangible (ie, non-transitory) computer readable medium, including but not Limited to ROM, flash memory, dynamic RAM and static RAM.
  • Memory 162 can be configured to store one or more computer programs of exemplary object tracking positioning functions that can be executed by processor 161 and disclosed in the present disclosure. For example, memory 162 can be configured to store programs that are executable by processor 161.
  • Memory 162 can also be configured to store information and data used by processor 161.
  • memory 162 can be configured to store a lookup table that includes identification patterns and their corresponding parameters. If the identification pattern is known, the processor can determine the identity of the identification pattern by querying the lookup table.
  • Communication interface 163 can be configured to facilitate communication between controllers such as a network and other components of the system.
  • terminal device 160 can receive input data/signals from the controller via a communication interface to control the characters in the game.
  • the terminal device 160 can also communicate data/signals to other displays for presenting games (images, video and/or sound signals) via the communication interface 163.
  • the network may include or partially include any one or more of various networks or other types of communication connections known to those skilled in the art.
  • the network may include network connections, buses or other types of data links, such as hardwired or other connections known in the art.
  • the network may include the Internet, an intranet, a local area network or other wireless or other hardwired connection, or other connection means (eg, Bluetooth, WiFi, 4G, LTE cellular data network, etc.) through which components of the system communicate.
  • the terminal device 160 is configured with a display device.
  • the display device can be part of a terminal device (eg, a display device in a head mounted display device, a screen of a laptop, etc.).
  • the display device may be a display device (eg, LED, OLED or LCD) or the like separate from a stand-alone standard television, HDTV, digital television, or any type of terminal device (eg, a gaming console).
  • the handheld controller 120 can be in communication with the terminal device 160, typically held by the user in one or both hands, to facilitate operation of the user input keys or the like on the handheld controller 120.
  • the user can interact with one or more characters in the game.
  • the handheld controller 120 can receive input from a user and transmit a signal to the terminal device 160 based on the received input, and the terminal device 160 can process the signal and/or change the game based on the signal.
  • the handheld controller 120 can receive data/signals from the terminal device 160 for controlling its components.
  • the terminal device 160 can send an interaction request or the like, and the handheld controller 120 can receive the interaction request and make corresponding feedback.
  • the user can open the display device through the eye control to open a certain function, and the head mounted display device sends a corresponding request.
  • the handheld controller 120 vibrates upon receiving the request, alerting the user to begin operation.
  • FIG. 3 illustrates a specific structure of a hand-held controller in some embodiments, the hand-held controller 120 including a handle 121 and a bracket 122.
  • the handle 121 is coupled to the bracket 122.
  • the outer surface of the bracket 122 is formed with an identification pattern.
  • the handle 121 includes an input device 1210 that can be configured to generate input data in response to a user's actions and/or inputs.
  • exemplary inputs and/or actions of the user may include touch input, gesture input (eg, hand waving, etc.), keystrokes, forces, sounds, voice conversations, facial recognition, fingerprints, fingerprints, or the like, and combinations thereof.
  • Input device 1210 can include a plurality of buttons, joysticks, a touchpad, a keyboard, an imaging sensor, a sound sensor (eg, a microphone), a pressure sensor, a motion sensor or a finger/palm scanner, or the like, and combinations thereof.
  • input device 1210 includes a thumb button.
  • the input device 1210 may also include a plurality of buttons, including, for example, a main button and other buttons, which may be configured to be remote from other buttons to prevent erroneous operation.
  • input device 1210 can also include a touch-sensitive surface that is divided into multiple portions, each portion corresponding to an input key. In this configuration, at least one touch sensor is located below the surface of the input device 1210. When the user's touch is detected by the touch sensor, an action associated with touching the corresponding input key is performed.
  • the user generates input data by operating at the input device 1210.
  • a button or sensor or the like in the input device 1210 is configured to communicate with the terminal device 160 to convert the operation input by the user into a corresponding action.
  • the handle 121 is a protruding structure of the hand-held controller, which may be rod-shaped, for example, may be a flat cylindrical shape, or other allowing the user to pass between the palm and the finger (eg, three or fewer fingers) Holding the handle 121, the thumb can be released for operating the input keys, and of course other fingers can be released to operate on the corresponding parts of the other fingers.
  • the handle 121 includes a first end 1211 and a second end 1212 opposite the first end 1211.
  • the first end 1211 is coupled to the bracket 122 and the second end 1212 is remote from the bracket 122.
  • the handle 121 is detachably coupled to the bracket 122.
  • the handle 121 can also be attached to the bracket 122 by a connection method corresponding to its material, for example, by bonding or welding to the bracket 122.
  • the handle 121 and the bracket 122 may be connected to each other by a fastening structure such as a screw or a bolt, or may be engaged with each other by a buckle or the like, or may be slidably connected by a sliding groove and a protrusion.
  • the detachable connection allows the handle 121 and the bracket 122 to be separately manufactured, and it is also convenient to replace the components when damaged, thereby reducing maintenance costs.
  • the handle 121 can also be integrally formed with the bracket 122.
  • the handle 121 and/or the bracket 122 may be formed from a rubber material (eg, to provide a surface that is sufficiently rubbed with the palm of the user, thereby increasing the reliability of the grip).
  • the handle 121 and/or the bracket 122 can be formed from a hard plastic including, but not limited to, a high density polyethylene that provides increased structural rigidity.
  • any other suitable material can be used.
  • the bracket 122 may be annular or elliptical in shape, and may be a closed ring or a ring having a notch.
  • the bracket 122 includes an outer surface 1220 that faces the outer ring and an inner surface 1223 that faces the inner ring.
  • the outer surface 1220 includes a first surface 1221 and a second surface 1222, the first surface 1221 intersecting the second surface 1222 at an circumscribed circle of the stent.
  • the inner surface 1223 connects the first surface 1221 and the second surface 1222.
  • An identification pattern 130 is disposed on at least one of the first surface 1221 or the second surface 1222.
  • the identification pattern 130 may be drawn or sprayed on the outer surface 1220, or may be attached to the outer surface 1220 in the form of an identification pattern layer.
  • other formation methods may be employed, and the specific formation method is not limited.
  • the first surface 1221 and the second surface 1222 are each provided with the identification pattern 130, and the specific pattern of the identification pattern 130 of the first surface 1221 and the identification pattern 130 of the second surface 1222 different. Further, the surface area of the second surface 1222 may be greater than the surface area of the first surface 1221. The second surface 1222 having a larger surface area is disposed toward the imaging device 140 to facilitate the imaging device 140 to acquire the identification pattern 130 on the second surface 1222.
  • the identification pattern 130 includes a background 131 and a marker point 132 distributed on the background 131.
  • the color or brightness of the background 131 and the marker point 132 are two colors or brightness that the imaging device 140 can distinguish.
  • the background 131 is black
  • the marker point 132 is white
  • the background 131 is white
  • the marker point 132 is black
  • other color combinations are also possible, for example, the background 131 is gray
  • the marker point 132 is red, and the like, as long as the color difference of the background 131 and the marker point 132 or the brightness imaging device 140 can be resolved.
  • the shape of the marker point 132 may be a circle, a polygon (for example, a hexagon), a square, or any other shape.
  • the shapes of the marker points 132 in the same identification pattern 130 may be the same or different.
  • the marker points 132 can be the same size, and further, can be evenly or periodically distributed along the circumference of the outer surface 1220 to the background 131 to form a marker strip, as in the upper half of FIGS. 4A-4D. .
  • the identification patterns 130 on the first surface 1221 and the second surface 1222 may all be such a pattern, except that the marker points 132 on the first surface 1221 and the second surface 1222 are different in size, such as FIG. 4D.
  • the landmarks 132 on the first surface 1221 are larger than the landmarks 132 on the second surface 1222.
  • the marker points 132 can be different in size, for example, can include a plurality of first marker points 1321 and a plurality of second marker points 1322, the first marker points 1321 being larger than the second marker points 1322.
  • the plurality of first marker points 1321 and the plurality of second marker points 1322 may be spaced apart from each other in the background 131 to form a first marker point 1321, a second marker point 1322, a first marker point 1321, a second marker point 1322, ... Marked strips of spaced distribution.
  • the identification patterns 130 on the first surface 1221 and the second surface 1222 may be such a pattern, and the first mark point 1321 and the second mark point 1322 on the first surface 1221 are respectively larger than the second surface 1222.
  • Such a pattern may also be provided with only one surface, such as on the first surface 1221, as shown in Figures 4A-4C.
  • the background of the identification pattern at the first surface 1221 and the second surface 1222 is black and the marking points are all white.
  • the first surface 1221 includes a first marking point 1321 and a second marking point 1322.
  • the first marking point 1321 and the second marking point 1322 are both circular, and the first marking The score 1321 is greater than the second marker point 1322.
  • the marking points on the first surface 1221 and the marking points on the second surface 1222 are staggered in the direction in which the strips extend.
  • the identification pattern is the same as FIG. 4A, except that the background color and the color of the marked point are opposite to those in FIG. 4A.
  • the identification pattern is substantially the same as FIG. 4A, except that the marker points are not circular but hexagonal.
  • the identification patterns at the first surface 1221 and at the second surface 1222 are created by spatially staggered black and white squares in a 2*2 matrix.
  • the black and white squares on the first surface 1221 are larger than the black and white squares on the second surface 1222.
  • identification patterns shown in FIG. 4A to FIG. 4D are only exemplary patterns, and the colors and the size of the mark points may be changed, and the specific implementation of the present invention is not limited; for example, on the first surface 1221.
  • the marker points are set to be rounded, and the marker points on the second surface 1222 are set as black and white interlaced squares.
  • the structure of the outer surface 1220 of the bracket 122 is not limited to the structure in which the first surface 1221 and the second surface 122 shown in FIGS. 4A to 4D are both round mesa.
  • the first surface 1221 and the second surface 1222 can also be a complete curved surface, as shown in FIG.
  • FIG. 5 shows a schematic view of a bracket 122 in another example.
  • the outer surface 1220 of the bracket 122 is a curved surface, and the first surface 1221 and the second surface 1222 together form the curved surface.
  • the first surface 1221 and the second surface 1222 are squares that are staggered in black and white or black ash. It can be understood that the size or arrangement rule of the black and white squares or the black gray squares in the first surface 1221 and the second surface 1222 may be the same or different, and is not limited to the pattern shown in FIG. 5.
  • the outer surface 1220 may also be formed by splicing unit blocks of different shapes, and the unit block is further patterned.
  • the unit block includes a hexagonal plate, a pentagonal plate, a triangular plate, or a trapezoidal plate.
  • the outer surface 1220 of the bracket 122 is formed by splicing a hexagonal panel 1224A, a quadrilateral panel 1224B, and a triangular panel 1224C, and a hexagonal panel 1224A is formed with a black and white square or triangular pattern. It can be understood that the patterns of the same color can be arranged continuously or in a staggered arrangement. In Fig.
  • the quadrilateral plate 1224B and the triangular plate 1224C are black, and it is understood that it can also be designed in white.
  • the identification pattern can also be two other colors or brightness that the imaging device can distinguish, for example, silver And black, etc.
  • Imaging device 140 detects movement of bracket 122 as the user moves (eg, swings, swings, punches, shakes, or any other manner).
  • the bracket 122 when the user holds the handle 121 in the neutral position, the bracket 122 is positioned such that it is above the user's hand, given this direction, the identification pattern 130 on the first surface 1221 of the bracket 122 for the imaging device 140 (for example, a front view camera on a head mounted display device) is visible.
  • the imaging device 140 is located in front of the user, and when the user holds the handle 121 in the neutral position, the identification pattern 130 on the first surface 1221 of the bracket 122 faces the imaging device 140.
  • the neutral position refers to the position where the user holds the handle 121 between the palm and the finger, the user holds the hand-held controller 120 in front of it, and relaxes the arm and wrist when the handle 121 is located.
  • the input device 1210 of the handle 121 is located at the second end 1212, and the handle 121 is configured such that the user can grip the handle 121 through the center of the bracket 122 and perform an input operation through the input device 1210.
  • the handle 121 is angled at a predetermined angle relative to the plane in which the bracket 122 is located to provide the user with a comfortable posture to hold the handle 121 and operate at the input device 1210.
  • the predetermined angle may be 30 degrees to 90 degrees, such as 45 degrees, 60 degrees, 75 degrees, and the like.
  • the input device 1210 is facing the bracket 122 with respect to the bracket 122.
  • the handle 121 is located within the space defined by the inner surface 1213 of the bracket 122.
  • the bracket 122 is positioned such that when the hand is held by the handle 121, the hand is centered on the bracket 122.
  • the input device 1210 of the handle 121 is located at the first end 1211, abutting the connection with the handle 121, and the handle 121 is configured such that the user can directly grip the handle 121 outside the bracket 122 and The input operation is performed by the input device 1210 as shown in FIG.
  • the handle 121 can be angled at a predetermined angle relative to the plane in which the bracket 122 is located to provide the user with a comfortable posture to hold the handle 121 and operate at the input device 1210.
  • the bracket 122 is positioned such that when the hand is held by the handle 121, the hand is positioned below the bracket 122.
  • the bracket 122 has a notch, and the bracket 122 is connected to the handle 121 at one end of the notch, for example, to the middle of the handle 121.
  • the input device 1210 of the handle 121 can be located at the first end 1211, and the user can grip the handle 122 through the center of the bracket 122 and perform an input operation through the input device 1210, as shown in FIG.
  • the handle 121 can be disposed approximately vertically relative to the plane in which the bracket 122 is located to provide the user with a comfortable posture to hold the handle 121 and operate at the input device 1210.
  • the bracket 122 is configured such that when the hand is held on the handle 121, the wrist is located in the center of the bracket 122, and the palm portion can pass through the notch.
  • FIG. 13 shows a block diagram of electrical connections of the handheld controller 120.
  • the handheld controller 120 includes an input device 1210 and a microcontroller 124 coupled to the input device 1210.
  • Input device 1210 can be a plurality of buttons, joysticks, touch pads, keyboards, imaging sensors, sound sensors (eg, microphones), pressure sensors, motion sensors or finger/palm scanners, or the like, and combinations thereof.
  • the handheld controller 120 can also include a microcontroller 124.
  • Microcontroller 124 can be configured to receive and process data/signals from input device 1210 and/or other components of the system.
  • the microcontroller 124 can receive input data generated in response to a user's actions and/or inputs from the input device 1210.
  • the microcontroller 124 can also generate input data based on the user's input and send the data to the terminal device 160 for further processing.
  • the microcontroller 124 can generate control signals for controlling other components.
  • the microcontroller 124 can generate control signals for controlling the imaging device.
  • the microcontroller 124 can include a microprocessor 1241, a memory 1242, an I/O interface 1243, a control interface 1244, and a communication interface 1245.
  • Microprocessor 12 may be configured to receive, generate, and/or process data/signals to implement the functionality of handheld controller 120.
  • Microprocessor 1241 may comprise any suitable type of general purpose or special purpose microprocessor, digital signal processor or microcontroller.
  • Memory 1242 can include any suitable type of mass storage that provides any type of information for storing microprocessor 1241 that may require operation.
  • the memory 1242 can be volatile or non-volatile, magnetic, semiconductor, magnetic, optical, removable, non-erasable or other type of storage device or tangible (ie, non-transitory) computer readable medium. , including but not limited to ROM, flash, dynamic RAM and static RAM.
  • the memory can be configured to store one or more computer programs of an exemplary object tracking function that can be executed by a microprocessor and disclosed in the present invention.
  • I/O interface 1243 can be configured to facilitate communication between microprocessor 1241 and input device 1210.
  • the microprocessor 1241 can receive input data from the input device 1210 via the I/O interface 1243 in response to user input.
  • Control interface 1244 can be configured to facilitate microprocessor 1241 and imaging Communication between the standby 140.
  • Communication interface 1245 can be configured to facilitate communication between handheld controller 120 and other components of the system.
  • the handheld controller 120 can communicate with the terminal device 160 via the communication interface 1245 via a network.
  • the microcontroller 124 can be disposed on the handle 121 or the bracket 122.
  • Input device 1210 of handle 121 can be configured to communicate input data to microprocessor 1241 via I/O interface 1243 for further processing.
  • input device 1210 can generate input data in response to a user's actuation button and send the input data to microprocessor 1241.
  • the microprocessor 1241 can communicate input data received from the input device 1210 to the terminal device 160 via the communication interface 1245 for further processing.
  • the handheld controller 120 may further include a sensor 1246 for acquiring the posture data of the handheld controller 120.
  • the sensor 1246 may be an attitude sensor such as an IMU, and is electrically connected to the microprocessor 1241 to transmit the collected attitude data. To the microprocessor 1241.
  • the sensor 1246 can be disposed on the handle 121 or can be disposed on the bracket 122.
  • a tracking positioning method based on the handheld controller 120 will be described below in conjunction with the structure of the handheld controller 120.
  • the tracking and positioning method is applied to the tracking and positioning system shown in FIG. 1. As shown in FIG. 14, the method may include:
  • Step S110 acquiring an identification pattern of an outer surface of the handle
  • imaging device 140 may continuously capture images. Additionally or alternatively, the captured image may be triggered by a special event or data/signal transmitted from the terminal device 160 or the handheld controller 120. For example, the user can perform an opening operation at the input device 1210 of the handheld controller 120. The handheld controller 120 can transmit a signal for activating the imaging device to capture one or more images based on user input. Alternatively, the handheld controller 120 can transmit input data to the terminal device, and the terminal device 160 can activate the imaging device 140 to capture one or more images.
  • the captured image may be triggered by the imaging device 140.
  • imaging device 140 may include a sensor for detecting an internal object within the field of view of imaging device 140.
  • an ultrasonic sensor can be used to detect one or more objects in the field of view of imaging device 140.
  • the imaging device 140 can be activated to take a picture to obtain one or more images.
  • imaging device 140 may be further configured to obtain depth information indicative of a location included in the image that includes an object for indication.
  • the imaging device 140 can further determine its own position by its position sensor.
  • imaging device 140 can be configured to capture color or black and white images.
  • imaging device 140 can optionally process the captured image and send the processed image to terminal device 160. For example, imaging device 140 may resize, denoise, and/or sharpen the image of the image. Imaging device 140 can also increase/decrease contract contrast and/or brightness of the image.
  • the imaging device 140 can also transmit the processed image to the terminal device 160.
  • imaging device 140 may receive parameters from terminal device 160 for capturing images.
  • Exemplary parameters for capturing an image may include setting exposure time, aperture, image resolution/size, viewing light field (zoom in and out), and/or color space of the image (eg, color or black and white) and/or A parameter used to perform other types of known functions of the camera.
  • Step S120 tracking and positioning the handheld controller according to the identification pattern.
  • the imaging device 140 may transmit the identification pattern to the terminal device 160 through the network, or may transmit the identification pattern to the terminal device 160 through a signal circuit.
  • the imaging device 140 may store the identification pattern before transmitting to the terminal device 160.
  • terminal device 160 can selectively process the received images to increase efficiency. For example, terminal device 160 can convert a color image to a black and white image, and/or resize the image to reduce computational requirements in the method. Additionally or alternatively, noise in the image may be reduced, the image sharpened, and/or the contract and/or brightness of the image may be increased (or decreased) such that the marker points in the identified image may be more easily detected. Of course, other types of image processing techniques are also contemplated.
  • the terminal device 160 can obtain the position and orientation of the specific point (eg, the center point) of the imaging device 140 relative to the handheld controller 120 by identifying the marker points in the pattern and using the three-dimensional structure information of the pattern marker points, by encoding the marker points.
  • the algorithm for obtaining the position and orientation may use an existing computer vision positioning algorithm, or may combine other sensors on the handheld controller 120, such as the attitude data collected by the sensor 1246 for acquiring attitude data, to accelerate the solution process and improve positioning. Precision.
  • the tracking and positioning of the handheld controller can be realized by setting the identification pattern on the handheld controller, and the frequency of the spot strobe is not required to be accurately controlled compared with the prior art.
  • the structure is simpler and the cost is lower.
  • this controller design can further reduce the imaging
  • the control requirements of the device do not require special adjustment of the imaging device parameters to match the tracking and positioning of the active light-emitting handheld controller.
  • each block of the flowchart or block diagram can represent a module, a program segment, or a portion of code that includes one or more of the Executable instructions.
  • the functions noted in the blocks may also occur in a different order than those illustrated in the drawings. For example, two consecutive blocks may be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or action. Or it can be implemented by a combination of dedicated hardware and computer instructions.
  • each functional module in each embodiment of the present invention may be integrated to form a separate part, or each module may exist separately, or two or more modules may be integrated to form a separate part.
  • the functions, if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Abstract

本发明公开了一种手持式控制器、跟踪定位方法以及系统。该手持式控制器包括:手柄及支架。手柄包括输入装置,用于接收用户的输入操作。支架与手柄连接,支架的外表面具有识别图案。在跟踪定位方法中,成像装置获取识别图案的图像;终端设备根据识别图案对手持式控制器进行跟踪定位。本发明可以通过在手持式控制器上设置识别图案,实现对手持式控制器的跟踪定位,与现有技术相比,不需要精确控制光点频闪的频率,结构更简单,成本更低。另外,还能进一步降低对成像设备的控制要求,无需特别调整成像设备参数以配合主动式发光手持控制器的跟踪定位。

Description

手持式控制器、跟踪定位方法以及系统 技术领域
本发明涉及计算机娱乐技术领域,尤其涉及一种手持式控制器、跟踪定位方法以及系统。
背景技术
交互控制技术是虚拟现实/增强现实/混合现实等领域的重要应用方向,交互控制技术为VR/AR/MR领域的快速发展起了巨大的需求牵引作用。在VR/AR/MR领域,手持式控制器(手柄)作为交互控制不可或缺的硬件设备,为实现交互控制提供强有力的支撑。用户通过操控手持式控制器的控制键(按键、扳机、触控板等等),能够实现人机交互功能。
为了增强用户的虚拟现实体验,目前对于控制器的追踪定位基本都是通过光学方法来确定,例如通过红外或增加光点等。但是,通过红外追踪需要专门的设备,增加光点的识别会有延迟,需要一个完整的频闪周期才能识别出光点,需要精确控制光点频闪的频率。
发明内容
本发明实施例的目的在于提供一种手持式控制器、跟踪定位方法以及系统,以解决上述问题。
本发明实施例第一方面提供了一种手持式控制器,该手持式控制器包括:手柄及支架。手柄包括输入装置,用于接收用户的输入操作。支架与手柄连接,支架的外表面具有识别图案。
在一些实施例中,支架例如为环状。
在一些实施例中,手柄包括相对设置的第一端和第二端,第一端连接支架,第二端远离支架。
在一些实施例中,手柄的输入装置位于第二端,手柄被配置为用户穿过支架的中央可握持手柄并通过输入装置进行输入操作。
在一些实施例中,手柄与支架所在的平面之间具有第一夹角。
在一些实施例中,手柄位于手柄内表面所限定的空间内。
在一些实施例中,手柄的输入装置位于第一端,手柄被配置为用户在支架外部可直接握持手柄并通过输入装置进行输入操作。
在一些实施例中,支架具有缺口,支架位于缺口的一个端部与手柄连接,手柄被配置为用户穿过支架的中央可握持手柄并通过输入装置进行输入操作。
在一些实施例中,支架的外表面包括第一表面以及第二表面,第一表面与第二表面在支架的外切圆相交,识别图案设置在第一表面和第二表面的至少一个表面上。
在一些实施例中,识别图案设置在第一表面和第二表面,设置在第一表面的识别图案与设置在第二表面的识别图案不同。
在一些实施例中,支架的外表面为圆弧面,识别图案设置在圆弧面上。
在一些实施例中,支架的外表面由不同形状的单元板块拼接而成,每个单元板块上的图案共同构成所述识别图案。
在一些实施例中,单元板块包括六边形板块、五边形板块、三角形板块、或梯形板块。
在一些实施例中,识别图案包括背景以及分布于背景的标志点,背景以及标志点的颜色或亮度为成像设备可分辨的两种颜色或亮度。
在一些实施例中,标志点大小相同且均匀分布于背景。
在一些实施例中,标志点大小间隔的分布于背景。
在一些实施例中,标志点为圆形、多边形或方形。
在一些实施例中,背景为黑色,标志点为白色,或者,背景为白色,标志点为黑色。
本发明实施例第二方面提供了一种手持式控制器,包括:手柄,所述手柄包括输入装置,用于接收用户的输入操作;支架,所述支架与所述手柄连接,所述支架的外表面具有识别图案;以及连接到输入装置的微控制器,所述微控制器被配置为接收并且处理来自所述输入装置的数据或信号,所述微控制器设置于所述手柄或支架内。
本发明实施例第三方面提供了一种跟踪定位系统,包括终端设备、成像设备以及上述手持式控制器,成像设备用于采集识别图案。
本发明实施例第四方面提供了一种跟踪定位方法,应用于跟踪系统,系统包括终端设备、成像设备以及手持式控制器,手持式控制器包括手柄以及与手柄连接的支架,手柄包括输入装置,用于接收用户的输入操作,支架的外表面具有识别图案,方法包括:所述成像装置获取所述识别图案的图像;所述终端设备根据所述识别图案对手持式控制器进行跟踪定位。
在一些实施例中,手持式控制器还设置有用于采集姿态数据的传感器,终端设备根据识别图案对手持式控制器进行跟踪定位,包括:终端设备根据识别图案和传感器所采集的姿态数据对手持式控制器进行跟踪定位。
在一些实施例中,终端设备根据所述识别图案对手持式控制器进行跟踪定位,包括:终端设备通过识别图案中的标志点以及利用图案标志点的三维结构信息获得成像装置相对于手持式控制器的特定点的位置及朝向;终端设备根据位置及朝向对手持式控制器进行跟踪定位。
本发明实施例,可以通过在手持式控制器上设置识别图案,实现对手持式控制器的跟踪定位,与现有技术相比,不需要精确控制光点频闪的频率,结构更简单,成本更低。另外,此种控制器设计还能进一步降低对成像设备的控制要求,无需特别调整成像设备参数以配合主动式发光手持控制器的跟踪定位。
附图说明
图1是本发明实施例提供的一种定位跟踪系统的结构示意图;
图2是本发明实施例提供的一种终端设备的示意图;
图3是本发明实施例提供的一种手持式控制器的结构示意图;
图4A至图4D是本发明实施例提供的识别图案的示例性示意图;
图5是本发明实施例提供的另一识别图案的示例性示意图;
图6是本发明实施例提供的又一识别图案的示例性示意图;
图7是本发明实施例提供的一种手持式控制器的使用示意图;
图8是本发明实施例提供的另一种手持式控制器的使用示意图;
图9是本发明实施例提供的另一种手持式控制器的结构示意图;
图10是图9所示手持式控制器的使用示意图;
图11是本发明实施例提供的再一种手持式控制器的结构示意图;
图12是图11所示手持式控制器的使用示意图;
图13是本发明实施例提供的一种手持式控制器的电气框图;
图14是本发明实施例提供的一种定位跟踪方法的流程示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本发明实施例的组件可以以各种不同的配置来布置和设计。
因此,以下对在附图中提供的本发明的实施例的详细描述并非旨在限制要求保护的本发明的范围,而是仅仅表示本发明的选定实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。
在本发明的描述中,需要说明的是,术语“中央”、“上”、“下”、“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,或者是该发明产品使用时惯常摆放的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。此外,术语“第一”、“第二”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
此外,术语“水平”、“竖直”等术语并不表示要求部件绝对水平或竖直,而是可以稍微倾斜。如“水平”仅仅是指其方向相对“竖直”而言更加水平,并不是表示该结构一定要完全水平,而是可以稍微倾斜。
在本发明的描述中,还需要说明的是,除非另有明确的规定和限定,术语“设置”、“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通。 对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本发明中的具体含义。以下将本发明通过参照实施例进行详细揭露,所列的示例将与附图结合进行说明。出于阅读方便的目的,在整个附图中将使用相同的附图标记来指代相同或相似的部件。
图1是本发明实施例提供的跟踪定位系统的示例性框图。如图1所示,在一些实施例中,跟踪定位系统100可以包括:手持式控制器120、具有图像传感器142的成像设备140以及终端设备160。
手持式控制器120的外表面具有识别图案。由成像设备140拍摄的图像可以包括识别图案。识别图案包括背景以及按照特定规则分布于背景的标志点,背景以及标志点的颜色或亮度为成像设备可分辨的两种颜色或亮度。例如,背景为黑色,标志点为白色,或者,背景为白色,标志点为黑色。终端设备160可以基于成像设备140拍摄的手持式控制器120上的识别图案对手持式控制器120进行跟踪定位。
与现有的解决方案相比,本发明的实施例可以基于手持式控制器上的识别图案对手持式控制器进行跟踪定位,从而替代主动式发光手持控制器,避免了对光源以及对光源频率进行控制的需要,可以简化手持式控制器的结构并且节省成本。另外,此种控制器设计还能进一步降低对成像设备的控制要求,无需特别调整成像设备参数以配合主动式发光手持控制器的跟踪定位。
成像设备140可以是能够在其视场内捕获物体的图像的任何装置。在一些实施例中,成像设备140可以不具有固定位置,例如,它可能被用户佩戴(例如,在用户头部属于头戴显示设备的一部分),并且可以随着用户移动,图1中以成像设备140设置在头戴显示设备为例。在一些实施例中,成像设备140可以设置在固定位置,例如,它可以放置在桌子或者或架子上。成像设备140可以被配置为在不同位置捕获其视野内的对象的图像。
成像设备140可以包括图像传感器142。图像传感器可以是CMOS(Complementary Metal Oxide Semiconductor,互补金属氧化物半导体)传感器,或者CCD(Charge-coupled Device,电荷耦合元件)传感器等等。
在一些实施例中,成像设备140可以被配置为在一段时间内在不同时间点捕获多个图像。例如,当手持式控制器120在成像设备140的视野内移动时,成 像设备140可以在该时间段期间在不同位置捕获手持式控制器120的图像。成像设备140还可以在捕获每个图像时获得时间信息。成像设备140还可以将时间信息连同图像一起发送到终端设备160用于进一步处理。在本发明实施例中,终端设备160可以被配置用于通过识别包括在图像中的识别图案对手持式控制器120进行跟踪和定位。
在一些实施例中,成像设备140还可以包括用于确定成像设备140位置的位置传感器(未示出)。成像设备140还可以被配置为将位置数据传输到终端设备160。例如,成像设备140可以包括GPS传感器,该GPS传感器被配置为将坐标数据发送到终端设备160。
请再参考图1,成像设备140可以与终端设备160通信并且将图像数据发送到终端设备160。成像设备140还可以从终端设备160接收设置用于捕获图像的参数的命令信号。其中用于捕获图像的示例性参数可以包括用于设置曝光时间、孔径、图像分辨率/尺寸,视场(例如,放大和缩小)和/或图像的颜色空间(例如,彩色或黑白色)和/或用于执行相机的其他类型的已知功能的参数。成像设备140和手持式控制器120可以经由网络连接,总线或其他类型的数据链路(例如,硬线,无线(例如Bluetooth TM)或本领域已知的其他连接)来连接。
终端设备160可以是计算设备,例如通用或笔记本计算机、移动设备、平板电脑、智能手机、可穿戴设备(如头戴显示设备),游戏机或这些计算机和/或附属组件的任意组合。
终端设备160可以被配置为从系统的其他部件接收和处理数据/信号。例如,本发明中所公开的,终端设备160可以从成像设备140接收和处理图像数据和/或手持式控制器120的输入数据。终端设备160还可以将数据/信号发送到系统的其他组件,并且其他组件可以基于来自终端设备160的数据/信号来执行某些功能。
请参见图2,在一些实施例中,终端设备160可以包括处理器161,存储器162和通信接口163。
处理器161可以包括任何适当类型的通用或专用微处理器、数字信号处理器或微控制器。处理器161可以被配置为专用于定位跟踪对象的单独的处理器 模块。或者,处理器可以被配置为用于执行与跟踪对象无关的其他功能的共享处理器模块。处理器161可以被配置为经由例如网络从系统的各种组件接收数据和/或信号。处理器161还可处理数据和/或信号以确定系统中的一个或多个操作条件。例如,处理器161可以从成像设备140接收图像并且确定图像是否包括识别图案,处理器161还可以确定包含在识别图案中的标志点。作为附加或替代,处理器161可以确定包括在识别图案中的标志点的大小和数量。处理器161还可以基于所确定的标志点的大小和/或所确定的标志点数量来确定跟踪目标。
存储器162可以包括提供用于存储处理器可能需要操作的任何类型的信息的任何适当类型的大容量存储器。存储器可以是易失性或非易失性、磁性、半导体、磁带、光学、可擦除、不可擦除或其他类型的存储设备或有形(即,非暂时性)计算机可读介质,包括但不限于ROM,闪速存储器,动态RAM和静态RAM。存储器162可以被配置为存储可以由处理器161执行的且在本发明中公开的示例性对象跟踪定位功能的一个或多个计算机程序。例如,存储器162可以被配置为存储可由处理器161执行的程序。
存储器162还可以被配置为存储由处理器161使用的信息和数据。例如,存储器162可以被配置为存储包括识别图案和它们对应的参数的查找表。如果获知识别图案,处理器可以通过查询查找表来确定识别图案的身份。
通信接口163可以被配置为便于通过诸如网络的控制器和系统的其他组件之间的通信。例如,终端设备160可以经由通信接口从控制器接收输入数据/信号,以控制游戏中的角色。终端设备160还可以经由通信接口163将数据/信号传送到用于呈现游戏(图像,视频和/或声音信号)的其他显示器。
网络可以包括或部分包括本领域技术人员已知的各种网络或其他类型的通信连接中的任何一种或多种。网络可以包括网络连接,总线或其他类型的数据链路,例如本领域已知的硬线或其他连接。例如,网络可以包括互联网,内联网,局域网或其它无线或其他硬连线,或者其它连接方式(例如,蓝牙,WiFi,4G,LTE蜂窝数据网络等),系统的组件通过网络实现通信。
终端设备160配置有显示设备。在一些实施例中,显示设备可以是终端设备的一部分(例如,头戴显示设备中的显示装置,笔记本电脑的屏幕等)。在 一些实施例中,显示设备可以是与诸如独立标准电视,HDTV,数字电视或任何类型的终端设备(例如游戏主机)分离的显示设备(例如,LED,OLED或LCD)等。
手持式控制器120可以与终端设备160通信,通常由用户在一只手或两只手中握住,以便于在手持式控制器120上操作用户输入键等。在玩游戏或者进行虚拟现实活动的时候,用户可以与游戏中的一个或多个角色进行交互。例如,手持式控制器120可以接收来自用户的输入,并且基于接收到的输入将信号发送到终端设备160,终端设备160可以基于该信号来处理信号和/或改变游戏。在一些实施例中,手持式控制器120可以从终端设备160接收用于控制其组件的数据/信号。例如,终端设备160可以发送交互请求等,手持式控制器120可以接收交互请求并作出相应的反馈,例如,用户可以通过眼睛控制头戴显示设备打开某个功能,头戴显示设备发送对应的请求至手持式控制器120,手持式控制器120接收到请求后发生震动,提醒用户开始操作。
图3示出了一些实施例中手持式控制器的具体结构,手持式控制器120包括手柄121以及支架122。手柄121与支架122连接。支架122的外表面形成有识别图案。
手柄121包括输入装置1210,输入装置1210可以被配置为响应于用户的动作和/或输入来生成输入数据。用户的示例性输入和/或动作可以包括触摸输入、手势输入(例如,手挥动等)、键击、力、声音、语音对话、面部识别、指印、手印、或者类似物及其组合。输入装置1210可以包括多个按钮、操纵杆、触摸板、键盘、成像传感器、声音传感器(例如麦克风)、压力传感器、运动传感器或手指/手掌扫描仪、或者类似物及其组合的输入装置。在图3的示例中,输入装置1210包括拇指按钮。当然,在其他实施例中,输入装置1210也可以包括多个按钮,例如包括主按钮和其他按钮,主按钮可以被配置为远离其他按钮,以防止误操作。在一些实施例中,输入装置1210也可以包括被分割成多个部分的触敏表面,每个部分对应于一个输入键。在该配置中,至少一个触摸传感器位于输入装置1210的表面下。当由触摸传感器检测到用户的触摸时,与触摸相应的输入键相关联的动作被执行。
用户通过在输入装置1210进行操作来生成输入数据。输入装置1210中的按键或者传感器等被配置为与终端设备160通信,以便将用户输入的操作转换成对应的动作。
在一些实施例中,手柄121是手持式控制器的突出结构,可以为棒状,例如可以为扁平的圆柱状,或者其他允许用户通过手掌和手指(例如,三个或更少的手指)之间握住手柄121,同时可以释放拇指,用于操作输入键,当然也可以释放其他手指在其他手指对应的部分进行操作。
手柄121包括第一端1211和与第一端1211相对的第二端1212。在一些实施例中,第一端1211连接支架122,第二端1212远离支架122。在一些实施例中,手柄121可拆卸的连接在支架122上。手柄121也可以通过与其材料相应的连接方法连接在支架122上,例如,可以通过粘接或者焊接连接在支架122上。或者,手柄121和支架122可以通过螺钉或螺栓等紧固结构彼此连接,也可以通过卡扣等方式相互卡接,或者,通过滑槽和凸起滑动连接。可拆卸连接可以使手柄121和支架122单独制造,在损坏时也方便更换组件,降低维修成本。在其他实施例中,手柄121也可以与支架122一体成型。
在一些实施例中,手柄121和/或支架122可以由橡胶材料形成(例如,以提供与用户的手掌之间充分摩擦的表面,从而提高握持的可靠性)。在一些实施例中,手柄121和/或支架122可以由硬塑料形成,包括但不限于提供结构刚性提高的高密度聚乙烯。此外,可以使用任何其它合适的材料。
支架122可以为圆环状或者椭圆环状,可以是封闭的环也可以是具有缺口的环。支架122包括朝向环外的外表面1220以及朝向环内的内表面1223。其中,外表面1220包括第一表面1221以及第二表面1222,第一表面1221与第二表面1222在支架的外切圆处相交。内表面1223连接第一表面1221和第二表面1222。第一表面1221或第二表面1222中的至少一个表面上设置有识别图案130。识别图案130可以绘制或喷涂在外表面1220,也可以是以识别图案层的形式粘贴在外表面1220上,当然,也可以采用其他的形成法,具体的形成方法不限。
在一些实施例中,第一表面1221和第二表面1222均设置有识别图案130,且第一表面1221的识别图案130与第二表面1222的识别图案130的具体图案 不同。进一步的,第二表面1222的表面积可以大于第一表面1221的表面积。表面积较大的第二表面1222被设置为朝向成像设备140以便于成像设备140采集第二表面1222上的识别图案130。
图4A至图4D为外表面1220展开后几种识别图案130的示例性示意图。请参见图4A至图4D,识别图案130包括背景131以及分布于背景131上的标志点132,背景131以及标志点132的颜色或亮度为成像设备140可分辨的两种颜色或亮度。例如,背景131为黑色,标志点132为白色,或者,背景131为白色,标志点132为黑色。当然,也可以是其他颜色组合,例如,背景131为灰色,标志点132为红色等,只要背景131和标志点132的颜色色差或亮度成像设备140能够分辨即可。其中,标志点132的形状可以为圆形、多边形(例如六边形)、方形或其他任意形状,同一识别图案130中标志点132的形状可以相同也可以不同。
在一些实施例中,标志点132可以大小相同,进一步的,可以沿外表面1220的周向均匀或周期性分布于背景131,形成标志点条带,如图4A至图4D中的上半部分。第一表面1221和第二表面1222上的识别图案130可以均为此种图案,只是第一表面1221和第二表面1222上的标志点132大小不同,例如图4D。在一些实施例中,例如但不限于第一表面1221的表面积较大的实施例中,第一表面1221上的标志点132大于第二表面1222上的标志点132。
在一些实施例中,标志点132可以大小不同,例如可以包括多个第一标志点1321和多个第二标志点1322,第一标志点1321大于第二标志点1322。多个第一标志点1321和多个第二标志点1322可以大小间隔的分布于背景131,形成第一标志点1321、第二标志点1322、第一标志点1321、第二标志点1322……间隔分布的标志点条带。在一些实施例中,第一表面1221和第二表面1222上的识别图案130可以均为此种图案,第一表面1221上的第一标志点1321和第二标志点1322分别大于第二表面1222上的第一标志点1321和第二标志点1322。此种图案也可以仅设置一个表面,例如第一表面1221上,如图4A至图4C所示。
在图4A示出的一种示例中,位于第一表面1221和位于第二表面1222的识别图案的背景均为黑色,标记点均为白色。位于第一表面1221包括第一标记点1321和第二标记点1322,第一标记点1321和第二标记点1322均为圆形,第一标 记点1321大于第二标记点1322。优选的,位于第一表面1221的标记点与位于第二表面1222的标记点均沿条带延伸的方向交错排布。
在图4B示出的一种示例中,识别图案与图4A相同,不同的是,背景颜色与标记点的颜色与图4A中的相反。
在图4C示出的一种示例中,识别图案与图4A基本相同,不同的是,标记点不是圆形,而是六边形。
在图4D示出的一种示例中,位于第一表面1221和位于第二表面1222的识别图案是由黑白方块以2*2的矩阵在空间上交错排列产生。位于第一表面1221的黑白方块大于位于第二表面1222的黑白方块。
可以理解的是,图4A至图4D所示识别图案仅是示例性的图案,其颜色及标志点的大小均可以改变,并不对本发明的具体实施构成限定;例如,第一表面1221上的标志点设为圆斑状,而第二表面1222上的标志点设为黑白交错的方块。
可以理解的是,支架122的外表面1220的结构并不限制于图4A至图4D所示的第一表面1221和第二表面122均为圆台面的结构。在一些实施例中,第一表面1221和第二表面1222也可以是一个完整的弧面,如图5所示。
图5示出了另一种示例中支架122的示意图。在该示例中,支架122的外表面1220为一个弧面,第一表面1221和第二表面1222共同构成该弧面。在该示例中,第一表面1221和第二表面1222均为黑白或黑灰交错排布的方块。可以理解的是,第一表面1221和第二表面1222中黑白方块或黑灰方块的大小或者排布规则可以相同也可以不同,并不限于图5所示的图案。
如图6所示,在另一些实施例中,外表面1220还可以是由不同形状的单元板块拼接而成,单元板块上进一步形成图案。所述的单元板块包括六边形板块、五边形板块、三角形板块、或梯形板块等。在图6所示的示例中,支架122的外表面1220由六边形板块1224A、四边形板块1224B和三角形板块1224C拼接而成,六边形板块1224A上形成有黑白方形或三角形图案。可以理解的是,相同颜色的图案可以连续排布也可以间隔交错排布。图6中,四边形板块1224B和三角形板块1224C是黑色,可以理解的是也可以设计为白色。当然,识别图案也可以为成像设备可分辨的其他两种颜色或亮度,例如,银色 和黑色等。
成像设备140在用户进行运动时(例如,挥动、摆动、冲压、摇动或任何其他方式)检测支架122的移动。在一些实施例中,当用户将手柄121保持在中立位置时,支架122定位成使其位于用户手的上方,给定这种方向,支架122的第一表面1221上的识别图案130对于成像设备140(例如头戴显示设备上的前视摄像头)可见。或者,成像设备140位于用户的前面,当用户将手柄121保持在中立位置时,支架122的第一表面1221上的识别图案130朝向成像设备140。中立位置是指用户将手柄121握在手掌和手指之间,用户将手持式控制器120保持在其前方,并且放松其手臂和手腕时手柄121所在的位置。
在一些实施例中,如图7所示,手柄121的输入装置1210位于第二端1212,手柄121被配置为用户穿过支架122的中央可握持手柄121并通过输入装置1210进行输入操作。
手柄121相对于支架122所在的平面以预定角度倾斜,以便为用户提供舒适姿势握住手柄121并在输入装置1210进行操作。预定角度可以是30度至90度,例如45度、60度、75度等。输入装置1210相对于支架122是面向支架122的。
如图8所示,在另一些实施方式中,手柄121位于支架122的内表面1213所限定的空间内。当用户将手柄121保持在中立位置时,支架122定位成使手握持在手柄121时,手位于支架122的中央。
在一些实施例中,如图9所示,手柄121的输入装置1210位于第一端1211,紧靠与手柄121的连接部,手柄121被配置为用户在支架122外部可直接握持手柄121并通过输入装置1210进行输入操作,如图10所示。
在这种实施方式中,手柄121可以相对于支架122所在的平面以预定角度倾斜,以便为用户提供舒适姿势握住手柄121并在输入装置1210进行操作。当用户将手柄121保持在中立位置时,支架122定位成使手握持在手柄121时,手位于支架122的下方。
在一些实施例中,如图11所示,支架122具有缺口,支架122位于缺口的一个端部与手柄121连接,例如可以与手柄121的中部连接。手柄121的输入装置1210可以位于第一端1211,用户穿过支架122的中央可握持手柄122并通过输入装置1210进行输入操作,如图12所示。
在这种实施方式中,手柄121可以相对于支架122所在的平面近似垂直设置,以便为用户提供舒适姿势握住手柄121并在输入装置1210进行操作。当用户将手柄121保持在中立位置时,支架122配置成使手握持在手柄121时,手腕位于支架122的中央,手掌部分可穿过缺口。
如图13所示,图13示出了手持式控制器120的电气连接框图。手持式控制器120包括输入装置1210和连接到输入装置1210的微控制器124。输入装置1210可以为多个按钮、操纵杆、触摸板、键盘、成像传感器、声音传感器(例如麦克风)、压力传感器、运动传感器或手指/手掌扫描仪、或者类似物及其组合。
在一些实施例中,手持式控制器120的还可以包括微控制器124。微控制器124可以被配置为接收并且处理来自输入装置1210和/或系统的其他组件的数据/信号。例如,微控制器124可以从输入装置1210接收响应于用户的动作和/或输入而生成的输入数据。
微控制器124还可以基于用户的输入生成输入数据,并将数据发送到终端设备160,以进行进一步的处理。在一些实施例中,微控制器124可以产生用于控制其他部件的控制信号。例如,微控制器124可以产生用于控制成像设备的控制信号。
微控制器124可以包括微处理器1241,存储器1242,I/O接口1243,控制接口1244和通信接口1245。微处理器12可以被配置为接收,生成和/或处理数据/信号,以实现手持式控制器120的功能。
微处理器1241可以包括任何适当类型的通用或专用微处理器,数字信号处理器或微控制器。存储器1242可以包括提供用于存储微处理器1241可能需要操作的任何类型的信息的任何适当类型的大容量存储器。存储器1242可以是易失性或非易失性的,磁性、半导体、磁带、光学、可移除、不可擦除的或其他类型的存储设备或有形的(即,非暂时的)计算机可读介质,包括但不限于ROM,闪存,动态RAM和静态RAM。存储器可以被配置为存储可以由微处理器执行的且在本发明中公开的示例性对象跟踪功能的一个或多个计算机程序。
I/O接口1243可以被配置为便于微处理器1241和输入装置1210之间的通信。例如,微处理器1241可以响应于用户的输入,经由I/O接口1243从输入装置1210接收输入数据。控制接口1244可以被配置为便于微处理器1241和成像设 备140之间的通信。通信接口1245可以被配置为促进手持式控制器120与系统的其他组件之间的通信。例如,手持式控制器120可以经由网络通过通信接口1245与终端设备160进行通信。
微控制器124可以设置在手柄121或者支架122。手柄121的输入装置1210可以被配置为经由I/O接口1243将输入数据传送到微处理器1241用于进一步处理。例如,输入装置1210可以响应于用户的致动按钮生成输入数据,并将输入数据发送到微处理器1241。在一些实施例中,微处理器1241可以经由通信接口1245将从输入装置1210接收到的输入数据传送到终端设备160,以进行进一步处理。
进一步的,手持式控制器120还可以包括用于获取手持式控制器120姿态数据的传感器1246,该传感器1246可以是IMU等姿态传感器,与微处理器1241电连接,可以将采集的姿态数据传送到微处理器1241。传感器1246可以设置在手柄121,也可以设置在支架122。
下面将结合手持式控制器120的结构说明基于该手持式控制器120的一种跟踪定位方法。该跟踪定位方法应用于图1所示的跟踪定位系统,如图14所示,该方法可以包括:
步骤S110,获取手柄外表面的识别图案;
在一些实施例中,成像设备140可以连续捕获图像。作为附加方式或替代方式,拍摄图像可以由特殊事件或从终端设备160或手持式控制器120发送的数据/信号所触发。例如,用户可以在手持式控制器120的输入装置1210进行开启操作。手持式控制器120可以传送用于启动成像设备的信号,以基于用户的输入捕获一个或多个图像。或者,手持式控制器120可以向终端设备发送输入数据,终端设备160可以启动成像设备140以捕获一个或多个图像。
在一些游戏事件中,可以通过成像设备140触发拍摄图像。作为附加方式或替代方式,成像设备140可以包括用于在成像设备140的视场内检测内部物体的传感器。例如,可以使用超声波传感器来检测成像设备140的视场中的一个或多个物体。在本实施例中,如果检测到物体,则可以启动成像设备140进行拍照,获得一个或多个图像。
在一些实施例中,成像设备140还可以被配置为获得指示包含在图像中包括用于指示的对象的位置的深度信息。成像设备140可以通过其位置传感器进一步确定其自身的位置。在一些实施例中,成像设备140可以被配置为捕获彩色或黑白图像。在一些实施例中,成像设备140可以可选地处理所捕获的图像并将经处理的图像发送到终端设备160。例如,成像设备140可以对图像进行调整大小调整、降噪和/或锐化处理图像。成像设备140还可以增加/减少图像的合同对比度和/或亮度。成像设备140还可以将经处理的图像发送到终端设备160。
在一些实施例中,成像设备140可以从终端设备160接收用于捕获图像的参数。用于捕获图像的示例性参数可以包括用于设置曝光时间、光圈、图像分辨率/尺寸、观看光场(放大和缩小),和/或图像的颜色空间(例如,颜色或黑白)和/或用于执行相机的其他类型的已知功能的参数。
步骤S120,根据所述识别图案对手持式控制器进行跟踪定位。
在一些实施例中,成像设备140可以通过网络将识别图案发送给终端设备160,也可以通过信号电路将识别图案发送给终端设备160。在发送给终端设备160前,成像设备140可以对识别图案进行存储。
在一些实施例中,终端设备160可选择性地处理所接收的图像,以便提高效率。例如,终端设备160可以将彩色图像转换为黑白图像,和/或调整图像大小,从而减少该方法中的计算需求。附加地或替代地,可以降低图像中的噪声,锐化图像,和/或增加(或减小)图像的合约和/或亮度,使得识别图像中的标志点可以更容易检测。当然,也可以考虑其它类型的图像处理技术。
终端设备160可以通过识别图案中的标志点以及利用图案标志点的三维结构信息获得成像装置140相对于手持式控制器120的特定点(例如中心点)的位置及朝向,通过对标志点编码可以极大的提高跟踪定位算法的可靠性和效率。获得位置及朝向的算法可以采用现有的计算机视觉定位算法,也可以结合手持式控制器120上的其他传感器,例如用于采集姿态数据的传感器1246等采集的姿态数据,加速求解过程并提高定位精度。
综上所述,本发明实施例,可以通过在手持式控制器上设置识别图案,实现对手持式控制器的跟踪定位,与现有技术相比,不需要精确控制光点频闪的频率,结构更简单,成本更低。另外,此种控制器设计还能进一步降低对成像 设备的控制要求,无需特别调整成像设备参数以配合主动式发光手持控制器的跟踪定位。
在本发明所提供的实施例中,应该理解到,所揭露的方法,也可以通过其它的方式实现。以上所描述的实施例仅仅是示意性的,例如,附图中的流程图和框图显示了根据本发明的实施例的方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或代码的一部分,所述模块、程序段或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现方式中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
另外,在本发明各个实施例中的各功能模块可以集成在一起形成一个独立的部分,也可以是各个模块单独存在,也可以两个或两个以上模块集成形成一个独立的部分。
所述功能如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括 一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,本发明可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应所述以权利要求的保护范围为准。

Claims (23)

  1. 一种手持式控制器,其特征在于,包括:
    手柄,所述手柄包括输入装置,用于接收用户的输入操作;
    支架,所述支架与所述手柄连接,所述支架的外表面具有识别图案。
  2. 根据权利要求1所述的手持式控制器,其特征在于,所述支架为环状。
  3. 根据权利要求2所述的手持式控制器,其特征在于,所述手柄包括相对设置的第一端和第二端,所述第一端连接所述支架,所述第二端远离所述支架。
  4. 根据权利要求3所述的手持式控制器,其特征在于,所述手柄的输入装置位于所述第二端,所述手柄被配置为用户穿过所述支架的中央可握持所述手柄并通过所述输入装置进行输入操作。
  5. 根据权利要求4所述的手持式控制器,其特征在于,所述手柄与所述支架所在的平面之间具有第一夹角。
  6. 根据权利要求4所述的手持式控制器,其特征在于,所述手柄位于所述支架内表面所限定的空间内。
  7. 根据权利要求3所述的手持式控制器,其特征在于,所述手柄的输入装置位于所述第一端,所述手柄被配置为用户在所述支架外部可直接握持所述手柄并通过所述输入装置进行输入操作。
  8. 根据权利要求2所述的手持式控制器,其特征在于,所述支架具有缺口,所述支架位于所述缺口的一个端部与所述手柄连接,所述手柄被配置为用户穿过所述支架的中央可握持所述手柄并通过所述输入装置进行输入操作。
  9. 根据权利要求2所述的手持式控制器,其特征在于,所述支架的外表面包括第一表面以及第二表面,所述第一表面与所述第二表面在所述支架的外切圆相交,所述识别图案设置在所述第一表面和第二表面的至少一个表面上。
  10. 根据权利要求9所述的手持式控制器,其特征在于,所述识别图案设置在第一表面和第二表面,设置在所述第一表面的识别图案与设置在所述第二表面的识别图案不同。
  11. 根据权利要求2所述的手持式控制器,其特征在于,所述支架的外表面为圆弧面,所述识别图案设置在所述圆弧面上。
  12. 根据权利要求2所述的手持式控制器,其特征在于,所述支架的外表面由不同形状的单元板块拼接而成,每个单元板块上的图案共同构成所述识别图案。
  13. 根据权利要求12所述的手持式控制器,其特征在于,所述单元板块包括六边形板块、五边形板块、三角形板块、或梯形板块。
  14. 根据权利要求1至13任意一项所述的手持式控制器,其特征在于,所述识别图案包括背景以及分布于所述背景的标志点,所述背景以及所述标志点的颜色或亮度为成像设备可分辨的两种颜色或亮度。
  15. 根据权利要求14所述的手持式控制器,其特征在于,所述标志点大小相同且均匀分布于所述背景。
  16. 根据权利要求14所述的手持式控制器,其特征在于,所述标志点大小间隔的分布于所述背景。
  17. 根据权利要求14所述的手持式控制器,其特征在于,所述标志点为圆形、多边形或方形。
  18. 根据权利要求14所述的手持式控制器,其特征在于,所述背景为黑色,所述标志点为白色,或者,所述背景为白色,所述标志点为黑色。
  19. 一种手持式控制器,其特征在于,包括:
    手柄,所述手柄包括输入装置,用于接收用户的输入操作;
    支架,所述支架与所述手柄连接,所述支架的外表面具有识别图案;以及
    连接到输入装置的微控制器,所述微控制器被配置为接收并且处理来自所述输入装置的数据或信号,所述微控制器设置于所述手柄或支架内。
  20. 一种跟踪定位系统,其特征在于,包括终端设备、成像设备以及如权利要求1-18任意一项所述的手持式控制器,所述成像设备用于采集所述识别图案。
  21. 一种跟踪定位方法,应用于跟踪系统,其特征在于,所述系统包括终端设备、成像设备以及手持式控制器,所述手持式控制器包括手柄以及与所述手柄连接的支架,所述手柄包括输入装置,用于接收用户的输入操作,所述支架的外表面具有识别图案,所述方法包括:
    所述成像装置获取所述识别图案的图像;
    所述终端设备根据所述识别图案对手持式控制器进行跟踪定位。
  22. 根据权利要求21所述的方法,其特征在于,所述手持式控制器还设置有 用于采集姿态数据的传感器,所述终端设备根据所述识别图案对手持式控制器进行跟踪定位,包括:
    所述终端设备根据所述识别图案和所述传感器所采集的姿态数据对手持式控制器进行跟踪定位。
  23. 根据权利要求21所述的方法,其特征在于,所述终端设备根据所述识别图案对手持式控制器进行跟踪定位,包括:
    所述终端设备通过识别图案中的标志点以及利用图案标志点的三维结构信息获得所述成像装置相对于所述手持式控制器的特定点的位置及朝向;
    所述终端设备根据所述位置及朝向对手持式控制器进行跟踪定位。
PCT/CN2017/097738 2017-08-16 2017-08-16 手持式控制器、跟踪定位方法以及系统 WO2019033322A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/314,400 US20190384419A1 (en) 2017-08-16 2017-08-16 Handheld controller, tracking method and system using the same
CN201780007656.8A CN109069920B (zh) 2017-08-16 2017-08-16 手持式控制器、跟踪定位方法以及系统
PCT/CN2017/097738 WO2019033322A1 (zh) 2017-08-16 2017-08-16 手持式控制器、跟踪定位方法以及系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/097738 WO2019033322A1 (zh) 2017-08-16 2017-08-16 手持式控制器、跟踪定位方法以及系统

Publications (1)

Publication Number Publication Date
WO2019033322A1 true WO2019033322A1 (zh) 2019-02-21

Family

ID=64676051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/097738 WO2019033322A1 (zh) 2017-08-16 2017-08-16 手持式控制器、跟踪定位方法以及系统

Country Status (3)

Country Link
US (1) US20190384419A1 (zh)
CN (1) CN109069920B (zh)
WO (1) WO2019033322A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110837295A (zh) * 2019-10-17 2020-02-25 重庆爱奇艺智能科技有限公司 一种手持控制设备及其追踪定位的方法、设备与系统

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11036284B2 (en) * 2018-09-14 2021-06-15 Apple Inc. Tracking and drift correction
TWI760654B (zh) * 2018-11-12 2022-04-11 宏達國際電子股份有限公司 虛擬實境控制器
CN109621401A (zh) * 2018-12-29 2019-04-16 广州明朝互动科技股份有限公司 一种互动游戏系统及控制方法
CN112241200A (zh) * 2019-07-17 2021-01-19 苹果公司 头戴式设备的对象跟踪
US11845001B2 (en) * 2021-01-14 2023-12-19 Htc Corporation Calibration system and method for handheld controller

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104699247A (zh) * 2015-03-18 2015-06-10 北京七鑫易维信息技术有限公司 一种基于机器视觉的虚拟现实交互系统及方法
CN105117016A (zh) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 用于虚拟现实和增强现实交互控制中的交互手柄
CN106055090A (zh) * 2015-02-10 2016-10-26 李方炜 利用移动装置的虚拟现实和增强现实控制
US20160357249A1 (en) * 2015-06-03 2016-12-08 Oculus Vr, Llc Hand-Held Controllers For Virtual Reality System

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764164A (en) * 1997-02-07 1998-06-09 Reality Quest Corp. Ergonomic hand-attachable controller
US8287373B2 (en) * 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
CN102279646A (zh) * 2010-06-10 2011-12-14 鼎亿数码科技(上海)有限公司 带手持设备的装置及手持设备运动的识别方法
US20160232715A1 (en) * 2015-02-10 2016-08-11 Fangwei Lee Virtual reality and augmented reality control with mobile devices
US9898091B2 (en) * 2015-06-03 2018-02-20 Oculus Vr, Llc Virtual reality system with head-mounted display, camera and hand-held controllers
US9839840B2 (en) * 2015-11-05 2017-12-12 Oculus Vr, Llc Interconnectable handheld controllers
US10130875B2 (en) * 2015-11-12 2018-11-20 Oculus Vr, Llc Handheld controller with finger grip detection
US10386922B2 (en) * 2015-12-30 2019-08-20 Facebook Technologies, Llc Handheld controller with trigger button and sensor retainer assembly
US10391400B1 (en) * 2016-10-11 2019-08-27 Valve Corporation Electronic controller with hand retainer and finger motion sensing
US20180161670A1 (en) * 2016-12-12 2018-06-14 Evgeny Boev Single-Handed Input Controller and Method
US10447265B1 (en) * 2017-06-07 2019-10-15 Facebook Technologies, Llc Hand-held controllers including electrically conductive springs for head-mounted-display systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106055090A (zh) * 2015-02-10 2016-10-26 李方炜 利用移动装置的虚拟现实和增强现实控制
CN104699247A (zh) * 2015-03-18 2015-06-10 北京七鑫易维信息技术有限公司 一种基于机器视觉的虚拟现实交互系统及方法
US20160357249A1 (en) * 2015-06-03 2016-12-08 Oculus Vr, Llc Hand-Held Controllers For Virtual Reality System
CN105117016A (zh) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 用于虚拟现实和增强现实交互控制中的交互手柄

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110837295A (zh) * 2019-10-17 2020-02-25 重庆爱奇艺智能科技有限公司 一种手持控制设备及其追踪定位的方法、设备与系统

Also Published As

Publication number Publication date
CN109069920B (zh) 2022-04-01
CN109069920A (zh) 2018-12-21
US20190384419A1 (en) 2019-12-19

Similar Documents

Publication Publication Date Title
WO2019033322A1 (zh) 手持式控制器、跟踪定位方法以及系统
JP6669069B2 (ja) 検出装置、検出方法、制御装置、および制御方法
JP6747446B2 (ja) 情報処理装置、情報処理方法、及びプログラム
US10015402B2 (en) Electronic apparatus
JP6658518B2 (ja) 情報処理装置、情報処理方法及びプログラム
WO2021035646A1 (zh) 可穿戴设备及其控制方法、识别手势的方法和控制系统
JP2013069224A (ja) 動作認識装置、動作認識方法、操作装置、電子機器、及び、プログラム
US9268408B2 (en) Operating area determination method and system
US10754446B2 (en) Information processing apparatus and information processing method
JPWO2018003862A1 (ja) 制御装置、表示装置、プログラムおよび検出方法
JP2020502608A (ja) 携帯用通信ターミナル、方向性入力ユニット及びこれと関連した方法
US11023050B2 (en) Display control device, display control method, and computer program
US20220019288A1 (en) Information processing apparatus, information processing method, and program
KR20190135794A (ko) 이동 단말기
US9898183B1 (en) Motions for object rendering and selection
KR20160149066A (ko) 이동단말기 및 그 제어방법
JP2016058061A (ja) 電子機器
JP2015052895A (ja) 情報処理装置及び情報処理方法
WO2018185830A1 (ja) 情報処理システム、情報処理方法、情報処理装置、及びプログラム
JP4972013B2 (ja) 情報提示装置、情報提示方法、情報提示プログラムおよびそのプログラムを記録した記録媒体
JP2013134549A (ja) データ入力装置およびデータ入力方法
US11054941B2 (en) Information processing system, information processing method, and program for correcting operation direction and operation amount
JP5951966B2 (ja) 画像処理装置、画像処理システム、画像処理方法、及びプログラム
US20190129609A1 (en) Electronic apparatus
Yeo et al. OmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional Camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17921964

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17921964

Country of ref document: EP

Kind code of ref document: A1