WO2018020497A1 - Système de suivi hybride destiné à un dispositif manipulé à la main - Google Patents

Système de suivi hybride destiné à un dispositif manipulé à la main Download PDF

Info

Publication number
WO2018020497A1
WO2018020497A1 PCT/IL2017/050833 IL2017050833W WO2018020497A1 WO 2018020497 A1 WO2018020497 A1 WO 2018020497A1 IL 2017050833 W IL2017050833 W IL 2017050833W WO 2018020497 A1 WO2018020497 A1 WO 2018020497A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointing device
optically
remote pointing
motion
optical
Prior art date
Application number
PCT/IL2017/050833
Other languages
English (en)
Inventor
Rami Parham
Eyal BOUMGARTEN
Hanan KRASNOSHTEIN
Original Assignee
Muv Interactive Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Muv Interactive Ltd. filed Critical Muv Interactive Ltd.
Publication of WO2018020497A1 publication Critical patent/WO2018020497A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0308Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device

Definitions

  • the present disclosure relates to the field of pointing devices.
  • Optical pointing devices have become increasingly popular with the advancement of wireless and mobile technology. Such devices allow users to remotely control the operation of one or more computerized applications and/or devices by directly indicating a target using the optical pointer.
  • One popular application for optical pointing devices is to remotely control the display of visual content. The user may wirelessly interact with visual content displayed on a screen by directly pointing to a target on the screen.
  • Some screens have optic sensors embedded therein, allowing them to self-detect the target indicated by the optical pointer. However, these screens are highly specialized and expensive.
  • Other implementations allow the projection of the visual content onto a generic surface, such as a wall, ceiling, or table top. In these cases, a camera is positioned to track the target indicated by the optical pointer.
  • tracking is typically lost as soon as the camera can no longer sense the optical indicator. For example, if an individual moves and temporarily obstructs the optical path between the pointer and screen, or between the screen and the camera, tracking will be lost. In these cases, tracking must be renewed in a process that incurs latency which may be detrimental to the user's interactive experience.
  • a system for determining a position of a moving target pointed at by a hand-mobilized device comprising: a hand- mobilized device, comprising: a motion and orientation sensor, an optical emitter, a radio frequency (RF) device transmitter, and a processor configured to receive motion and orientation data from the motion and orientation sensor and transmit the data via the RF device transmitter; and a tracking system, comprising: a camera, an RF tracking receiver, and a controller configured to: determine an optically-based position of a moving target pointed at by the hand mobilized device when the determining is from an optical indication produced by the optical emitter and the camera senses the optical indication at an intensity that exceeds a threshold, receive the motion and orientation data from the RF device transmitter via the RF tracking receiver, compute a discrepancy between a position estimated using the motion and orientation data and the determined optically-based position, to produce a correlation history, and when the determination of the optically- based position ceases, estimate a position of the moving target
  • the controller is further configured to revert to determining an optically-based position of the moving target based on an optical indication when the camera senses the optical indication at an intensity that exceeds the threshold after the optical tracking had ceased. In some embodiments, the controller is further configured to use a smoothing scheme when reverting to determining the optically-based position.
  • the hand-mobilized device further comprises an RF device receiver and wherein the tracking system further comprises an RF tracking transmitter, wherein the controller is configured to initiate the smoothing scheme by sending a command, via the RF tracking transmitter and RF device receiver, to trigger the optical emitter to emit multiple sequential optical pulses, wherein the smoothing scheme comprises: for each detected optical pulse: determining a current optically-based position of the moving target based on the detected pulse, estimating a current non-optically-based position using motion and orientation data received via the RF tracking receiver and the most recently determined optically-based position, computing a discrepancy between the current optically-based position and the current non-optically-based position, adjusting the subsequently emitted optical pulses using the discrepancy.
  • the controller is further configured to trigger the optical emitter to emit a corrective optical pulse when the motion and orientation data received via the RF tracking receiver indicates a change above an inertial threshold.
  • system further comprises a screen, wherein the optical indication is indicated on the screen.
  • the controller is further configured to use the determined position to control an application.
  • the application displays visual content on the screen.
  • the motion and orientation sensor is a multiple-axis motion tracking component.
  • the optical emitter is a laser light source coupled with a diffractive optical element (DOE). In some embodiments, the optical emitter is an array of multiple light emitting diodes.
  • DOE diffractive optical element
  • the controller is further configured to: disable the optical emitter, thereby causing the determination of the optically-based position to cease; and enable the optical emitter thereby causing the determination of the optically-based position to recover.
  • a method for tracking a moving target pointed at by a hand-mobilized device comprising: emitting, by a hand- mobilized device, an optical indication to illuminate a moving target pointed at by the hand-mobilized device; capturing an image of the illuminated moving target; determining an optically-based position of the moving target from the captured image when the illuminated moving target is sensed at an intensity that exceeds a threshold; measuring motion and orientation data of the handheld device; estimating a position of the moving target using the motion and orientation data and the most recently determined optically- based position; computing a discrepancy between the estimated position and the optically determined position to produce a correlation history; when the determination of the optically-based position ceases, estimating a position of the moving target pointed at by the hand-mobilized device based on: the correlation history, the most recently determined optically-based position, and the motion and orientation data.
  • the method further comprises reverting to determining an optically-based position of the moving target based on an optical indication when the optical indication is sensed at an intensity that exceeds the threshold after the optical tracking had ceased.
  • the method further comprises using a smoothing scheme when reverting to determining the optically-based position.
  • the method further comprises initiating the smoothing scheme by triggering the emitting of multiple sequential optical pulses, wherein the smoothing scheme comprises: for each detected optical pulse: determining a current optically-based position of the moving target based on the detected pulse, receiving motion and orientation data; estimating a current non-optically-based position using the motion and orientation data and the most recently determined optically-based position, computing a discrepancy between the current optically-based position and the current non-optically-based position, adjusting the subsequently emitted optical pulses using the discrepancy.
  • adjusting comprises adjusting a period between the subsequently emitted optical pulses. In some embodiments, adjusting the period comprises increasing the length of the period when the discrepancy is small and decreasing the length of the period when the discrepancy is large. In some embodiments, adjusting comprises adjusting a duty cycle of the subsequently emitted optical pulses.
  • adjusting the duty cycle comprises increasing the duty cycle when the discrepancy is large and decreasing the duty cycle when the discrepancy is small.
  • the method further comprises triggering the emitting of a corrective optical pulse when the motion and orientation data indicates a change above an inertial threshold.
  • the method further comprises using the determined position to control an application.
  • the method further comprises using the determined position to display visual content.
  • the method further comprises ceasing the emission of the optical indication, thereby causing the determination of the optically-based position to cease; and restarting the emission of the optical indication, thereby causing the determination of the optically-based position to recover.
  • a hybrid system for tracking a hand-mobilized device comprising: a hand-mobilized device, comprising: a motion and orientation sensor, an optical emitter, a radio frequency (RF) device transmitter, and a processor configured to receive motion and orientation data from the motion and orientation sensor and transmit the data via the RF device transmitter; and a tracking system, comprising: a camera, an RF tracking receiver, and a controller configured to: determine at least two optically-based positions of a moving target pointed at by the hand mobilized device when the determining is from an optical indication produced by the optical emitter and the camera senses the optical indication at an intensity that exceeds a threshold, receive the motion and orientation data from the RF device transmitter via the RF tracking receiver, compute a discrepancy between a position estimated using the motion and orientation data and the determined optically-based position, to produce a correlation history, determine an optically-based absolute position of the hand-mobilized device using the at least two optically-based positions of the moving target and the motion
  • a tracking system comprising: a
  • a method for tracking a hand- mobilized device comprising: emitting, by a hand-mobilized device, an optical indication to illuminate a moving target pointed at by the hand-mobilized device; capturing at least two images of the illuminated moving target at two positions; determining at least two optically-based positions of the moving target from the captured images when the illuminated moving target is sensed at an intensity that exceeds a threshold; measuring motion and orientation data of the handheld device; estimating a position of the moving target using the motion and orientation data and the most recently determined optically- based position; computing a discrepancy between the estimated position and the optically determined position to produce a correlation history; determining an optically-based absolute position of the hand-mobilized device using the at least two optically-based positions of the moving target and the motion and orientation data; determining an inertially-based absolute position of the hand-mobilized device using only the motion and orientation data; computing a calibration error between the optically-based absolute position and the inertially-
  • a method for associating each of a plurality of locations pointed at by at least two remote pointing devices, with a respective remote pointing device comprising: receiving a plurality of locations pointed at on a surface by a first remote pointing device and a second remote pointing device; associating an unassociated location from the plurality of locations with the first remote pointing device or the second remote pointing device, wherein the associating comprises: receiving readings from a motion sensor comprised in at least one of the first remote pointing device and the second remote pointing device; and associating the unassociated location with the first remote pointing device or the second remote pointing device based on the readings from the motion sensor and the unassociated location.
  • associating the unassociated location with the first remote pointing device or the second remote pointing device optionally comprises: determining a likely area for a next location based on a trajectory determined upon the readings; and subject to the unassociated location being within the likely area, associating the location with the first remote pointing device or the second remote pointing device, respectively.
  • determining the likely area optionally comprises: determining an advancement route from the readings; transforming at least a part of the advancement route to coordinates on the surface, to obtain a transformed advancement route; and determining the likely area as an area in the vicinity of a last segment of the transformed advancement route.
  • the likely area is optionally a vicinity of a straight line. In some embodiments, the likely area is optionally an area internal to an angle.
  • associating the unassociated location with the first remote pointing device or the second remote pointing device optionally comprises: calculating correlation between the unassociated location and an advancement route based on the readings from the motion sensor; and associating the unassociated location with a remote device for which a maximal correlation is obtained.
  • receiving a radio frequency signal from the first remote pointing device or the second remote pointing device upon the first remote pointing device or the second remote pointing device starting to point at locations. In some embodiments, receiving a radio frequency signal from the first remote pointing device or the second remote pointing device, in response to sending a request to the first remote pointing device or the second remote pointing device.
  • associating an unassociated location from the plurality of locations with the first remote pointing device or the second remote pointing device subject to: a first distance between the unassociated location and a previous received location associated with the first remote pointing device or the second remote pointing device, respectively, not exceeding a first threshold; and a second distance between the unassociated location and another previous received location associated with the second remote pointing device or the first remote pointing device, respectively, exceeding a second threshold.
  • the motion sensor optionally comprises a 3-axis inertial sensor.
  • a system comprising: at least two hand-mobilized pointing devices each comprising: one or more optical emitters, an inertial sensor, and a signal transceiver for transmitting a reading of the inertial sensor; and a controller comprising: an optical capture device, a signal transceiver for receiving the reading of the inertial sensor, and a processor configured for: receiving a plurality of locations pointed at by a first remote pointing device and a second remote pointing device on a surface; associating an unassociated location from the plurality of locations with the first remote pointing device or the second remote pointing device, comprising: receiving readings from a motion sensor comprised in the first remote pointing device or the second remote pointing device; and associating the unassociated location with the first remote pointing device or the second remote pointing device based on the readings from the motion sensor and the unassociated location.
  • associating the unassociated location with the first remote pointing device or the second remote pointing device optionally comprises: determining a likely area for a next location based on a trajectory determined upon the inertial readings; and subject to the unassociated location being within the likely area, associating the location with the first remote pointing device or the second remote pointing device, respectively.
  • determining the likely area optionally comprises: determining an advancement route from the readings; transforming at least a part of the advancement route to coordinates on the surface, to obtain a transformed advancement route; and determining the likely area as an area in the vicinity of a last segment of the transformed advancement route.
  • the likely area is optionally a vicinity of a straight line. In some embodiments, the likely area is optionally an area internal to an angle.
  • associating the unassociated location with the first remote pointing device or the second remote pointing device comprises: calculating correlation between the unassociated location and an advancement route based on the readings from the motion sensor; and associating the unassociated location with a remote device for which a maximal correlation is obtained.
  • the processor is optionally further configured to receive a radio frequency signal from the first remote pointing device or the second remote pointing device upon the first remote pointing device or the second remote pointing device starting to point at locations, or in response to sending a request to the first remote pointing device or the second remote pointing device.
  • the processor is optionally further configured to associate an unassociated location from the plurality of locations with the first remote pointing device or the second remote pointing device, subject to: a first distance between the unassociated location and a previous received location associated with the first remote pointing device or the second remote pointing device, respectively, not exceeding a first threshold; and a second distance between the unassociated location and another previous received location associated with the second remote pointing device or the first remote pointing device, respectively, exceeding a second threshold.
  • the motion sensor optionally comprises a 3-axis inertial sensor.
  • a computer program product comprising a computer readable storage medium retaining program instructions, which program instructions when read by a processor, cause the processor to perform a method comprising: receiving a plurality of locations pointed at on a surface by a first remote pointing device and a second remote pointing device; associating an unassociated location from the plurality of locations with the first remote pointing device or the second remote pointing device, wherein the associating comprises: receiving readings from a motion sensor comprised in at least one of the first remote pointing device and the second remote pointing device; and associating the unassociated location with the first remote pointing device or the second remote pointing device based on the readings from the motion sensor and the unassociated location.
  • Fig. 1 A is a conceptual illustration of an optical tracking system, in accordance with an embodiment
  • Fig. IB is a conceptual block diagram of a hybrid tracking system, in accordance with an embodiment
  • Figs. 1C-1D taken together, illustrate a technique for determining the absolute position of a hand-mobilized device using a combination of optical and inertial data
  • Fig. 2A illustrates a timeline illustrating the steps for determining the correlation history between optically and inertial-based tracking, in accordance with an embodiment
  • Figs. 2B-2C show two timelines illustrating the steps for implementing a smoothing scheme when transitioning between an inertial-based tracking system to an optical-based tracking system, in accordance with an embodiment
  • FIGs. 3 A-3B together, illustrate a flowchart of a hybrid method for tracking a target of a hand-mobilized device
  • Fig. 3C illustrates a flowchart of a method to smooth the transition between predicting a position of a target using inertial data and reverting to determining the position of the target using optically-based data
  • Figs. 4A-4B illustrate exemplary situations of ambiguity in associating pointed locations with a pointing device when two or more devices point at a surface
  • Fig. 4C illustrates an example of an area likely to contain an unassociated location, in accordance with some exemplary embodiments of the disclosed subject matter
  • Fig. 5 shows a flowchart of steps in a method for associating a pointed at location with remote pointing device out of two or more remote pointing devices used simultaneously, in accordance with some exemplary embodiments of the disclosed subject matter;
  • Fig. 6 shows a block diagram of a system in which two pointing devices are used and tracked, and wherein pointed locations are associated with one of the pointing devices, in accordance with some exemplary embodiments of the disclosed subject matter.
  • a hybrid positioning system is disclosed herein to track the position of a moving target of a hand-mobilized device using an optical positioning system operating in concert with an inertial positioning system.
  • the hand-mobilized device may include an optical pointer, such as a laser, that is used to optically indicate a target that moves in accordance with a change in position and/or orientation of the pointer.
  • the moving target may indicate one or more locations on a screen, or an object in a room.
  • the hand-mobilized device may be any of a hand and/or or finger, thumb wearable device or a hand-held device that may be moved with the movement of any of the user's hand and/or finger/thumb, accordingly.
  • the system determines the position of the moving target using the optical indication. However, should the camera lose sight of the optical indication, the system continues to track the position of the moving target using inertial data provided via a motion and orientation sensor of the hand-mobilized device. Thus, tracking the user's target is not automatically lost when the optical indication is no longer sensed by the camera. The system may then revert back to optically tracking the moving target.
  • the inertial data is used to track the position of the moving target during a majority of the time, and the optical pointer of the hand-mobilized device is only activated intermittently, as a corrective measure.
  • tracking the hand-mobilized device using a hybrid tracking system may provide additional advantages, such as but not limited to: reducing power consumption, by intermittently switching off the higher power-consuming optical tracking system; improving tracking accuracy and reliability by providing a backup tracking system should one of the systems fail; allow to exercise safety considerations by reducing the frequency of emitting the optical indicator when such emission is deemed unsafe, allowing to simultaneously track multiple different devices using, and the like.
  • Fig. 1 A illustrates a system for tracking a moving target of a hand-mobilized device 100.
  • Device 100 may function as a wireless optical indicator, such as an electronic mouse, allowing the user to indicate a virtual or real target on a screen 102 using an illumination beam.
  • the target is shown in Fig. 1 A as a four-cornered star.
  • Screen 102 may be any suitable display surface, such as virtual screen provided with a virtual or augmented reality system, an electronic screen, such as a plasma, or liquid crystal display screen, or an ad-hoc screen created by projecting an image onto a surface, such as a wall or table top.
  • Device 100 may be a wearable device and may fit on one of the user's fingers allowing him to indicate moving targets on screen 102 by pointing at them as one would naturally point with one's hand, and trigger one or more functionalities of an application using hand gestures such as by pinching and/or pressing his fingers, rotating his hand, pressing one or more control buttons 100a of device 100, and the like.
  • a camera 104 and controller 110 provided with a hybrid tracking system 106 may optically track the position of the moving target in realtime as the user changes the position by moving his hand, such as to control an application associated with device 100.
  • Hybrid system 106 additionally includes a radio frequency (RF) transceiver 118 that may communicate with device 100 to receive motion and orientation data.
  • RF radio frequency
  • the motion and orientation data may be used by controller 110 to estimate or predict the position of the moving target should camera 104 momentarily lose the optical signal.
  • the motion and orientation data may supplement the optical data, to 'fill in' gaps should the optical signal be obstructed or otherwise lost.
  • the motion and orientation data may be transmitted using any suitable RF communication protocol, such as WiFi, Bluetooth, ZigBee, or other.
  • Device 100 may be used to select different targets on screen 102 that trigger the application to execute different functionalities depending on the selected target, such as to display visual content on screen 102, initiate the execution of another application, activate a device, and the like, allowing the user to interact remotely with the screen via device 100.
  • screen 102 may display a control panel for various applications allowing the user to select one accordingly.
  • the user may point device 100 to select an application, in response to which a menu is displayed that allows the user to set one or more parameters for the selected application. For example, on selecting 'VIEW PHOTOS', a list of various photo albums may be displayed. The user may select the album he wishes to view.
  • the user may set one or more display parameters using any of the pointing, pinching, pressing or other gesture actions enabled by device 100.
  • the application may issue a command to display the photos on screen 102, accordingly.
  • Fig. IB shows a block diagram of device 100 operative with hybrid tracking system 106, in accordance with an embodiment.
  • Device 100 may be provided with an optical emitter 108, such as a laser or light emitting diode (LED) light source that illuminates the target with one or more optical indications.
  • the illumination may be in the visible, near-infrared (TR) or IR range, or any suitable wavelength detectable by a camera.
  • the optical indicator(s) may be a single point focused dot that indicates the target, or alternatively a pattern, such as a set of dots illuminating the target.
  • optical emitter 108 may be any of a laser light source coupled with a diffractive optical element (DOE), or alternatively an array of multiple LEDs that create a pattern of dots which may be used to determine the relative position of device 100
  • DOE diffractive optical element
  • Hybrid tracking system 106 may include an optical tracking subsystem 106a that optically tracks the position of the target as it moves over time by detecting the optical indicators emitted from emitter 108, and a motion/orientation subsystem 106b that tracks the motion and orientation of device 100 over time, and which is described in greater detail below.
  • Optical tracking subsystem 106a may include camera 104 and controller 110. Camera 104 may be sensitive to any of the intensity, wavelength, and other illumination characteristics of the light emitted by emitter 108, and may be positioned to capture one or more images of the illuminated moving target.
  • Camera 104 may communicate the captured images to controller 110, which may analyze the images to determine if the optical indication produced by emitter 108 was detected successfully by camera 104, such as at an intensity that exceeds a minimal threshold value.
  • the threshold value may be system related, and may depend on any of emitter 108 and camera 104. If the optical indication was successfully detected, controller 110 may determine an optically-based position of the moving target from the detected optical indication.
  • Device 100 may additionally include a processor 112, a radio frequency (RF) transmitter 114, and a motion and orientation sensor 116, such as a multiple-axis motion tracking component.
  • sensor 116 may include any of an accelerometer, a gyroscope, and a compass integrated within a single electronic component, such as the MPU-9250 9-axis motion tracking device or MPU-6500 6-axis motion tracking device by InvenSense, Inc. of San Jose, California.
  • Processor 112 may receive the motion and orientation data from sensor 116 and transmit the data via RF transmitter 114.
  • Hybrid tracking system 106 may additionally include a motion and orientation tracking subsystem 106b comprising RF receiver 118 with controller 110.
  • Controller 110 may receive the motion and orientation data from device 100 via the RF transmitter 114 and the RF receiver 118.
  • RF receiver 118, controller 110 and camera 104 may be housed separately and positioned independently of each other.
  • RF receiver 118, controller 110 and camera 104 may be housed together as a single hybrid tracking unit.
  • controller 110 may use any one of the tracking subsystems exclusively, or both tracking systems simultaneously.
  • Each tracking subsystem may provide backup tracking should the other one of the tracking subsystems fail or otherwise cease to operate.
  • the optical tracking subsystem may be temporarily or intermittently turned off either by the user or by the controller 110, and the motion and orientation tracking subsystem may provide supplemental tracking to allow for a robust tracking system. Similarly, should the motion and orientation tracking subsystem become disabled, the optical tracking subsystem may provide the tracking.
  • controller 110 may track the target using the motion and orientation data to compute a non-optically based position of the target over time.
  • controller 110 may use the most recently determined optical position as a starting point, and estimate the current position based on the most recently received motion and orientation data.
  • controller 110 may compare the estimated non-optically based position with the optically-based position and compute their discrepancy to evaluate the error between the two.
  • Controller 110 may compute the discrepancies between the positions determined by the optically based tracking of the moving target and the positions estimated using the motion and orientation data to produce a correlation history over time.
  • the correlation history may be stored at a memory unit (not shown) provided with controller 110.
  • Fig. 2A illustrates a time line for determining the correlation history between the optically-based tracking and the motion and orientation- based tracking.
  • controller 110 receives inertial information from device 100 indicating the user's hand motion and changes to the orientation, which are expected to affect the absolute position of the moving target, accordingly. Controller 110 uses this information to estimate the position of the moving target at the next time period, Ti.
  • the absolute position of the moving target is determined optically at time Ti, and compared to the position estimated using the motion and orientation data. The error between the two is evaluated and stored.
  • the method repeats for subsequent time periods: inertial data is collected over the period spanning from Ti to T 2 , and used to estimate the position at time T 2 based on the absolute position detected optically at Ti. This position is compared to the absolute position determined optically at time T 2 , and the error is evaluated and stored. Repeating this method over multiple time periods provides a function of the error between the two positioning methods over time, and which is used to create the correlation history.
  • the length of the time period may be selected in accordance with the computation load of controller 110 - a finer granularity may provide a more robust correlation history but required more computation resources, whereas a larger time period may conserve computation resource of controller 110 at the expense of finer details.
  • the time period may be approximately 0.25 seconds (s), 0.5s, 0.75s, Is, 1.5s, 2s, 2.5, or 3s.
  • Controller 110 may analyze the images received from camera 104 and may detect when the optical tracking has ceased, such as when controller 110 detects that camera 104 did not sense the optical indication at a sufficient intensity that exceeded the minimal threshold value. For example, the optical tracking may cease when controller 110 fails to identify the illuminated target in the captured images, such as if the images were captured when the illuminated target was out of range of camera 104, or if an object obstructed the optical path between the target and emitter 108 and/or the optical path between the target and camera 104. Similarly, if the optical indicator emitted by emitter 108 was of a low intensity, or if the captured image is of poor quality due to noise, distortion, or other factors, controller 110 may detect that the optical tracking has ceased.
  • controller 110 may employ the motion and orientation tracking subsystem 106b as a backup tracking system to continue tracking the moving target pointed at by device 100 using non-optical data. Additionally, or alternatively, any of controller 110 and the user may intentionally disable the optical tracking system 106a by disabling emitter 108 such as to conserve power, comply with safety regulations and the like. Similarly, any of controller 110 and the user may intentionally enable the optical tracking system 106a by enabling emitter 108 after it has been disabled.
  • device 100 may be provided with a user interface (not shown) that allows the user to alternately indicate to controller 110 to disable and enable optical emitter 112.
  • Controller 110 may determine a non-optically based position of the moving target by estimating the position based on the correlation history, the most recently determined position, and the motion and orientation data. Controller 110 may user the non-optically based position to temporarily track the position of the user's target until the optical tracking can be renewed. For example, techniques such as described in Chapter three of ' Studies of Mechatronics, Motion Tracking Systems, An Overview of Motion Tracking Method', 2011, Swiss Federal Institute of Technology, Zurich, (available at http://students.asl.ethz.ch/uplj3df/308-report.pdf, last viewed June 28, 2016) may be used to determine the user's position after the optical signal is lost.
  • Controller 110 may employ both the inertial tracking subsystem 106b and the optical tracking subsystem 106a to determine the absolute position of the user with respect to screen 102, or in a room in which screen 102 is situated.
  • FIG. 1C-1D illustrate a technique for determining the absolute position of device 100 using a combination of optical and inertial data.
  • a two-dimensional solution for the spatial position of device 100 may be determined using two points, Xi and X 2 (corresponding to points A and B, respectively in the equations below) indicated by emitter 108 on screen 102.
  • the orientation of screen 102 with respect to device 100 may be represented by an azimuth- to-screen normal N (perpendicular).
  • , between the two points indicated on screen 102 may be measured.
  • the relationship between pixels and spatial units is constant for any particular system and may be defined during the calibration process.
  • the magnetometer component of the inertial tracking system may provide azimuths of the different pointing directions with respect to locations Xi and X 2 . The following may be determined:
  • N is the normal between device 100 and screen 102.
  • AZN is the azimuth of the normal N
  • AZA is the azimuth with respect to the pointing direction of device 100 to Xi and incident angle ⁇
  • AZB is the azimuth with respect to the pointing direction of device 100 to X 2 and incident angle Y.
  • Xp and Yp are the (X,Y) coordinates of device 100, and describes the projection of the position of device 100 in 3D space with respect to the XY plane, where screen 102 is the XZ plane.
  • Z (height) coordinate of the position of device 100 the same calculations may be repeated to find the projection to the YZ surface.
  • the 3 -dimensional azimuth data obtained from the magnetometer may be used to determine this projection, accordingly.
  • controller 110 may use the previously determined absolute position relative to screen 102, calculated using the method above, together with information received from inertial tracking subsystem 106b indicating the user's hand motion to estimate the hypothetical projection of the target on screen 102.
  • the projection may be affected by the incident angle of the hypothetical beam that would be emitted from emitter 108 if it were enabled, and which may be determined by the orientation information received from inertial tracking subsystem 106b as well as the estimated distance from the screen, which may be determined from an accelerometer of inertial tracking subsystem 106b.
  • a change in the orientation of the user's hand may change the size of the projection.
  • the changes to the distance and/or orientation of the user's hand with respect to screen 102 may be estimated from the information received from motion and orientation tracking subsystem 106b and the resulting hypothetical projection may be determined accordingly and used as though it were optically tracked using optical tracking subsystem 106a.
  • the combination of the optical and inertial tracking subsystems 106 may be used to measure a current distance R from screen 102 and incident angle ⁇ between the beam emitted by emitter 108 on screen 102.
  • a rotation of the user's hand that sweeps an arc of X° facing the screen may sweep across Y screen pixels, corresponding to a ratio of Y pixels per X°.
  • a translational shift of emitter 108 of Zmm may sweep across W screen pixels, corresponding to a ratio of W pixels per Zmm of translational movement at distance R.
  • controller 110 may use the information received from motion and orientation tracking subsystem 106b to estimate the current hand distance R' incident angle ⁇ ' between emitter 108 and the screen, and adjust these ratios accordingly, to estimate that a rotation of the user's hand that sweeps the same arc of X° facing the screen may sweep across Y' screen pixels, and the same translational shift of Zmm may sweep across W screen pixels. Controller 110 may use these estimations and calculations to control one or more applications, such as the display of visual content on the screen.
  • controller 110 may continually receive images captured by camera 104 and analyze them in an attempt to renew the optical tracking of the target.
  • controller 110 may revert to determining an optically-based position of the moving target indicated by device 100 based on the detected optical indication.
  • controller 110 may use a smoothing scheme to smooth a discrepancy between the non-optically based position and the optically based position.
  • the smoothing scheme may include a Kalman filter, or other suitable smoothing function. Since the non-optically based position is less accurate than the optically based position, the discrepancy between the two may depend on a variety of factors, such as how much time elapsed between when the optical tracking ceased and was renewed, how far and/or fast the user moved his hand during the non- optical tracking.
  • the smoothing scheme may allow breaking up a large discrepancy into several smaller steps, such that the user does not experience a sudden 'jump' in the tracking, but rather, segues smoothly from the non-optical tracking into the optical tracking.
  • controller 110 may compute the absolute position based on information received from the optical tracking system and the inertial tracking system. Additionally, controller 110 may separately compute an absolute position based only on information received from each component of the inertial tracking system.
  • the (X, Y, Z) position determined for each component may be compared to the (X, Y, Z) absolute position determined optically, and an error may be computed for each inertial component, along each axis, i.e. when using a 9-axis tracking system, 9 errors will be determined, and when using a 6-axis tracking system, 6 errors will be determined.
  • the inertial errors may be included when calculating the position based on the inertial data.
  • the inertially calculated positions may include an up-to-date compensating factor for any variations to the inertial tracking components.
  • the hybrid tracking system 106 may be provided with a RF transmitter integrated with the RF receiver 118 as a RF transceiver 118.
  • device 100 may be provided with an RF receiver integrated with RF transmitter 114 as RF transceiver 114.
  • Controller 110 may initiate the smoothing scheme by sending a command from transceiver 118 to transceiver 114 that triggers optical emitter 108 to emit multiple sequential optical pulses.
  • transceiver 114 may provide the command to processor 112 which may trigger emitter 108 accordingly.
  • the sequence of the optical pulses may have a dynamically periodicity and/or duty cycle that varies according to the size of the discrepancy.
  • the smoothing scheme may begin with an emission of an illumination pulse, such as 75 milliseconds (ms), or 80ms, or 85ms, or 90ms, or 95ms, or 100ms, or 105ms, or 110ms, or 115ms, or 120ms, or 125ms separated by relatively short intervals, such as 75ms, or 80ms, or 85ms, or 90ms, or 95ms, or 100ms, or 105ms, or 110ms, or 115ms, or 120ms, or 125ms.
  • Camera 104 may detect the target illuminated by the pulse and provide an image of the illuminated target to controller 110.
  • Controller 110 may determine the current optically-based position of the target based on the image.
  • controller 110 may compute the current non-optically-based position of the target using the motion and orientation data received from device 100 via RF tracking transceiver 118, as described above. Controller 110 may compute the discrepancy between the current optically-based position and the current non-optically-based position, and may adjust the subsequently emitted optical pulses using the discrepancy.
  • the length of the period between any two sequential optical pulses may be decreased, and if the discrepancy is small, the length the period between any two sequential optical pulses may be increased.
  • the duty cycle of any of the subsequently emitted pulses may be decreased, if the discrepancy is large, the duty cycle of any of the subsequently emitted pulses may be increase. This scheme may result in longer and more frequent pulses when the discrepancy is large, indicating a weak correlation, and shorter and less frequent pulses when the discrepancy is small, indicating a strong correlation.
  • controller 110 may emitter 108 to emit a corrective optical pulse to prevent losing the optical tracking.
  • the inertial threshold may range from 42° to 47°, or 40° to 50°, or 35° to 55°, and may be approximately 45° ⁇ 10%.
  • Timeline 200 shows multiple laser pulses emitted over time.
  • the length of any given pulse, T1-T0, T3-T2, T5-T4, and T7-T6 is dynamic and varies over time according the discrepancy.
  • the periods between any two sequential laser pulses, T2-T1, T4-T3, T6-T5 is also dynamic and varies in accordance with the discrepancy.
  • Timeline 200 shows the periods between sequential pulses increasing, and the length of the pulses decreasing, indicating that the correlation between the position detected using the motion and orientation sensing and the optically-determined position is strong and the discrepancy is small, allowing the system to reduce the amount of time emitter 108 emits a pulse, to conserve power.
  • Timeline 202 spans the same duration as timeline 200 however, three longer pulses are emitted compared with the four shorter pulses of Fig. 2B.
  • timeline 202 shows the periods between the subsequent pulses decreasing, and the length of the pulses increasing, indicating that the correlation between the position detected using the motion and orientation sensing and the optically-determined position is weak, and the discrepancy between the two is large. Thus, longer and more frequent pulses are required to correct the inertial positioning.
  • Figs. 3A-3B illustrate a flowchart of a hybrid method track a target of a hand-mobilized device.
  • An optical indication may be emitted by a hand-mobilized device to illuminate a target pointed at by the device (Step 300).
  • An image of the illuminated target may be captured (Step 302).
  • An optically-based position of the target may be determined from the captured image when the illuminated target is sensed at an intensity that exceeds a threshold (Step 304).
  • Motion and orientation data of the handheld device may be measured (Step 306).
  • a position of the target may be estimated using the motion and orientation data and the most recently determined optically-based position (Step 308).
  • a discrepancy between the estimated position and the optically determined position may be computed over time, to produce a correlation history (Step 310).
  • a position of a target pointed at by the device may be estimated based on the correlation history, the most recently determined optically-based position, and the motion and orientation data when an optical indication is not sensed at an intensity that exceeds the threshold (Step 312).
  • the method may revert to determining an optically-based position of a target based on an optical indication when the optical sensing is regained, such as when the optical indication is sensed at an intensity that exceeds the threshold (Step 314).
  • optical tracking may be ceased responsive to receiving an indication from any of the user and controller to disable the optical emitter.
  • optical tracking may be enabled after being disabled by receiving an indication from any of the user and controller.
  • the absolute positioning of the hand-mobilized device may be determined.
  • a hypothetical projection of the target may be estimated using the current motion and orientation received via the inertial tracking subsystem and the absolute positioning of the hand-mobilized device.
  • the absolute positioning may be determined by the following steps: i) the distance between two different target points indicated by the emitter may be calculated, ii) the motion and orientation data may be used to calculate an angle swept when moving from one of the target points to the other target, iii) a radius defined by the distance and the angle may be calculated, and iv) a third dimension, such as may be received from the compass component configured with the inertial tracking system and included in the motion and orientation data, may be used as a third dimension positioning parameter.
  • a calibration error of one or more of the inertial components may be calculated and may be used to estimate the absolute position of the device using only the motion and orientation data.
  • an inertially- based absolute position of the hand-mobilized device may be determined using only the motion and orientation data.
  • the absolute position calculated using the optical tracking system may be compared to the inertially-based absolute position, and a calibration error between them may be calculated.
  • the absolute position of the device may be estimated using the most recently determined absolute position, the motion and orientation data, and the calibration error
  • a smoothing scheme may be used when reverting to determining the position of the target using optically-based data (Step 316), such as by using the smoothing method described by Fig. 3C.
  • the determined position may be used to control an application (Step 318).
  • the determined position may be used to display visual content on the screen that displays the target.
  • Fig. 3C illustrates a flowchart of a method to smooth the transition between estimating a position of a target using inertial data and reverting to determining the position of the target using optically-base data.
  • the smoothing scheme may be initiated by triggering the emitting of multiple sequential optical pulses (Step 330).
  • a current optically-based position of the target may be determined based on the detected pulse (Step 332).
  • Motion and orientation data may be received (Step 334).
  • a current non-optically-based position may be estimated using the motion and orientation data and the most recently determined optically-based position (Step 336).
  • a discrepancy between the current optically-based position and the current non-optically-based position may be computed (Step 338).
  • the subsequently emitted optical pulses may be adjusted using the discrepancy (Step 340), such as by adjusting the period between the subsequently emitted optical pulses, and/or adjusting a duty cycle of the subsequently emitted optical pulses and/or adjusting the duration of the emitted pulse.
  • the length of the period may be increased when the discrepancy is small, and the length of the period may be increased when the discrepancy is large.
  • the duty cycle may be increased when the discrepancy is large and the duty cycle may be decreased when the discrepancy is small.
  • the emission of a corrective optical pulse may be triggered.
  • each such pointing device may emit a characteristic radio frequency (RF) signal when a tracking session is initiated, such that the initial location pointed at by the device and recognized by the tracking system, can be associated with the particular device.
  • RF radio frequency
  • the session may then continue wherein the tracking system continuously tracks the locations pointed at by the participating devices.
  • one or more of the devices may transmit a characteristic RF signal at predetermined time intervals, or upon specific events such as receiving a request from the tracking system, or the like.
  • the locations are received from a camera or another device optionally comprising or associated with a processor, in accordance with where the light emitted by a device hits the surface.
  • a camera or another device optionally comprising or associated with a processor
  • other methods for identifying the locations may be used as well.
  • Points being far enough may refer to the points being at least a predetermined number of centimeters or pixels from each other.
  • the predetermined distance may be relative to the velocity at which the pointed-at located change, such that rapid movement may allow a larger distance between the points in order to continue associate each location with a particular device.
  • the devices are different, for example in their wavelength, intensity, emitted light pattern or another characteristic, then determining which device emitted which location may be accurate and efficient, but such requirement may make the system more complex and more expensive. However, when the devices are substantially identical, such differentiation cannot be done based only on the locations as captured.
  • location may refer to a point expressed as a coordinate set in a two or three dimensional space. In further embodiments, the term may refer to a larger area, such a collection of coordinates shaped as a rectangle, a circle, or any other shape.
  • Fig. 4A shows a sequence of three states in a system in which two devices 402, 404 emit lights on a surface 408.
  • device 402 points at location 412 and device 404 points at location 416.
  • device 402 points at location 420 and device 404 points at location 422, wherein locations 420 and 422 are close to each other.
  • Arrangements 424 and 426 present two possible situations at time t3 : in arrangement 424, device 402 points at location 428 while device 404 points at location 432, while in arrangement 426, device 402 points at location 432 while device 404 points at location 428. In the two situations depicted in arrangements 424 and 426, locations 428 and 432 are pointed at, thus the situation is ambiguous as the two situations are valid.
  • Pane 436 shows the locations pointed at in the three points in time: tl, t2, and t3.
  • Pane 440 demonstrates one option in which path 444 connects locations pointed at by one of the devices, such as device 402, and path 448 connects the other locations, pointed at by the other device, such as device 404, which corresponds to arrangement 424.
  • Pane 452 demonstrates the other option in which path 456 connects locations pointed at by one of the devices, such as device 402, and path 460 connects locations pointed at by the other device, such as device 404, which corresponds to arrangement 426.
  • motion history can, in some situations help resolve the ambiguity if the devices point at locations which are apart from each other. However, when such history is unavailable or useless, the ambiguity cannot be resolved based only on the pointed locations.
  • Pane 460 presents a collection of locations pointed at by two devices at times tl ..tn+l .
  • the locations received for times tl ..tn are all close by, however the locations received for time tn+1 are far apart from the locations of tl ..tn and from each other.
  • n>l there may be motion history of one or both devices, however, this history is useless as all locations are close by and thus cannot assist in resolving the ambiguity.
  • each such device may comprise a sensor providing motion data, such as orientation or inertial data, of the device.
  • the sensor may comprise one or more accelerometers, gyroscopes, compasses integrated within a single electronic component, such as the MPU-9250 9-axis motion tracking device or MPU-6500 6-axis motion tracking device by InvenSense, Inc. of San Jose, California, or the like, such that an inertial based tracking system can track the motion of the device from the inertial data.
  • motion performed by the hand-mobilized devices can be determined in accordance with readings received from inertial sensors incorporated into the device. It will be appreciated that in situations in which it is obvious which locations are pointed at by which device, there is no need to combine the optical tracking information with motion-related information of the device. However, when there is ambiguity, the motions tracked by the device may be determined, and the pointed locations may be associated with the devices in accordance with the motions. Thus, in the situation of Fig. 4A, a horseshow motion by any of the devices can indicate that the situation is as shown in pane 440, while if the devices performed a substantially straight line motion, the situation is as depicted in pane 452.
  • a signal such as an RF signal may be received from either device when the device starts pointing at locations on the surface.
  • the signal may comprise a characteristic, such as a unique ID that may differentiate between devices.
  • Such transmission may provide an initial association between the location and the device, after which the locations pointed at by the device can be tracked and associated with the device. The initial association may be possible since the devices are used by human users, and the probabilities of two humans starting transmission at exactly the same time (at the time resolution used by the system) are rather negligible.
  • a plurality of locations on a surface which are pointed at by any of two or more pointing devices may be received.
  • the locations may be received as 2- dimensional or 3-dimensional coordinate set, and may be absolute or relative to some coordinate system, for example a coordinate system starting at a corner of the surface.
  • the locations may be received from a camera capturing the points of light on the surface.
  • a newly received unassociated location may be associated with the first or the second pointing devices, wherein the locations may be received from a camera capturing light on the surface.
  • Stage 504 may comprise stage 508, in which it may be determined whether the location can be associated with one of the pointing devices without requiring the information related to the inertia of the devices. For example, it may be determined whether the unassociated location is at a first distance not exceeding a first threshold from a previously received location associated with the first pointing device, and at a second distance exceeding a second threshold from a previously received location associated with the second pointing device, or vice versa. In such case, the location may be associated with the first pointing device, and the process may continue to a next point.
  • the first and second thresholds may be absolute, for example 1 and 5 cm, respectively, or numbers of pixels, relative to a distance between consecutive points previously assigned to the same device (thus relating to the velocity at which the points are appear on the surface), or the like.
  • readings may be received from the first or the second pointing devices.
  • the readings may be sensed and collected within a storage device of the pointing device for a predetermined time window, for example about 1 to about 20 seconds wherein the reading sequence may be transmitted to a controller upon request.
  • the readings may be continuously transmitted from one or both devices to a controller and stored therein for a predetermined time window.
  • an advancement route of the mobile device may be determined based on two or more inertial readings.
  • the advancement route may be a substantially straight line, an angle, or more complex or even arbitrary shapes, depending also on the number of previous points upon which the route is determined.
  • the advancement route may be transformed into surface coordinates, or surface directions.
  • the transformation may utilize an assumption that a user holding the device is facing the surface, such that, for example, a motion of the device along a positive direction of an X axis is transformed to an advancement on the surface along a positive direction of the X axis as well.
  • Such transformed advancement route 480 is shown between points Tl, T2 and T3 of Fig. 4C on surface 476.
  • each pointed location may be associated with the trajectories of the pointing devices, which can be performed in a multiplicity of ways.
  • a likely area for a pointed location may be determined as an area proximate to a last segment of the advancement route, as transformed to the surface.
  • the area may be shaped substantially as an angle, wherein the vertex of the angle is the previous point associated with the device, and the angle bisector is a ray starting at the previous point associated with the device and advancing in accordance with the advancement of the inertial motion.
  • the angle may be of a predetermined size, for example 45% on either side of the bisector. Alternatively, if there are multiple points to be associated in the vicinity of the current point, the angle may be made narrower, or the like.
  • the area is indicated in Fig. 4C by angle 492 having rays 484 and 488.
  • the size of the area defined by the angle may be determined, for example, in accordance of the velocity of the motion device, such that faster motions provide for a determining the area as a sector of a larger circle.
  • the location may be associated with the device for which the likely area has been determined.
  • likely areas may be determined for the two pointing devices, and the location may be associated with the pointing device for which the likely area has higher probability to contain the location, for example closer to the bisector, closer to the angle vertex, or the like.
  • likely areas may be determined for the two devices, and the combination of points and likely areas that yields better match may be selected.
  • a correlation function may be calculated between the advancement routes as calculated based on the inertial readings, and an optical route based on the trajectory formed by the pointed locations including the unassociated locations.
  • the unassociated location may be associated with the pointing device for which the correlation function yields maximal correlation.
  • a controller may request one or the two devices to transmit a characteristic signal, similar to the signal sent at the initial stage, which may provide for continuing to associate each point with one of the devices.
  • FIG. 6 showing a block diagram of a system in which two pointing devices are used and tracked, and wherein pointed locations are associated with one of the pointing devices, in accordance with some exemplary embodiments of the disclosed subject matter.
  • the system may comprise two or more remote pointing devices 600, adapted to be used by a user for pointing at areas such as a screen, a table, or the like, upon which content may be displayed. By pointing at locations, a user may take an action such as making a selection, marking a path, or the like.
  • Pointing device 600 may comprise a processor 604, such as a Central Processing Unit (CPU), a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like.
  • processor 604 can be utilized to perform computations required by pointing device 600.
  • Pointing device 600 may comprise one or more pushbuttons 608 which may be used for controlling actions of device 600, optionally comprising sensors 612 for sensing the push of button 608.
  • Pointing device 600 may comprise a signal transceiver 616, such as a radio frequency (RF) signal transmitter/receiver.
  • RF radio frequency
  • Pointing device 600 may comprise an optical emitter 620, which may include any suitable emitter of optical signals, such as but not limited to as a laser light source or a light emitting diode (LED).
  • optical emitter 620 may include any suitable emitter of optical signals, such as but not limited to as a laser light source or a light emitting diode (LED).
  • LED light emitting diode
  • Pointing device 600 may comprise motion sensor 624, such as one or more accelerometers, one or more gyroscopes, or one or more compasses integrated within a single electronic component, such as the MPU-9250 9-axis motion tracking device or MPU-6500 6-axis motion tracking device by InvenSense, Inc. of San Jose, California. Pointing device 600 may transmit motion and orientation data sensed by sensor 624 via signal transceiver 616.
  • motion sensor 624 such as one or more accelerometers, one or more gyroscopes, or one or more compasses integrated within a single electronic component, such as the MPU-9250 9-axis motion tracking device or MPU-6500 6-axis motion tracking device by InvenSense, Inc. of San Jose, California.
  • Pointing device 600 may transmit motion and orientation data sensed by sensor 624 via signal transceiver 616.
  • Pointing device 600 may comprise additional components, such as indicators 628 for example LEDs for indicating a state of the pointing device, a proximity sensor 632 for activating signal transceiver 616 or optical emitter 620 when in proximity to a user's finger or to another object, or other components.
  • indicators 628 for example LEDs for indicating a state of the pointing device
  • proximity sensor 632 for activating signal transceiver 616 or optical emitter 620 when in proximity to a user's finger or to another object, or other components.
  • Two or more pointing devices 600 may be operated and communicate with controller 630.
  • Controller 630 may comprise an optical capture device 644 such as a camera adapted to sense a point or another area formed on the surface by the signal emitted by optical emitter 620. Camera 644 may capture a stream of images of a target and provide the image stream to controller 630, in order to track the pointed locations.
  • optical capture device 644 such as a camera adapted to sense a point or another area formed on the surface by the signal emitted by optical emitter 620.
  • Camera 644 may capture a stream of images of a target and provide the image stream to controller 630, in order to track the pointed locations.
  • Controller 630 may comprise a signal transceiver 648 for receiving signals, such as orientation or acceleration data sensed by motion sensor 624 and transmitted by signal transceiver 616, and to transmit signals, for example transmit a request to pointing device
  • Controller 630 may comprise a processor 652, such as a CPU, a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like.
  • processor 652 such as a CPU, a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like.
  • Processor 652 can be utilized to perform computations required by controller 630.
  • processor 652 can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non-transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processor.
  • Processor 652 may comprise a standard location association module 656 for associating a location with a pointing device, when there is no ambiguity, as disclosed in association with stage 508 above.
  • Processor 652 may comprise an advancement route determination module 660, for determining an advancement route based on orientation or acceleration data of a device, as disclosed in association with stage 520 above.
  • Processor 652 may comprise a route transformation module 664, for transforming a route determined upon acceleration or orientation data received in pointing device environment, such as point device coordinate system, to surface coordinates or environment, as disclosed in association with stage 524 above.
  • Processor 652 may comprise a location-pointing device matching module 668 for matching a pointed location with one of the pointing devices based on the inertial route calculated for one of the devices.
  • location-pointing device matching module 668 may comprise a likely area determination module, for determining an area on the surface likely to contain a next point, as disclosed in association with stage 528 above, upon a determined route and location and likely area match assessment module for assessing whether a location in a likely area, or a probability that a locating is indeed in the likely area, as disclosed in association with stage 532 above.
  • location -pointing device matching module 668 may comprise a correlation determining function for determining the correlation between the pointed locations and the inertial routes of the devices, and selecting the pointing device for which the correlation is higher.
  • Processor 652 may comprise a control and data flow module 672, for activating the modules above as required, and proving to each module with the required input.
  • the present invention may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
  • the computer readable storage medium can be a non-transitory, tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random-access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, or any suitable combination of the foregoing.
  • RAM random-access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random-access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • a memory stick or any suitable combination of the foregoing.
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un système destiné à déterminer la position d'une cible pointée par un dispositif manipulé à la main comprenant : un dispositif manipulé à la main, comprenant : un capteur inertiel, un émetteur optique, un émetteur RF, et un processeur configuré pour recevoir des données inertielles en provenance du capteur et transmettre les données par l'intermédiaire de l'émetteur RF ; et un système de suivi comprenant : une caméra, un récepteur RF, et un dispositif de commande configuré pour : déterminer une position fondée sur l'optique d'une cible pointée par le dispositif manipulé à la main en fonction d'une indication émise par l'émetteur, recevoir les données inertielles, calculer une divergence entre une position estimée à l'aide des données inertielles et la position déterminée pour produire un historique de corrélation et, à la fin du suivi optique, déterminer une position d'une cible pointée par le dispositif en fonction de l'historique de corrélation, de la position la plus récente et des données inertielles.
PCT/IL2017/050833 2016-07-25 2017-07-25 Système de suivi hybride destiné à un dispositif manipulé à la main WO2018020497A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662366214P 2016-07-25 2016-07-25
US62/366,214 2016-07-25
US201762456721P 2017-02-09 2017-02-09
US62/456,721 2017-02-09

Publications (1)

Publication Number Publication Date
WO2018020497A1 true WO2018020497A1 (fr) 2018-02-01

Family

ID=61017162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2017/050833 WO2018020497A1 (fr) 2016-07-25 2017-07-25 Système de suivi hybride destiné à un dispositif manipulé à la main

Country Status (2)

Country Link
TW (1) TW201807477A (fr)
WO (1) WO2018020497A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10589182B2 (en) * 2018-05-24 2020-03-17 Universal City Studios Llc Water attraction dispatch system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7696980B1 (en) * 2006-06-16 2010-04-13 Logitech Europe S.A. Pointing device for use in air with improved cursor control and battery life
US20110193777A1 (en) * 2008-06-05 2011-08-11 Smart Technologies Ulc Multiple pointer ambiguity and occlusion resolution
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US8743055B2 (en) * 2011-10-13 2014-06-03 Panasonic Corporation Hybrid pointing system and method
WO2015067962A1 (fr) * 2013-11-08 2015-05-14 University Of Newcastle Upon Tyne Désambiguïsation de stylets par mise en corrélation d'une accélération sur des entrées tactiles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7696980B1 (en) * 2006-06-16 2010-04-13 Logitech Europe S.A. Pointing device for use in air with improved cursor control and battery life
US20110193777A1 (en) * 2008-06-05 2011-08-11 Smart Technologies Ulc Multiple pointer ambiguity and occlusion resolution
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US8743055B2 (en) * 2011-10-13 2014-06-03 Panasonic Corporation Hybrid pointing system and method
WO2015067962A1 (fr) * 2013-11-08 2015-05-14 University Of Newcastle Upon Tyne Désambiguïsation de stylets par mise en corrélation d'une accélération sur des entrées tactiles

Also Published As

Publication number Publication date
TW201807477A (zh) 2018-03-01

Similar Documents

Publication Publication Date Title
US10403047B1 (en) Information handling system augmented reality through a virtual object anchor
US10521011B2 (en) Calibration of inertial measurement units attached to arms of a user and to a head mounted device
US9405384B2 (en) Computer system and control method for same
US9538147B2 (en) Method and system for determining proper positioning of an object
EP2590057B1 (fr) Système 3D d'entrée de données et méthode associée
US10962631B2 (en) Method for operating a laser distance measuring device
WO2015021084A1 (fr) Détection de taches pour suivi de mouvement
KR20210010437A (ko) 광학 위치 추적 장치를 위한 전력 관리
US11989355B2 (en) Interacting with a smart device using a pointing controller
EP2392991A1 (fr) Dispositif de pointage portatif, système de contrôle de curseur logiciel et procédé de contrôle d'un mouvement d'un curseur logiciel
WO2018020497A1 (fr) Système de suivi hybride destiné à un dispositif manipulé à la main
US9013404B2 (en) Method and locating device for locating a pointing device
KR101956035B1 (ko) 인터랙티브 디스플레이 디바이스 및 그 제어 방법
JP2017162054A (ja) 情報処理装置及び出力装置
CN212906218U (zh) 一种分离式手柄、虚拟现实设备及虚拟现实追踪系统
EP3175327A1 (fr) Positionnement de précision d'instruments
CN111443811A (zh) 一种分离式手柄、虚拟现实设备及虚拟现实追踪系统
JP7148713B2 (ja) 位置合わせされた基準フレームを用いた物理空間における物体追跡のためのシステム
Scheibert Overcoming the problem of uncertain tracking errors in an AR navigation application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17833700

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17833700

Country of ref document: EP

Kind code of ref document: A1