US20140168081A1 - 3d remote control system employing absolute and relative position detection - Google Patents

3d remote control system employing absolute and relative position detection Download PDF

Info

Publication number
US20140168081A1
US20140168081A1 US14/185,147 US201414185147A US2014168081A1 US 20140168081 A1 US20140168081 A1 US 20140168081A1 US 201414185147 A US201414185147 A US 201414185147A US 2014168081 A1 US2014168081 A1 US 2014168081A1
Authority
US
United States
Prior art keywords
remote control
absolute position
axis
image
position detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/185,147
Inventor
Duncan Robert Kerr
Chad A. Bronstein
Wing Kong Low
Nicholas Vincent King
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US14/185,147 priority Critical patent/US20140168081A1/en
Publication of US20140168081A1 publication Critical patent/US20140168081A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/373Details of the operation on graphic patterns for modifying the size of the graphic pattern

Definitions

  • the present invention can relate to multi-dimensional remote control systems.
  • Some electronic systems can permit a user to interact with software applications, e.g., video games, by manipulating a remote control.
  • the systems can permit a user to interact with an image shown on a display by pointing a remote control at desired locations on or proximate to the display.
  • IR infrared
  • the remote control systems can detect light produced or reflected by the light sources. The systems then can determine the location to which the remote control is pointing based on the detected light.
  • the remote control systems or electronic devices coupled thereto can then perform one or more predetermined actions.
  • the present invention can include multi-dimensional (e.g., 2-D or 3-D) remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes (e.g., the x- and y-axes).
  • Remote control systems of the present invention also can detect the absolute position of the remote control in a third orthogonal axis (e.g., the z-axis).
  • remote control systems of the present invention can employ absolute position detection with relative position detection.
  • Absolute position detection can indicate an initial absolute position of the remote control.
  • Relative position detection can indicate changes in the position of the remote control.
  • an updated absolute position can be determined. Because relative position detection can provide greater resolution than some techniques used in absolute position detection, the updated absolute position can be more precise than the initial absolute position determined for the remote control.
  • the remote control system of the present invention also can zoom into and out of an image or a portion thereof based on the absolute position of the remote control in the third axis.
  • FIG. 1 illustrates one embodiment of a remote control system of the present invention
  • FIG. 2 illustrates interaction of one embodiment of a remote control system of the present invention with an image shown on a display
  • FIG. 3 illustrates a process for determining absolute positions of a remote control in x-, y-, and z-axes in accordance with one embodiment of the present invention
  • FIG. 4 illustrates a process for determining an absolute position of a remote control in the z-axis in accordance with one embodiment of the present invention
  • FIGS. 5A-5B illustrate alternative processes for determining an average absolute position of a remote control in the z-axis in accordance with one embodiment of the present invention
  • FIGS. 6A-6C and 7 A- 7 C illustrate embodiments of a zooming feature of the present invention.
  • FIG. 8 illustrates one embodiment of the present invention for performing the zoom function described with respect to FIGS. 7A-7C .
  • the present invention can incorporate a three-dimensional remote control system that can detect an absolute location to which a remote control is pointing in x- and y-axes and can detect the absolute position of the remote control in the z-axis with respect to one or more reference locations.
  • the remote control system of the present invention can employ absolute position detection with relative position detection.
  • FIGS. 1 and 2 illustrate one embodiment of remote control system 10 of the present invention.
  • Remote control system 10 can include remote control 16 , absolute position detection sub-system 12 , and relative position detection sub-system 14 .
  • Remote control system 10 can permit a user to interact with an image shown on display 30 using remote control 16 .
  • Display 30 can show an image substantially defined by orthogonal x- and y-axes.
  • Display 30 can have any shape or configuration.
  • display 30 can be a television, a computer monitor, a surface upon which images are projected, or any combination thereof.
  • the display can have a flat screen or a screen with a nominal curvature.
  • the display also can be any other type of display known in the art or otherwise.
  • Remote control system 10 can permit a user to move object 28 (e.g., a cursor) displayed on display 30 in the x- and y-axes by pointing remote control 16 at desired locations on display 30 .
  • Ray R in FIG. 2 can indicate the location at which remote control 16 is pointing.
  • Remote control system 10 can determine the absolute x- and y-positions of the location to which the remote control is pointing (relative to one or more reference locations). Remote control system 10 then can move object 28 to the location to which the remote control is pointing.
  • display 30 can show a corresponding movement of object 28 in the x- and y-axes.
  • Remote control system 10 also can permit a user to control other parameters of the image show on display 30 (e.g., size of object 28 ) by moving remote control 16 in a z-axis that may be orthogonal to the x- and y-axes.
  • Remote control system 10 can determine the absolute position of remote control 16 in the z-axis with respect to a reference location and correlate one or more parameters of the image thereto.
  • remote control system 10 can enlarge or reduce at least a portion of the image shown on display 30 (e.g., the size of object 28 ).
  • the reference location in the z-axis may be substantially co-planar with a screen of display 30 on which an image is shown.
  • the position of the remote control in the x-, y- and z-axes also may be referred to as the x-, y- and z-positions of the remote control (respectively).
  • Absolute position detection sub-system 12 can detect one or more of the following absolute positions with respect to one or more reference locations: (1) the x- and y-positions of remote control 16 ; (2) the x- and y-positions of the location on or proximate to display 30 to which the remote control is pointing; and (3) the z-position of remote control 16 .
  • Relative position detection sub-system 14 can detect changes in the position of remote control 16 as the user manipulates the remote control. For example, relative position detection sub-system 14 can detect the direction in which remote control 16 is moving and/or the speed at which remote control 16 is moving.
  • absolute position detection sub-system 12 can include one or more electro-optical components, e.g., one or more light sources and/or a photodetector.
  • remote control system 10 can include a plurality of individual predetermined light sources 22 .
  • One or more predetermined light sources 22 can be disposed on frame 24 to form light transmitter 20 or integrated with display 30 .
  • One or more predetermined light sources 22 also can be disposed anywhere proximate to, on, or near display 30 . As used herein, the predetermined light sources can either generate light or reflect light shined thereon.
  • predetermined light source(s) act as reflector(s)
  • another light source can project light towards the reflector(s).
  • the reflector(s) can reflect the light back to a photodetector.
  • the photodetector and the other light source can be disposed on remote control 16 , whereas the reflector(s) can be disposed proximate to, near, on, or in display 30 .
  • Predetermined light sources 22 can emit, e.g., infrared (IR) light 24 to remote control 16 , which can detect the emitted light using photodetector 26 .
  • Photodetector 26 can include CCD arrays, CMOS arrays, two-dimensional position sensitive photodiode arrays, other types of photodiode arrays, other types of light detection devices known in the art or otherwise, or any combination thereof.
  • transmitter 20 can be disposed such that predetermined light sources 22 are substantially co-planar with the screen of display 30 .
  • transmitter 20 and/or predetermined light sources 22 can be disposed at another location near or on display 30 .
  • remote control system 10 can be configured to determine the absolute z-position of remote control 16 with respect to the light transmitter and/or one or more predetermined light sources. That is, the light transmitter and/or one or more predetermined light sources may serve as the reference location in the z-axis. One of the predetermined light sources also may serve as the reference location in the x- and y-axes.
  • Controller 32 which may be disposed within remote control 16 , can determine the x- and y-positions of the display location to which a user is pointing remote control 16 based on the IR light detected by photodetector 26 . Controller 32 also can be configured to generate signals for rendering display 30 that move object 28 to the determined x- and y-positions. Based on the IR light detected by photodetector 26 , controller 32 also can be configured to determine an absolute z-position of remote control 16 with respect to a reference location.
  • the controllers described herein may include processors, memory, ASICs, circuits and/or other electronic components.
  • Relative position detection system 14 can include relative motion sensor 34 disposed within remote control 16 .
  • Relative motion sensor 34 can include any sensor that can detect relative motion or change in position of an object to which it is coupled. Controller 32 can incorporate data from relative motion sensor 34 in calculating the absolute z-position of remote control 16 . This can provide additional resolution of the determined z-position and can permit remote control system 10 to more accurately track movement of remote control 16 .
  • relative motion sensor 34 can include a single or multi-dimensional accelerometer. In alternative embodiments of the present invention, relative motion sensor 34 can include a gyroscope, an accelerometer, any other sensor that can detect relative motion, or any combination thereof.
  • Remote control 12 can incorporate user input component 38 .
  • a user may actuate user input component 38 when the user wants remote control system 10 to perform an action.
  • a user my actuate user input component 38 when the user is pointing to a location on display 30 to which the user wants object 28 to be moved or when the user moves remote control 16 in the z-axis to, e.g., zoom in on or zoom out of the image shown on display 30 .
  • remote control system 10 can be configured to take no action.
  • User input component 38 can be a scrollwheel similar to that incorporated by a portable media player sold under the trademark iPodTM by Apple Computer, Inc. of Cupertino, Calif.
  • the scrollwheel can include one or more buttons and a capacitive touchpad.
  • the touchpad can permit a user to scroll through software menus by running the user's finger around the track of the scrollwheel.
  • User input component 38 also can include, for example, one or more buttons, a touchpad, a touchscreen display, or any combination thereof.
  • Remote control system 10 also can include optional console 40 .
  • Console 40 can have controller 42 that can perform some or all of the processing described for controller 32 .
  • remote control 16 can be configured to transmit data representing detected IR light 24 to console 40 .
  • Controller 42 in console 40 then can (1) determine the absolute x-, y-, and z-positions described above; and (2) generate signals for rendering display 30 based on the determined x-, y-, and z-positions.
  • controller 32 can determine the absolute x-, y-, and z-positions described above and controller 42 can generate signals for rendering display 30 based on the determined x-, y-, and z-positions.
  • console 40 can communicate with remote control 16 using cable 44 and/or one or more wireless communication protocols known in the art or otherwise.
  • Console 40 also can communicate with display 30 using cable 46 and/or one or more wireless communication protocols known in the art or otherwise.
  • console 40 can be integrated with display 30 as one unit.
  • Console 40 also can have one or more connectors 43 to which accessories can be coupled.
  • Accessories can include cables 44 and/or 46 , game cartridges, portable memory devices (e.g., memory cards, external hard drives, etc.), adapters for interfacing with another electronic device (e.g., computers, camcorders, cameras, media players, etc.), or combinations thereof.
  • portable memory devices e.g., memory cards, external hard drives, etc.
  • adapters for interfacing with another electronic device e.g., computers, camcorders, cameras, media players, etc.
  • FIG. 3 illustrates one embodiment of a position detection process in accordance with the present invention.
  • controller 32 or 42 can accept data from photodetector 26 of absolute position detection sub-system 12 .
  • the accepted data may be representative of detected light 24 .
  • controller 32 or 42 can use the data from photodetector 26 to determine the absolute x- and y-positions of the location to which remote control 16 is pointing and/or the absolute x- and y-positions of remote control 16 .
  • the absolute x- and y-positions of remote control 16 can be used, for example, in video games to position a user's character or to otherwise track the movement of the remote control in a user's environment.
  • Remote control system 10 also can employ other techniques known in the art or otherwise.
  • controller 32 or 42 can use the data from photodetector 26 to determine an initial absolute z-position of remote control 16 using, e.g., an averaging technique.
  • an averaging technique can include accepting multiple frames of data collected by photodetector 26 and determining an average absolute z-position based on the multiple frames of data. More details about one embodiment of the averaging technique is discussed below with respect to FIGS. 4-5B .
  • controller 32 or 42 can accept data or signals from accelerometer 34 . Based on the accelerometer data/signals, controller 32 or 42 can extract information about changes in the z-position of remote control 16 (if any). For example, the sign of the slope of a signal waveform derived from accelerometer data can indicate whether a user is moving remote control 16 in the positive or negative z-direction with respect to a reference condition. The magnitude of signals derived from accelerometer data can indicate the rate at which the user is moving remote control 16 . The controller can extract this information from the accelerometer signals and correlate the information to the direction and rate of change of remote control 16 in the z-axis. Given the direction, rate of change, and amount of time elapsed, controller 32 or 42 can determine changes in the position of remote control 16 in the z-axis.
  • controller 32 or 42 can combine the average absolute z-position determined in step 54 with the change in z-position determined in step 58 to provide an updated absolute z-position.
  • controller 32 or 42 can add the average absolute z-position determined in step 54 with the change in z-position determined in step 58 .
  • Controller 32 or 42 also can weight either the average absolute z-position determined in step 54 or the change in z-position determined in step 58 before combining the values, e.g., to account for differences in accuracy, error rates, characteristics of the hardware, etc.
  • the value resulting from the combination can be a more precise indication of the absolute z-position of remote control 16 as compared to the average z-position determined in step 54 .
  • the updated z-position determined in step 60 can provide additional resolution and thereby permit remote control system 10 to more accurately track movement of remote control 16 .
  • Controller 32 or 42 can be configured to perform steps 50 - 54 simultaneously with steps 56 - 60 .
  • the controller also can continuously reiterate steps 50 - 60 , thereby continuously updating the absolute z-position of remote control 16 .
  • remote control system 10 can perform additional processing.
  • data from photodetector 26 can be processed by a hardware or software low pass filter (not shown).
  • data from accelerometer 34 can be processed by a hardware or software high pass filter (not shown).
  • Controller 32 or 42 also can use data from relative motion sensor 34 to determine roll of remote control 16 .
  • the controller can not be able to distinguish whether the remote control is disposed with, e.g., user input component 38 pointing in the positive y-direction or in the negative y-direction due to the symmetricity.
  • a controller of the present invention can distinguish between these configurations by analyzing accelerometer data.
  • Controller 32 or 42 also can use data from the relative motion sensor to determine pitch and yaw of remote control 16 with respect to a reference configuration.
  • FIG. 3 shows a remote control system of the present invention using data from the relative position detection sub-system to determine only the changes in the absolute z-position of a remote control
  • data from the relative position detection sub-system also can be used to determine changes in the x- and y-positions of the remote control. This information then can be combined with the x- and y-positions determined in step 52 to determine the location to which the remote control is pointing and/or the absolute x- and y-positions of remote control 10 .
  • FIG. 4 illustrates techniques that absolute position detection sub-system 12 of remote control system 10 can employ in step 54 of FIG. 3 to determine an initial absolute z-position of remote control 16 .
  • Remote control system 10 can be configured to determine the absolute position of a remote control in the z-direction by analyzing light signals 24 . 1 , 24 . 2 from at least two predetermined light sources 22 to determine perceived distance D between the predetermined light sources. For example, as a user moves remote control 16 from position Z(a) to Z(b), angle ⁇ between light rays 24 . 1 and 24 . 2 may decrease from ⁇ (a) to ⁇ (b). As a result, remote control 16 can perceive distance D between predetermined light sources 22 to become smaller.
  • controller 32 or 42 can correlate angle ⁇ and/or perceived distance D to a z-position.
  • controller 32 or 42 can calculate the z-position using angle ⁇ and/or perceived distance D in one or more formulas based on principles of geometry.
  • remote control system 10 can have a database that associates perceived distances D and/or angles ⁇ to predetermined z-positions. Controller 32 or 42 can be configured to access this database to determine the z-position of remote control 16 .
  • Remote control system 10 also can compare the signal intensities of light rays 24 . 1 and 24 . 2 received by photodetector 26 to determine the absolute z-position of remote control 16 . For example, as a user moves remote control 16 from position Z(a) to Z(b), the intensities of light rays 24 . 1 and 24 . 2 received by photodetector 26 may decrease. Also, as a user moves remote control 16 from side to side in the x-axis, the intensity of light 24 . 1 received by photodetector 26 may differ from that received from light 24 . 2 . To determine the absolute z-position of remote control 16 , controller 32 or 42 can correlate the detected intensities of light rays 24 . 1 and 24 .
  • controller 32 or 42 can be configured to calculate the z-position using formula(s) that are function(s) of the intensities of the detected light rays. The formulas can be determined by empirical testing or by using principles of light propagation.
  • remote control system 10 can have a database that associates detected intensities to predetermined z-positions. Controller 32 or 42 can be configured to access this database to determine the z-position of remote control 16 .
  • controller 32 or 42 can correlate the z-position of remote control 16 using perceived distance D, angle ⁇ , intensities of light detected by photodetector 26 , or any combination thereof. While FIG. 4 illustrates an exemplary system having two predetermined light sources, these techniques can be employed to detect an absolute z-position of remote control 16 in systems having more than two predetermined light sources.
  • Remote control system 10 also can employ other techniques known in the art or otherwise for determining initial absolute z-positions of remote control 16 .
  • U.S. Patent Application Publication Nos. 2006/0152489 to Sweetser et al.; 2006/0152488 to Salsman et al.; and 2006/0152487 to Grunnet-Jepsen et al. describe techniques that can be employed by controller 32 or 42 to determine the z-position of a remote control when two, three, or four predetermined light sources are provided.
  • remote control system 10 can use only the techniques described above to determine the initial absolute z-position of remote control 16 in step 54 of FIG. 3 .
  • remote control system 10 can employ additional processing in step 54 to determine an average z-position for remote control 16 . That is, the initial absolute position determined in step 54 can be an average position of the remote control over a predetermined amount of time or predetermined number of frames of data collected by the photodetector. The latter embodiment can reduce the effect of jitter in the collected data. Jitter can result, for example, from decreased resolution in the z-direction when the distance between predetermined light sources 22 and photodetector 26 increases.
  • FIGS. 5A-5B illustrate averaging techniques for determining an average absolute z-position for remote control 16 .
  • remote control system 10 can use other averaging techniques known in the art or otherwise for determining an average absolute z-position for remote control 16 .
  • controller 32 or 42 can be configured to determine an absolute z-position for each frame of data collected by photodetector 26 and then average the determined z-positions over a predetermined number of frames (e.g., 30 frames).
  • controller 32 or 42 can accept data from photodetector 26 that may be representative of IR light 24 from predetermined light sources 22 .
  • controller 32 or 42 can determine an absolute z-position of remote control 16 based on the accepted data (step 72 ).
  • controller 32 or 42 can store the z-position determined in step 72 in memory, e.g., buffer memory.
  • controller 32 or 42 can check whether z-positions have been determined for a predetermined number of frames. If not, the controller can revert back to step 70 .
  • controller 32 or 42 can determine an average z-position for remote control 16 by averaging some or all of the z-positions stored in step 74 . In one embodiment of the present invention, the controller can perform additional processing before the controller determines an average z-position. For example, the controller can eliminate extreme or outlying z-position values from the set of values used in the averaging process. Thereafter, controller 32 or 42 can revert back to step 70 .
  • controller 32 or 42 can be configured to average data collected from photodetector 26 over a predetermined number of frames (e.g., 30 frames) and then determine a z-position based on the averaged data.
  • controller 32 or 42 can accept data from photodetector 26 that may be representative of IR light 24 from predetermined light sources 22 .
  • controller 32 or 42 can store the accepted data in memory, e.g., buffer memory.
  • controller 32 or 42 can check whether data from photodetector 26 has been accepted for a predetermined number of frames. If not, the controller can revert back to step 82 .
  • controller 32 or 42 can determine average value(s) of the stored data, e.g., average intensity for light ray 24 . 1 and average intensity for light ray 24 . 2 .
  • controller 32 or 42 can determine the average absolute z-position of remote control 16 based on the average value(s) determined in step 88 (step 90 ).
  • the controller can perform additional processing before the controller determines an average z-position in step 90 . Thereafter, the process can revert back to step 82 .
  • FIGS. 6A-6C illustrate one application of the present invention that can utilize the absolute z-position of remote control 16 .
  • remote control 16 is positioned at position Z1 from light transmitter 20 .
  • Controller 32 or 42 can detect position Z1 and generate signals for rendering at least a portion of the image shown on the display (e.g., object 28 ) in a size that corresponds to position Z1.
  • the controller can scale an image of object 28 by a factor that correlates to position Z1 in a predetermined relationship.
  • the controller can detect new position Z2 and generate signals for rendering object 28 in a larger size that correlates to position Z2.
  • the controller can detect new position Z3 and generate signals for rendering at least a portion of the image shown on the display (e.g., object 28 ) in a smaller size that correlates to position Z3.
  • the image of object 28 may have a reference size that may be scaled up or down depending on the position of remote control 16 in the z-axis.
  • controller 32 or 42 can enlarge or zoom in on at least a portion of an image shown on the display (e.g., object 28 ) when the remote control is moved away from the display or transmitter 20 in the z-axis. Controller 32 or 42 also can reduce the size or zoom out of at least a portion of the image shown on the display (e.g., object 28 ) when the remote control is moved towards the display or transmitter 20 in the z-axis.
  • FIGS. 7A-7C illustrate a second application that uses the zooming function described with respect to FIGS. 6A-6C to zoom into and out of at least a portion an image (e.g., pictures or videos) shown on display 30 .
  • FIG. 7A illustrates display 30 showing an image of a triangle and cursor 28 .
  • the user can point remote control 16 to the desired area on the display (e.g., a corner of the triangle). Responsive thereto, remote control system 10 can detect this action and move cursor 28 to the location at which the remote control is pointed (see FIG. 7B ).
  • remote control system 10 can zoom in on or enlarge the corner of the triangle at which cursor 28 is disposed (see FIG. 7C ). Accordingly, the location in the x- and y-axes at which remote control 16 is pointing may be the focal point about which the image is zoomed in or out. Alternatively, remote control system 10 can be configured to zoom out or shrink an image shown on the display when the user moves the remote control closer to display 30 or transmitter 20 in the z-axis.
  • FIG. 8 illustrates one embodiment of the present invention for performing the zoom function described with respect to FIGS. 7A-7C .
  • controller 32 or 42 can generate signals to render an image on display 30 . Controller 32 or 42 initially can render the image in a reference size.
  • the controller can accept signals from user input component 38 that indicates the user is requesting that remote control system 10 take action.
  • the controller can accept data from absolute and relative position detection sub-systems as described above.
  • the controller can determine the absolute x- and y-positions to which remote control 16 is pointing and the z-position of the remote control.
  • the controller can correlate the x- and y-positions to which remote control 16 is pointing to coordinates on the displayed image.
  • step 110 the controller can determine how much the displayed image needs to be translated in the x- and y-directions so that the resulting image rendered in step 114 shows the desired feature at which the remote control is pointed.
  • the image rendered in step 114 can be centered about the location in the x- and y-axes to which the remote control is pointing.
  • step 112 the controller can determine how much to scale the displayed image from its reference size in accordance with the z-position of remote control 16 .
  • step 114 the controller can generate signals for rendering display 30 with an image that is translated and scaled in accordance with the translation and scaling factor determined in steps 110 and 112 .
  • predetermined light sources can be disposed in a remote control and a photodetector can be disposed in a display, in a frame disposed proximate to the display, or at any location proximate to, on, or near a display.
  • a controller in the display can perform some or all of the processing described above for controllers 32 and/or 42 .
  • multiple controllers may be used to control remote control systems of the present invention.
  • a remote control of the present invention can be any electronic device in a system that may need to determine the absolute positions of the electronic device with respect to one or more reference locations.
  • the remote control can be any portable, mobile, hand-held, or miniature consumer electronic device.
  • Illustrative electronic devices can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, or combinations thereof.
  • Miniature electronic devices may have a form factor that is smaller than that of hand-held devices.
  • Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or combinations thereof.
  • photodetector 26 may be integrated with relative motion sensor 34 .
  • Controller 32 also may be integrated with photodetector 26 and relative motion sensor 34 .
  • the absolute and relative position detection sub-systems can share components, e.g., controller 32 .
  • the illustrative remote control systems described above may have included predetermined light sources that output light waves
  • one or more of the predetermined light sources can be replaced with component(s) that output or reflect other types of energy waves either alone or in conjunction with light waves.
  • the component(s) can output radio waves.

Abstract

The present invention can include three-dimensional remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes and an absolute position of the remote control in a third orthogonal axis. Remote control systems of the present invention can employ absolute position detection with relative position detection. Absolute position detection can indicate an initial absolute position of the remote control and relative position detection can indicate changes in the position of the remote control. By combining absolute and relative position detection, remote control systems of the present invention can track remote controls more precisely than systems that only employ absolute position detection. The present invention also can include methods and apparatus for zooming in and out of an image shown on a display based on the absolute position of the remote control in the third axis.

Description

    FIELD OF THE INVENTION
  • The present invention can relate to multi-dimensional remote control systems.
  • BACKGROUND OF THE INVENTION
  • Some electronic systems can permit a user to interact with software applications, e.g., video games, by manipulating a remote control. For example, the systems can permit a user to interact with an image shown on a display by pointing a remote control at desired locations on or proximate to the display. Using infrared (IR) sources and photodetectors, the remote control systems can detect light produced or reflected by the light sources. The systems then can determine the location to which the remote control is pointing based on the detected light. The remote control systems or electronic devices coupled thereto can then perform one or more predetermined actions.
  • SUMMARY OF THE INVENTION
  • The present invention can include multi-dimensional (e.g., 2-D or 3-D) remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes (e.g., the x- and y-axes). Remote control systems of the present invention also can detect the absolute position of the remote control in a third orthogonal axis (e.g., the z-axis).
  • To determine the absolute position of the remote control, remote control systems of the present invention can employ absolute position detection with relative position detection. Absolute position detection can indicate an initial absolute position of the remote control. Relative position detection can indicate changes in the position of the remote control. When the initial absolute position is combined with a change in the position of the remote control, an updated absolute position can be determined. Because relative position detection can provide greater resolution than some techniques used in absolute position detection, the updated absolute position can be more precise than the initial absolute position determined for the remote control.
  • The remote control system of the present invention also can zoom into and out of an image or a portion thereof based on the absolute position of the remote control in the third axis.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other advantages of the present invention will be apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
  • FIG. 1 illustrates one embodiment of a remote control system of the present invention;
  • FIG. 2 illustrates interaction of one embodiment of a remote control system of the present invention with an image shown on a display;
  • FIG. 3 illustrates a process for determining absolute positions of a remote control in x-, y-, and z-axes in accordance with one embodiment of the present invention;
  • FIG. 4 illustrates a process for determining an absolute position of a remote control in the z-axis in accordance with one embodiment of the present invention;
  • FIGS. 5A-5B illustrate alternative processes for determining an average absolute position of a remote control in the z-axis in accordance with one embodiment of the present invention;
  • FIGS. 6A-6C and 7A-7C illustrate embodiments of a zooming feature of the present invention; and
  • FIG. 8 illustrates one embodiment of the present invention for performing the zoom function described with respect to FIGS. 7A-7C.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present invention can incorporate a three-dimensional remote control system that can detect an absolute location to which a remote control is pointing in x- and y-axes and can detect the absolute position of the remote control in the z-axis with respect to one or more reference locations. The remote control system of the present invention can employ absolute position detection with relative position detection.
  • FIGS. 1 and 2 illustrate one embodiment of remote control system 10 of the present invention. Remote control system 10 can include remote control 16, absolute position detection sub-system 12, and relative position detection sub-system 14. Remote control system 10 can permit a user to interact with an image shown on display 30 using remote control 16. Display 30 can show an image substantially defined by orthogonal x- and y-axes. Display 30 can have any shape or configuration. For example, display 30 can be a television, a computer monitor, a surface upon which images are projected, or any combination thereof. The display can have a flat screen or a screen with a nominal curvature. The display also can be any other type of display known in the art or otherwise.
  • Remote control system 10 can permit a user to move object 28 (e.g., a cursor) displayed on display 30 in the x- and y-axes by pointing remote control 16 at desired locations on display 30. Ray R in FIG. 2 can indicate the location at which remote control 16 is pointing. Remote control system 10 can determine the absolute x- and y-positions of the location to which the remote control is pointing (relative to one or more reference locations). Remote control system 10 then can move object 28 to the location to which the remote control is pointing. Thus, when the user moves remote control 16 in the x- and y-axes, display 30 can show a corresponding movement of object 28 in the x- and y-axes.
  • Remote control system 10 also can permit a user to control other parameters of the image show on display 30 (e.g., size of object 28) by moving remote control 16 in a z-axis that may be orthogonal to the x- and y-axes. Remote control system 10 can determine the absolute position of remote control 16 in the z-axis with respect to a reference location and correlate one or more parameters of the image thereto. Thus, for example, as a user moves remote control 16 towards or away from display 30 in the z-axis, remote control system 10 can enlarge or reduce at least a portion of the image shown on display 30 (e.g., the size of object 28). In one embodiment of the present invention, the reference location in the z-axis may be substantially co-planar with a screen of display 30 on which an image is shown. As used herein, the position of the remote control in the x-, y- and z-axes also may be referred to as the x-, y- and z-positions of the remote control (respectively).
  • Absolute position detection sub-system 12 can detect one or more of the following absolute positions with respect to one or more reference locations: (1) the x- and y-positions of remote control 16; (2) the x- and y-positions of the location on or proximate to display 30 to which the remote control is pointing; and (3) the z-position of remote control 16. Relative position detection sub-system 14 can detect changes in the position of remote control 16 as the user manipulates the remote control. For example, relative position detection sub-system 14 can detect the direction in which remote control 16 is moving and/or the speed at which remote control 16 is moving.
  • To detect the x-, y-, and z-positions, absolute position detection sub-system 12 can include one or more electro-optical components, e.g., one or more light sources and/or a photodetector. For example, as illustrated in FIG. 2, remote control system 10 can include a plurality of individual predetermined light sources 22. One or more predetermined light sources 22 can be disposed on frame 24 to form light transmitter 20 or integrated with display 30. One or more predetermined light sources 22 also can be disposed anywhere proximate to, on, or near display 30. As used herein, the predetermined light sources can either generate light or reflect light shined thereon. If predetermined light source(s) act as reflector(s), another light source can project light towards the reflector(s). The reflector(s) can reflect the light back to a photodetector. For example, the photodetector and the other light source can be disposed on remote control 16, whereas the reflector(s) can be disposed proximate to, near, on, or in display 30.
  • Predetermined light sources 22 can emit, e.g., infrared (IR) light 24 to remote control 16, which can detect the emitted light using photodetector 26. Photodetector 26 can include CCD arrays, CMOS arrays, two-dimensional position sensitive photodiode arrays, other types of photodiode arrays, other types of light detection devices known in the art or otherwise, or any combination thereof.
  • In one embodiment of the present invention, transmitter 20 can be disposed such that predetermined light sources 22 are substantially co-planar with the screen of display 30. In alternative embodiments of the present invention, transmitter 20 and/or predetermined light sources 22 can be disposed at another location near or on display 30. In one embodiment of the present invention, remote control system 10 can be configured to determine the absolute z-position of remote control 16 with respect to the light transmitter and/or one or more predetermined light sources. That is, the light transmitter and/or one or more predetermined light sources may serve as the reference location in the z-axis. One of the predetermined light sources also may serve as the reference location in the x- and y-axes.
  • Controller 32, which may be disposed within remote control 16, can determine the x- and y-positions of the display location to which a user is pointing remote control 16 based on the IR light detected by photodetector 26. Controller 32 also can be configured to generate signals for rendering display 30 that move object 28 to the determined x- and y-positions. Based on the IR light detected by photodetector 26, controller 32 also can be configured to determine an absolute z-position of remote control 16 with respect to a reference location. The controllers described herein may include processors, memory, ASICs, circuits and/or other electronic components.
  • Relative position detection system 14 can include relative motion sensor 34 disposed within remote control 16. Relative motion sensor 34 can include any sensor that can detect relative motion or change in position of an object to which it is coupled. Controller 32 can incorporate data from relative motion sensor 34 in calculating the absolute z-position of remote control 16. This can provide additional resolution of the determined z-position and can permit remote control system 10 to more accurately track movement of remote control 16.
  • In one embodiment of the present invention, relative motion sensor 34 can include a single or multi-dimensional accelerometer. In alternative embodiments of the present invention, relative motion sensor 34 can include a gyroscope, an accelerometer, any other sensor that can detect relative motion, or any combination thereof.
  • Remote control 12 can incorporate user input component 38. A user may actuate user input component 38 when the user wants remote control system 10 to perform an action. For example, a user my actuate user input component 38 when the user is pointing to a location on display 30 to which the user wants object 28 to be moved or when the user moves remote control 16 in the z-axis to, e.g., zoom in on or zoom out of the image shown on display 30. When the user is not actuating user input component 38, remote control system 10 can be configured to take no action.
  • User input component 38 can be a scrollwheel similar to that incorporated by a portable media player sold under the trademark iPod™ by Apple Computer, Inc. of Cupertino, Calif. The scrollwheel can include one or more buttons and a capacitive touchpad. The touchpad can permit a user to scroll through software menus by running the user's finger around the track of the scrollwheel. User input component 38 also can include, for example, one or more buttons, a touchpad, a touchscreen display, or any combination thereof.
  • Remote control system 10 also can include optional console 40. Console 40 can have controller 42 that can perform some or all of the processing described for controller 32. For example, remote control 16 can be configured to transmit data representing detected IR light 24 to console 40. Controller 42 in console 40 then can (1) determine the absolute x-, y-, and z-positions described above; and (2) generate signals for rendering display 30 based on the determined x-, y-, and z-positions. Alternatively, controller 32 can determine the absolute x-, y-, and z-positions described above and controller 42 can generate signals for rendering display 30 based on the determined x-, y-, and z-positions.
  • In one embodiment of the present invention, console 40 can communicate with remote control 16 using cable 44 and/or one or more wireless communication protocols known in the art or otherwise. Console 40 also can communicate with display 30 using cable 46 and/or one or more wireless communication protocols known in the art or otherwise. Alternatively, console 40 can be integrated with display 30 as one unit.
  • Console 40 also can have one or more connectors 43 to which accessories can be coupled. Accessories can include cables 44 and/or 46, game cartridges, portable memory devices (e.g., memory cards, external hard drives, etc.), adapters for interfacing with another electronic device (e.g., computers, camcorders, cameras, media players, etc.), or combinations thereof.
  • FIG. 3 illustrates one embodiment of a position detection process in accordance with the present invention. In step 50, controller 32 or 42 can accept data from photodetector 26 of absolute position detection sub-system 12. The accepted data may be representative of detected light 24. In step 52, controller 32 or 42 can use the data from photodetector 26 to determine the absolute x- and y-positions of the location to which remote control 16 is pointing and/or the absolute x- and y-positions of remote control 16. The absolute x- and y-positions of remote control 16 can be used, for example, in video games to position a user's character or to otherwise track the movement of the remote control in a user's environment.
  • Techniques for determining the x- and y-positions may be known in the art. For example, U.S. Pat. No. 6,184,863 to Sibert et al., issued on Feb. 6, 2001, and U.S. Pat. No. 7,053,932 to Lin et al, issued on May 30, 2006, the entireties of which are incorporated herein by reference, describe two techniques that can be employed by controller 32 or 42. U.S. Patent Application Publication No. 2004/0207597 to Marks, published on Oct. 21, 2004; No. 2006/0152489 to Sweetser et al., published on Jul. 13, 2006; No. 2006/0152488 to Salsman et al., published on Jul. 13, 2006; and No. 2006/0152487 to Grunnet-Jepsen et al., published on Jul. 13, 2006, the entireties of which also are incorporated herein by reference, describe additional techniques that can be employed by controller 32 or 42. Remote control system 10 also can employ other techniques known in the art or otherwise.
  • In step 54, controller 32 or 42 can use the data from photodetector 26 to determine an initial absolute z-position of remote control 16 using, e.g., an averaging technique. One embodiment of an averaging technique can include accepting multiple frames of data collected by photodetector 26 and determining an average absolute z-position based on the multiple frames of data. More details about one embodiment of the averaging technique is discussed below with respect to FIGS. 4-5B.
  • In step 56, controller 32 or 42 can accept data or signals from accelerometer 34. Based on the accelerometer data/signals, controller 32 or 42 can extract information about changes in the z-position of remote control 16 (if any). For example, the sign of the slope of a signal waveform derived from accelerometer data can indicate whether a user is moving remote control 16 in the positive or negative z-direction with respect to a reference condition. The magnitude of signals derived from accelerometer data can indicate the rate at which the user is moving remote control 16. The controller can extract this information from the accelerometer signals and correlate the information to the direction and rate of change of remote control 16 in the z-axis. Given the direction, rate of change, and amount of time elapsed, controller 32 or 42 can determine changes in the position of remote control 16 in the z-axis.
  • In step 60, controller 32 or 42 can combine the average absolute z-position determined in step 54 with the change in z-position determined in step 58 to provide an updated absolute z-position. For example, controller 32 or 42 can add the average absolute z-position determined in step 54 with the change in z-position determined in step 58. Controller 32 or 42 also can weight either the average absolute z-position determined in step 54 or the change in z-position determined in step 58 before combining the values, e.g., to account for differences in accuracy, error rates, characteristics of the hardware, etc.
  • The value resulting from the combination can be a more precise indication of the absolute z-position of remote control 16 as compared to the average z-position determined in step 54. The updated z-position determined in step 60 can provide additional resolution and thereby permit remote control system 10 to more accurately track movement of remote control 16.
  • Controller 32 or 42 can be configured to perform steps 50-54 simultaneously with steps 56-60. The controller also can continuously reiterate steps 50-60, thereby continuously updating the absolute z-position of remote control 16.
  • In the embodiment of FIG. 3, remote control system 10 can perform additional processing. For example, data from photodetector 26 can be processed by a hardware or software low pass filter (not shown). Also, data from accelerometer 34 can be processed by a hardware or software high pass filter (not shown). Controller 32 or 42 also can use data from relative motion sensor 34 to determine roll of remote control 16. For example, if a remote control system employs a symmetrical pattern of IR emitters, the controller can not be able to distinguish whether the remote control is disposed with, e.g., user input component 38 pointing in the positive y-direction or in the negative y-direction due to the symmetricity. By incorporating an accelerometer, for example, a controller of the present invention can distinguish between these configurations by analyzing accelerometer data. Controller 32 or 42 also can use data from the relative motion sensor to determine pitch and yaw of remote control 16 with respect to a reference configuration.
  • While FIG. 3 shows a remote control system of the present invention using data from the relative position detection sub-system to determine only the changes in the absolute z-position of a remote control, data from the relative position detection sub-system also can be used to determine changes in the x- and y-positions of the remote control. This information then can be combined with the x- and y-positions determined in step 52 to determine the location to which the remote control is pointing and/or the absolute x- and y-positions of remote control 10.
  • FIG. 4 illustrates techniques that absolute position detection sub-system 12 of remote control system 10 can employ in step 54 of FIG. 3 to determine an initial absolute z-position of remote control 16. Remote control system 10 can be configured to determine the absolute position of a remote control in the z-direction by analyzing light signals 24.1, 24.2 from at least two predetermined light sources 22 to determine perceived distance D between the predetermined light sources. For example, as a user moves remote control 16 from position Z(a) to Z(b), angle Θ between light rays 24.1 and 24.2 may decrease from Θ(a) to Θ(b). As a result, remote control 16 can perceive distance D between predetermined light sources 22 to become smaller. Accordingly, to determine the absolute z-position of remote control 16 with respect to, e.g., predetermined light sources 22, controller 32 or 42 can correlate angle Θ and/or perceived distance D to a z-position. For example, controller 32 or 42 can calculate the z-position using angle Θ and/or perceived distance D in one or more formulas based on principles of geometry. Alternatively, remote control system 10 can have a database that associates perceived distances D and/or angles Θ to predetermined z-positions. Controller 32 or 42 can be configured to access this database to determine the z-position of remote control 16.
  • Remote control system 10 also can compare the signal intensities of light rays 24.1 and 24.2 received by photodetector 26 to determine the absolute z-position of remote control 16. For example, as a user moves remote control 16 from position Z(a) to Z(b), the intensities of light rays 24.1 and 24.2 received by photodetector 26 may decrease. Also, as a user moves remote control 16 from side to side in the x-axis, the intensity of light 24.1 received by photodetector 26 may differ from that received from light 24.2. To determine the absolute z-position of remote control 16, controller 32 or 42 can correlate the detected intensities of light rays 24.1 and 24.2 to a z-position for the remote control. For example, controller 32 or 42 can be configured to calculate the z-position using formula(s) that are function(s) of the intensities of the detected light rays. The formulas can be determined by empirical testing or by using principles of light propagation. Alternatively, remote control system 10 can have a database that associates detected intensities to predetermined z-positions. Controller 32 or 42 can be configured to access this database to determine the z-position of remote control 16. In one embodiment of the present invention, controller 32 or 42 can correlate the z-position of remote control 16 using perceived distance D, angle Θ, intensities of light detected by photodetector 26, or any combination thereof. While FIG. 4 illustrates an exemplary system having two predetermined light sources, these techniques can be employed to detect an absolute z-position of remote control 16 in systems having more than two predetermined light sources.
  • Remote control system 10 also can employ other techniques known in the art or otherwise for determining initial absolute z-positions of remote control 16. For example, U.S. Patent Application Publication Nos. 2006/0152489 to Sweetser et al.; 2006/0152488 to Salsman et al.; and 2006/0152487 to Grunnet-Jepsen et al., the entireties of which are incorporated herein by reference above, describe techniques that can be employed by controller 32 or 42 to determine the z-position of a remote control when two, three, or four predetermined light sources are provided.
  • In one embodiment of the present invention, remote control system 10 can use only the techniques described above to determine the initial absolute z-position of remote control 16 in step 54 of FIG. 3. In an alternative embodiment of the present invention, remote control system 10 can employ additional processing in step 54 to determine an average z-position for remote control 16. That is, the initial absolute position determined in step 54 can be an average position of the remote control over a predetermined amount of time or predetermined number of frames of data collected by the photodetector. The latter embodiment can reduce the effect of jitter in the collected data. Jitter can result, for example, from decreased resolution in the z-direction when the distance between predetermined light sources 22 and photodetector 26 increases.
  • FIGS. 5A-5B illustrate averaging techniques for determining an average absolute z-position for remote control 16. Of course, remote control system 10 can use other averaging techniques known in the art or otherwise for determining an average absolute z-position for remote control 16.
  • In the embodiment of FIG. 5A, controller 32 or 42 can be configured to determine an absolute z-position for each frame of data collected by photodetector 26 and then average the determined z-positions over a predetermined number of frames (e.g., 30 frames). In step 70, controller 32 or 42 can accept data from photodetector 26 that may be representative of IR light 24 from predetermined light sources 22. Using the techniques described with respect to FIG. 4, for example, controller 32 or 42 can determine an absolute z-position of remote control 16 based on the accepted data (step 72). In step 74, controller 32 or 42 can store the z-position determined in step 72 in memory, e.g., buffer memory. In step 76, controller 32 or 42 can check whether z-positions have been determined for a predetermined number of frames. If not, the controller can revert back to step 70. In step 78, controller 32 or 42 can determine an average z-position for remote control 16 by averaging some or all of the z-positions stored in step 74. In one embodiment of the present invention, the controller can perform additional processing before the controller determines an average z-position. For example, the controller can eliminate extreme or outlying z-position values from the set of values used in the averaging process. Thereafter, controller 32 or 42 can revert back to step 70.
  • In the embodiment of FIG. 5B, controller 32 or 42 can be configured to average data collected from photodetector 26 over a predetermined number of frames (e.g., 30 frames) and then determine a z-position based on the averaged data. In step 82, controller 32 or 42 can accept data from photodetector 26 that may be representative of IR light 24 from predetermined light sources 22. In step 84, controller 32 or 42 can store the accepted data in memory, e.g., buffer memory. In step 86, controller 32 or 42 can check whether data from photodetector 26 has been accepted for a predetermined number of frames. If not, the controller can revert back to step 82. In step 88, controller 32 or 42 can determine average value(s) of the stored data, e.g., average intensity for light ray 24.1 and average intensity for light ray 24.2. Using the techniques described with respect to FIG. 4, for example, controller 32 or 42 can determine the average absolute z-position of remote control 16 based on the average value(s) determined in step 88 (step 90). In one embodiment of the present invention, the controller can perform additional processing before the controller determines an average z-position in step 90. Thereafter, the process can revert back to step 82.
  • FIGS. 6A-6C illustrate one application of the present invention that can utilize the absolute z-position of remote control 16. In FIG. 6A, remote control 16 is positioned at position Z1 from light transmitter 20. Controller 32 or 42 can detect position Z1 and generate signals for rendering at least a portion of the image shown on the display (e.g., object 28) in a size that corresponds to position Z1. For example, the controller can scale an image of object 28 by a factor that correlates to position Z1 in a predetermined relationship. When a user moves remote control 16 closer to IR transmitter 20 as shown in FIG. 6B, the controller can detect new position Z2 and generate signals for rendering object 28 in a larger size that correlates to position Z2. When a user moves remote control 16 farther way from IR transmitter 20 as shown in FIG. 6C, the controller can detect new position Z3 and generate signals for rendering at least a portion of the image shown on the display (e.g., object 28) in a smaller size that correlates to position Z3. Thus, the image of object 28 may have a reference size that may be scaled up or down depending on the position of remote control 16 in the z-axis.
  • In an alternative embodiment of the present invention, controller 32 or 42 can enlarge or zoom in on at least a portion of an image shown on the display (e.g., object 28) when the remote control is moved away from the display or transmitter 20 in the z-axis. Controller 32 or 42 also can reduce the size or zoom out of at least a portion of the image shown on the display (e.g., object 28) when the remote control is moved towards the display or transmitter 20 in the z-axis.
  • FIGS. 7A-7C illustrate a second application that uses the zooming function described with respect to FIGS. 6A-6C to zoom into and out of at least a portion an image (e.g., pictures or videos) shown on display 30. FIG. 7A illustrates display 30 showing an image of a triangle and cursor 28. When a user wishes to zoom into or enlarge a particular area of the triangle, the user can point remote control 16 to the desired area on the display (e.g., a corner of the triangle). Responsive thereto, remote control system 10 can detect this action and move cursor 28 to the location at which the remote control is pointed (see FIG. 7B). When the user moves the remote control closer to display 30 or transmitter 20 in the z-axis, remote control system 10 can zoom in on or enlarge the corner of the triangle at which cursor 28 is disposed (see FIG. 7C). Accordingly, the location in the x- and y-axes at which remote control 16 is pointing may be the focal point about which the image is zoomed in or out. Alternatively, remote control system 10 can be configured to zoom out or shrink an image shown on the display when the user moves the remote control closer to display 30 or transmitter 20 in the z-axis.
  • FIG. 8 illustrates one embodiment of the present invention for performing the zoom function described with respect to FIGS. 7A-7C. In step 100, controller 32 or 42 can generate signals to render an image on display 30. Controller 32 or 42 initially can render the image in a reference size. In step 102, the controller can accept signals from user input component 38 that indicates the user is requesting that remote control system 10 take action. In step 104, the controller can accept data from absolute and relative position detection sub-systems as described above. In step 106, the controller can determine the absolute x- and y-positions to which remote control 16 is pointing and the z-position of the remote control. In step 108, the controller can correlate the x- and y-positions to which remote control 16 is pointing to coordinates on the displayed image.
  • In step 110, the controller can determine how much the displayed image needs to be translated in the x- and y-directions so that the resulting image rendered in step 114 shows the desired feature at which the remote control is pointed. For example, the image rendered in step 114 can be centered about the location in the x- and y-axes to which the remote control is pointing.
  • In step 112, the controller can determine how much to scale the displayed image from its reference size in accordance with the z-position of remote control 16. In step 114, the controller can generate signals for rendering display 30 with an image that is translated and scaled in accordance with the translation and scaling factor determined in steps 110 and 112.
  • Although particular embodiments of the present invention have been described above in detail, it will be understood that this description is merely for purposes of illustration. Alternative embodiments of those described hereinabove also are within the scope of the present invention. For example, predetermined light sources can be disposed in a remote control and a photodetector can be disposed in a display, in a frame disposed proximate to the display, or at any location proximate to, on, or near a display.
  • Also, a controller in the display can perform some or all of the processing described above for controllers 32 and/or 42. Thus, multiple controllers may be used to control remote control systems of the present invention.
  • A remote control of the present invention can be any electronic device in a system that may need to determine the absolute positions of the electronic device with respect to one or more reference locations. For example, the remote control can be any portable, mobile, hand-held, or miniature consumer electronic device. Illustrative electronic devices can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, or combinations thereof. Miniature electronic devices may have a form factor that is smaller than that of hand-held devices. Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or combinations thereof.
  • While the above description may have described certain components as being physically separate from other components, one or more of the components may be integrated into one unit. For example, photodetector 26 may be integrated with relative motion sensor 34. Controller 32 also may be integrated with photodetector 26 and relative motion sensor 34. Furthermore, the absolute and relative position detection sub-systems can share components, e.g., controller 32.
  • Furthermore, while the illustrative remote control systems described above may have included predetermined light sources that output light waves, one or more of the predetermined light sources can be replaced with component(s) that output or reflect other types of energy waves either alone or in conjunction with light waves. For example, the component(s) can output radio waves.
  • The above described embodiments of the present invention are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims which follow.

Claims (21)

1-33. (canceled)
34. A remote control operative for use with a display, the remote control comprising:
an absolute position detection component comprising at least one electro-optical component, wherein the absolute position detection component is operative to detect an initial absolute position of the remote control;
a relative position detection component operative to detect a change in a position of the remote control; and
a controller in communication with the absolute and relative position detection components, wherein the controller is operative to:
detect that the remote control is pointing at a particular location on the display based on at least one of data from the absolute position detection component comprising the initial absolute position of the remote control and data from the relative position detection component, wherein the particular location on the display is defined by first and second orthogonal axes;
detect an updated absolute position of the remote control in a third axis based on the initial absolute position of the remote control and additional data from the relative position detection component, wherein the third axis is orthogonal to the first and second axes; and
transmit signals operative to change a portion of an image on the display corresponding to the particular location based on the updated absolute position of the remote control in the third axis.
35. The remote control of claim 34, wherein the transmitted signals are operative to move a cursor on the display to the particular location.
36. The remote control of claim 34, wherein the particular location is a focal point about which the portion of the image is operative to be changed by the transmitted signals.
37. The remote control of claim 34, wherein the initial absolute position of the remote control comprises an initial absolute position of the remote control in the third axis.
38. The remote control of claim 37, wherein the controller is operative to:
detect that the updated absolute position of the remote control in the third axis is closer to a reference location than the initial absolute position of the remote control in the third axis; and
transmit the signals to the display to at least one of zoom in on and enlarge the portion of the image corresponding to the particular location.
39. The remote control of claim 37, wherein the controller is operative to:
detect that the updated absolute position of the remote control in the third axis is further from a reference location than the initial absolute position of the remote control in the third axis is to the reference location; and
transmit the signals operative to at least one of zoom out of and shrink the portion of the image corresponding to the particular location.
40. A method for use with a system having a remote control and a display, the method comprising:
rendering an image defined by first and second orthogonal axes on the display;
determining an initial absolute position of the remote control in a third axis, wherein the third axis is orthogonal to the first and second axes;
receiving at least one signal from a user input component to change the image, wherein the at least one signal comprises data associated with a change in position of the remote control;
in response to receiving the at least one signal, determining an updated absolute position of the remote control based on the initial absolute position and the data associated with the change in position of the remote control, wherein the updated absolute position comprises an updated absolute position in the third axis;
determining a scale factor for changing the rendered image based on the updated absolute position in the third axis;
changing the rendered image by the scale factor; and
rendering the changed image on the display.
41. The method of claim 40, wherein the updated absolute position further comprises updated absolute positions in the first and second axes.
42. The method of claim 41, further comprising correlating the updated absolute positions in the first and second axes with coordinates on the rendered image.
43. The method of claim 42, further comprising determining a portion of the rendered image to translate based on the coordinates on the rendered image.
44. The method of claim 43, wherein the determining the portion of the rendered image to translate further comprises determining a first amount of the rendered image to translate in a first direction corresponding to the first axis and a second amount of the rendered image to translate in a second direction corresponding to the second axis.
45. The method of claim 43, wherein the determining the portion of the rendered image to translate further comprises generating a translation factor.
46. The method of claim 42, wherein the changed image is centered about the coordinates on the rendered image.
47. The method of claim 43, wherein the changing the rendered image further comprises translating the portion of the rendered image.
48. A method for use with a system comprising a remote control and a display, wherein the display is defined by first and second orthogonal axes and shows an image, the method comprising:
detecting a first absolute position of the remote control in a third axis with respect to a reference location based on absolute position data output by a light detector, wherein the third axis is orthogonal to the first and second axes;
rendering at least a first portion of the image in a first reference size on the display based on the first absolute position of the remote control in the third axis with respect to the reference location;
detecting a second absolute position of the remote control in the third axis based on the detected first absolute position and based on relative position data output by at least one of an accelerometer and a gyroscope; and
rendering at least a second portion of the image in a second reference size on the display based on the second absolute position of the remote control in the third axis.
49. The method of claim 48, wherein the second reference size is scaled with respect to the first reference size based on a difference between:
the first absolute position of the remote control in the third axis with respect to the reference location; and
the second absolute position of the remote control in the third axis with respect to the reference location.
50. The method of claim 48, wherein the rendering the at least a second portion of the image comprises one of zooming in on and zooming out from a focal point of the image.
51. The method of claim 50, wherein the focal point is based on an absolute position of the remote control in the first and second axes with respect to the reference location.
52. A remote control for use with a display defined by first and second orthogonal axes and operative to show an image, the remote control comprising:
an absolute position detection component of an absolute position detection sub-system, wherein the absolute position detection component comprises at least one photodetector that enables the absolute position detection sub-system to detect an initial absolute position of the remote control in a third axis orthogonal to the first and second axes;
a relative position detection component of a relative position detection sub-system, wherein the relative position detection component comprises at least one of an accelerometer and a gyroscope that enables the relative position detection sub-system to detect a change in a position of the remote control in the third axis; and
a controller communicatively coupled to the absolute position detection component and to the relative position detection component, wherein the controller:
detects an updated absolute position of the remote control in the third axis by combining the initial absolute position detected by the absolute position detection sub-system with the change in the position detected by the relative position detection sub-system; and
change the scale factor of the image shown on the display based on the difference between the initial absolute position of the remote control in the third axis and the updated absolute position of the remote control in the third axis.
53. The remote control of claim 52, wherein:
the change of the scale factor comprises one of zooming in on and zooming out from a focal point of the image; and
the focal point is based on an absolute position of the remote control in the first and second axes.
US14/185,147 2006-11-07 2014-02-20 3d remote control system employing absolute and relative position detection Abandoned US20140168081A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/185,147 US20140168081A1 (en) 2006-11-07 2014-02-20 3d remote control system employing absolute and relative position detection

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US11/594,342 US8291346B2 (en) 2006-11-07 2006-11-07 3D remote control system employing absolute and relative position detection
US13/647,088 US8689145B2 (en) 2006-11-07 2012-10-08 3D remote control system employing absolute and relative position detection
US14/185,147 US20140168081A1 (en) 2006-11-07 2014-02-20 3d remote control system employing absolute and relative position detection

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/647,088 Continuation US8689145B2 (en) 2006-11-07 2012-10-08 3D remote control system employing absolute and relative position detection

Publications (1)

Publication Number Publication Date
US20140168081A1 true US20140168081A1 (en) 2014-06-19

Family

ID=39359325

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/594,342 Active 2028-04-13 US8291346B2 (en) 2006-11-07 2006-11-07 3D remote control system employing absolute and relative position detection
US13/647,088 Expired - Fee Related US8689145B2 (en) 2006-11-07 2012-10-08 3D remote control system employing absolute and relative position detection
US14/185,147 Abandoned US20140168081A1 (en) 2006-11-07 2014-02-20 3d remote control system employing absolute and relative position detection

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US11/594,342 Active 2028-04-13 US8291346B2 (en) 2006-11-07 2006-11-07 3D remote control system employing absolute and relative position detection
US13/647,088 Expired - Fee Related US8689145B2 (en) 2006-11-07 2012-10-08 3D remote control system employing absolute and relative position detection

Country Status (1)

Country Link
US (3) US8291346B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170346125A1 (en) * 2014-12-25 2017-11-30 Galaxy Corporation Vanadium active material solution and vanadium redox battery
US20220206576A1 (en) * 2020-12-30 2022-06-30 Imagine Technologies, Inc. Wearable electroencephalography sensor and device control methods using same

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9229540B2 (en) 2004-01-30 2016-01-05 Electronic Scripting Products, Inc. Deriving input from six degrees of freedom interfaces
US7961909B2 (en) * 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
US7826641B2 (en) * 2004-01-30 2010-11-02 Electronic Scripting Products, Inc. Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features
US20080007426A1 (en) * 2006-06-13 2008-01-10 Itron, Inc Modified use of a standard message protocol for inter-module communications within a utility meter
US8384665B1 (en) * 2006-07-14 2013-02-26 Ailive, Inc. Method and system for making a selection in 3D virtual environment
US8291346B2 (en) 2006-11-07 2012-10-16 Apple Inc. 3D remote control system employing absolute and relative position detection
JP4513799B2 (en) * 2006-11-24 2010-07-28 ソニー株式会社 Wireless transmission system, electronic device, and wireless transmission method
WO2008095227A1 (en) * 2007-02-08 2008-08-14 Silverbrook Research Pty Ltd System for controlling movement of a cursor on a display device
TWI364176B (en) * 2007-05-08 2012-05-11 Quanta Comp Inc System for communication and control and method thereof
JP5291305B2 (en) * 2007-07-06 2013-09-18 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
US20090153475A1 (en) * 2007-12-14 2009-06-18 Apple Inc. Use of a remote controller Z-direction input mechanism in a media system
US8881049B2 (en) * 2007-12-14 2014-11-04 Apple Inc. Scrolling displayed objects using a 3D remote controller in a media system
US8194037B2 (en) * 2007-12-14 2012-06-05 Apple Inc. Centering a 3D remote controller in a media system
US8341544B2 (en) 2007-12-14 2012-12-25 Apple Inc. Scroll bar with video region in a media system
DE102008010717A1 (en) * 2008-02-22 2009-08-27 Siemens Aktiengesellschaft Device and method for displaying medical image information and imaging system with such a device
JP5866199B2 (en) * 2008-07-01 2016-02-17 ヒルクレスト・ラボラトリーズ・インコーポレイテッド 3D pointer mapping
US8754851B2 (en) * 2008-09-03 2014-06-17 Wuhan Splendid Optronics Technology Co., Ltd. Remote controls for electronic display board
US8237666B2 (en) 2008-10-10 2012-08-07 At&T Intellectual Property I, L.P. Augmented I/O for limited form factor user-interfaces
US8253713B2 (en) * 2008-10-23 2012-08-28 At&T Intellectual Property I, L.P. Tracking approaching or hovering objects for user-interfaces
CN102197351A (en) * 2008-10-28 2011-09-21 硅立康通讯科技株式会社 Grid signal receiver and wireless pointing system including same
KR101607264B1 (en) * 2009-07-10 2016-04-11 엘지전자 주식회사 3d pointing device, digital television, control method and system of digital television
US10271135B2 (en) 2009-11-24 2019-04-23 Nokia Technologies Oy Apparatus for processing of audio signals based on device position
JP5532300B2 (en) * 2009-12-24 2014-06-25 ソニー株式会社 Touch panel device, touch panel control method, program, and recording medium
KR101646953B1 (en) * 2010-01-12 2016-08-09 엘지전자 주식회사 Display device and control method thereof
JP5491217B2 (en) * 2010-01-27 2014-05-14 株式会社バンダイナムコゲームス Program, information storage medium, game system
KR101000062B1 (en) * 2010-04-21 2010-12-10 엘지전자 주식회사 Image display apparatus and method for operating the same
US20110265118A1 (en) * 2010-04-21 2011-10-27 Choi Hyunbo Image display apparatus and method for operating the same
KR101729527B1 (en) 2010-09-10 2017-04-24 엘지전자 주식회사 Apparatus for displaying image and method for operating the same
JP5778776B2 (en) * 2010-10-14 2015-09-16 トムソン ライセンシングThomson Licensing Remote control device for 3D video system
JP5938638B2 (en) * 2011-01-13 2016-06-22 パナソニックIpマネジメント株式会社 Interactive presentation system
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
KR101383840B1 (en) * 2011-11-17 2014-04-14 도시바삼성스토리지테크놀러지코리아 주식회사 Remote controller, system and method for controlling by using the remote controller
US9342139B2 (en) * 2011-12-19 2016-05-17 Microsoft Technology Licensing, Llc Pairing a computing device to a user
TWI598772B (en) * 2012-01-05 2017-09-11 原相科技股份有限公司 Positioning signal transmitter module and electronic product capable of controlling cursor
CN103196362B (en) * 2012-01-09 2016-05-11 西安智意能电子科技有限公司 A kind of system of the three-dimensional position for definite relative checkout gear of emitter
CN102778998B (en) 2012-03-19 2014-12-03 联想(北京)有限公司 Interaction method, device and system
KR101956173B1 (en) * 2012-03-26 2019-03-08 삼성전자주식회사 Apparatus and Method for Calibrating 3D Position/Orientation Tracking System
US9244549B2 (en) * 2012-06-28 2016-01-26 Samsung Electronics Co., Ltd. Apparatus and method for user input
JP2015531510A (en) * 2012-09-06 2015-11-02 インターフェイズ・コーポレーション Absolute and relative positioning sensor cooperation in interactive display system
EP2909701A4 (en) 2012-10-19 2016-08-10 Interphase Corp Motion compensation in an interactive display system
CN103869932A (en) * 2012-12-10 2014-06-18 建兴电子科技股份有限公司 Optical input device and operation method thereof
RU2573245C2 (en) 2013-04-24 2016-01-20 Общество С Ограниченной Ответственностью "Лаборатория Эландис" Method for contactless control using polarisation marker and system therefor
CN104423537B (en) * 2013-08-19 2017-11-24 联想(北京)有限公司 Information processing method and electronic equipment
WO2015031456A1 (en) * 2013-08-29 2015-03-05 Interphase Corporation Rolling shutter synchronization of a pointing device in an interactive display system
US9582516B2 (en) 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
US20160334884A1 (en) * 2013-12-26 2016-11-17 Interphase Corporation Remote Sensitivity Adjustment in an Interactive Display System
US20160011675A1 (en) * 2014-02-20 2016-01-14 Amchael Visual Technology Corporation Absolute Position 3D Pointing using Light Tracking and Relative Position Detection
KR20150137452A (en) * 2014-05-29 2015-12-09 삼성전자주식회사 Method for contoling for a displaying apparatus and a remote controller thereof
CN104135693A (en) * 2014-07-22 2014-11-05 乐视网信息技术(北京)股份有限公司 A video playing method, smart TV and server
GB2533394A (en) * 2014-12-19 2016-06-22 Gen Electric Method and system for generating a control signal for a medical device
CN104700603A (en) * 2015-03-11 2015-06-10 四川和芯微电子股份有限公司 System for remotely controlling intelligent mobile terminal
KR20170057056A (en) * 2015-11-16 2017-05-24 삼성전자주식회사 Remote Control Apparatus, Driving Method of Remote Control Apparatus, Image Display Apparatus, Driving Method of Image Display Apparatus, and Computer Readable Recording Medium
RU2648938C2 (en) * 2015-12-23 2018-03-28 Дмитрий Олегович Соловьев Inertial device and methods of remote controlling electronic systems
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
WO2018042639A1 (en) * 2016-09-02 2018-03-08 楽天株式会社 Information processing device, information processing method, and program
CN108388341B (en) * 2018-02-11 2021-04-23 苏州笛卡测试技术有限公司 Man-machine interaction system and device based on infrared camera-visible light projector
CN109189067B (en) * 2018-09-04 2021-12-24 深圳市风云智创科技有限公司 Remote digital interactive system
CN109625663B (en) * 2018-11-28 2020-11-03 王长华 Strenghthened type chemical industry storage tank of built-in buffering distribution net
US11273367B1 (en) * 2019-09-24 2022-03-15 Wayne Hughes Beckett Non-CRT pointing device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5867158A (en) * 1995-08-31 1999-02-02 Sharp Kabushiki Kaisha Data processing apparatus for scrolling a display image by designating a point within the visual display region
US20050156914A1 (en) * 2002-06-08 2005-07-21 Lipman Robert M. Computer navigation
WO2006018775A2 (en) * 2004-08-12 2006-02-23 Philips Intellectual Property & Standards Gmbh Method and system for controlling a display
US20070052177A1 (en) * 2005-08-22 2007-03-08 Nintendo Co., Ltd. Game operating device
US20070106462A1 (en) * 2004-09-23 2007-05-10 Michel Blain Method and apparatus for determining the position of an underwater object in real-time
US20070211027A1 (en) * 2006-03-09 2007-09-13 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20080100825A1 (en) * 2006-09-28 2008-05-01 Sony Computer Entertainment America Inc. Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen

Family Cites Families (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2814268A1 (en) * 1977-04-05 1978-10-19 Nintendo Co Ltd ENTERTAINMENT DEVICE WITH A LIGHT EMITTING WEAPON
US4395045A (en) * 1980-06-16 1983-07-26 Sanders Associates, Inc. Television precision target shooting apparatus and method
US4813682A (en) * 1985-08-09 1989-03-21 Nintendo Co., Ltd. Video target control and sensing circuit for photosensitive gun
US5115230A (en) * 1989-07-19 1992-05-19 Bell Communications Research, Inc. Light-pen system for projected images
US5302968A (en) * 1989-08-22 1994-04-12 Deutsche Itt Industries Gmbh Wireless remote control and zoom system for a video display apparatus
US5502459A (en) * 1989-11-07 1996-03-26 Proxima Corporation Optical auxiliary input arrangement and method of using same
US5504501A (en) * 1989-11-07 1996-04-02 Proxima Corporation Optical input arrangement and method of using same
JP2622620B2 (en) * 1989-11-07 1997-06-18 プロクシマ コーポレイション Computer input system for altering a computer generated display visible image
GB2280778B (en) * 1992-04-10 1996-12-04 Avid Technology Inc Digital audio workstation providing digital storage and display of video information
JPH07284166A (en) * 1993-03-12 1995-10-27 Mitsubishi Electric Corp Remote controller
GB2291497B (en) * 1994-07-06 1998-09-23 Alps Electric Co Ltd Relative position detecting device
US5926168A (en) * 1994-09-30 1999-07-20 Fan; Nong-Qiang Remote pointers for interactive televisions
US5898435A (en) * 1995-10-02 1999-04-27 Sony Corporation Image controlling device and image controlling method
US5892501A (en) * 1996-01-17 1999-04-06 Lg Electronics Inc, Three dimensional wireless pointing device
US5703623A (en) * 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
US5736975A (en) * 1996-02-02 1998-04-07 Interactive Sales System Interactive video display
GB9608770D0 (en) * 1996-04-27 1996-07-03 Philips Electronics Nv Projection display system
AU5156198A (en) * 1996-10-29 1998-05-22 Xeotron Corporation Optical device utilizing optical waveguides and mechanical light-switches
US6146278A (en) * 1997-01-10 2000-11-14 Konami Co., Ltd. Shooting video game machine
US6683628B1 (en) * 1997-01-10 2004-01-27 Tokyo University Of Agriculture And Technology Human interactive type display system
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US6171190B1 (en) * 1998-05-27 2001-01-09 Act Labs, Ltd. Photosensitive input peripheral device in a personal computer-based video gaming platform
US6184863B1 (en) * 1998-10-13 2001-02-06 The George Washington University Direct pointing apparatus and method therefor
US7102616B1 (en) * 1999-03-05 2006-09-05 Microsoft Corporation Remote control device with pointing capacity
US6486896B1 (en) * 1999-04-07 2002-11-26 Apple Computer, Inc. Scalable scroll controller
US6538665B2 (en) * 1999-04-15 2003-03-25 Apple Computer, Inc. User interface for presenting media information
JP4939682B2 (en) * 1999-04-27 2012-05-30 エーユー オプトロニクス コーポレイション Display device
US6287198B1 (en) * 1999-08-03 2001-09-11 Mccauley Jack J. Optical gun for use with computer games
US6252720B1 (en) * 1999-08-20 2001-06-26 Disney Entpr Inc Optical system and method for remotely manipulating interactive graphical elements on television screens and the like
US6727885B1 (en) * 1999-09-07 2004-04-27 Nikon Corporation Graphical user interface and position or attitude detector
US6377242B1 (en) * 1999-12-02 2002-04-23 The United States Of America As Represented By The Secretary Of The Air Force Display pointer tracking device
DE19958443C2 (en) * 1999-12-03 2002-04-25 Siemens Ag operating device
US6618076B1 (en) * 1999-12-23 2003-09-09 Justsystem Corporation Method and apparatus for calibrating projector-camera system
JP4708581B2 (en) * 2000-04-07 2011-06-22 キヤノン株式会社 Coordinate input device, coordinate input instruction tool, and computer program
US7667123B2 (en) * 2000-10-13 2010-02-23 Phillips Mark E System and method for musical playlist selection in a portable audio device
US7024228B2 (en) * 2001-04-12 2006-04-04 Nokia Corporation Movement and attitude controlled mobile station control
US20030050092A1 (en) * 2001-08-03 2003-03-13 Yun Jimmy S. Portable digital player--battery
US7312785B2 (en) * 2001-10-22 2007-12-25 Apple Inc. Method and apparatus for accelerated scrolling
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
TW529309B (en) * 2001-10-31 2003-04-21 Pixart Imaging Inc Displacement measuring method of image sensor
US6765555B2 (en) * 2001-11-07 2004-07-20 Omnivision Technologies, Inc. Passive optical mouse using image sensor with optional dual mode capability
AU2003248369A1 (en) * 2002-02-26 2003-09-09 Cirque Corporation Touchpad having fine and coarse input resolution
US6995751B2 (en) * 2002-04-26 2006-02-07 General Instrument Corporation Method and apparatus for navigating an image using a touchscreen
US7219308B2 (en) * 2002-06-21 2007-05-15 Microsoft Corporation User interface for media player program
US8100552B2 (en) * 2002-07-12 2012-01-24 Yechezkal Evan Spero Multiple light-source illuminating system
US7623115B2 (en) * 2002-07-27 2009-11-24 Sony Computer Entertainment Inc. Method and apparatus for light input device
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US7030856B2 (en) * 2002-10-15 2006-04-18 Sony Corporation Method and system for controlling a display device
US20040095317A1 (en) * 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
CN100409157C (en) 2002-12-23 2008-08-06 皇家飞利浦电子股份有限公司 Non-contact inputting devices
US7302181B2 (en) * 2003-02-25 2007-11-27 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Single lens multiple light source device
WO2004079558A1 (en) * 2003-03-03 2004-09-16 Matsushita Electric Industrial Co., Ltd. Projector system
US20050055624A1 (en) * 2003-04-17 2005-03-10 Edward Seeman Method, system, and computer-readable medium for creating electronic literary works, including works produced therefrom
US7233316B2 (en) * 2003-05-01 2007-06-19 Thomson Licensing Multimedia user interface
KR100971920B1 (en) * 2003-06-02 2010-07-22 디즈니엔터프라이지즈,인크. System and method of programmatic window control for consumer video players
US8127248B2 (en) * 2003-06-20 2012-02-28 Apple Inc. Computer interface having a virtual single-layer mode for viewing overlapping objects
US8373660B2 (en) * 2003-07-14 2013-02-12 Matt Pallakoff System and method for a portable multimedia client
US7489299B2 (en) * 2003-10-23 2009-02-10 Hillcrest Laboratories, Inc. User interface devices and methods employing accelerometers
US7280096B2 (en) * 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
US7180500B2 (en) * 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US7365735B2 (en) * 2004-03-23 2008-04-29 Fujitsu Limited Translation controlled cursor
US7365737B2 (en) * 2004-03-23 2008-04-29 Fujitsu Limited Non-uniform gesture precision
JP2005346244A (en) * 2004-06-01 2005-12-15 Nec Corp Information display unit and operation method therefor
US7728823B2 (en) * 2004-09-24 2010-06-01 Apple Inc. System and method for processing raw data of track pad device
US20060123360A1 (en) * 2004-12-03 2006-06-08 Picsel Research Limited User interfaces for data processing devices and systems
US7852317B2 (en) * 2005-01-12 2010-12-14 Thinkoptics, Inc. Handheld device for handheld vision based absolute pointing system
JP2008536196A (en) * 2005-02-14 2008-09-04 ヒルクレスト・ラボラトリーズ・インコーポレイテッド Method and system for enhancing television applications using 3D pointing
US20060262105A1 (en) * 2005-05-18 2006-11-23 Microsoft Corporation Pen-centric polyline drawing tool
CN1877506A (en) 2005-06-10 2006-12-13 鸿富锦精密工业(深圳)有限公司 E-book reading device
KR100714722B1 (en) * 2005-06-17 2007-05-04 삼성전자주식회사 Apparatus and method for implementing pointing user interface using signal of light emitter
US20070067798A1 (en) * 2005-08-17 2007-03-22 Hillcrest Laboratories, Inc. Hover-buttons for user interfaces
US20070109324A1 (en) * 2005-11-16 2007-05-17 Qian Lin Interactive viewing of video
FI20051211L (en) 2005-11-28 2007-05-29 Innohome Oy Remote control system
JP4569555B2 (en) * 2005-12-14 2010-10-27 日本ビクター株式会社 Electronics
EP2041639A4 (en) * 2006-07-17 2010-08-25 Wayv Corp Systems, methods, and computer program products for the creation, monetization, distribution, and consumption of metacontent
US7956849B2 (en) * 2006-09-06 2011-06-07 Apple Inc. Video manager for portable multifunction device
US8291346B2 (en) 2006-11-07 2012-10-16 Apple Inc. 3D remote control system employing absolute and relative position detection
US7984385B2 (en) * 2006-12-22 2011-07-19 Apple Inc. Regular sampling and presentation of continuous media stream
US7954065B2 (en) * 2006-12-22 2011-05-31 Apple Inc. Two-dimensional timeline display of media items
KR20080086265A (en) * 2007-03-22 2008-09-25 삼성전자주식회사 System and method for scrolling display screen, mobile terminal including the system and recording medium storing program for performing the method thereof
US20100303440A1 (en) * 2009-05-27 2010-12-02 Hulu Llc Method and apparatus for simultaneously playing a media program and an arbitrarily chosen seek preview frame
US8819559B2 (en) * 2009-06-18 2014-08-26 Cyberlink Corp. Systems and methods for sharing multimedia editing projects
US20110035700A1 (en) * 2009-08-05 2011-02-10 Brian Meaney Multi-Operation User Interface Tool
US8811801B2 (en) * 2010-03-25 2014-08-19 Disney Enterprises, Inc. Continuous freeze-frame video effect system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5867158A (en) * 1995-08-31 1999-02-02 Sharp Kabushiki Kaisha Data processing apparatus for scrolling a display image by designating a point within the visual display region
US20050156914A1 (en) * 2002-06-08 2005-07-21 Lipman Robert M. Computer navigation
WO2006018775A2 (en) * 2004-08-12 2006-02-23 Philips Intellectual Property & Standards Gmbh Method and system for controlling a display
US20070106462A1 (en) * 2004-09-23 2007-05-10 Michel Blain Method and apparatus for determining the position of an underwater object in real-time
US20070052177A1 (en) * 2005-08-22 2007-03-08 Nintendo Co., Ltd. Game operating device
US20070211027A1 (en) * 2006-03-09 2007-09-13 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20080100825A1 (en) * 2006-09-28 2008-05-01 Sony Computer Entertainment America Inc. Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170346125A1 (en) * 2014-12-25 2017-11-30 Galaxy Corporation Vanadium active material solution and vanadium redox battery
US20220206576A1 (en) * 2020-12-30 2022-06-30 Imagine Technologies, Inc. Wearable electroencephalography sensor and device control methods using same
US11461991B2 (en) 2020-12-30 2022-10-04 Imagine Technologies, Inc. Method of developing a database of controllable objects in an environment
US11500463B2 (en) * 2020-12-30 2022-11-15 Imagine Technologies, Inc. Wearable electroencephalography sensor and device control methods using same
US11816266B2 (en) 2020-12-30 2023-11-14 Imagine Technologies, Inc. Method of developing a database of controllable objects in an environment

Also Published As

Publication number Publication date
US20130027297A1 (en) 2013-01-31
US8291346B2 (en) 2012-10-16
US8689145B2 (en) 2014-04-01
US20080106517A1 (en) 2008-05-08

Similar Documents

Publication Publication Date Title
US8689145B2 (en) 3D remote control system employing absolute and relative position detection
US10620726B2 (en) 3D pointer mapping
US11670267B2 (en) Computer vision and mapping for audio applications
EP2354893B1 (en) Reducing inertial-based motion estimation drift of a game input controller with an image-based motion estimation
KR102107867B1 (en) Touch sensitive user interface
US8310537B2 (en) Detecting ego-motion on a mobile device displaying three-dimensional content
US20120086725A1 (en) System and Method for Compensating for Drift in a Display of a User Interface State
KR101821781B1 (en) Determining forward pointing direction of a handheld device
CN103180893A (en) Method and system for use in providing three dimensional user interface
EP3343242B1 (en) Tracking system, tracking device and tracking method
Babic et al. Pocket6: A 6dof controller based on a simple smartphone application
US9310851B2 (en) Three-dimensional (3D) human-computer interaction system using computer mouse as a 3D pointing device and an operation method thereof
CN108317992A (en) A kind of object distance measurement method and terminal device
US20230367118A1 (en) Augmented reality gaming using virtual eyewear beams
US20130201157A1 (en) User interface device and method of providing user interface
US20180046265A1 (en) Latency Masking Systems and Methods
US20120086636A1 (en) Sensor ring and interactive system having sensor ring
US20240061798A1 (en) Debug access of eyewear having multiple socs
Hafner et al. Unencumbered interaction in display environments with extended working volume
Hafner et al. Unencumbered interaction in display environments with extended working

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION