US20070007442A1 - Light modulating input device for capturing user control inputs - Google Patents
Light modulating input device for capturing user control inputs Download PDFInfo
- Publication number
- US20070007442A1 US20070007442A1 US11/175,757 US17575705A US2007007442A1 US 20070007442 A1 US20070007442 A1 US 20070007442A1 US 17575705 A US17575705 A US 17575705A US 2007007442 A1 US2007007442 A1 US 2007007442A1
- Authority
- US
- United States
- Prior art keywords
- light
- light source
- relative
- input device
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05G—CONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
- G05G9/00—Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
- G05G9/02—Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
- G05G9/04—Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
- G05G9/047—Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
- G05G2009/0474—Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks characterised by means converting mechanical movement into electric signals
- G05G2009/04759—Light-sensitive detector, e.g. photoelectric
Definitions
- a hand operated pointing device for use with a computer and its display has become almost universal.
- One form of the various types of pointing devices is the optical pointing device.
- a light source within an optical pointing device illuminates a navigation surface, such as a finger or a desktop, that is external to the pointing device.
- Light is reflected off the navigation surface and onto an image sensor within the optical pointing device for detection as relative movement information as the pointing device is moved relative to the navigation surface. This information is used to direct a corresponding movement of a screen pointer.
- pointing devices are often too large for smaller applications, such as personal digital assistants, mobile phones, handheld computers, portable audio players, etc.
- a conventional pointing device is poorly adapted to execute new functions of evolving electronic devices, as well to accommodate the widely varying shapes and configurations of these new class of electronic devices.
- Embodiments of the invention are directed to a light modulating input device for capturing user control inputs.
- an input device comprises a light source, a light modulator, and a sensor.
- the light modulator is configured to emit a path of light.
- the light modulator is arranged relative to the light source transversely across the path of light to produce a light contrast pattern from the light modulator.
- the sensor module is configured to detect relative motion between the light contrast pattern and the sensor module.
- FIG. 1A is a block diagram illustrating major components of an optical input device, according to one embodiment of the present invention.
- FIG. 1B is a block diagram illustrating an optical input device, according to an embodiment of the present invention.
- FIG. 2 is a flow diagram illustrating a method for generating movement data, according to one embodiment of the present invention.
- FIG. 3 is a perspective view of an input device, according to one embodiment of the present invention.
- FIG. 4 is a sectional view of the input device of FIG. 3 , as taken along lines 3 - 3 , according to one embodiment of the present invention.
- FIG. 5A is a block diagram illustrating major components of an optical input device, according to one embodiment of the present invention.
- FIG. 5B is a plan view illustrating a light modulator, according to one embodiment of the present invention.
- FIG. 5C is a plan view illustrating a light modulator, according to one embodiment of the present invention.
- FIG. 6A is a block diagram illustrating major components of an optical input device, according to one embodiment of the present invention.
- FIG. 6B is a plan view illustrating a light modulator, according to one embodiment of the present invention.
- FIG. 7A is a top plan view illustrating an optical input device, according to an embodiment of the present invention.
- FIG. 7B is a sectional view of the input device of FIG. 7A , as taken along lines 7 B- 7 B, according to one embodiment of the present invention.
- Embodiments of the invention are directed to an input device comprising a light source for illuminating a light modulator in which motion of the illuminated light modulator relative to a sensor is detected at the sensor to capture human control inputs associated with the relative motion.
- the light modulator is interposed between the light source and the sensor to introduce a predefined pattern in the light as it passes through the light modulator on its way to the sensor.
- the light modulator comprises a diffraction element, an interference element, or element producing a pattern of varying light intensities. As a result, a unique pattern of differing light intensities is received at the sensor in the form of lines, speckles, or singular dots, etc.
- the light modulator comprises a navigation surface that is illuminated (e.g., backlit) by the light source with the sensor receiving light patterns transmitted from the backlit navigation surface. Different positions of the light modulator relative to the sensor cause differing light patterns to be received at the sensor that are compared over time to capture human control inputs. Light patterns received at the sensor for comparison are produced via a light modulator, which comprises an at least partially opaque pattern of a lens or generally transparent surface through which the light is transmitted. These transmissive light patterns are different than reflected images used in conventional optical pointing devices in which an image received at a sensor array is produced by light reflected and/or scattered off of a navigation surface, such as a mouse pad.
- the light source and the light modulator are fixed relative to each other to operate as a single unit that projects a light contrast pattern of varying light intensities onto a sensor.
- the light source and the light modulator function as a contrast pattern projecting mechanism arranged to directly transmit a light contrast pattern onto a sensor that is positioned generally perpendicular to the transmitted light path from the projecting mechanism. Movement of the projecting mechanism, and therefore movement of the projected contrast pattern, occurs in a plane that is generally parallel to the sensor and generally perpendicular to the light transmitted from the projecting mechanism.
- the light source and light modulator are embodied in a mobile housing which is configured for lateral movement relative to a sensor array.
- the mobile housing comprises a puck that is slidably movable relative to a support structure with the support structure also suspending the light modulator of the puck in a spaced relationship relative to the sensor array to enable light to be transmitted through the light modulator for reception at the sensor array.
- a constraining mechanism limits lateral movement of the mobile housing within a predefined field of motion relative to the sensor array.
- Embodiments of the invention are directed to generating relative motion information to capture human control inputs. These inputs are used for controlling a screen pointer, selecting or manipulating symbols visible on a display device, video game controllers, for direct control of a mechanical or electrical system such as speed and turn inputs for controlling an automobile or toy vehicle, or any other application where one-dimensional or two-dimensional control inputs are useful. Additional examples include portable electronic devices, such as mobile phones, portable audio devices, personal digital assistants, electronic cameras, etc.
- FIGS. 1-7B These embodiments, and others, and described and illustrated in greater detail in association with FIGS. 1-7B .
- FIG. 1A is a block diagram illustrating major components of an optical input device 10 according to one embodiment of the present invention.
- optical input device 10 comprises mobile housing 12 and sensor module 30 including photodetector array 32 .
- Mobile housing 12 comprises light source 20 and light modulator 24 with first surface 25 .
- first surface 25 defines a navigation surface.
- optical input device 10 also comprises lens 50 disposed between first surface 25 of mobile housing 12 and sensor module 30 , serving to image first surface 25 upon photodetector array 32 .
- This embodiment is employed when there is a relatively large distance (H 1 ) between first surface 25 and photodetector array 32 of sensor module 30 .
- lens 50 is omitted when height (H 1 ) between first surface 25 and photodetector array 32 of sensor module 30 is relatively small or negligible. This small height enables first surface 25 to be imaged at photodetector array 32 at an adequate resolution without an intervening lens.
- input device 10 is in electrical communication with and supports operation of display 12 to serve as a pointing device for controlling screen pointer 14 on display 12 .
- input device 10 is not restricted to controlling a screen pointer, and is used more generally to capture user inputs for a wide variety of devices and applications as previously described.
- mobile housing 12 is configured for lateral movement (as indicated by directional arrow M) between boundaries 40 A and 40 B, while also being maintained in a generally vertical position above and spaced apart from sensor module 30 (as indicated by height Hi).
- First surface 25 of mobile housing 12 is maintained in a generally parallel position relative to photodetector array 32 of sensor module 30 .
- boundaries 40 A and 40 B are separated by a width W 1 and together generally define a field of motion through which housing 12 is permitted to move laterally relative to sensor module 30 .
- a width of photodetector array 32 generally corresponds to the width W 1 between boundaries 40 A and 40 B, and therefore generally corresponds to the field of motion of housing 12 .
- light modulator 24 comprises an at least partially transparent member (e.g., a surface, lens, or other member) that modulates or alters light from light source 20 that is transmitted through light modulator 24 to produce an image suitable for detection at photodetector array 32 .
- light modulator 24 comprises a distinctive opaque pattern on a generally transparent member (or conversely, a transparent pattern on a generally opaque member) suitable for comparison with other like images received at photodetector array 32 . Additional aspects of contrast patterns (i.e., light patterns having light and dark portions of varying intensity) produced via light modulator 24 are described and illustrated in greater detail in association with FIGS. 2-6B .
- first surface 25 of housing 12 effectively acts as a navigation surface
- movement information generated via photodetector array 32 based on first surface 25 is highly accurate because the features of this navigation surface are known and relatively stable.
- first surface 25 - is generally excluded from dust, markings, etc. that otherwise can cause noise or bad pixels in digital images corresponding to the navigation surface. Accordingly, embodiments of the invention enable control over the type and quality of navigation surface as well as protection of the navigation surface based on its general exclusion from ambient conditions (external to input device 10 ), as will be further illustrated in association with FIGS. 3-4 .
- sensor module 30 forms a portion of optical navigation sensor integrated circuit (IC) 60 .
- optical navigation sensor 60 includes digital input/output circuitry 66 , navigation processor 68 , analog to digital converter (ADC) 72 , photodetector array 32 (of sensor module 30 ), and light source driver circuit 86 .
- sensor 60 comprises an electronic image sensor, including but not limited to, a charge coupled device (CCD) or a complimentary metal oxide semiconductor—active pixel sensors (CMOS—APS). Both types of sensors contain an array of photodetectors (e.g., pixels), arranged in a pattern.
- CCD charge coupled device
- CMOS—APS complimentary metal oxide semiconductor—active pixel sensors
- light source 20 emits light (A) through light modulator 24 (which also defines navigation surface 25 ), and illuminated images (B) are generated.
- light source 20 is a light emitting diode.
- light source 20 is a coherent light source or an at least partially coherent light source.
- light source 20 is a laser.
- light source 20 is a vertical cavity surface emitting laser (VCSEL) diode.
- VCSEL vertical cavity surface emitting laser
- light source 20 is an edge emitting laser diode.
- Light source 20 is controlled by driver circuit 86 , which is controlled by navigation processor 68 via control line 70 .
- control line 70 is used by navigation processor 68 to cause driver circuit 86 to be powered on and off, and correspondingly cause light source 20 to be powered on and off.
- Images (B) emanating from first surface 25 of mobile housing 12 are projected directly onto photodetector array 32 without interaction (e.g., reflection) with any intervening structures.
- Each photodetector in photodetector array 32 provides a signal that varies in magnitude based upon the intensity of light incident on the photodetector.
- the signals from photo array 32 are output to analog to digital converter (ADC) 72 , which converts the signals into digital values of a suitable resolution (e.g., six bits).
- ADC analog to digital converter
- the digital values provide a digital representation of the illuminated pattern on photodetector array 32 produced via transmission of light through navigation surface 25 , which acts as a light modulator.
- the digital values generated by analog to digital converter 72 are output to navigation processor 68 .
- the digital values received by navigation processor 68 are stored as a frame within memory 69 . Different frames of light patterns detected at photodetector array 32 are compared over time. In one aspect, successive frames are compared while in other aspects, non-sequential frames are compared.
- light modulator 24 has width (W 2 ) and photodetector array 32 has a width (W 3 ).
- a width of each of respective light modulator 24 and photodetector array 32 inferentially generally corresponds to a surface area of the respective light modulator 24 and sensor array.
- light modulator 24 has a surface area (inferentially represented by width W 2 ) that is substantially larger than a surface area of photodetector array 32 (inferentially represented by width W 3 ).
- light modulator 24 comprises a plurality of opaque features spaced apart on a generally transparent member so that as light modulator 24 is moved relative to photodetector array 32 , the light contrast pattern detected at photodetector array 32 generally corresponds to a unique position of light modulator 24 relative to photodetector array 32 . Frames of newly detected light contrast patterns are compared with reference frames of detected light contrast patterns to thereby determine a direction, amount and speed of relative motion of the light contrast pattern relative to the photodetector array 32 .
- light modulator 24 has a surface area (inferentially represented by width W 2 ) that is substantially less than a surface area of photodetector array 32 (inferentially represented by width W 3 ).
- light modulator 24 comprises a single opaque feature centrally arranged on a generally transparent member so that as light modulator 24 is moved relative to photodetector array 32 , the single opaque feature generally corresponds to a unique absolute position of light modulator 24 relative to sensor array 24 .
- the single opaque feature enables direct tracking of the absolute position of the opaque feature without using frame comparison techniques. This direct tracking mechanism indicates a direction and magnitude of relative motion of the light contrast pattern relative to photodetector array 32 .
- relative movement between light modulator 24 and the sensor module 30 occurs in a generally singular plane along two dimensions. In another embodiment, relative movement between the light modulator 24 and the sensor module 30 also occurs along a generally singular plane but is restricted to movement along a single dimension.
- FIG. 1B is a block diagram illustrating an input device 90 , according to an embodiment of the invention.
- input device 90 comprises light source 92 , light modulator 94 and sensor 96 , which have substantially the same features and attributes as previously described for light source 34 , light modulator 24 , and sensor module 30 of input device 10 in association with FIG. 1A .
- light modulator 94 is positioned transversely across a path of light emitted from light source 92 to produce a light contrast pattern (i.e., a pattern of varying light intensities) that is directly transmitted to sensor 96 . Relative motion between the light contrast pattern and sensor 96 is detected at sensor 96 .
- FIG. 1B illustrates different arrangements of enabling relative motion among light source 92 , light modulator 94 and sensor 96 to produce changing light patterns at sensor 96 to capture human control inputs.
- light source 92 and light modulator 94 are fixed relative to each other but movable as single unit relative to a fixed sensor 96 .
- light source 92 is fixed relative to a fixed sensor 96
- light modulator 94 is movable relative to both light source 92 and sensor 96 .
- light modulator 94 and sensor 96 are fixed relative to each other and light source 92 is movable relative to light modulator 94 .
- other combinations are possible such as light source 92 and light modulator 94 being fixed relative to each other, but sensor 96 being movable relative to both light modulator 94 and light source 92 .
- 1B use light transmitted through light modulator 94 to produce a changing pattern of light detected at sensor 96 over time to thereby capture human control inputs associated with the motion of one or more of light source 92 , light modulator 94 , and sensor 96 relative to each other.
- FIG. 2 illustrates a method 100 of optical navigation using a pointing device, according to an embodiment of the invention.
- a path of light is directed from a light source through a generally transparent first surface that includes at least one opaque region.
- the at least one opaque region comprises a plurality of opaque portions spaced apart from each other over the generally transparent surface to form a known or predefined pattern.
- the plurality of opaque portions comprises a pseudo-random pattern with each of the opaque portions varying in size and shape, and with varying spacing between adjacent opaque portions.
- the first surface conversely comprises a generally opaque member including a plurality of transparent regions or portions arranged in a manner substantially the same as the previously described plurality of opaque portions (on the generally transparent member).
- the illuminated surface is moved relative to the sensor module in a generally lateral orientation.
- this lateral movement occurs in two dimensions that are generally horizontal relative to the sensor array.
- this relative movement of the illuminated surface relative to the sensor array is constrained to a single dimension that is generally horizontal relative to the sensor array.
- movement information is generated based on the relative movement between the first surface and the sensor module.
- illuminated light contrast patterns e.g., light contrasting images
- Comparison of these differently positioned light contrast patterns enables determining a magnitude and direction of movement of the first surface to generate relative motion information for human control inputs.
- light contrast patterns received at the sensor array for comparison are images of an illuminated surface through which light was transmitted on its path to the sensor array.
- These transmissive images or transmissive light patterns i.e., images or patterns created by transmission of light through a transparent member
- reflected images used in conventional optical pointing devices in which an image received at a sensor array is produced by reflection off of a surface.
- method 100 is performed using input device 10 as previously described and illustrated in association with FIG. 1A , and as well as pointing devices 90 , 150 , 250 , or 275 , as described in association with FIGS. 1B, 4 , 5 A- 5 B, 6 A- 6 B, and 7 A- 7 B.
- FIG. 3 illustrates a pointing device 150 , according to an embodiment of the invention.
- pointing device 150 comprises puck 152 , support structure 160 , and sensor module 170 .
- puck 152 and sensor module 170 have substantially the same features and attributes as mobile housing 12 and sensor module 30 , as previously described in association with FIG. 1A .
- puck 152 comprises first portion 154 and second portion 156 with the second portion 156 extending generally outward, and generally perpendicular relative to first portion 154 .
- first portion 154 comprises a generally disc-shaped member while second portion 156 comprises a generally cylindrically shaped or tubular shaped member.
- first portion 154 and/or second portion 156 comprises light source 155 for illuminating first surface 158 (i.e., a navigation surface) of second portion 156 .
- First portion 154 of puck 152 also comprises upper surface 178 and a bottom surface 180 .
- first portion 154 comprises other shaped members, such as a generally rectangular shaped member while second portion 156 also comprises other shaped members such as a generally rectangular shaped member.
- light source 155 comprises a light emitting diode (LED).
- light source 155 comprises a portion of light that is conveyed to puck 152 from an external location, such as a remote LED, and then fed via a light guide into puck 152 to produce light source 155 within the puck 152 .
- This latter embodiment is substantially the same as the embodiment later described in association with FIGS. 7A-7B .
- support structure 160 comprises top surface 161 , opening 162 , and bottom surface 163 .
- Opening 162 has a shape (e.g., generally circular) and a width that enables constrained movement of second portion 156 of puck 152 within opening 162 . Accordingly, second portion 156 is allowed to move freely in a generally unlimited manner, such as side-to-side motion, spiral motion, etc., within the boundary defined by the sidewalls of hole 162 of support structure 160 .
- opening 162 defines an elongate slot to enable slidable movement of second portion 156 of puck 152 along a single dimension relative to support structure 160 .
- top surface 161 of support structure 160 is arranged for sliding contact with bottom surface 180 of first portion 154 .
- First portion 154 is sized and shaped to enable a finger to engage upper surface 178 for laterally moving first portion 154 of puck 152 relative to support structure 160 . Movement of first portion 154 relative to support structure 160 causes movement of second portion 156 relative to sensor array 172 of sensor module 170 , thereby producing relative motion of a light contrast pattern (projected from second portion 156 ) and sensor array 172 to enable capturing user control inputs.
- FIG. 4 is a sectional view of FIG. 3 , as taken along lines 4 - 4 , according to an embodiment of the invention.
- pointing device 152 comprises the features and attributes illustrated in FIG. 3 , and further comprises retaining mechanism 190 for preventing second portion 156 of puck 152 from being lifted out of (or falling out of) hole 162 .
- retaining mechanism 190 comprises a generally disc shaped member or a pair of finger-like projections or wings 192 that loosely engage bottom surface 163 of support structure 161 .
- retaining mechanism 190 does not restrict or limit lateral movement of puck 154 relative to support structure 160 .
- retaining mechanism 190 permits lateral movement of puck 154 relative to support structure 160 but is also biased against support structure 160 to maintain puck 152 in a stationary position to prevent unintended lateral movement of puck 152 until or unless a force is applied by the user to move puck 152 laterally relative to support structure 160 .
- restraining mechanism 190 when biased against support structure 160 , acts to maintain a constant height between first surface 158 and sensor array 172 .
- pointing device 150 includes other mechanisms, such as re-centering mechanisms, for controlling the motion of puck 152 relative to support structure 160 , as later described in association with FIGS. 7A-7B .
- controller module 194 is a simplified representation of sensor integrated circuit 60 ( FIG. 1 ) and is in electrical communication with light source 155 via lines 196 .
- control lines 196 extend into and within second portion 156 for connection with light source 155 within puck 154 .
- first portion 154 of puck 152 causes movement of navigation surface 158 (as indicated by arrow M 2 ) relative to sensor array 172 to generate relative motion information to capture human control inputs.
- FIG. 5A is a block diagram illustrating a pointing device 250 , according to an embodiment of the invention.
- Pointing device 250 comprises substantially the same features and attributes as input device 10 , 90 and 150 , as previously described in association with FIGS. 1A-4 .
- pointing device 250 comprises mobile housing 252 and optical navigation sensor 254 with mobile housing 252 comprising light source 260 and light modulator 262 .
- light source 260 comprises a light emitting diode (LED).
- LED light emitting diode
- light source 260 comprises other light sources, such as those producing substantially coherent light or at least partially coherent light.
- light modulator 262 comprises a mask for modulating light emitted from light source 260 prior to reception at optical navigation sensor 254 .
- light modulator 262 comprises a shadow mask, aperture grill, or related structure including a transparent member with a predefined pattern of opaque portions to cause absorption, scattering, etc. of light, thereby serving to form an illumination pattern of light modulator 262 at navigation sensor 254 .
- FIG. 5B is a plan view of a mask 270 , according to an embodiment of the invention.
- mask 270 comprises a generally disc shaped member (having a generally circularly shaped appearance when seen in a plan view) that generally corresponds to a cross-sectional shape of mobile housing 252 .
- mask 270 comprises a generally rectangular shaped member or other shape suitable for serving as a transverse member to modulate a path of light.
- mask 270 is a generally transparent member 271 that includes a plurality of opaque portions 272 spaced apart from each other and arranged in a distinctive, fixed pattern.
- opaque portions 272 are arranged in a pattern generally corresponding to a pseudo-random pattern.
- the plurality of opaque portions vary in size and/or shape with the pattern defining varying spacing between adjacent opaque portions 272 .
- a surface area occupied by the plurality of opaque portions 272 is substantially larger than a surface area of a sensor array of navigation sensor 254 to thereby enable unique comparisons of light patterns at navigation sensor 254 as relative motion occurs between mask 270 and a sensor array of navigation sensor 254 .
- the unique comparisons are based on variations in the size, shape, and position of the opaque portions 272 .
- mask 270 is a generally opaque member 271 that includes a plurality of transparent portions 272 .
- mask 270 is a semi-transparent member 271 that includes a plurality of opaque portions 272 .
- the pattern on mask 270 is arranged to enable comparison of a sequence of images or light patterns of the illuminated mask 270 .
- Each image or different light pattern in the comparison generally corresponds to a different position of mask 270 relative to optical navigation sensor 254 as mask 270 (as part of housing 252 ) is moved relative to optical navigation sensor 254 .
- Comparison of these differing images or differing light patterns enables determining a speed and/or direction of motion of housing 277 and enables movement information to be generated to capture user control inputs associated with optical navigation sensor 254 .
- FIG. 5C is a plan view of a mask 273 , according to an embodiment of the invention.
- mask 273 comprises a generally disc shaped member (having a generally circularly shaped appearance when seen in a plan view) that generally corresponds to a cross-sectional shape of mobile housing 252 .
- Mask 273 is a generally transparent member 271 that includes a single centrally located opaque portion 274 .
- Opaque portion 274 includes a generally circular shape.
- opaque portion 274 comprises any one of a generally rectangular shape, a generally triangular shape, or other shape suitable for direct tracking by optical navigation sensor 254 A
- mask 273 comprises a generally rectangular shaped member or other shaped member.
- mask 273 comprises a generally opaque portion 271 with a centrally located transparent portion 274 .
- a surface area of opaque portion 274 is substantially smaller than a surface area of a sensor array of navigation sensor 254 to thereby enable direct tracking of the speed and/or direction of opaque portion 274 at navigation sensor 254 to capture user control inputs as relative motion occurs between mask 273 and a sensor array of navigation sensor 254 .
- FIG. 6A is a block diagram illustrating a pointing device 275 , according to an embodiment of the invention.
- Pointing device 275 comprises substantially the same features and attributes as input device 10 , 90 and 150 , as previously described in association with FIGS. 1A-4 .
- pointing device 275 comprises mobile housing 277 and optical navigation sensor 279 .
- mobile housing 277 comprises light source 280 , light pipe 282 , diffuser 284 , and light modulator 285 .
- light source 280 is a light emitting diode (LED) with light emitted from the LED traveling through light pipe 282 and diffuser 284 to and through light modulator 285 .
- LED light emitting diode
- light modulator 285 comprises a grid or other mechanism for modulating light emitted from light source 280 (and which travels through light pipe 282 and diffuser 284 ) prior to reception at photocell array 279 .
- photocell array 279 comprises photocell A and photocell B arranged generally side-by-side manner.
- Light modulator 285 is configured, so that when illuminated via backlighting relative to photocell sensor array 279 , to enable a phase shift detection algorithm or quadrature-type detection algorithm to be used to detect motion of the light modulator 285 , and generate relative motion information for capturing user control inputs.
- pointing device 275 comprises an external portion 287 that is external to and/or an extension of mobile housing 277 at which light source 280 is positioned external of mobile housing 277 .
- light pipe 282 extends outside of mobile housing 277 to receive light from light source 280 , so that light source 280 need not be contained within mobile housing 277 .
- light pipe 282 comprises a length that enables light to travel from light source 280 positioned outside mobile housing 277 into mobile housing 277 and then into diffuser 284 and through light modulator 285 .
- FIG. 6B comprises a plan view of a grid 290 as one embodiment of light modulator 285 , according to an embodiment of the invention.
- grid 290 comprises an elongate, generally rectangular shaped member, with only a portion of grid 290 is shown for illustrative purposes.
- Grid 290 is a generally transparent member including an array of modules 295 arranged side-by-side in series, which further define channels A and B.
- Channel A is defined by a plurality of marked portions 292 A and 292 B which are arranged in series, with at least one unmarked portion 296 disposed between adjacent marked portions 292 A and 292 B.
- Channel B is defined by a plurality of marked portions 294 A and 294 B arranged in series, with at least one unmarked portion 296 interposed between adjacent marked portions 294 A and 294 B.
- Marked portion 292 A of channel A has a position that overlaps marked portion 294 A of channel B
- marked portion 292 B of channel A has a position generally overlapping marked portion 294 B of channel B.
- grid 290 is arranged as a light modulator such as light modulator 285 shown in FIG. 6A as positioned over sensor array 279 .
- grid 290 is positioned so that channel A of grid 290 (including marked portions 292 A, 292 B) is positioned over photocell A and channel B of grid 290 (including marked portions 294 A, 294 B) is positioned over photocell B.
- images detected at photocell A relative to channel A and images detected at photocell B relative to channel B caused by movement of grid 290 along a single direction of motion enables generating movement information based on relative movement of mobile housing 277 relative to photocell array 279 .
- Movement information is generated via comparison of images via a phase-shift detection algorithm or quadature-type detection algorithm.
- FIG. 7A is diagram illustrating a top view of a pointing device 300 , according to another embodiment of the present invention.
- FIG. 7B is a diagram illustrating a cross-sectional view along section line 7 B- 7 B of the pointing device 300 shown in FIG. 7A according to one embodiment of the present invention.
- pointing device 300 is configured in substantially the same manner as pointing device 150 ( FIGS. 3-4 ), but puck 152 of pointing device 300 does not include a light source 155 contained directly within puck housing 154 . Rather, pointing device 300 includes at least one of four light sources 302 A- 302 D positioned about a periphery of the field of motion 319 .
- Each of the light sources 302 A- 302 D is positioned adjacent to one of the springs 313 .
- springs 313 are formed from plastic, and are configured as light pipes to guide visible light from light sources 302 A- 302 D to puck 152 .
- fiber optic lines are attached to or inserted through springs 313 , to guide light from light sources 302 A- 302 D to puck 152 .
- light sources 102 A- 102 D are LEDs. Accordingly, springs 313 act to bring light from a remote source to puck 152 for transmission through light modulator 158 (shown in FIG. 7B ) for reception at sensor module 170 (which is not seen in FIG. 7A , but shown in FIG. 7B ) to enable image comparisons to generate relative motion information to capture user control inputs.
- springs 313 also act as a re-centering mechanism to further constrain motion of puck 152 to a field of motion.
- springs 313 when the user moves puck 152 (using finger 319 ) laterally outward from a center region of slide surface 153 (e.g., top surface 161 in FIGS. 3-4 ) and then releases puck 152 by removing their finger 316 from puck 152 , puck 152 is returned to its centered position by the springs 313 that connect the puck 152 to the side 314 of the puck field of motion 319 .
- user control inputs associated with puck 152 are deactivated when puck 152 is released for automatic re-centering. This aspect mimics the action of a conventional mouse when lifting the mouse from an edge of a navigation surface (e.g. a mouse pad) and replacing the mouse at the center of the navigation surface.
- springs 313 do not act as light pipes to convey light into puck housing 154 but act only as a re-centering mechanism to re-center puck 152 within puck field of motion 319 .
- puck 152 includes its own light source contained within puck housing 152 .
- Embodiments of the invention are directed to an input device that uses an illuminated light modulator to serve as a navigation surface to enable an optical navigation sensor to base its movement calculations on transmissive images, i.e., light contrast patterns created by transmission of light through the navigation surface.
- the navigation surface is conveniently arranged in direct proximity to the sensor array with no intervening structures therebetween and with little risk for contamination from ambient sources.
- the navigation surface is arranged in a puck that is slidably movable over a support structure (e.g., slide pad) that also encloses the sensor array and the navigation surface from the ambient environment. Movement information for the input device is based on relative movement between the illuminated navigation surface and a sensor array. In this manner, the nature and type of the navigation surface is tightly controlled to produce highly accurate movement information for capturing user control inputs.
- optical pointing devices 10 , 90 150 , 250 , 275 , and 300 may be implemented in hardware, software, firmware, or any combination thereof.
- the implementation may be via a microprocessor, programmable logic device, and state machine, or combinations thereof.
- Components of the present invention may reside in software on one or more computer-readable mediums.
- the term computer-readable medium as used herein is defined to include any kind of memory, volatile or non-volatile, such as floppy disks, hard disks, CD-ROMs, flash memory, read-only memory (ROM), and random access memory.
Abstract
Description
- The use of a hand operated pointing device for use with a computer and its display has become almost universal. One form of the various types of pointing devices is the optical pointing device. In one form of an optical pointing device, a light source within an optical pointing device illuminates a navigation surface, such as a finger or a desktop, that is external to the pointing device. Light is reflected off the navigation surface and onto an image sensor within the optical pointing device for detection as relative movement information as the pointing device is moved relative to the navigation surface. This information is used to direct a corresponding movement of a screen pointer.
- Conventional imaging surfaces, such as fingertips, desktops, mouse pads, etc. typically have some noise associated with them due to dirt, surface irregularities, etc. that can interfere with generating accurate movement information. While a pointing device can control the light source and the sensor package, the quality and type of navigation surface is typically chosen by the user of the pointing device. Despite many approaches to compensating for varying types of navigation surfaces (e.g., highly reflective, highly absorbing, dirty, etc.), the quality of the navigation surface still affects performance of the optical pointing devices.
- Conventional pointing devices are often too large for smaller applications, such as personal digital assistants, mobile phones, handheld computers, portable audio players, etc. In some instances, a conventional pointing device is poorly adapted to execute new functions of evolving electronic devices, as well to accommodate the widely varying shapes and configurations of these new class of electronic devices.
- Accordingly, manufacturers and designers of electronic devices still face challenges in reducing the size of pointing devices while enhancing the accuracy, adaptability, and effectiveness of those pointing devices.
- Embodiments of the invention are directed to a light modulating input device for capturing user control inputs. In one embodiment, an input device comprises a light source, a light modulator, and a sensor. The light modulator is configured to emit a path of light. The light modulator is arranged relative to the light source transversely across the path of light to produce a light contrast pattern from the light modulator. The sensor module is configured to detect relative motion between the light contrast pattern and the sensor module.
-
FIG. 1A is a block diagram illustrating major components of an optical input device, according to one embodiment of the present invention. -
FIG. 1B is a block diagram illustrating an optical input device, according to an embodiment of the present invention. -
FIG. 2 is a flow diagram illustrating a method for generating movement data, according to one embodiment of the present invention. -
FIG. 3 is a perspective view of an input device, according to one embodiment of the present invention. -
FIG. 4 is a sectional view of the input device ofFIG. 3 , as taken along lines 3-3, according to one embodiment of the present invention. -
FIG. 5A is a block diagram illustrating major components of an optical input device, according to one embodiment of the present invention. -
FIG. 5B is a plan view illustrating a light modulator, according to one embodiment of the present invention. -
FIG. 5C is a plan view illustrating a light modulator, according to one embodiment of the present invention. -
FIG. 6A is a block diagram illustrating major components of an optical input device, according to one embodiment of the present invention. -
FIG. 6B is a plan view illustrating a light modulator, according to one embodiment of the present invention. -
FIG. 7A is a top plan view illustrating an optical input device, according to an embodiment of the present invention. -
FIG. 7B is a sectional view of the input device ofFIG. 7A , as taken alonglines 7B-7B, according to one embodiment of the present invention. - In the following Detailed Description, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. In this regard, directional terminology, such as “top,” “bottom,” “front,” “back,” “leading,” “trailing,” etc., is used with reference to the orientation of the Figure(s) being described. Because components of embodiments of the present invention can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration and is in no way limiting. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present invention. The following Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims.
- Embodiments of the invention are directed to an input device comprising a light source for illuminating a light modulator in which motion of the illuminated light modulator relative to a sensor is detected at the sensor to capture human control inputs associated with the relative motion. In one embodiment, the light modulator is interposed between the light source and the sensor to introduce a predefined pattern in the light as it passes through the light modulator on its way to the sensor. The light modulator comprises a diffraction element, an interference element, or element producing a pattern of varying light intensities. As a result, a unique pattern of differing light intensities is received at the sensor in the form of lines, speckles, or singular dots, etc.
- In one aspect, the light modulator comprises a navigation surface that is illuminated (e.g., backlit) by the light source with the sensor receiving light patterns transmitted from the backlit navigation surface. Different positions of the light modulator relative to the sensor cause differing light patterns to be received at the sensor that are compared over time to capture human control inputs. Light patterns received at the sensor for comparison are produced via a light modulator, which comprises an at least partially opaque pattern of a lens or generally transparent surface through which the light is transmitted. These transmissive light patterns are different than reflected images used in conventional optical pointing devices in which an image received at a sensor array is produced by light reflected and/or scattered off of a navigation surface, such as a mouse pad.
- In one embodiment, the light source and the light modulator are fixed relative to each other to operate as a single unit that projects a light contrast pattern of varying light intensities onto a sensor. Accordingly, in this embodiment, the light source and the light modulator function as a contrast pattern projecting mechanism arranged to directly transmit a light contrast pattern onto a sensor that is positioned generally perpendicular to the transmitted light path from the projecting mechanism. Movement of the projecting mechanism, and therefore movement of the projected contrast pattern, occurs in a plane that is generally parallel to the sensor and generally perpendicular to the light transmitted from the projecting mechanism.
- In one embodiment, the light source and light modulator are embodied in a mobile housing which is configured for lateral movement relative to a sensor array. In one aspect, the mobile housing comprises a puck that is slidably movable relative to a support structure with the support structure also suspending the light modulator of the puck in a spaced relationship relative to the sensor array to enable light to be transmitted through the light modulator for reception at the sensor array. In one embodiment, a constraining mechanism limits lateral movement of the mobile housing within a predefined field of motion relative to the sensor array.
- Embodiments of the invention are directed to generating relative motion information to capture human control inputs. These inputs are used for controlling a screen pointer, selecting or manipulating symbols visible on a display device, video game controllers, for direct control of a mechanical or electrical system such as speed and turn inputs for controlling an automobile or toy vehicle, or any other application where one-dimensional or two-dimensional control inputs are useful. Additional examples include portable electronic devices, such as mobile phones, portable audio devices, personal digital assistants, electronic cameras, etc.
- These embodiments, and others, and described and illustrated in greater detail in association with
FIGS. 1-7B . -
FIG. 1A is a block diagram illustrating major components of anoptical input device 10 according to one embodiment of the present invention. As shown inFIG. 1A ,optical input device 10 comprisesmobile housing 12 andsensor module 30 includingphotodetector array 32.Mobile housing 12 compriseslight source 20 andlight modulator 24 withfirst surface 25. In one embodiment,first surface 25 defines a navigation surface. - In one embodiment,
optical input device 10 also compriseslens 50 disposed betweenfirst surface 25 ofmobile housing 12 andsensor module 30, serving to imagefirst surface 25 uponphotodetector array 32. This embodiment is employed when there is a relatively large distance (H1) betweenfirst surface 25 andphotodetector array 32 ofsensor module 30. In another embodiment,lens 50 is omitted when height (H1) betweenfirst surface 25 andphotodetector array 32 ofsensor module 30 is relatively small or negligible. This small height enablesfirst surface 25 to be imaged atphotodetector array 32 at an adequate resolution without an intervening lens. - In one embodiment,
input device 10 is in electrical communication with and supports operation ofdisplay 12 to serve as a pointing device for controlling screen pointer 14 ondisplay 12. In another embodiment,input device 10 is not restricted to controlling a screen pointer, and is used more generally to capture user inputs for a wide variety of devices and applications as previously described. - As shown in
FIG. 1A , in one embodimentmobile housing 12 is configured for lateral movement (as indicated by directional arrow M) betweenboundaries First surface 25 ofmobile housing 12 is maintained in a generally parallel position relative tophotodetector array 32 ofsensor module 30. In one embodiment,boundaries housing 12 is permitted to move laterally relative tosensor module 30. In one aspect, a width ofphotodetector array 32 generally corresponds to the width W1 betweenboundaries housing 12. - In one embodiment,
light modulator 24 comprises an at least partially transparent member (e.g., a surface, lens, or other member) that modulates or alters light fromlight source 20 that is transmitted throughlight modulator 24 to produce an image suitable for detection atphotodetector array 32. In one aspect,light modulator 24 comprises a distinctive opaque pattern on a generally transparent member (or conversely, a transparent pattern on a generally opaque member) suitable for comparison with other like images received atphotodetector array 32. Additional aspects of contrast patterns (i.e., light patterns having light and dark portions of varying intensity) produced vialight modulator 24 are described and illustrated in greater detail in association withFIGS. 2-6B . - In one aspect, because
first surface 25 ofhousing 12 effectively acts as a navigation surface, movement information generated viaphotodetector array 32 based onfirst surface 25 is highly accurate because the features of this navigation surface are known and relatively stable. In one embodiment, first surface 25-is generally excluded from dust, markings, etc. that otherwise can cause noise or bad pixels in digital images corresponding to the navigation surface. Accordingly, embodiments of the invention enable control over the type and quality of navigation surface as well as protection of the navigation surface based on its general exclusion from ambient conditions (external to input device 10), as will be further illustrated in association withFIGS. 3-4 . - In one embodiment,
sensor module 30 forms a portion of optical navigation sensor integrated circuit (IC) 60. As shown inFIG. 1A ,optical navigation sensor 60 includes digital input/output circuitry 66,navigation processor 68, analog to digital converter (ADC) 72, photodetector array 32 (of sensor module 30), and lightsource driver circuit 86. In one embodiment,sensor 60 comprises an electronic image sensor, including but not limited to, a charge coupled device (CCD) or a complimentary metal oxide semiconductor—active pixel sensors (CMOS—APS). Both types of sensors contain an array of photodetectors (e.g., pixels), arranged in a pattern. - In operation, according to one embodiment,
light source 20 emits light (A) through light modulator 24 (which also defines navigation surface 25), and illuminated images (B) are generated. In one embodiment,light source 20 is a light emitting diode. In one embodiment,light source 20 is a coherent light source or an at least partially coherent light source. In one embodiment,light source 20 is a laser. In one form of the invention,light source 20 is a vertical cavity surface emitting laser (VCSEL) diode. In another form of the invention,light source 20 is an edge emitting laser diode.Light source 20 is controlled bydriver circuit 86, which is controlled bynavigation processor 68 viacontrol line 70. In one embodiment,control line 70 is used bynavigation processor 68 to causedriver circuit 86 to be powered on and off, and correspondingly causelight source 20 to be powered on and off. - Images (B) emanating from
first surface 25 ofmobile housing 12 are projected directly ontophotodetector array 32 without interaction (e.g., reflection) with any intervening structures. Each photodetector inphotodetector array 32 provides a signal that varies in magnitude based upon the intensity of light incident on the photodetector. The signals fromphoto array 32 are output to analog to digital converter (ADC) 72, which converts the signals into digital values of a suitable resolution (e.g., six bits). The digital values provide a digital representation of the illuminated pattern onphotodetector array 32 produced via transmission of light throughnavigation surface 25, which acts as a light modulator. The digital values generated by analog todigital converter 72 are output tonavigation processor 68. The digital values received bynavigation processor 68 are stored as a frame withinmemory 69. Different frames of light patterns detected atphotodetector array 32 are compared over time. In one aspect, successive frames are compared while in other aspects, non-sequential frames are compared. - In another aspect, as shown in
FIG. 1A ,light modulator 24 has width (W2) andphotodetector array 32 has a width (W3). In one aspect, a width of each of respectivelight modulator 24 andphotodetector array 32 inferentially generally corresponds to a surface area of the respectivelight modulator 24 and sensor array. Accordingly, in one embodiment,light modulator 24 has a surface area (inferentially represented by width W2) that is substantially larger than a surface area of photodetector array 32 (inferentially represented by width W3). In this embodiment,light modulator 24 comprises a plurality of opaque features spaced apart on a generally transparent member so that aslight modulator 24 is moved relative tophotodetector array 32, the light contrast pattern detected atphotodetector array 32 generally corresponds to a unique position oflight modulator 24 relative tophotodetector array 32. Frames of newly detected light contrast patterns are compared with reference frames of detected light contrast patterns to thereby determine a direction, amount and speed of relative motion of the light contrast pattern relative to thephotodetector array 32. - In another embodiment,
light modulator 24 has a surface area (inferentially represented by width W2) that is substantially less than a surface area of photodetector array 32 (inferentially represented by width W3). In this embodiment,light modulator 24 comprises a single opaque feature centrally arranged on a generally transparent member so that aslight modulator 24 is moved relative tophotodetector array 32, the single opaque feature generally corresponds to a unique absolute position oflight modulator 24 relative tosensor array 24. The single opaque feature enables direct tracking of the absolute position of the opaque feature without using frame comparison techniques. This direct tracking mechanism indicates a direction and magnitude of relative motion of the light contrast pattern relative tophotodetector array 32. - In one embodiment, relative movement between
light modulator 24 and thesensor module 30 occurs in a generally singular plane along two dimensions. In another embodiment, relative movement between thelight modulator 24 and thesensor module 30 also occurs along a generally singular plane but is restricted to movement along a single dimension. -
FIG. 1B is a block diagram illustrating aninput device 90, according to an embodiment of the invention. As shown inFIG. 1A ,input device 90 compriseslight source 92,light modulator 94 andsensor 96, which have substantially the same features and attributes as previously described for light source 34,light modulator 24, andsensor module 30 ofinput device 10 in association withFIG. 1A . - As shown in
FIG. 1B ,light modulator 94 is positioned transversely across a path of light emitted fromlight source 92 to produce a light contrast pattern (i.e., a pattern of varying light intensities) that is directly transmitted tosensor 96. Relative motion between the light contrast pattern andsensor 96 is detected atsensor 96.FIG. 1B illustrates different arrangements of enabling relative motion amonglight source 92,light modulator 94 andsensor 96 to produce changing light patterns atsensor 96 to capture human control inputs. - In one embodiment, as already represented in
FIG. 1 A ,light source 92 andlight modulator 94 are fixed relative to each other but movable as single unit relative to a fixedsensor 96. In another embodiment,light source 92 is fixed relative to a fixedsensor 96, andlight modulator 94 is movable relative to bothlight source 92 andsensor 96. In another embodiment,light modulator 94 andsensor 96 are fixed relative to each other andlight source 92 is movable relative tolight modulator 94. In addition, other combinations are possible such aslight source 92 andlight modulator 94 being fixed relative to each other, butsensor 96 being movable relative to bothlight modulator 94 andlight source 92. Each of these different embodiments represented byFIG. 1B use light transmitted throughlight modulator 94 to produce a changing pattern of light detected atsensor 96 over time to thereby capture human control inputs associated with the motion of one or more oflight source 92,light modulator 94, andsensor 96 relative to each other. - Different aspects of these embodiments of
input device 90 are described and illustrated in further detail in association withFIGS. 1 and 2 -7B. -
FIG. 2 illustrates amethod 100 of optical navigation using a pointing device, according to an embodiment of the invention. As shown inFIG. 2 , at 102, a path of light is directed from a light source through a generally transparent first surface that includes at least one opaque region. In one aspect, the at least one opaque region comprises a plurality of opaque portions spaced apart from each other over the generally transparent surface to form a known or predefined pattern. In one aspect, the plurality of opaque portions comprises a pseudo-random pattern with each of the opaque portions varying in size and shape, and with varying spacing between adjacent opaque portions. In another aspect, the first surface conversely comprises a generally opaque member including a plurality of transparent regions or portions arranged in a manner substantially the same as the previously described plurality of opaque portions (on the generally transparent member). - At 104, the illuminated surface is moved relative to the sensor module in a generally lateral orientation. In one aspect, this lateral movement occurs in two dimensions that are generally horizontal relative to the sensor array. In another aspect, this relative movement of the illuminated surface relative to the sensor array is constrained to a single dimension that is generally horizontal relative to the sensor array.
- At 106, movement information is generated based on the relative movement between the first surface and the sensor module. In one aspect, although the pattern of opaque portions (or transparent portions, conversely) is fixed on the first surface, illuminated light contrast patterns (e.g., light contrasting images) received at the sensor array are differentiated by their relative position as detected on the sensor array as the first surface is moved relative to the sensor array. Comparison of these differently positioned light contrast patterns (relative to the sensor array) enables determining a magnitude and direction of movement of the first surface to generate relative motion information for human control inputs.
- In one aspect, light contrast patterns received at the sensor array for comparison are images of an illuminated surface through which light was transmitted on its path to the sensor array. These transmissive images or transmissive light patterns (i.e., images or patterns created by transmission of light through a transparent member) are different than reflected images used in conventional optical pointing devices in which an image received at a sensor array is produced by reflection off of a surface.
- In one embodiment,
method 100 is performed usinginput device 10 as previously described and illustrated in association withFIG. 1A , and as well as pointingdevices FIGS. 1B, 4 , 5A-5B, 6A-6B, and 7A-7B. -
FIG. 3 illustrates apointing device 150, according to an embodiment of the invention. As shown inFIG. 3 ,pointing device 150 comprisespuck 152,support structure 160, andsensor module 170. In one embodiment,puck 152 andsensor module 170 have substantially the same features and attributes asmobile housing 12 andsensor module 30, as previously described in association withFIG. 1A . - In one embodiment,
puck 152 comprisesfirst portion 154 andsecond portion 156 with thesecond portion 156 extending generally outward, and generally perpendicular relative tofirst portion 154. In one embodiment,first portion 154 comprises a generally disc-shaped member whilesecond portion 156 comprises a generally cylindrically shaped or tubular shaped member. In one aspect,first portion 154 and/orsecond portion 156 compriseslight source 155 for illuminating first surface 158 (i.e., a navigation surface) ofsecond portion 156.First portion 154 ofpuck 152 also comprisesupper surface 178 and abottom surface 180. - In another embodiment,
first portion 154 comprises other shaped members, such as a generally rectangular shaped member whilesecond portion 156 also comprises other shaped members such as a generally rectangular shaped member. - In one embodiment,
light source 155 comprises a light emitting diode (LED). In another embodiment,light source 155 comprises a portion of light that is conveyed topuck 152 from an external location, such as a remote LED, and then fed via a light guide intopuck 152 to producelight source 155 within thepuck 152. This latter embodiment is substantially the same as the embodiment later described in association withFIGS. 7A-7B . - As shown in
FIG. 3 ,support structure 160 comprisestop surface 161, opening 162, andbottom surface 163.Opening 162 has a shape (e.g., generally circular) and a width that enables constrained movement ofsecond portion 156 ofpuck 152 withinopening 162. Accordingly,second portion 156 is allowed to move freely in a generally unlimited manner, such as side-to-side motion, spiral motion, etc., within the boundary defined by the sidewalls ofhole 162 ofsupport structure 160. In another embodiment, opening 162 defines an elongate slot to enable slidable movement ofsecond portion 156 ofpuck 152 along a single dimension relative to supportstructure 160. - As shown in
FIG. 3 ,top surface 161 ofsupport structure 160 is arranged for sliding contact withbottom surface 180 offirst portion 154.First portion 154 is sized and shaped to enable a finger to engageupper surface 178 for laterally movingfirst portion 154 ofpuck 152 relative to supportstructure 160. Movement offirst portion 154 relative to supportstructure 160 causes movement ofsecond portion 156 relative tosensor array 172 ofsensor module 170, thereby producing relative motion of a light contrast pattern (projected from second portion 156) andsensor array 172 to enable capturing user control inputs. -
FIG. 4 is a sectional view ofFIG. 3 , as taken along lines 4-4, according to an embodiment of the invention. As shown inFIG. 4 ,pointing device 152 comprises the features and attributes illustrated inFIG. 3 , and further comprises retainingmechanism 190 for preventingsecond portion 156 ofpuck 152 from being lifted out of (or falling out of)hole 162. In one embodiment, retainingmechanism 190 comprises a generally disc shaped member or a pair of finger-like projections orwings 192 that loosely engagebottom surface 163 ofsupport structure 161. In one aspect, retainingmechanism 190 does not restrict or limit lateral movement ofpuck 154 relative to supportstructure 160. - In another embodiment, retaining
mechanism 190 permits lateral movement ofpuck 154 relative to supportstructure 160 but is also biased againstsupport structure 160 to maintainpuck 152 in a stationary position to prevent unintended lateral movement ofpuck 152 until or unless a force is applied by the user to movepuck 152 laterally relative to supportstructure 160. In one aspect, restrainingmechanism 190, when biased againstsupport structure 160, acts to maintain a constant height betweenfirst surface 158 andsensor array 172. - In other embodiments, pointing
device 150 includes other mechanisms, such as re-centering mechanisms, for controlling the motion ofpuck 152 relative to supportstructure 160, as later described in association withFIGS. 7A-7B . - As shown in
FIG. 4 , in one embodiment,controller module 194 is a simplified representation of sensor integrated circuit 60 (FIG. 1 ) and is in electrical communication withlight source 155 vialines 196. In one aspect,control lines 196 extend into and withinsecond portion 156 for connection withlight source 155 withinpuck 154. - Accordingly, movement of
first portion 154 of puck 152 (as indicated by arrow M1) causes movement of navigation surface 158 (as indicated by arrow M2) relative tosensor array 172 to generate relative motion information to capture human control inputs. -
FIG. 5A is a block diagram illustrating apointing device 250, according to an embodiment of the invention.Pointing device 250 comprises substantially the same features and attributes asinput device FIGS. 1A-4 . As shown inFIG. 5A , pointingdevice 250 comprisesmobile housing 252 andoptical navigation sensor 254 withmobile housing 252 comprisinglight source 260 andlight modulator 262. In one embodiment,light source 260 comprises a light emitting diode (LED). In another embodiment,light source 260 comprises other light sources, such as those producing substantially coherent light or at least partially coherent light. - In one embodiment,
light modulator 262 comprises a mask for modulating light emitted fromlight source 260 prior to reception atoptical navigation sensor 254. In one aspect,light modulator 262 comprises a shadow mask, aperture grill, or related structure including a transparent member with a predefined pattern of opaque portions to cause absorption, scattering, etc. of light, thereby serving to form an illumination pattern oflight modulator 262 atnavigation sensor 254. -
FIG. 5B is a plan view of amask 270, according to an embodiment of the invention. As shown inFIG. 5B ,mask 270 comprises a generally disc shaped member (having a generally circularly shaped appearance when seen in a plan view) that generally corresponds to a cross-sectional shape ofmobile housing 252. In other embodiments,mask 270 comprises a generally rectangular shaped member or other shape suitable for serving as a transverse member to modulate a path of light. - In one embodiment,
mask 270 is a generallytransparent member 271 that includes a plurality ofopaque portions 272 spaced apart from each other and arranged in a distinctive, fixed pattern. In one embodiment,opaque portions 272 are arranged in a pattern generally corresponding to a pseudo-random pattern. In one aspect, the plurality of opaque portions vary in size and/or shape with the pattern defining varying spacing between adjacentopaque portions 272. In one aspect, a surface area occupied by the plurality of opaque portions 272 (including the spacing between opaque portions 272) is substantially larger than a surface area of a sensor array ofnavigation sensor 254 to thereby enable unique comparisons of light patterns atnavigation sensor 254 as relative motion occurs betweenmask 270 and a sensor array ofnavigation sensor 254. The unique comparisons are based on variations in the size, shape, and position of theopaque portions 272. - In another embodiment,
mask 270 is a generallyopaque member 271 that includes a plurality oftransparent portions 272. In another embodiment,mask 270 is asemi-transparent member 271 that includes a plurality ofopaque portions 272. - As previously described, the pattern on
mask 270 is arranged to enable comparison of a sequence of images or light patterns of the illuminatedmask 270. Each image or different light pattern in the comparison generally corresponds to a different position ofmask 270 relative tooptical navigation sensor 254 as mask 270 (as part of housing 252) is moved relative tooptical navigation sensor 254. Comparison of these differing images or differing light patterns enables determining a speed and/or direction of motion ofhousing 277 and enables movement information to be generated to capture user control inputs associated withoptical navigation sensor 254. -
FIG. 5C is a plan view of amask 273, according to an embodiment of the invention. As shown inFIG. 5B ,mask 273 comprises a generally disc shaped member (having a generally circularly shaped appearance when seen in a plan view) that generally corresponds to a cross-sectional shape ofmobile housing 252.Mask 273 is a generallytransparent member 271 that includes a single centrally locatedopaque portion 274.Opaque portion 274 includes a generally circular shape. In other embodiments,opaque portion 274 comprises any one of a generally rectangular shape, a generally triangular shape, or other shape suitable for direct tracking by optical navigation sensor 254 A In other embodiments,mask 273 comprises a generally rectangular shaped member or other shaped member. In another embodiment,mask 273 comprises a generallyopaque portion 271 with a centrally locatedtransparent portion 274. - In one aspect, a surface area of
opaque portion 274 is substantially smaller than a surface area of a sensor array ofnavigation sensor 254 to thereby enable direct tracking of the speed and/or direction ofopaque portion 274 atnavigation sensor 254 to capture user control inputs as relative motion occurs betweenmask 273 and a sensor array ofnavigation sensor 254. -
FIG. 6A is a block diagram illustrating a pointing device 275, according to an embodiment of the invention. Pointing device 275 comprises substantially the same features and attributes asinput device FIGS. 1A-4 . As shown inFIG. 6A , pointing device 275 comprisesmobile housing 277 andoptical navigation sensor 279. In one embodiment,mobile housing 277 compriseslight source 280,light pipe 282,diffuser 284, andlight modulator 285. In one aspect,light source 280 is a light emitting diode (LED) with light emitted from the LED traveling throughlight pipe 282 anddiffuser 284 to and throughlight modulator 285. - In one embodiment,
light modulator 285 comprises a grid or other mechanism for modulating light emitted from light source 280 (and which travels throughlight pipe 282 and diffuser 284) prior to reception atphotocell array 279. In one aspect,photocell array 279 comprises photocell A and photocell B arranged generally side-by-side manner.Light modulator 285 is configured, so that when illuminated via backlighting relative tophotocell sensor array 279, to enable a phase shift detection algorithm or quadrature-type detection algorithm to be used to detect motion of thelight modulator 285, and generate relative motion information for capturing user control inputs. - In one embodiment, pointing device 275 comprises an
external portion 287 that is external to and/or an extension ofmobile housing 277 at whichlight source 280 is positioned external ofmobile housing 277. In this embodiment,light pipe 282 extends outside ofmobile housing 277 to receive light fromlight source 280, so thatlight source 280 need not be contained withinmobile housing 277. In one aspect,light pipe 282 comprises a length that enables light to travel fromlight source 280 positioned outsidemobile housing 277 intomobile housing 277 and then intodiffuser 284 and throughlight modulator 285. -
FIG. 6B comprises a plan view of agrid 290 as one embodiment oflight modulator 285, according to an embodiment of the invention. As shown inFIG. 6B ,grid 290 comprises an elongate, generally rectangular shaped member, with only a portion ofgrid 290 is shown for illustrative purposes.Grid 290 is a generally transparent member including an array ofmodules 295 arranged side-by-side in series, which further define channels A and B. Channel A is defined by a plurality ofmarked portions unmarked portion 296 disposed between adjacentmarked portions marked portions unmarked portion 296 interposed between adjacentmarked portions Marked portion 292A of channel A has a position that overlapsmarked portion 294A of channel B, andmarked portion 292B of channel A has a position generally overlappingmarked portion 294B of channel B. - In one embodiment,
grid 290 is arranged as a light modulator such aslight modulator 285 shown inFIG. 6A as positioned oversensor array 279. In particular,grid 290 is positioned so that channel A of grid 290 (includingmarked portions marked portions - In use, images detected at photocell A relative to channel A and images detected at photocell B relative to channel B caused by movement of
grid 290 along a single direction of motion (as indicated by directional arrow E) enables generating movement information based on relative movement ofmobile housing 277 relative tophotocell array 279. Movement information is generated via comparison of images via a phase-shift detection algorithm or quadature-type detection algorithm. -
FIG. 7A is diagram illustrating a top view of apointing device 300, according to another embodiment of the present invention.FIG. 7B is a diagram illustrating a cross-sectional view alongsection line 7B-7B of thepointing device 300 shown inFIG. 7A according to one embodiment of the present invention. In the illustrated embodiment, pointingdevice 300 is configured in substantially the same manner as pointing device 150 (FIGS. 3-4 ), butpuck 152 ofpointing device 300 does not include alight source 155 contained directly withinpuck housing 154. Rather, pointingdevice 300 includes at least one of fourlight sources 302A-302D positioned about a periphery of the field ofmotion 319. Each of thelight sources 302A-302D is positioned adjacent to one of thesprings 313. In one form of the invention, springs 313 are formed from plastic, and are configured as light pipes to guide visible light fromlight sources 302A-302D topuck 152. In another form of the invention, fiber optic lines are attached to or inserted throughsprings 313, to guide light fromlight sources 302A-302D topuck 152. In one embodiment, light sources 102A-102D are LEDs. Accordingly, springs 313 act to bring light from a remote source topuck 152 for transmission through light modulator 158 (shown inFIG. 7B ) for reception at sensor module 170 (which is not seen inFIG. 7A , but shown inFIG. 7B ) to enable image comparisons to generate relative motion information to capture user control inputs. - In one aspect, springs 313 also act as a re-centering mechanism to further constrain motion of
puck 152 to a field of motion. In particular, when the user moves puck 152 (using finger 319) laterally outward from a center region of slide surface 153 (e.g.,top surface 161 inFIGS. 3-4 ) and then releasespuck 152 by removing theirfinger 316 frompuck 152,puck 152 is returned to its centered position by thesprings 313 that connect thepuck 152 to theside 314 of the puck field ofmotion 319. In one aspect, user control inputs associated withpuck 152 are deactivated whenpuck 152 is released for automatic re-centering. This aspect mimics the action of a conventional mouse when lifting the mouse from an edge of a navigation surface (e.g. a mouse pad) and replacing the mouse at the center of the navigation surface. - In another embodiment, springs 313 do not act as light pipes to convey light into
puck housing 154 but act only as a re-centering mechanism to re-centerpuck 152 within puck field ofmotion 319. In this case,puck 152 includes its own light source contained withinpuck housing 152. - Embodiments of the invention are directed to an input device that uses an illuminated light modulator to serve as a navigation surface to enable an optical navigation sensor to base its movement calculations on transmissive images, i.e., light contrast patterns created by transmission of light through the navigation surface. In one aspect, the navigation surface is conveniently arranged in direct proximity to the sensor array with no intervening structures therebetween and with little risk for contamination from ambient sources. In one embodiment, the navigation surface is arranged in a puck that is slidably movable over a support structure (e.g., slide pad) that also encloses the sensor array and the navigation surface from the ambient environment. Movement information for the input device is based on relative movement between the illuminated navigation surface and a sensor array. In this manner, the nature and type of the navigation surface is tightly controlled to produce highly accurate movement information for capturing user control inputs.
- It will be understood by a person of ordinary skill in the art that functions performed by
optical pointing devices - Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described without departing from the scope of the present invention. This application is intended to cover any adaptations or variations of the specific embodiments discussed herein. Therefore, it is intended that this invention be limited only by the claims and the equivalents thereof.
Claims (24)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/175,757 US7161136B1 (en) | 2005-07-06 | 2005-07-06 | Light modulating input device for capturing user control inputs |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/175,757 US7161136B1 (en) | 2005-07-06 | 2005-07-06 | Light modulating input device for capturing user control inputs |
Publications (2)
Publication Number | Publication Date |
---|---|
US7161136B1 US7161136B1 (en) | 2007-01-09 |
US20070007442A1 true US20070007442A1 (en) | 2007-01-11 |
Family
ID=37617462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/175,757 Active US7161136B1 (en) | 2005-07-06 | 2005-07-06 | Light modulating input device for capturing user control inputs |
Country Status (1)
Country | Link |
---|---|
US (1) | US7161136B1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070268398A1 (en) * | 2006-05-17 | 2007-11-22 | Ramesh Raskar | Apparatus and method for illuminating a scene with multiplexed illumination for motion capture |
US8605035B2 (en) | 2010-04-30 | 2013-12-10 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Backlighting for optical finger navigation |
US20150224924A1 (en) * | 2012-07-11 | 2015-08-13 | Ulrich Backes | Method for controlling an interior lighting system in a vehicle and interior lighting system |
WO2017147534A1 (en) * | 2016-02-26 | 2017-08-31 | Magic Leap, Inc. | Display system having a plurality of light pipes for a plurality of light emitters |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI289708B (en) * | 2002-12-25 | 2007-11-11 | Qualcomm Mems Technologies Inc | Optical interference type color display |
US7342705B2 (en) | 2004-02-03 | 2008-03-11 | Idc, Llc | Spatial light modulator with integrated optical compensation structure |
US7706050B2 (en) * | 2004-03-05 | 2010-04-27 | Qualcomm Mems Technologies, Inc. | Integrated modulator illumination |
US7750886B2 (en) * | 2004-09-27 | 2010-07-06 | Qualcomm Mems Technologies, Inc. | Methods and devices for lighting displays |
US7766498B2 (en) | 2006-06-21 | 2010-08-03 | Qualcomm Mems Technologies, Inc. | Linear solid state illuminator |
US7845841B2 (en) * | 2006-08-28 | 2010-12-07 | Qualcomm Mems Technologies, Inc. | Angle sweeping holographic illuminator |
US8107155B2 (en) * | 2006-10-06 | 2012-01-31 | Qualcomm Mems Technologies, Inc. | System and method for reducing visual artifacts in displays |
EP2069838A2 (en) * | 2006-10-06 | 2009-06-17 | Qualcomm Mems Technologies, Inc. | Illumination device with built-in light coupler |
WO2008045224A2 (en) * | 2006-10-06 | 2008-04-17 | Qualcomm Mems Technologies, Inc | Thin light bar and method of manufacturing |
EP2366943B1 (en) | 2006-10-06 | 2013-04-17 | Qualcomm Mems Technologies, Inc. | Optical loss structure integrated in an illumination apparatus of a display |
US7855827B2 (en) * | 2006-10-06 | 2010-12-21 | Qualcomm Mems Technologies, Inc. | Internal optical isolation structure for integrated front or back lighting |
US7864395B2 (en) * | 2006-10-27 | 2011-01-04 | Qualcomm Mems Technologies, Inc. | Light guide including optical scattering elements and a method of manufacture |
US7777954B2 (en) * | 2007-01-30 | 2010-08-17 | Qualcomm Mems Technologies, Inc. | Systems and methods of providing a light guiding layer |
US7733439B2 (en) * | 2007-04-30 | 2010-06-08 | Qualcomm Mems Technologies, Inc. | Dual film light guide for illuminating displays |
US7949213B2 (en) * | 2007-12-07 | 2011-05-24 | Qualcomm Mems Technologies, Inc. | Light illumination of displays with front light guide and coupling elements |
JP2011508337A (en) * | 2007-12-31 | 2011-03-10 | ユイ・ジン・オ | Data input device |
US20090189858A1 (en) * | 2008-01-30 | 2009-07-30 | Jeff Lev | Gesture Identification Using A Structured Light Pattern |
WO2009102731A2 (en) | 2008-02-12 | 2009-08-20 | Qualcomm Mems Technologies, Inc. | Devices and methods for enhancing brightness of displays using angle conversion layers |
US8654061B2 (en) * | 2008-02-12 | 2014-02-18 | Qualcomm Mems Technologies, Inc. | Integrated front light solution |
WO2009129264A1 (en) | 2008-04-15 | 2009-10-22 | Qualcomm Mems Technologies, Inc. | Light with bi-directional propagation |
US8118468B2 (en) * | 2008-05-16 | 2012-02-21 | Qualcomm Mems Technologies, Inc. | Illumination apparatus and methods |
JP2011526053A (en) * | 2008-06-04 | 2011-09-29 | クォルコム・メムズ・テクノロジーズ・インコーポレーテッド | Reduction method of edge shadow for prism front light |
US8451224B2 (en) * | 2008-07-23 | 2013-05-28 | Sony Corporation | Mapping detected movement of an interference pattern of a coherent light beam to cursor movement to effect navigation of a user interface |
US8324602B2 (en) * | 2009-04-14 | 2012-12-04 | Intersil Americas Inc. | Optical sensors that reduce specular reflections |
US8232541B2 (en) * | 2009-04-14 | 2012-07-31 | Intersil Americas Inc. | Optical sensors that reduce specular reflections |
US9121979B2 (en) * | 2009-05-29 | 2015-09-01 | Qualcomm Mems Technologies, Inc. | Illumination devices and methods of fabrication thereof |
US8125449B2 (en) * | 2009-12-16 | 2012-02-28 | Chung Shan Institute Of Science And Technology, Armaments Bureau, M.N.D. | Movable touchpad with high sensitivity |
GB2486000A (en) | 2010-11-30 | 2012-06-06 | St Microelectronics Res & Dev | Optical proximity detectors with arrangements for reducing internal light propagation from emitter to detector |
GB2485996A (en) | 2010-11-30 | 2012-06-06 | St Microelectronics Res & Dev | A combined proximity and ambient light sensor |
GB2485998A (en) * | 2010-11-30 | 2012-06-06 | St Microelectronics Res & Dev | A single-package optical proximity detector with an internal light baffle |
US8902484B2 (en) | 2010-12-15 | 2014-12-02 | Qualcomm Mems Technologies, Inc. | Holographic brightness enhancement film |
GB2558667A (en) * | 2017-01-17 | 2018-07-18 | T Phy Ltd | Optical input devices |
US11523722B2 (en) | 2019-05-28 | 2022-12-13 | Pixart Imaging Inc. | Dirtiness level determining method and electronic device applying the dirtiness level determining method |
US11493336B2 (en) * | 2020-06-22 | 2022-11-08 | Pixart Imaging Inc. | Optical navigation device which can determine dirtiness level of cover or fix multi light pattern issue |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6333733B1 (en) * | 1996-09-04 | 2001-12-25 | Trioc Ab | Position-sensing unit and multidimensional pointer comprising one or more such units |
US20030151594A1 (en) * | 2002-02-08 | 2003-08-14 | Huo-Lu Tsai | Mouse with an optical encoder wheel for a computer |
US20050110755A1 (en) * | 2003-11-24 | 2005-05-26 | Jonah Harley | Compact pointing device |
-
2005
- 2005-07-06 US US11/175,757 patent/US7161136B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6333733B1 (en) * | 1996-09-04 | 2001-12-25 | Trioc Ab | Position-sensing unit and multidimensional pointer comprising one or more such units |
US20030151594A1 (en) * | 2002-02-08 | 2003-08-14 | Huo-Lu Tsai | Mouse with an optical encoder wheel for a computer |
US20050110755A1 (en) * | 2003-11-24 | 2005-05-26 | Jonah Harley | Compact pointing device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070268398A1 (en) * | 2006-05-17 | 2007-11-22 | Ramesh Raskar | Apparatus and method for illuminating a scene with multiplexed illumination for motion capture |
US7957007B2 (en) * | 2006-05-17 | 2011-06-07 | Mitsubishi Electric Research Laboratories, Inc. | Apparatus and method for illuminating a scene with multiplexed illumination for motion capture |
US8605035B2 (en) | 2010-04-30 | 2013-12-10 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Backlighting for optical finger navigation |
US20150224924A1 (en) * | 2012-07-11 | 2015-08-13 | Ulrich Backes | Method for controlling an interior lighting system in a vehicle and interior lighting system |
US9758093B2 (en) * | 2012-07-11 | 2017-09-12 | Trw Automotive Electronics & Components Gmbh | Method for controlling an interior lighting system in a vehicle and interior lighting system |
WO2017147534A1 (en) * | 2016-02-26 | 2017-08-31 | Magic Leap, Inc. | Display system having a plurality of light pipes for a plurality of light emitters |
US10775545B2 (en) | 2016-02-26 | 2020-09-15 | Magic Leap, Inc. | Display system having a plurality of light pipes for a plurality of light emitters |
Also Published As
Publication number | Publication date |
---|---|
US7161136B1 (en) | 2007-01-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7161136B1 (en) | Light modulating input device for capturing user control inputs | |
US7737948B2 (en) | Speckle navigation system | |
US6618038B1 (en) | Pointing device having rotational sensing mechanisms | |
US7257255B2 (en) | Capturing hand motion | |
CN1928801B (en) | Position detection system using laser speckle | |
US7675020B2 (en) | Input apparatus and methods having diffuse and specular tracking modes | |
JP5236860B2 (en) | Data input apparatus and method for detecting movement of tracking surface by laser speckle pattern | |
US20060279545A1 (en) | Sensor chip for laser optical mouse and related laser optical mouse | |
US20080030458A1 (en) | Inertial input apparatus and method with optical motion state detection | |
US7324086B2 (en) | Data input device and method for detecting lift-off from a tracking surface by laser doppler self-mixing effects | |
US20110095984A1 (en) | Optical navigation system and method of estimating motion with optical lift detection | |
US20060256077A1 (en) | Inertial sensing input apparatus | |
US20060158424A1 (en) | Optical slide pad | |
US20040113886A1 (en) | Sensing structure for optic input | |
US20050156874A1 (en) | Data input device and method for detecting life-off from a tracking surface by laser doppler self-mixing effects | |
US7567235B2 (en) | Self-aligning optical sensor package | |
US20070241262A1 (en) | Optical sensing unit for an optical input device | |
US7746477B1 (en) | System and method for illuminating and imaging a surface for an optical navigation system | |
KR20030062032A (en) | Digital pen device | |
US20080204761A1 (en) | Pointing device | |
WO2009114821A9 (en) | Apparatus and method of finger-motion based navigation using optical sensing | |
US20060232556A1 (en) | Lens module for optical mouse and related optical module and computer input apparatus | |
US20050200599A1 (en) | Optical pointing device | |
JP2003216321A (en) | Optical input device | |
US8330721B2 (en) | Optical navigation device with phase grating for beam steering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AGILENT TECHNOLOGIES, INC., COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WENSTRAND, JOHN STEWART;HARTLOVE, JASON T;REEL/FRAME:016395/0491 Effective date: 20050629 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP PTE. LTD.,SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:017206/0666 Effective date: 20051201 Owner name: AVAGO TECHNOLOGIES GENERAL IP PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:017206/0666 Effective date: 20051201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.,S Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017675/0518 Effective date: 20060127 Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD., Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017675/0518 Effective date: 20060127 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: MERGER;ASSIGNOR:AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.;REEL/FRAME:030369/0528 Effective date: 20121030 |
|
AS | Assignment |
Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT, NEW YORK Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001 Effective date: 20140506 Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AG Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001 Effective date: 20140506 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001 Effective date: 20160201 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE NAME PREVIOUSLY RECORDED AT REEL: 017206 FRAME: 0666. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:038632/0662 Effective date: 20051201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041710/0001 Effective date: 20170119 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041710/0001 Effective date: 20170119 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553) Year of fee payment: 12 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE Free format text: MERGER;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:047196/0097 Effective date: 20180509 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE EXECUTION DATE PREVIOUSLY RECORDED AT REEL: 047196 FRAME: 0097. ASSIGNOR(S) HEREBY CONFIRMS THE MERGER;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:048555/0510 Effective date: 20180905 |