US20080291167A1 - Pointing device and displacement estimation method - Google Patents

Pointing device and displacement estimation method Download PDF

Info

Publication number
US20080291167A1
US20080291167A1 US12/217,356 US21735608A US2008291167A1 US 20080291167 A1 US20080291167 A1 US 20080291167A1 US 21735608 A US21735608 A US 21735608A US 2008291167 A1 US2008291167 A1 US 2008291167A1
Authority
US
United States
Prior art keywords
frame
block
pointing device
displacement
predicted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/217,356
Inventor
Chun-Huang Lin
Jeng-Feng Lan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/832,203 external-priority patent/US7388997B2/en
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to US12/217,356 priority Critical patent/US20080291167A1/en
Publication of US20080291167A1 publication Critical patent/US20080291167A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • G06T7/238Analysis of motion using block-matching using non-full search, e.g. three-step search

Definitions

  • the present invention relates to displacement estimation, and more specifically, to displacement estimation for an optical mouse using a predicted velocity.
  • a pointing device such as mouse is a standard peripheral for a computer system.
  • a mechanical mouse typically has rollers, wheels, or the like, that contact a rubber-surfaced steel ball at the equator thereof and convert the rotation of the ball into electrical signals.
  • the mechanical mouse has a number of shortcomings such as deterioration or damage to the surface of the mouse ball and mouse pad, resulting from the mechanical construction and operation thereof that rely to a significant degree on a fairly delicate compromise among the mechanical forces involved.
  • An optical mouse utilizes optical and electronic method to compute the movement of the mouse, and is a popular replacement for the mechanical mouse. Compared with the conventional mechanical mouse, an optical mouse offers better reliability and performance. Thus optical pointing devices have captured a significant share of the mouse market.
  • An optical mouse typically has a logical operation circuit and a Complementary Metal Oxide Semiconductor (CMOS) photosensing array comprising photo detectors.
  • CMOS Complementary Metal Oxide Semiconductor
  • the CMOS photosensing array sequentially captures images of the area in which the optical mouse moves and generates digital signals representing the captured image.
  • the digital image varies with the movement of the optical mouse.
  • the logical operation circuit computes the displacement of the optical mouse according to the dynamic change in the digital image, and directs the computer to control the pointer (cursor) on the screen in accordance with movement of the mouse.
  • Block matching is accomplished by comparing a newly captured sample frame (current frame) with a previously captured reference frame (template frame) to ascertain the direction and amount of movement.
  • Conventional block matching performs a fully search pf block with a predetermined size.
  • the template frame 100 and current frame 104 are digital images of a 6*6 pixel area. If these two digital images are fully searched using a 2*2 pixel block, the computer must perform 25*25 (625) correlation computations. If two digital images of 16*16 pixels are fully searched with an 8*8 pixel block, the computer must perform 81*81 (6561) correlation computations.
  • the number of correlation computations can be greatly reduced if a template block in the template frame is used for block matching instead of the full searching process.
  • a 2*2 template block 102 is located in the center of the template frame 100 . Accordingly, the computer only needs to perform 5*5 (25) correlation computations for the current frame 104 when using the 2*2 block as a search unit.
  • Block matching determines a block with the least difference (greatest correlation) after comparing all the 2*2 blocks (from block 106 to block 108 ) in the current frame 104 with the template block 102 . Note that the searching order is not limited to the described order.
  • the object of the present invention is to provide a displacement prediction method and a pointing device used in a computer system.
  • the pointing device of the present invention comprises photo detectors that capture images sequentially, obtain a template frame and current frame, and predict moving velocity vector to calculate the image displacement.
  • Another object of the present invention is to reduce the search range in the current frame for block matching according to the predicted displacement.
  • Yet another object of the present invention is to determine the location of the template block in the template frame according to the predicted displacement in order to extend the valid period of the template frame, hence decreasing the frequency of template frame updates.
  • the present invention provides a hand held pointing device for a computer system, which comprises a photo capture device and a displacement detection circuit.
  • the photo capture device captures images sequentially to produce a first frame and a second frame, whereas the displacement detection circuit predicts a velocity of the pointing device, and compares the first frame to the second frame according to the predicted velocity to obtain a displacement of the pointing device.
  • the photo capture device of the present invention comprises a photosensing array with a plurality of photo detectors, and the size of each photo detector in the photosensing array is between 20 ⁇ m*20 ⁇ m to 40 ⁇ m*40 ⁇ m.
  • the present invention further comprises a method for estimating the displacement of a pointing device, wherein the pointing device comprises a photo capture device for capturing images sequentially to produce corresponding frames.
  • a velocity of the pointing device is predicted, and according to the predicted velocity, a first frame is compared with a second frame produced by the photo capture device to obtain a displacement of the pointing device.
  • the velocity is predicted according to at least two previous velocity data.
  • the first and second frames are compared by first defining a template block in the first frame, defining a search range in the second frame according to the predicted velocity, and lastly, searching a matched block in the search range to output the displacement.
  • the matched block is the block in the search range that has the smallest difference when compared to the template block.
  • the method of the present invention further comprises computing the acceleration of the pointing device, and predicting the velocity according to the acceleration and at least one previous velocity data.
  • the method of defining the search range comprises computing a predicted block with the same size as the template block according to the predicted velocity, wherein the predicted block is the block in the second frame with the most possible locations for the template block under the predicted velocity of the pointing device.
  • the predicted block is then extended to form the search range.
  • the predicted block is extended either symmetrically or asymmetrically.
  • the method provided in the present invention further comprises defining a template block according to the predicted velocity, wherein the template block is not in the center of the first frame.
  • the displacement vector between the template block and the center of the first frame is proportional to the predicted velocity.
  • the search algorithm for block matching is typically related to correlation computation, for example, mean square error (MSE) and mean absolute difference (MAD). These algorithms compute the error or difference between two blocks using specific formulas, and the smaller the error, the larger the correlation between the two blocks.
  • a matched block is defined as a block that has the greatest correlation to the template block (or reference block).
  • FIG. 1 illustrates the conventional fully search method for block matching.
  • FIG. 2 a is a diagram illustrating a typical optical mouse.
  • FIG. 2 b is a block diagram of the integrated circuit chip shown in FIG. 2 a.
  • FIG. 3 illustrates searching the matched block within a specific searching range according to an embodiment of the present invention.
  • FIG. 4 illustrates an example of changing the search range according to the principles of the present invention.
  • FIG. 5 illustrates an example of a current frame with several repeating features.
  • FIG. 6 illustrates an example of changing the position of the template block according to the principles of the present invention.
  • FIG. 2 a is an optical mouse for implementing the present invention
  • FIG. 2 b is a block diagram of the integrated circuit chip shown in FIG. 2 a
  • the optical mouse 20 comprises a lens 21 and an integrated circuit (IC) chip 22 .
  • the integrated circuit chip 22 is generally divided into two regions.
  • One is a photo capture device 221 , for example, an array composed of Complementary Metal Oxide Semiconductor (CMOS) photo detectors.
  • CMOS Complementary Metal Oxide Semiconductor
  • the lens 21 reflects the surface under the optical mouse 20 to the photo capture device 221 .
  • the photo capture device captures the images sequentially according to a preset frame rate to generate corresponding images, or the so called “frame”.
  • the captured frame varies with movement of the optical mouse 20 .
  • a displacement detection circuit 222 for processing and computing the captured frames.
  • the displacement detection circuit 222 tracks the movement of the optical mouse 20 by comparing a plurality of captured frames to determine the displacement of the optical mouse 20 .
  • the displacement detection circuit 222 then informs other devices, such as the computer, of the displacement of the optical mouse 20 .
  • the present invention provides a method utilizing the predicted velocity for comparing frames in a block matching operation.
  • the search range can be reduced according to the predicted velocity, thus reducing the number of required computations in the displacement detection circuit.
  • the present invention extends the lifetime of the template frame (due to lower frequency of template frame updates), and increases the resolution of the output signals from the optical mouse.
  • the photo capture device 221 captures the digital image according to a preset rate, and selects a digital image as a new template frame.
  • a template block in the template frame is then selected and compared to the searching range in the current frame to obtain a matched block that has the smallest difference when compared to the template block.
  • the displacement is calculated by comparing the location of the template block in the template frame with the location of the matched block in the current frame.
  • the calculated displacement represents the movement of the optical mouse or the photo capture device during the photo capture period. If the photo capture period is set as a unit time, the calculated displacement is the instantaneous velocity of the image. Since the displacement is a directional vector, it can be represented by two components, x and y.
  • Instantaneous velocity V is also a vector, and can thus be represented as (Vx,Vy).
  • An average velocity Vm (Vmx,Vmy) is derived by accumulating M instantaneous velocities, and similarly, an average velocity Vn (Vnx,Vny) is derived by accumulating N instantaneous velocities. If M is greater than N, the predicted velocity Vp of the images estimated by the photo capturing device is obtained according to Equation (1). Equation (1) can be decomposed into Equations (2) and (3) for calculating the x and y components individually.
  • V -> ⁇ p ( V ⁇ ⁇ m + V ⁇ ⁇ n ) 2 + ( V ⁇ ⁇ m - V ⁇ ⁇ n ) Equation ⁇ ⁇ ( 1 )
  • Vpx ( Vmx + Vnx ) 2 + ( Vmx - Vnx ) Equation ⁇ ⁇ ( 2 )
  • Vpy ( Vmy + Vny ) 2 + ( Vmy - Vny ) Equation ⁇ ⁇ ( 3 )
  • the predicted velocity Vp is also the predicted displacement of the photo capture device during the next capture period, which indicates the predicted location of the matched block with respect to the location of the template block.
  • the location of the matched block is predicted according to the predicted velocity Vp.
  • the size of each frame is 6*6 pixels
  • the size of the block for block matching is 2*2 pixels.
  • the block matching algorithm proposed in the present invention does not perform a fully search of the current frame, but only searches the blocks in a searching range derived by the predicted velocity.
  • the search time, as well as the computational load on the processor can be significantly reduced by reducing the search range.
  • the search range is restricted by the predicted velocity, so only the blocks in the search range require correlation computation.
  • the predicted velocity and the displacement are all zero in the initial state.
  • the predicted location of the matched block 302 i.e. predicted block
  • the template block is assumed to be at the center of the template frame.
  • the estimated search range 304 is derived by extending one pixel in all directions from the predicted matched block 302 . After a period of time, the image shifts to the right, and the predicted matched block shifts right by one pixel. The predicted matched block is now at location 308 in the current frame 306 , thus the corresponding search range shifts to location 310 .
  • the computational load of the present invention is far less than the fully search method of the conventional block matching method.
  • a 16*16 digital image is searched using an 8*8 template block.
  • the conventional fully search requires 81*81 (6561) computations while searching using one template block requires 9*9 (81) computations, whereas the present invention only requires 5*5 (25) or 3*3 (9) correlation computations depending on the size of the predetermined search range.
  • the search algorithm for block matching is typically related to correlation computation, for example, mean square error (MSE) and mean absolute difference (MAD).
  • MSE mean square error
  • MAD mean absolute difference
  • the search range can be defined by extending N pixels symmetrically around the predicted matched block, or the extended distance can be asymmetrical.
  • the search range 404 of the current frame 400 is derived by extending the expected matched block 402 two pixels in each direction.
  • the initial predicted velocity is zero, so the predicted matched block 402 is in the center of the current frame 400 .
  • a predicted velocity can be estimated when the image changes, and is used to estimate the location of the predicted matched block in the next frame. As shown in FIG.
  • the search range 410 is set based on an acceleration of the photo capture device, and can be either symmetrical or asymmetrical. By doing so, the search range is further reduced in size, and the number of correlation computations is also reduced.
  • the size of the searching range as previously described is determined by the maximum acceleration of the images. Assuming that the photo capture device captures 1000 digital images per second, the search range is derived by extending one pixel from the expected matching block in all directions, and the size of a pixel is 30 ⁇ m, then the maximum acceleration of the images is derived from Equation (4) as 30 ms ⁇ 2 . 30 ms ⁇ 2 of acceleration is equivalent to 3.06 times the gravity G. If the capture rate is 1000 images per second, the search range derived by extending one pixel from the predicted matched block is enough in most applications as the maximum acceleration of the movement would not exceed 30 ms ⁇ 2 .
  • the advantages of using the predicted velocity to reduce the search range comprise reducing the number of correlation computations, and increasing the accuracy of block matching.
  • the size of the photo capture device 221 is typically 60 ⁇ m*60 ⁇ m, and the lens 21 has a magnification ratio of 1.
  • the array in the photo capture device comprise 16*16 CMOS photo detectors and the template block is 8*8, the visible range of the photo capture device 221 is 960 ⁇ m*960 ⁇ m, and the size of the template block is 480 ⁇ m*480 ⁇ m.
  • the conventional method of block matching searches the matched block in the current frame using a template block, and because there are many repeating patterns in one frame, the number of features covered may be more than one, which causes more than one matched blocks to be found in the current frame and complicates the determination of optical mouse movement.
  • the template block of the template frame might find several identical matched blocks in the current frame 500 , and the optical mouse may be unable to determine the displacement, or make a wrong determination.
  • the present invention reduces the search range according to the predicted velocity, which also reduces the chance of finding more than one identical matched block. As shown in FIG. 5 , if the searching range 502 is a 10*10 block covering the matched block 504 in the center, the number of features in the search range 502 is reduced to 4, and the probability of finding more than one matched block is reduced. Therefore, the present invention improves the reliability of block matching by reducing the search range for block matching.
  • One way to reduce the number of features in a frame is to reduce the size of the image captured by the photo capturing device.
  • the present invention proposes a photo capture device capturing images with a size between 20 ⁇ m*20 ⁇ m ⁇ 40 ⁇ m*40 ⁇ m.
  • Another way to reduce the number of features in a frame is to increase the magnification ratio of the lens, so that the lens magnifies the surface under the optical mouse and feeds the magnified images to the photo capture device.
  • the present invention proposes a method for changing the location of the template block according to the predicted velocity of the optical mouse or the photo capture device.
  • the template block in the template frame 600 moves from the center 602 to the location 605 to the left when the optical mouse detects movement to the right.
  • Moving the template block in the opposite direction of the image extends the lifetime of the template frame, thus minimizing the frequency of frame updates.
  • the block 602 in the center will leave the current frame prior to the block 604 to the left side, as the block 604 at the left has a higher chance of appearance in the current frame.
  • the present invention changes the location of the template block as well as sets the search range according to predicted velocity.
  • the template block can be changed when the predicted velocity reaches specific point, then sets the search range when the predicted velocity reaches another specific point. It is also possible to set the search range first, then change the location of the template block.
  • the embodiment of the present invention utilizes illumination as an example of the feature, whereas other image features such as chrominance are also applicable for the feature enhancement method proposed in the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

A method of estimating image displacement by block matching for a pointing device used in a computer system. The method provided in the present invention comprises capturing digital images sequentially using a photo capture device, obtaining a template frame and a current frame, estimating a predicted velocity vector, and computing the displacement of the image. The efficiency and reliability of block match searching are improved by reducing the search range according to the calculated displacement. The template block can be defined anywhere on the template frame according to the predicted velocity to extend the lifetime of the template frame, and the displacement vector between the template block and the center of the template frame is proportional to the predicted velocity.

Description

  • This application is a continuation of U.S. patent application Ser. No. 10/832,203, filed Apr. 26, 2004.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to displacement estimation, and more specifically, to displacement estimation for an optical mouse using a predicted velocity.
  • 2. Description of the Related Art
  • A pointing device such as mouse is a standard peripheral for a computer system. A mechanical mouse typically has rollers, wheels, or the like, that contact a rubber-surfaced steel ball at the equator thereof and convert the rotation of the ball into electrical signals. The mechanical mouse has a number of shortcomings such as deterioration or damage to the surface of the mouse ball and mouse pad, resulting from the mechanical construction and operation thereof that rely to a significant degree on a fairly delicate compromise among the mechanical forces involved. An optical mouse utilizes optical and electronic method to compute the movement of the mouse, and is a popular replacement for the mechanical mouse. Compared with the conventional mechanical mouse, an optical mouse offers better reliability and performance. Thus optical pointing devices have captured a significant share of the mouse market.
  • An optical mouse typically has a logical operation circuit and a Complementary Metal Oxide Semiconductor (CMOS) photosensing array comprising photo detectors. The CMOS photosensing array sequentially captures images of the area in which the optical mouse moves and generates digital signals representing the captured image. The digital image varies with the movement of the optical mouse. The logical operation circuit computes the displacement of the optical mouse according to the dynamic change in the digital image, and directs the computer to control the pointer (cursor) on the screen in accordance with movement of the mouse.
  • The displacement of the optical mouse is commonly tracked and estimated by block matching. Block matching is accomplished by comparing a newly captured sample frame (current frame) with a previously captured reference frame (template frame) to ascertain the direction and amount of movement. Conventional block matching performs a fully search pf block with a predetermined size. As shown in FIG. 1, the template frame 100 and current frame 104 are digital images of a 6*6 pixel area. If these two digital images are fully searched using a 2*2 pixel block, the computer must perform 25*25 (625) correlation computations. If two digital images of 16*16 pixels are fully searched with an 8*8 pixel block, the computer must perform 81*81 (6561) correlation computations.
  • The number of correlation computations can be greatly reduced if a template block in the template frame is used for block matching instead of the full searching process. As shown in FIG. 1, a 2*2 template block 102 is located in the center of the template frame 100. Accordingly, the computer only needs to perform 5*5 (25) correlation computations for the current frame 104 when using the 2*2 block as a search unit. Block matching determines a block with the least difference (greatest correlation) after comparing all the 2*2 blocks (from block 106 to block 108) in the current frame 104 with the template block 102. Note that the searching order is not limited to the described order. If two digital images with 16*16 pixels are searched with an 8*8 pixel block, a total of 9*9 (81) correlation computations must be performed. Although the use of template block significantly reduces the number of searching and computation cycles, a large size image still requires long computation time to compare each block with the template block. Furthermore, performing a fully search using a template block also requires a large amount of memory and processor capacity for large digital images.
  • SUMMARY OF THE INVENTION
  • Accordingly, the object of the present invention is to provide a displacement prediction method and a pointing device used in a computer system. The pointing device of the present invention comprises photo detectors that capture images sequentially, obtain a template frame and current frame, and predict moving velocity vector to calculate the image displacement.
  • Another object of the present invention is to reduce the search range in the current frame for block matching according to the predicted displacement.
  • Yet another object of the present invention is to determine the location of the template block in the template frame according to the predicted displacement in order to extend the valid period of the template frame, hence decreasing the frequency of template frame updates.
  • The present invention provides a hand held pointing device for a computer system, which comprises a photo capture device and a displacement detection circuit. The photo capture device captures images sequentially to produce a first frame and a second frame, whereas the displacement detection circuit predicts a velocity of the pointing device, and compares the first frame to the second frame according to the predicted velocity to obtain a displacement of the pointing device. The photo capture device of the present invention comprises a photosensing array with a plurality of photo detectors, and the size of each photo detector in the photosensing array is between 20 μm*20 μm to 40 μm*40 μm.
  • The present invention further comprises a method for estimating the displacement of a pointing device, wherein the pointing device comprises a photo capture device for capturing images sequentially to produce corresponding frames. First, a velocity of the pointing device is predicted, and according to the predicted velocity, a first frame is compared with a second frame produced by the photo capture device to obtain a displacement of the pointing device. The velocity is predicted according to at least two previous velocity data. The first and second frames are compared by first defining a template block in the first frame, defining a search range in the second frame according to the predicted velocity, and lastly, searching a matched block in the search range to output the displacement. The matched block is the block in the search range that has the smallest difference when compared to the template block. The method of the present invention further comprises computing the acceleration of the pointing device, and predicting the velocity according to the acceleration and at least one previous velocity data. The method of defining the search range comprises computing a predicted block with the same size as the template block according to the predicted velocity, wherein the predicted block is the block in the second frame with the most possible locations for the template block under the predicted velocity of the pointing device. The predicted block is then extended to form the search range. The predicted block is extended either symmetrically or asymmetrically.
  • The method provided in the present invention further comprises defining a template block according to the predicted velocity, wherein the template block is not in the center of the first frame. The displacement vector between the template block and the center of the first frame is proportional to the predicted velocity.
  • The search algorithm for block matching is typically related to correlation computation, for example, mean square error (MSE) and mean absolute difference (MAD). These algorithms compute the error or difference between two blocks using specific formulas, and the smaller the error, the larger the correlation between the two blocks. A matched block is defined as a block that has the greatest correlation to the template block (or reference block).
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention can be more fully understood by reading the subsequent detailed description in conjunction with the examples and references made to the accompanying drawings, wherein:
  • FIG. 1 illustrates the conventional fully search method for block matching.
  • FIG. 2 a is a diagram illustrating a typical optical mouse.
  • FIG. 2 b is a block diagram of the integrated circuit chip shown in FIG. 2 a.
  • FIG. 3 illustrates searching the matched block within a specific searching range according to an embodiment of the present invention.
  • FIG. 4 illustrates an example of changing the search range according to the principles of the present invention.
  • FIG. 5 illustrates an example of a current frame with several repeating features.
  • FIG. 6 illustrates an example of changing the position of the template block according to the principles of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 2 a is an optical mouse for implementing the present invention, and FIG. 2 b is a block diagram of the integrated circuit chip shown in FIG. 2 a. The optical mouse 20 comprises a lens 21 and an integrated circuit (IC) chip 22. As shown in FIG. 2 b, the integrated circuit chip 22 is generally divided into two regions. One is a photo capture device 221, for example, an array composed of Complementary Metal Oxide Semiconductor (CMOS) photo detectors. The lens 21 reflects the surface under the optical mouse 20 to the photo capture device 221. The photo capture device captures the images sequentially according to a preset frame rate to generate corresponding images, or the so called “frame”. The captured frame varies with movement of the optical mouse 20. Another region in the integrated circuit chip 22 is a displacement detection circuit 222 for processing and computing the captured frames. The displacement detection circuit 222 tracks the movement of the optical mouse 20 by comparing a plurality of captured frames to determine the displacement of the optical mouse 20. The displacement detection circuit 222 then informs other devices, such as the computer, of the displacement of the optical mouse 20.
  • The present invention provides a method utilizing the predicted velocity for comparing frames in a block matching operation. The search range can be reduced according to the predicted velocity, thus reducing the number of required computations in the displacement detection circuit. Furthermore, the present invention extends the lifetime of the template frame (due to lower frequency of template frame updates), and increases the resolution of the output signals from the optical mouse.
  • The photo capture device 221 captures the digital image according to a preset rate, and selects a digital image as a new template frame. A template block in the template frame is then selected and compared to the searching range in the current frame to obtain a matched block that has the smallest difference when compared to the template block. The displacement is calculated by comparing the location of the template block in the template frame with the location of the matched block in the current frame. The calculated displacement represents the movement of the optical mouse or the photo capture device during the photo capture period. If the photo capture period is set as a unit time, the calculated displacement is the instantaneous velocity of the image. Since the displacement is a directional vector, it can be represented by two components, x and y. Instantaneous velocity V is also a vector, and can thus be represented as (Vx,Vy). An average velocity Vm (Vmx,Vmy) is derived by accumulating M instantaneous velocities, and similarly, an average velocity Vn (Vnx,Vny) is derived by accumulating N instantaneous velocities. If M is greater than N, the predicted velocity Vp of the images estimated by the photo capturing device is obtained according to Equation (1). Equation (1) can be decomposed into Equations (2) and (3) for calculating the x and y components individually.
  • V -> p = ( V m + V n ) 2 + ( V m - V n ) Equation ( 1 ) Vpx = ( Vmx + Vnx ) 2 + ( Vmx - Vnx ) Equation ( 2 ) Vpy = ( Vmy + Vny ) 2 + ( Vmy - Vny ) Equation ( 3 )
  • The predicted velocity Vp is also the predicted displacement of the photo capture device during the next capture period, which indicates the predicted location of the matched block with respect to the location of the template block.
  • The location of the matched block is predicted according to the predicted velocity Vp. As shown in the example of FIG. 3, the size of each frame is 6*6 pixels, and the size of the block for block matching is 2*2 pixels. The block matching algorithm proposed in the present invention does not perform a fully search of the current frame, but only searches the blocks in a searching range derived by the predicted velocity. The search time, as well as the computational load on the processor can be significantly reduced by reducing the search range. The search range is restricted by the predicted velocity, so only the blocks in the search range require correlation computation.
  • As shown in FIG. 3, the predicted velocity and the displacement are all zero in the initial state. The predicted location of the matched block 302 (i.e. predicted block) in the current frame 300 is most likely to be the same as the location of the template block 102. The template block is assumed to be at the center of the template frame. The estimated search range 304 is derived by extending one pixel in all directions from the predicted matched block 302. After a period of time, the image shifts to the right, and the predicted matched block shifts right by one pixel. The predicted matched block is now at location 308 in the current frame 306, thus the corresponding search range shifts to location 310.
  • If the current frame 300 in the example shown in FIG. 3 is fully searched using a 2*2 template block 102, 5*5 (25) correlation computations are required to obtain the corresponding matched block. Only 3*3 (9) computations, however, are required when searching the blocks within the search range 304 according to the present invention. When the present invention is implemented for matched block searching in a large image frame, or an image of any size, the number of computations is still 9 if the search range is derived by extending one pixel from the predicted matched block. If the search range is defined as extending two pixels from the expected matching block, the number of correlation computations is always 25 regardless of the sizes of the image frame and template block. The computational load only depends on the distance extended for the search range. The computational load of the present invention is far less than the fully search method of the conventional block matching method. For example, a 16*16 digital image is searched using an 8*8 template block. The conventional fully search requires 81*81 (6561) computations while searching using one template block requires 9*9 (81) computations, whereas the present invention only requires 5*5 (25) or 3*3 (9) correlation computations depending on the size of the predetermined search range. The search algorithm for block matching is typically related to correlation computation, for example, mean square error (MSE) and mean absolute difference (MAD).
  • The search range can be defined by extending N pixels symmetrically around the predicted matched block, or the extended distance can be asymmetrical. As shown in FIG. 4, the search range 404 of the current frame 400 is derived by extending the expected matched block 402 two pixels in each direction. The initial predicted velocity is zero, so the predicted matched block 402 is in the center of the current frame 400. A predicted velocity can be estimated when the image changes, and is used to estimate the location of the predicted matched block in the next frame. As shown in FIG. 4, if the location of the predicted matched block 408 in the current frame 406 shifts right by one pixel from the previous predicted matched block 402, the image is assumed to have a higher probability of having shifted right than in other directions, which means that the next matched block is most likely found on the right side of the predicted matched block 408, rather than on the left side. Therefore, as shown in the Figure, the left side of the search range can be reduced from two pixels to one pixel apart from the predicted matched block 408. The search range 410 is set based on an acceleration of the photo capture device, and can be either symmetrical or asymmetrical. By doing so, the search range is further reduced in size, and the number of correlation computations is also reduced.
  • The size of the searching range as previously described is determined by the maximum acceleration of the images. Assuming that the photo capture device captures 1000 digital images per second, the search range is derived by extending one pixel from the expected matching block in all directions, and the size of a pixel is 30 μm, then the maximum acceleration of the images is derived from Equation (4) as 30 ms−2. 30 ms−2 of acceleration is equivalent to 3.06 times the gravity G. If the capture rate is 1000 images per second, the search range derived by extending one pixel from the predicted matched block is enough in most applications as the maximum acceleration of the movement would not exceed 30 ms−2.
  • a = Δ v T frame = d pixel T frame 2 = 30 * 10 - 6 m ( 1 1000 s ) 2 = 30 ms - 2 = 3.06 G Equation ( 4 )
  • The advantages of using the predicted velocity to reduce the search range comprise reducing the number of correlation computations, and increasing the accuracy of block matching.
  • The size of the photo capture device 221 is typically 60 μm*60 μm, and the lens 21 has a magnification ratio of 1. For example, the array in the photo capture device comprise 16*16 CMOS photo detectors and the template block is 8*8, the visible range of the photo capture device 221 is 960 μm*960 μm, and the size of the template block is 480 μm*480 μm. Current printing resolution is around 150˜200 Dots Per Inch (DPI), and resolution of 150 DPI is given in the following example. If one dot represents a feature, there will be approximately (150/inch*60 μm*16)2=5.672≈32 features on a frame captured by the photo capture device 221, and a 8*8 template block covers roughly 8 features.
  • The conventional method of block matching searches the matched block in the current frame using a template block, and because there are many repeating patterns in one frame, the number of features covered may be more than one, which causes more than one matched blocks to be found in the current frame and complicates the determination of optical mouse movement. As shown in FIG. 5, the template block of the template frame might find several identical matched blocks in the current frame 500, and the optical mouse may be unable to determine the displacement, or make a wrong determination.
  • The present invention reduces the search range according to the predicted velocity, which also reduces the chance of finding more than one identical matched block. As shown in FIG. 5, if the searching range 502 is a 10*10 block covering the matched block 504 in the center, the number of features in the search range 502 is reduced to 4, and the probability of finding more than one matched block is reduced. Therefore, the present invention improves the reliability of block matching by reducing the search range for block matching.
  • This improvement of reliability is more obvious if the number of features captured by the photo capture device is reduced. Assume that the size of the photo capture device is 30 μm*30 μm, the magnification ratio of the lens is unity (one), and the printing resolution is 150 DPI. The number of features on a frame captured by the photo capture device is approximately (150/inch*30 μm*16)2=8.035. Whereas the 8*8 template block covers approximately 2 features, and the search range covers around 3 features. Therefore, a single matched block can be more easily found when the amount of repeating features is reduced.
  • One way to reduce the number of features in a frame is to reduce the size of the image captured by the photo capturing device. The present invention proposes a photo capture device capturing images with a size between 20 μm*20 μm˜40 μm*40 μm. Another way to reduce the number of features in a frame is to increase the magnification ratio of the lens, so that the lens magnifies the surface under the optical mouse and feeds the magnified images to the photo capture device.
  • Furthermore, the present invention proposes a method for changing the location of the template block according to the predicted velocity of the optical mouse or the photo capture device. As shown in FIG. 6, the template block in the template frame 600 moves from the center 602 to the location 605 to the left when the optical mouse detects movement to the right. Moving the template block in the opposite direction of the image extends the lifetime of the template frame, thus minimizing the frequency of frame updates. As the image moves to the right, the block 602 in the center will leave the current frame prior to the block 604 to the left side, as the block 604 at the left has a higher chance of appearance in the current frame. Assuming the template frame still overlaps the current frame, and the matched block is searchable in the current frame, changing the template block to block 604 reduces the detectable minimum shift angle 608 (tan−1(¼)=14 degrees). In comparison with the original template block 602, the detectable minimum shift angle 606 is only tan−1(½)=26.5 degrees. The tracking of the movement has higher accuracy if the detectable minimum shift angle is smaller.
  • The present invention changes the location of the template block as well as sets the search range according to predicted velocity. The template block can be changed when the predicted velocity reaches specific point, then sets the search range when the predicted velocity reaches another specific point. It is also possible to set the search range first, then change the location of the template block.
  • The embodiment of the present invention utilizes illumination as an example of the feature, whereas other image features such as chrominance are also applicable for the feature enhancement method proposed in the present invention.
  • Finally, while the invention has been described by way of example and in terms of the above, it is to be understood that the invention is not limited to the disclosed embodiment. On the contrary, it is intended to cover various modifications and similar arrangements as would be apparent to those skilled in the art. Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.

Claims (19)

1. A pointing device for a computer system, comprising:
a photo capture device, capturing images sequentially to produce a first frame and a second frame; and
a displacement detection circuit, predicting a velocity of the pointing device, comparing the first frame and the second frame according to the predicted velocity, and obtaining a displacement of the pointing device according to the comparison.
2. The pointing device according to claim 1, wherein the photo capture device comprises a photosensing array with a plurality of photo detectors.
3. The pointing device according to claim 2, wherein an area of each photo detector in the photosensing array is between 20 μm*20 μm to 40 μm*40 μm.
4. The pointing device according to claim 1, wherein the predicted velocity is derived from at least two previous velocities of the pointing device.
5. (canceled)
6. The pointing device according to claim 1, wherein the displacement detection circuit computes the predicted velocity according to an acceleration of the pointing device.
7. The pointing device according to claim 1, wherein the displacement detection circuit defines a template block in the first frame, defines a searching range in the second frame that is smaller than the second frame according to the predicted velocity, and outputs the displacement by searching a matched block in the search range, wherein the matched block is the block with the greatest correlation to the template block in the search range.
8. The pointing device according to claim 1, wherein the displacement detection circuit defines a template block in the first frame according to the predicted velocity, and searches a matched block with the greatest correlation to the template block in the second frame.
9. A displacement estimation method for a pointing device, the pointing device comprises a photo capture device for capturing images sequentially to produce corresponding frames, the method comprising the following steps:
estimating a predicted velocity of the pointing device;
comparing a first frame and a second frame produced by the photo capture device according to the predicted velocity, and obtaining a displacement of the pointing device according to the comparison.
10. The displacement estimation method according to claim 9, wherein the predicted velocity is estimated according to at least two previous velocities.
11. The displacement estimation method according to claim 9, wherein the step of estimating the predicted velocity further comprises estimating the predicted velocity according to an estimated acceleration of the pointing device and at least one of the previous velocities.
12. The displacement estimation method according to claim 9, wherein the step of comparing the first frame and the second frame further comprises:
defining a template block in the first frame;
defining a search range in the second frame according to the predicted velocity; and
searching a matched block in the searching range in order to output the displacement, wherein the matched block is the block with the greatest correlation to the template block.
13. The displacement estimation method according to claim 12, wherein the step of defining the searching range further comprises the following steps:
computing an predicted block according to the predicted velocity, wherein the predicted block is the same size as the template block, and is the most probable location in the second frame for the template block according to the predicted velocity of the pointing device; and
extending the predicted block to set the search range.
14. The displacement estimation method according to claim 12, further comprising:
changing a location of the template block when the predicted velocity exceeds a first velocity; and
setting the search range based on an acceleration of the photo capture device.
15-16. (canceled)
17. The displacement estimation method according to claim 9, wherein the step of comparing the first frame and the second frame further comprises:
defining a template block in the first frame according to the predicted velocity; and
searching a matched block in the second frame, wherein the matched block is the block with the greatest correlation to the template block.
18. The displacement estimation method according to claim 9, wherein the photo capture device comprises a photosensing array with a plurality of photo detectors.
19. The displacement estimation method according to claim 18, wherein an area of each photo detector in the photosensing array is between 20 μm*20 μm to 40 μm*40 μm.
20. A pointing device for a computer system, comprising:
a photo capture device, capturing images sequentially to produce a first frame and a second frame; and
a displacement detection circuit, predicting a velocity of the pointing device according to an acceleration of the pointing device and comparing the first frame and the second frame according to the predicted velocity to obtain a displacement of the pointing device.
US12/217,356 2003-12-29 2008-07-05 Pointing device and displacement estimation method Abandoned US20080291167A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/217,356 US20080291167A1 (en) 2003-12-29 2008-07-05 Pointing device and displacement estimation method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
TW92137319 2003-12-29
TW092137319A TWI230890B (en) 2003-12-29 2003-12-29 Handheld pointing device and method for estimating a displacement
US10/832,203 US7388997B2 (en) 2003-12-29 2004-04-26 One dimensional feature enhancement
US10/875,346 US7417623B2 (en) 2003-12-29 2004-06-24 Pointing device and displacement estimation method
US12/217,356 US20080291167A1 (en) 2003-12-29 2008-07-05 Pointing device and displacement estimation method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/875,346 Continuation US7417623B2 (en) 2003-12-29 2004-06-24 Pointing device and displacement estimation method

Publications (1)

Publication Number Publication Date
US20080291167A1 true US20080291167A1 (en) 2008-11-27

Family

ID=34738147

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/875,346 Active 2026-06-03 US7417623B2 (en) 2003-12-29 2004-06-24 Pointing device and displacement estimation method
US12/217,356 Abandoned US20080291167A1 (en) 2003-12-29 2008-07-05 Pointing device and displacement estimation method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/875,346 Active 2026-06-03 US7417623B2 (en) 2003-12-29 2004-06-24 Pointing device and displacement estimation method

Country Status (2)

Country Link
US (2) US7417623B2 (en)
TW (1) TWI230890B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW530489B (en) * 2001-09-11 2003-05-01 Pixart Imaging Inc Moving distance detection method of image sensor
SE0300913D0 (en) * 2003-03-31 2003-03-31 Print Dreams Europe Ab Method for navigation with optical sensors, and a device utilizing the method
TWI236289B (en) * 2004-08-11 2005-07-11 Pixart Imaging Inc Interactive device capable of improving image processing
US9024880B2 (en) * 2004-08-11 2015-05-05 Pixart Imaging Inc. Interactive system capable of improving image processing
KR100628101B1 (en) * 2005-07-25 2006-09-26 엘지전자 주식회사 Mobile telecommunication device having function for inputting letters and method thereby
US7733329B2 (en) * 2005-10-19 2010-06-08 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Pattern detection using an optical navigation device
KR100845321B1 (en) * 2006-08-18 2008-07-10 주식회사 애트랩 Optical navigation device and method for compensating an offset in optical navigation device.
US8169420B2 (en) * 2008-02-05 2012-05-01 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Reporting optical tracking data based on integrated resolution switching and surface detection
US8509544B2 (en) * 2008-03-13 2013-08-13 Kabushiki Kaisha Toshiba Template matching apparatus and method thereof
US8611584B2 (en) * 2009-08-17 2013-12-17 Avago Technologies General Ip (Singapore) Pte. Ltd. System and method for performing optical navigation using portions of captured frames of image data
US9367146B2 (en) 2011-11-14 2016-06-14 Logiteh Europe S.A. Input device with multiple touch-sensitive zones
US9733727B2 (en) * 2012-12-07 2017-08-15 Wen-Chieh Geoffrey Lee Optical mouse with cursor rotating ability
US9927884B2 (en) * 2015-11-06 2018-03-27 Pixart Imaging (Penang) Sdn. Bhd. Non transitory computer readable recording medium for executing image processing method, and image sensing device applying the image processing method
CN109360523B (en) * 2018-12-12 2020-11-27 惠科股份有限公司 Display panel driving method and driving device and display device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5994710A (en) * 1998-04-30 1999-11-30 Hewlett-Packard Company Scanning mouse for a computer system
US6078618A (en) * 1997-05-28 2000-06-20 Nec Corporation Motion vector estimation system
US6084574A (en) * 1992-10-05 2000-07-04 Logitech, Inc. Compact cursor pointing device utilizing photodetector array
US6263088B1 (en) * 1997-06-19 2001-07-17 Ncr Corporation System and method for tracking movement of objects in a scene
US20020109668A1 (en) * 1995-12-13 2002-08-15 Rosenberg Louis B. Controlling haptic feedback for enhancing navigation in a graphical environment
US20030058218A1 (en) * 2001-07-30 2003-03-27 Crane Randall T. Tracking pointing device motion using a single buffer for cross and auto correlation determination
US20030095140A1 (en) * 2001-10-12 2003-05-22 Keaton Patricia (Trish) Vision-based pointer tracking and object classification method and apparatus
US20040095323A1 (en) * 2002-11-15 2004-05-20 Jung-Hong Ahn Method for calculating movement value of optical mouse and optical mouse using the same
US6950094B2 (en) * 1998-03-30 2005-09-27 Agilent Technologies, Inc Seeing eye mouse for a computer system
US7042439B2 (en) * 2001-11-06 2006-05-09 Omnivision Technologies, Inc. Method and apparatus for determining relative movement in an optical mouse
US7050502B2 (en) * 2001-09-18 2006-05-23 Matsushita Electric Industrial Co., Ltd. Method and apparatus for motion vector detection and medium storing method program directed to the same
US7633489B2 (en) * 2002-04-12 2009-12-15 Samsung Electro-Mechanics Co., Ltd. Navigation system and navigation method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6029833A (en) * 1983-07-28 1985-02-15 Canon Inc Image display device
US5621434A (en) * 1993-08-11 1997-04-15 Object Technology Licensing Corp. Cursor manipulation system and method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6084574A (en) * 1992-10-05 2000-07-04 Logitech, Inc. Compact cursor pointing device utilizing photodetector array
US20020109668A1 (en) * 1995-12-13 2002-08-15 Rosenberg Louis B. Controlling haptic feedback for enhancing navigation in a graphical environment
US6078618A (en) * 1997-05-28 2000-06-20 Nec Corporation Motion vector estimation system
US6263088B1 (en) * 1997-06-19 2001-07-17 Ncr Corporation System and method for tracking movement of objects in a scene
US6950094B2 (en) * 1998-03-30 2005-09-27 Agilent Technologies, Inc Seeing eye mouse for a computer system
US5994710A (en) * 1998-04-30 1999-11-30 Hewlett-Packard Company Scanning mouse for a computer system
US20030058218A1 (en) * 2001-07-30 2003-03-27 Crane Randall T. Tracking pointing device motion using a single buffer for cross and auto correlation determination
US7050502B2 (en) * 2001-09-18 2006-05-23 Matsushita Electric Industrial Co., Ltd. Method and apparatus for motion vector detection and medium storing method program directed to the same
US20030095140A1 (en) * 2001-10-12 2003-05-22 Keaton Patricia (Trish) Vision-based pointer tracking and object classification method and apparatus
US7042439B2 (en) * 2001-11-06 2006-05-09 Omnivision Technologies, Inc. Method and apparatus for determining relative movement in an optical mouse
US7633489B2 (en) * 2002-04-12 2009-12-15 Samsung Electro-Mechanics Co., Ltd. Navigation system and navigation method
US20040095323A1 (en) * 2002-11-15 2004-05-20 Jung-Hong Ahn Method for calculating movement value of optical mouse and optical mouse using the same

Also Published As

Publication number Publication date
US20050151724A1 (en) 2005-07-14
TWI230890B (en) 2005-04-11
US7417623B2 (en) 2008-08-26
TW200521810A (en) 2005-07-01

Similar Documents

Publication Publication Date Title
US20080291167A1 (en) Pointing device and displacement estimation method
EP2169945B1 (en) Image processing apparatus and method for detection and correction of camera shake
EP1917573B1 (en) A displacement and tilt detection method for a portable autonomous device having an integrated image sensor and a device therefor
US8335257B2 (en) Vector selection decision for pixel interpolation
EP0896300A2 (en) Device and method for motion vector detection
US7945072B2 (en) Optical motion sensing process
US8111877B2 (en) Image processing device and storage medium storing image processing program
KR19990031401A (en) Image Stabilization Device and Image Stabilization Method Using Motion Correction of Input Image Using Bit Plane Matching
JP2005128619A (en) Object measurement apparatus, object measurement method, and program
KR0163922B1 (en) Method & apparatus for detecting movement vector of camera image
JP2003533800A (en) Motion estimator to reduce halo in MC upconversion
JP2996657B2 (en) Motion detection apparatus and method using gradation pattern matching
CN101739688A (en) Device for detecting motion vectors, method and program for processing motion vectors
US7099512B2 (en) Process and device for global motion estimation in a sequence of images and a computer program product therefor
US8160151B2 (en) Motion vector detection apparatus, motion vector processing method and program
KR102429337B1 (en) Image processing device stabilizing image and method of stabilizing image
US7702176B2 (en) One dimensional feature enhancement
JP2006215655A (en) Method, apparatus, program and program storage medium for detecting motion vector
US12008773B2 (en) Object tracking apparatus and control method thereof using weight map based on motion vectors
JP4124861B2 (en) Moving amount detection apparatus and method
JP2000099744A (en) Method and device for detecting moving vector of image
JP4541335B2 (en) Image processing device
CN1648939A (en) Hand held direction indicator and method for evaluating displacement quantity
JP2743763B2 (en) Motion estimation method for moving images
KR100516711B1 (en) Device and method for motion vector detection

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION