US20020012449A1 - Method and apparatus for tracking an object using a continuously adapting mean shift - Google Patents

Method and apparatus for tracking an object using a continuously adapting mean shift Download PDF

Info

Publication number
US20020012449A1
US20020012449A1 US09/079,917 US7991798A US2002012449A1 US 20020012449 A1 US20020012449 A1 US 20020012449A1 US 7991798 A US7991798 A US 7991798A US 2002012449 A1 US2002012449 A1 US 2002012449A1
Authority
US
United States
Prior art keywords
search window
probability distribution
calculating
resizing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/079,917
Other versions
US6394557B2 (en
Inventor
Gary Rost Bradski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US09/079,917 priority Critical patent/US6394557B2/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRADSKI, GARY ROST
Publication of US20020012449A1 publication Critical patent/US20020012449A1/en
Application granted granted Critical
Publication of US6394557B2 publication Critical patent/US6394557B2/en
Adjusted expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the present invention relates to the field of image processing, computer vision, and computer graphical user interfaces.
  • the present invention discloses a video image based tracking system that allows a computer to identify and track the location of a moving object within a sequence of video images.
  • a security system can be created that tracks people that enter a video image.
  • a user interface can be created wherein a computer tracks the gestures and movements of a person in order to control some activity.
  • a method of tracking a dynamically changing probability distribution operates by first calculating a mean location of a probability distribution within a search window. Next, the search window is centered on the calculated mean location and the search window is then resized. Successive iterations of calculating a mean, centering on the mean, and resizing the search window track an object represented by the probability distribution.
  • FIG. 1 illustrates an example computer workstation that may use the teachings of the present invention.
  • FIG. 2 illustrates a pixel sampling of a human face.
  • FIG. 3A illustrates a small portion of sample image being converted into a flesh hue histogram.
  • FIG. 3B illustrates a normalized flesh hue histogram created by a sampling a human face.
  • FIG. 4 illustrates a probability distribution of flesh hues of an input image.
  • FIG. 5 illustrates a flow diagram describing the operation of the mean shift method.
  • FIG. 6 illustrates an example of a continuously adaptive mean shift method applied to one dimensional data.
  • FIG. 7 illustrates a flow diagram describing the operation of the continuously adaptive mean shift method.
  • FIG. 8 illustrates example of the continuously adaptive mean shift method applied to one dimensional data.
  • FIG. 9 illustrates a flow diagram describing the operation of a head tracker using the continuously adaptive mean shift method.
  • FIG. 10A illustrates a first diagram of a head within a video frame, a head tracker search window, and an calculation area used by the search window.
  • FIG. 10B illustrates a second diagram of a head within a video frame that is very close to the camera, a head tracker search window, and an calculation area used by the search window.
  • a method of tracking objects using a continuously adaptive mean shift method on a probability distribution is disclosed.
  • a human head is located and tracked within a flesh hue probability distribution created from video image.
  • the present invention can easily be used to track other types of objects using other types of probability distribution data.
  • the present invention could be used to track heat emitting objects using an infrared detection system.
  • the present invention can also be used to track objects that are described using non image data such as population distributions.
  • the disclosed embodiment operates by first capturing a “talking head” video image wherein the head and shoulders of the target person are within the video frame. Next, the method creates a two dimensional flesh hue probability distribution using a preset flesh hue histogram. Then, the location of the target person's head is determined by locating the center of the flesh hue probability distribution. To determine the orientation of the target person's head, the major and minor axis of the flesh hue probability distribution is calculated.
  • FIG. 1 illustrates one possible system for using the teachings of the present invention.
  • a user 110 is seated in front of a video camera 120 .
  • the video camera 120 is used to capture a “talking head” image of the user 110 .
  • the user is using a computer workstation that comprises a visual display monitor 151 , a keyboard 153 for alphanumeric input, a mouse 155 for cursor positioning, and a computer system 157 .
  • the computer system 157 digitizes the “talking head” image of the user 110 captured by video camera 120 .
  • the user positions himself such that the user's head fills a sample area of an image captured by video camera 120 .
  • an image of a “talking head” image is displayed where the user's head substantially or completely fills the sample area.
  • the pixels in the sample area are then used to build a flesh hue histogram.
  • each pixel in the video image is converted to or captured in a hue (H), saturation (S), and value (V) color space.
  • Certain hue values in the sample region are accumulated into a flesh hue histogram.
  • FIG. 3A illustrates a small nine by nine pixel block that has been divided into its hue (H), saturation (S), and value (V) components being converted into a flesh hue histogram.
  • the hue values are grouped into bins wherein each bin comprises five consecutive hue values. Hue values are only accumulated if their corresponding saturation (S) and value (V) values are above respective saturation (S) and value (V) thresholds. Referring to the example of FIG.
  • the S threshold is 20 and the V threshold is 15 such that a pixel will only be added to the flesh hue histogram if the pixel's S value exceeds 20 and the pixel's V value exceeds 15.
  • this first pixel is added to the flesh hue histogram since the pixel's S value exceeds 20 and the pixel's V value exceeds 15.
  • a marker 391 is added to the 20 to 24 hue value bin.
  • the center pixel of the top row will be added to the 20 to 24 hue value bin as illustrated by marker 392 .
  • the center pixel of the right most column will not be added to the flesh hue histogram since its Saturation value does not exceed 20.
  • the flesh hue histogram is normalized such that the maximum value in the histogram is equal to a probability value of one (“1”).
  • all the histogram bins contain flesh hue probability values between zero (“0”) and one hundred (“100”) as illustrated in FIG. 3B.
  • pixel hues that are likely to be flesh hues are given high percentage values and pixel hues that are not likely to be flesh hues are given low probability values.
  • FIG. 4 illustrates an example of a two dimensional image of a talking head wherein the pixels have been replaced with a percentage probability value that specifies the probability of the pixel being flesh. As apparent in FIG. 4, the pixels that comprise the person's face are given high probabilities of being flesh.
  • the teachings of the present invention can be used to locate the center of an object and to track the object.
  • An early embodiment of the present invention uses a standard mean shift method to track objects that have been converted into probability distributions.
  • FIG. 5 graphically illustrates how the standard mean shift method operates. Initially, at steps 510 and 520 , an initial search window size and initial search window location are selected. The method then computes the “mean” location of the search window at step 530 . At step 540 , the center of the search window is moved onto the mean location that was computed in step 530 . At step 550 , the method determines if it has converged upon the center of the probability distribution. This can be done by determine if the search was moved by a value less than a preset threshold value. If the mean shift method has converged, then it is done. If the mean shift method has not converged then the method returns to step 530 where the mean of the new search window location is calculated.
  • FIG. 6 An example of the mean shift method in operation is presented in FIG. 6. To simplify the explanation, the example is provided using a one dimension slice of a two dimensional probability distribution. However, the same principles apply for a two or more dimensional probability distribution.
  • a five sample wide search window is placed at an initial location. After a first iteration of the method, the search window is moved to the left as illustrated in step 1 of FIG. 6. The search window was moved left since the mean location of the search window samples was left of the initial search window location. After a second iteration, the search window is again moved left as illustrated in step 2 . The search window was moved left since the mean location of the search window samples in step 1 is left of the center of the search window in step 1 . After a third iteration, the search window again moves left. However, for all subsequent iterations, the search window will remain stationary (provided the distribution data does not change). Thus, by the third iteration, the mean shift method has converged.
  • I(x, y) is the image value at position (x, y) in the image, and x and y range over the search window.
  • the mean shift method disclosed with reference to FIG. 5 and FIG. 6 provides relatively good results, but it does have a few flaws.
  • the present invention introduces a continuously adaptive mean shift method referred to as a CAMSHIFT method.
  • the CAMSHIFT method dynamically adjusts the size of the search window to produce improved results.
  • the dynamically adjusting search window allows the mean shift method to operate better in environments where the data changes dynamically.
  • FIG. 7 graphically illustrates how the CAMSHIFT method of the present invention operates.
  • steps 710 and 720 an initial search window size and initial search window location are selected.
  • the CAMSHIFT method performs one or more iterations of the mean shift method to move the search window at step 730 .
  • the method adjusts the size of the search window. The size of the search window may be dependent upon information gathered about the data.
  • step 760 the method determines if it has converged upon the center of the probability distribution. If the mean shift method has converged, then the method is done. If the CAMSHIFT method has not converged then the method returns to step 730 where the mean shift method is performed using the new search window location with the new search window size is calculated.
  • FIG. 8 An example of the continuously adaptive mean shift method in operation is presented in FIG. 8. Again, to simplify the explanation, the example is provided using a one dimension slice of a two dimensional probability distribution. However, the same principles apply for a two or more dimensional distribution.
  • the continuously adaptive mean shift method adjusts the search window to a size that is proportional square root of the zeroth moment.
  • the continuously adaptive mean shift method in the example of FIG. 8 in two dimensions adjusts the search window to have a width and height of:
  • ⁇ 1 . is a positive constant.
  • other embodiments may use other methods of determining the search window size.
  • a three sample wide search window is placed at an initial location. After a first iteration of the method, the search window is moved to the left as illustrated in step 1 of FIG. 6. The search window was moved left since the mean location of the search window samples was left of the initial search window location. After a second iteration, the search window is again moved left as illustrated in step 2 since the mean location of the search window samples in step 1 is left of the center of the search window in step 1 . However, it should also be noted that the size of the search window increased since the amount of data in the search window has increased. After a third iteration, the center of the search window again moves left and the search window size again increases.
  • the adaptive mean shift method adjusts the window size as it converges upon the mean of the data.
  • the continuously adaptive mean shift method converges upon the mean of the contiguous data. It has been found that the continuously adaptive mean shift method with a search window width and height set equal to 2* ⁇ square root ⁇ square root over (M 00 ) ⁇ . will typically find the center of the largest connected region of a probability distribution, a great benefit for tracking one of multiple confusable objects.
  • FIG. 9 To provide an example usage of the continuously adaptive mean shift method, one specific embodiment of a head tracking system is provided. However, many variations exist. The example is provided with reference to FIG. 9, FIG. 10A and FIG. 10B.
  • the head tracker uses a limited “calculation region” that defines the area that will be examined closely. Specifically, the calculation region is the area for which a probability distribution is calculated. Area outside of the calculation region is not operated upon.
  • the head tracker initially sets the calculation region to include the entire video frame.
  • the initial search window size and location is selected to capture a head that is substantially centered in the video frame.
  • the initial search window size can include the entire video frame.
  • the present invention can be implemented without any initial search window size or search window location parameters that need to be determined.
  • the flesh hue probability distribution is calculated in the calculation region.
  • the probability distribution is calculated for the entire video frame.
  • the continuously adaptive mean shift method is applied by steps 921 , 935 , and 942 to locate the region with the highest probability density.
  • the size and location of the search window are reported. This information can be used for tracking the head location and size.
  • the size parameter can be used to determine a distance of the head from the camera.
  • the size and location of the search window are used to determine a new calculation region.
  • the calculation region is the area 1030 centered around the search window 1020 as illustrated in FIG. 10A such that a flesh hue probability distribution will only be calculated for the area around the head.
  • the “lock” of the motion-tracker is reinforced on the object of interest.
  • the flesh probability distribution will be large and any movements they make will also be large in absolute number of pixels translated so the calculation region must be large. But when the person is far from the camera as illustrated in FIG.
  • step 914 the method calculates the probability distribution in the new calculation area. The method then proceeds to search for the area with the greatest probability density.
  • a motion difference is calculated for successive video frames.
  • the center of the motion difference is then selected as the center of the search window since the center of the motion difference is likely to be a person in the image.
  • the probability distributions are discrete. Since the methods of the present invention climb the gradient of a probability distribution, the minimum search window size must be greater than one in order to detect a gradient. Furthermore, in order to center the search window, the search window should be of odd size. Thus, for discrete distributions, the minimum window size is set at three. Also, as the method adapts the search window size, the size of the search window is rounded to the nearest odd number greater or equal to three in order to be able to center the search window. For tracking colored objects in video sequences, we adjust the search window size as described in equation 4.
  • the orientation of the probability distribution can be determined.
  • the orientation of the head can be determined.
  • the second moment of the probability distribution is calculated. Equation 6 describes how a second moment is calculated.
  • the orientation of the probability distribution (the angle of the head) can be determined.
  • arctan ⁇ ( 2 ⁇ ( M 11 M 00 - x c ⁇ y c ) ( M 20 M 00 - x c 2 ) - ( M 02 M 00 - y c 2 ) ) 2 ( 7 )
  • the orientation of the probability distribution is highly correlated with the orientation of the person's head.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

A tracking method is disclosed. The method of the present invention tracks a object using a probability distribution of the desired object. The method operates by first calculating a mean location of a probability distribution within a search window. Next, the search window is centered on the calculated mean location. The steps of calculating a mean location and centering the search window may be performed until convergence. The search window may then be resized. Successive iterations of calculating a mean, centering on the mean, and resizing the search window track an object represented by the probability distribution. In one embodiment, a flesh hue probability distribution is generated from an input video image. The flesh hue probability distribution is used to track a human head within the video image.

Description

    FIELD OF THE INVENTION
  • The present invention relates to the field of image processing, computer vision, and computer graphical user interfaces. In particular, the present invention discloses a video image based tracking system that allows a computer to identify and track the location of a moving object within a sequence of video images. [0001]
  • BACKGROUND OF THE INVENTION
  • There are many applications of object tracking in video images. For example, a security system can be created that tracks people that enter a video image. A user interface can be created wherein a computer tracks the gestures and movements of a person in order to control some activity. [0002]
  • However, traditional object tracking systems are computationally expensive and difficult to use. One example of a traditional method of tracking objects in a scene uses object pattern recognition and edge detection. Such methods are very computationally intensive. Furthermore, such systems are notoriously difficult to train and calibrate. The results produced by such methods often contain a significant amount of jitter such that the results must be filtered before they can be used for a practical purpose. This additional filtering adds more computation work that must be performed. It would therefore be desirable to have a simpler more elegant method of visually tracking a dynamic object. [0003]
  • SUMMARY OF THE INVENTION
  • A method of tracking a dynamically changing probability distribution is disclosed. The method operates by first calculating a mean location of a probability distribution within a search window. Next, the search window is centered on the calculated mean location and the search window is then resized. Successive iterations of calculating a mean, centering on the mean, and resizing the search window track an object represented by the probability distribution. [0004]
  • Other objects, features, and advantages of present invention will be apparent from the company drawings and from the following detailed description that follows below. [0005]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The objects, features and advantages of the present invention will be apparent to one skilled in the art, in view of the following detailed description in which: [0006]
  • FIG. 1 illustrates an example computer workstation that may use the teachings of the present invention. [0007]
  • FIG. 2 illustrates a pixel sampling of a human face. [0008]
  • FIG. 3A illustrates a small portion of sample image being converted into a flesh hue histogram. [0009]
  • FIG. 3B illustrates a normalized flesh hue histogram created by a sampling a human face. [0010]
  • FIG. 4 illustrates a probability distribution of flesh hues of an input image. [0011]
  • FIG. 5 illustrates a flow diagram describing the operation of the mean shift method. [0012]
  • FIG. 6 illustrates an example of a continuously adaptive mean shift method applied to one dimensional data. [0013]
  • FIG. 7 illustrates a flow diagram describing the operation of the continuously adaptive mean shift method. [0014]
  • FIG. 8 illustrates example of the continuously adaptive mean shift method applied to one dimensional data. [0015]
  • FIG. 9 illustrates a flow diagram describing the operation of a head tracker using the continuously adaptive mean shift method. [0016]
  • FIG. 10A illustrates a first diagram of a head within a video frame, a head tracker search window, and an calculation area used by the search window. [0017]
  • FIG. 10B illustrates a second diagram of a head within a video frame that is very close to the camera, a head tracker search window, and an calculation area used by the search window. [0018]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • A method and apparatus for object tracking using a continuous mean shift method is disclosed. In the following description, for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that these specific details are not required in order to practice the present invention. For example, the present invention has been described with reference to an image flesh hue probability distribution. However, the same techniques can easily be applied to other types of dynamically changing probability distributions. [0019]
  • The Overall Object Tracking System
  • A method of tracking objects using a continuously adaptive mean shift method on a probability distribution is disclosed. To simplify the disclosure of the invention, one embodiment is presented wherein a human head is located and tracked within a flesh hue probability distribution created from video image. However, the present invention can easily be used to track other types of objects using other types of probability distribution data. For example, the present invention could be used to track heat emitting objects using an infrared detection system. The present invention can also be used to track objects that are described using non image data such as population distributions. [0020]
  • The disclosed embodiment operates by first capturing a “talking head” video image wherein the head and shoulders of the target person are within the video frame. Next, the method creates a two dimensional flesh hue probability distribution using a preset flesh hue histogram. Then, the location of the target person's head is determined by locating the center of the flesh hue probability distribution. To determine the orientation of the target person's head, the major and minor axis of the flesh hue probability distribution is calculated. [0021]
  • Capturing the Data
  • Example Hardware [0022]
  • FIG. 1 illustrates one possible system for using the teachings of the present invention. In the illustration of FIG. 1, a [0023] user 110 is seated in front of a video camera 120. The video camera 120 is used to capture a “talking head” image of the user 110. In the embodiment of FIG. 1, the user is using a computer workstation that comprises a visual display monitor 151, a keyboard 153 for alphanumeric input, a mouse 155 for cursor positioning, and a computer system 157.
  • Generating a Flesh Hue Histogram [0024]
  • The [0025] computer system 157 digitizes the “talking head” image of the user 110 captured by video camera 120. To build a flesh hue histogram, the user positions himself such that the user's head fills a sample area of an image captured by video camera 120. Specifically, referring to FIG. 2, an image of a “talking head” image is displayed where the user's head substantially or completely fills the sample area. The pixels in the sample area are then used to build a flesh hue histogram.
  • In one embodiment, each pixel in the video image is converted to or captured in a hue (H), saturation (S), and value (V) color space. Certain hue values in the sample region are accumulated into a flesh hue histogram. FIG. 3A illustrates a small nine by nine pixel block that has been divided into its hue (H), saturation (S), and value (V) components being converted into a flesh hue histogram. In the embodiment of FIG. 3A, the hue values are grouped into bins wherein each bin comprises five consecutive hue values. Hue values are only accumulated if their corresponding saturation (S) and value (V) values are above respective saturation (S) and value (V) thresholds. Referring to the example of FIG. 3A, the S threshold is 20 and the V threshold is 15 such that a pixel will only be added to the flesh hue histogram if the pixel's S value exceeds 20 and the pixel's V value exceeds 15. Starting at the upper left pixel, this first pixel is added to the flesh hue histogram since the pixel's S value exceeds 20 and the pixel's V value exceeds 15. Thus, a marker [0026] 391 is added to the 20 to 24 hue value bin. Similarly, the center pixel of the top row will be added to the 20 to 24 hue value bin as illustrated by marker 392. The center pixel of the right most column will not be added to the flesh hue histogram since its Saturation value does not exceed 20.
  • After sampling all the pixels in the sample area, the flesh hue histogram is normalized such that the maximum value in the histogram is equal to a probability value of one (“1”). In a percentage embodiment of FIG. 3B, all the histogram bins contain flesh hue probability values between zero (“0”) and one hundred (“100”) as illustrated in FIG. 3B. Thus, in the normalized flesh hue probability histogram illustrated in FIG. 3B, pixel hues that are likely to be flesh hues are given high percentage values and pixel hues that are not likely to be flesh hues are given low probability values. [0027]
  • Generating a Flesh Hue Probability Images [0028]
  • Once a flesh hue probability histogram has been created, the [0029] computer system 157 can quickly convert video images captured from the video camera into flesh hue probability distributions. This is performed by replacing the pixels in a video image with the their respective flesh hue probability values by using the flesh hue histogram of FIG. 3B as a look up table. FIG. 4 illustrates an example of a two dimensional image of a talking head wherein the pixels have been replaced with a percentage probability value that specifies the probability of the pixel being flesh. As apparent in FIG. 4, the pixels that comprise the person's face are given high probabilities of being flesh.
  • Object Tracking Using Mean Shift
  • Once a probability distribution has been created, the teachings of the present invention can be used to locate the center of an object and to track the object. An early embodiment of the present invention uses a standard mean shift method to track objects that have been converted into probability distributions. [0030]
  • FIG. 5 graphically illustrates how the standard mean shift method operates. Initially, at [0031] steps 510 and 520, an initial search window size and initial search window location are selected. The method then computes the “mean” location of the search window at step 530. At step 540, the center of the search window is moved onto the mean location that was computed in step 530. At step 550, the method determines if it has converged upon the center of the probability distribution. This can be done by determine if the search was moved by a value less than a preset threshold value. If the mean shift method has converged, then it is done. If the mean shift method has not converged then the method returns to step 530 where the mean of the new search window location is calculated.
  • An example of the mean shift method in operation is presented in FIG. 6. To simplify the explanation, the example is provided using a one dimension slice of a two dimensional probability distribution. However, the same principles apply for a two or more dimensional probability distribution. Referring to step [0032] 0 of FIG. 6, a five sample wide search window is placed at an initial location. After a first iteration of the method, the search window is moved to the left as illustrated in step 1 of FIG. 6. The search window was moved left since the mean location of the search window samples was left of the initial search window location. After a second iteration, the search window is again moved left as illustrated in step 2. The search window was moved left since the mean location of the search window samples in step 1 is left of the center of the search window in step 1. After a third iteration, the search window again moves left. However, for all subsequent iterations, the search window will remain stationary (provided the distribution data does not change). Thus, by the third iteration, the mean shift method has converged.
  • To use the mean shift method for two dimensional image data, the following procedures are followed: [0033]
  • Find the zeroth moment: [0034] M 00 = x y I ( x , y ) . ( 1 )
    Figure US20020012449A1-20020131-M00001
  • Find the first moment for x & y: [0035] M 10 = x y xI ( x , y ) ; M 01 = x y yI ( x , y ) . ( 2 )
    Figure US20020012449A1-20020131-M00002
  • Then the mean location (the centroid) is: [0036] x c = M 10 M 00 ; y c = M 01 M 00 . ( 3 )
    Figure US20020012449A1-20020131-M00003
  • Where I(x, y) is the image value at position (x, y) in the image, and x and y range over the search window. [0037]
  • The mean shift method disclosed with reference to FIG. 5 and FIG. 6 provides relatively good results, but it does have a few flaws. For example, for dynamically changing and moving probability distributions such as probability distributions derived from video sequences, there is no proper fixed search window size. Specifically, a small window might get caught tracking a user's nose or get lost entirely for large movements. A large search window might include a user and his hands as well as people in the background. Thus, if the distribution dymanically changes in time, then a static search window not produce optimal results. [0038]
  • Object Tracking Using Continuously Adaptive Mean Shift
  • To improve upon the mean shift method, the present invention introduces a continuously adaptive mean shift method referred to as a CAMSHIFT method. The CAMSHIFT method dynamically adjusts the size of the search window to produce improved results. The dynamically adjusting search window allows the mean shift method to operate better in environments where the data changes dynamically. [0039]
  • FIG. 7 graphically illustrates how the CAMSHIFT method of the present invention operates. At [0040] steps 710 and 720, an initial search window size and initial search window location are selected. The CAMSHIFT method performs one or more iterations of the mean shift method to move the search window at step 730. At step 750, the method adjusts the size of the search window. The size of the search window may be dependent upon information gathered about the data. Next, at step 760, the method determines if it has converged upon the center of the probability distribution. If the mean shift method has converged, then the method is done. If the CAMSHIFT method has not converged then the method returns to step 730 where the mean shift method is performed using the new search window location with the new search window size is calculated.
  • An example of the continuously adaptive mean shift method in operation is presented in FIG. 8. Again, to simplify the explanation, the example is provided using a one dimension slice of a two dimensional probability distribution. However, the same principles apply for a two or more dimensional distribution. In the example of FIG. 8, the continuously adaptive mean shift method adjusts the search window to a size that is proportional square root of the zeroth moment. Specifically, the continuously adaptive mean shift method in the example of FIG. 8 in two dimensions adjusts the search window to have a width and height of: [0041]
  • w=h=2*{square root}{square root over (M00)}.  (4)
  • wherein M[0042] 00 is the zeroth moment of the data within the search window. (See equation 1.) For N dimensional distributions where N ranges from 1 to infinity, each side of the search window would be set to w = α i * M 00 1 N . ( 5 )
    Figure US20020012449A1-20020131-M00004
  • where α[0043] 1. is a positive constant. However, other embodiments may use other methods of determining the search window size.
  • Referring to step [0044] 0 of FIG. 8, a three sample wide search window is placed at an initial location. After a first iteration of the method, the search window is moved to the left as illustrated in step 1 of FIG. 6. The search window was moved left since the mean location of the search window samples was left of the initial search window location. After a second iteration, the search window is again moved left as illustrated in step 2 since the mean location of the search window samples in step 1 is left of the center of the search window in step 1. However, it should also be noted that the size of the search window increased since the amount of data in the search window has increased. After a third iteration, the center of the search window again moves left and the search window size again increases. It can be seen in the subsequent iterations that the adaptive mean shift method adjusts the window size as it converges upon the mean of the data. Referring to step 7 of FIG. 8, the continuously adaptive mean shift method converges upon the mean of the contiguous data. It has been found that the continuously adaptive mean shift method with a search window width and height set equal to 2*{square root}{square root over (M00)}. will typically find the center of the largest connected region of a probability distribution, a great benefit for tracking one of multiple confusable objects.
  • Head Tracking Using Continuously Adaptive Mean Shift
  • To provide an example usage of the continuously adaptive mean shift method, one specific embodiment of a head tracking system is provided. However, many variations exist. The example is provided with reference to FIG. 9, FIG. 10A and FIG. 10B. [0045]
  • To reduce the amount of data that needs to be processed, the head tracker uses a limited “calculation region” that defines the area that will be examined closely. Specifically, the calculation region is the area for which a probability distribution is calculated. Area outside of the calculation region is not operated upon. Referring to step [0046] 907 of FIG. 9, the head tracker initially sets the calculation region to include the entire video frame. The initial search window size and location is selected to capture a head that is substantially centered in the video frame. Again, the initial search window size can include the entire video frame. Thus, the present invention can be implemented without any initial search window size or search window location parameters that need to be determined.
  • Next, at [0047] step 914, the flesh hue probability distribution is calculated in the calculation region. During the first iteration, the probability distribution is calculated for the entire video frame. The continuously adaptive mean shift method is applied by steps 921, 935, and 942 to locate the region with the highest probability density. At step 949, the size and location of the search window are reported. This information can be used for tracking the head location and size. The size parameter can be used to determine a distance of the head from the camera.
  • At [0048] step 956, the size and location of the search window are used to determine a new calculation region. The calculation region is the area 1030 centered around the search window 1020 as illustrated in FIG. 10A such that a flesh hue probability distribution will only be calculated for the area around the head. By using the size and location of the search window, the “lock” of the motion-tracker is reinforced on the object of interest. In the example of a head tracker, when a person is close to the camera as illustrated in FIG. 10B, the flesh probability distribution will be large and any movements they make will also be large in absolute number of pixels translated so the calculation region must be large. But when the person is far from the camera as illustrated in FIG. 10A, their flesh probability distribution will be small and even if the person moves quite fast the number of pixels that the person translates will be small since the person is so far from the camera, so the calculation region can be small. After determining the location of the new calculation region, the method returns to step 914 where the method calculates the probability distribution in the new calculation area. The method then proceeds to search for the area with the greatest probability density.
  • A Kickstart Method [0049]
  • To initially determine the size and location of the search window, other methods of object detection and tracking may be used. For example, in one embodiment, a motion difference is calculated for successive video frames. The center of the motion difference is then selected as the center of the search window since the center of the motion difference is likely to be a person in the image. [0050]
  • Search Window Sizing [0051]
  • In a digital embodiment such as digitized video images, the probability distributions are discrete. Since the methods of the present invention climb the gradient of a probability distribution, the minimum search window size must be greater than one in order to detect a gradient. Furthermore, in order to center the search window, the search window should be of odd size. Thus, for discrete distributions, the minimum window size is set at three. Also, as the method adapts the search window size, the size of the search window is rounded to the nearest odd number greater or equal to three in order to be able to center the search window. For tracking colored objects in video sequences, we adjust the search window size as described in [0052] equation 4.
  • Determining Orientation [0053]
  • After the probability distribution has been located by the search window, the orientation of the probability distribution can be determined. In the example of a flesh hue tracking system to locate a human head, the orientation of the head can be determined. To determine the probability distribution orientation, the second moment of the probability distribution is calculated. [0054] Equation 6 describes how a second moment is calculated.
  • Second moments are: [0055] M 20 = x y x 2 I ( x , y ) ; M 02 = x y y 2 I ( x , y ) ( 6 )
    Figure US20020012449A1-20020131-M00005
  • After determining the second moments of the probability distribution, the orientation of the probability distribution (the angle of the head) can be determined. [0056]
  • Then the object orientation (major axis) is: [0057] θ = arctan ( 2 ( M 11 M 00 - x c y c ) ( M 20 M 00 - x c 2 ) - ( M 02 M 00 - y c 2 ) ) 2 ( 7 )
    Figure US20020012449A1-20020131-M00006
  • In the embodiment of a head tracker, the orientation of the probability distribution is highly correlated with the orientation of the person's head. [0058]
  • The foregoing has described a method of tracking objects by tracking probability densities. It is contemplated that changes and modifications may be made by one of ordinary skill in the art, to the materials and arrangements of elements of the present invention without departing from the scope of the invention. [0059]

Claims (20)

We claim:
1. A method of tracking a probability distribution, said method comprising:
calculating a mean location of a probability distribution within a search window;
centering said search window onto said mean location;
resizing said search window; and
repeating said steps of calculating, resizing and centering.
2. The method as claimed in claim 1 wherein said steps of calculating and centering are performed until said search window converges.
3. The method as claimed in claim 1 wherein resizing said search window comprising resizing said search window dependent upon data located within said search window.
4. The method as claimed in claim 1 wherein resizing said search window comprising resizing said search window to size that is proportional to the Nth root of a zeroth moment of said data wherein N is a dimension of said data located within said search window.
5. The method as claimed in claim 1 wherein calculating said mean location comprises calculating a zeroth and first moment of data within said search window.
6. The method as claimed in claim 1, said method further comprising:
determining an orientation of said probability distribution.
7. A method of tracking an object within a video image, said method comprising:
converting said video image into a probability distribution;
calculating a mean location of a probability distribution within a search window;
centering said search window onto said mean location; and
repeating said steps of calculating and centering.
8. The method as claimed in claim 7, said method further comprising:
resizing said search window.
9. The method as claimed in claim 8 wherein resizing said search window comprising resizing said search window dependent upon data located within said search window.
10. The method as claimed in claim 8 wherein resizing said search window comprising resizing said search window to a length and a width that is proportional to the square root of a zeroth moment of said data located within said search window.
11. The method as claimed in claim 7 wherein calculating said mean location comprises calculating a zeroth and first moment of data within said search window.
12. The method as claimed in claim 7, said method further comprising the step of:
determining an orientation of said probability distribution.
13. The method as claimed in claim 12 wherein determining said orientation comprises determining a major axis of said probability distribution.
14. The method as claimed in claim 7 wherein said probability distribution comprises a flesh hue probability distribution.
15. An apparatus for tracking an object, said apparatus comprising:
a video camera, said video camera capturing an image of said object;
a video digitizer, said video digitizer digitizing said image; and
a computer system, said computer system converting said image into a probability distribution, said computer system iteratively calculating a mean location of a probability distribution within a subarea of said image and centering said subarea onto said mean location.
16. The apparatus as claimed in claim 15 wherein calculating said mean location comprises calculating a zeroth and first moment of data within said search window.
17. The apparatus as claimed in claim 15 wherein said computer system resizes said subarea dependent upon data located within said subarea.
18. The apparatus as claimed in claim 17 wherein resizing said subarea comprising resizing said subarea to size that is proportional to the Nth root of a zeroth moment of said data wherein N is a dimension of said data located within said subarea.
19. The apparatus as claimed in claim 15 wherein calculating said mean location comprises calculating a zeroth and first moment of data within said search window.
20. The apparatus as claimed in claim 15 wherein said computer determines an orientation of said probability distribution.
US09/079,917 1998-05-15 1998-05-15 Method and apparatus for tracking an object using a continuously adapting mean shift Expired - Fee Related US6394557B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/079,917 US6394557B2 (en) 1998-05-15 1998-05-15 Method and apparatus for tracking an object using a continuously adapting mean shift

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/079,917 US6394557B2 (en) 1998-05-15 1998-05-15 Method and apparatus for tracking an object using a continuously adapting mean shift

Publications (2)

Publication Number Publication Date
US20020012449A1 true US20020012449A1 (en) 2002-01-31
US6394557B2 US6394557B2 (en) 2002-05-28

Family

ID=22153633

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/079,917 Expired - Fee Related US6394557B2 (en) 1998-05-15 1998-05-15 Method and apparatus for tracking an object using a continuously adapting mean shift

Country Status (1)

Country Link
US (1) US6394557B2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070014471A1 (en) * 2005-07-18 2007-01-18 Sergey Simanovsky Method of and system for splitting compound objects in multi-energy computed tomography images
US20090238406A1 (en) * 2006-09-29 2009-09-24 Thomson Licensing Dynamic state estimation
CN101894378A (en) * 2010-06-13 2010-11-24 南京航空航天大学 Moving target visual tracking method and system based on double ROI (Region of Interest)
CN102074000A (en) * 2010-11-23 2011-05-25 天津市亚安科技电子有限公司 Tracking method for adaptively adjusting window width by utilizing optimal solution of variance rate
US8139896B1 (en) * 2005-03-28 2012-03-20 Grandeye, Ltd. Tracking moving objects accurately on a wide-angle video
US20130113940A1 (en) * 2006-09-13 2013-05-09 Yoshikazu Watanabe Imaging device and subject detection method
US20130148850A1 (en) * 2011-12-13 2013-06-13 Fujitsu Limited User detecting apparatus, user detecting mehtod, and computer-readable recording medium storing a user detecting program
CN103426185A (en) * 2013-08-09 2013-12-04 北京博思廷科技有限公司 Method and device for adjusting target scale in pan-tilt-zoom (PTZ) tracking process
US20150243049A1 (en) * 2012-10-22 2015-08-27 Nokia Technologies Oy Classifying image samples
US20160366308A1 (en) * 2015-06-12 2016-12-15 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Electronic device and image tracking method thereof
CN110532432A (en) * 2019-08-21 2019-12-03 深圳供电局有限公司 A kind of personage's trajectory retrieval method and its system, computer readable storage medium
CN111010590A (en) * 2018-10-08 2020-04-14 传线网络科技(上海)有限公司 Video clipping method and device

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6757422B1 (en) * 1998-11-12 2004-06-29 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US7787017B2 (en) * 2000-05-18 2010-08-31 OptigraP Sagl Digital camera and method for identification of objects
US6788809B1 (en) * 2000-06-30 2004-09-07 Intel Corporation System and method for gesture recognition in three dimensions using stereo imaging and color vision
US6920237B2 (en) * 2000-12-19 2005-07-19 Eastman Kodak Company Digital image processing method and computer program product for detecting human irises in an image
US6760465B2 (en) 2001-03-30 2004-07-06 Intel Corporation Mechanism for tracking colored objects in a video sequence
US20020140705A1 (en) * 2001-03-30 2002-10-03 Frazer Matthew E. Automated Calibration for colored object tracking
US6985179B2 (en) * 2001-03-30 2006-01-10 Intel Corporaiton Determining image quality for improving object trackability
US7274803B1 (en) 2002-04-02 2007-09-25 Videomining Corporation Method and system for detecting conscious hand movement patterns and computer-generated visual feedback for facilitating human-computer interaction
JP4009851B2 (en) * 2002-05-20 2007-11-21 セイコーエプソン株式会社 Projection-type image display system, projector, program, information storage medium, and image projection method
JP4009850B2 (en) * 2002-05-20 2007-11-21 セイコーエプソン株式会社 Projection-type image display system, projector, program, information storage medium, and image projection method
US6925122B2 (en) * 2002-07-25 2005-08-02 National Research Council Method for video-based nose location tracking and hands-free computer input devices based thereon
US7317812B1 (en) 2002-11-15 2008-01-08 Videomining Corporation Method and apparatus for robustly tracking objects
WO2005031552A2 (en) * 2003-09-30 2005-04-07 Koninklijke Philips Electronics, N.V. Gesture to define location, size, and/or content of content window on a display
US7590291B2 (en) 2004-12-06 2009-09-15 Intel Corporation Method and apparatus for non-parametric hierarchical clustering
US7246100B2 (en) * 2004-12-06 2007-07-17 Intel Corporation Classifying an analog voltage in a control system using binary classification of time segments determined by voltage level
US7272583B2 (en) * 2004-12-06 2007-09-18 Intel Corporation Using supervised classifiers with unsupervised data
FR2885719B1 (en) * 2005-05-10 2007-12-07 Thomson Licensing Sa METHOD AND DEVICE FOR TRACKING OBJECTS IN AN IMAGE SEQUENCE
US7835542B2 (en) * 2005-12-29 2010-11-16 Industrial Technology Research Institute Object tracking systems and methods utilizing compressed-domain motion-based segmentation
US8374388B2 (en) * 2007-12-28 2013-02-12 Rustam Stolkin Real-time tracking of non-rigid objects in image sequences for which the background may be changing
US8098888B1 (en) 2008-01-28 2012-01-17 Videomining Corporation Method and system for automatic analysis of the trip of people in a retail space using multiple cameras
US8284258B1 (en) 2008-09-18 2012-10-09 Grandeye, Ltd. Unusual event detection in wide-angle video (based on moving object trajectories)
GB0818561D0 (en) * 2008-10-09 2008-11-19 Isis Innovation Visual tracking of objects in images, and segmentation of images
US9417700B2 (en) * 2009-05-21 2016-08-16 Edge3 Technologies Gesture recognition systems and related methods
US8396252B2 (en) 2010-05-20 2013-03-12 Edge 3 Technologies Systems and related methods for three dimensional gesture recognition in vehicles
WO2012030872A1 (en) 2010-09-02 2012-03-08 Edge3 Technologies Inc. Method and apparatus for confusion learning
US8655093B2 (en) 2010-09-02 2014-02-18 Edge 3 Technologies, Inc. Method and apparatus for performing segmentation of an image
US8666144B2 (en) 2010-09-02 2014-03-04 Edge 3 Technologies, Inc. Method and apparatus for determining disparity of texture
US8582866B2 (en) 2011-02-10 2013-11-12 Edge 3 Technologies, Inc. Method and apparatus for disparity computation in stereo images
US8970589B2 (en) 2011-02-10 2015-03-03 Edge 3 Technologies, Inc. Near-touch interaction with a stereo camera grid structured tessellations
US9672609B1 (en) 2011-11-11 2017-06-06 Edge 3 Technologies, Inc. Method and apparatus for improved depth-map estimation
US10721448B2 (en) 2013-03-15 2020-07-21 Edge 3 Technologies, Inc. Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
US10115016B2 (en) 2017-01-05 2018-10-30 GM Global Technology Operations LLC System and method to identify a vehicle and generate reservation
US10229601B2 (en) 2017-01-30 2019-03-12 GM Global Technology Operations LLC System and method to exhibit vehicle information

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8703931D0 (en) * 1987-02-19 1993-05-05 British Aerospace Tracking systems
US4907169A (en) * 1987-09-30 1990-03-06 International Technical Associates Adaptive tracking vision and guidance system
US5267329A (en) * 1990-08-10 1993-11-30 Kaman Aerospace Corporation Process for automatically detecting and locating a target from a plurality of two dimensional images
GB9019538D0 (en) * 1990-09-07 1990-10-24 Philips Electronic Associated Tracking a moving object
DE69233722T2 (en) * 1991-09-12 2009-02-12 Fujifilm Corp. Method for determining object images and method for determining the movement thereof
JP3318680B2 (en) * 1992-04-28 2002-08-26 サン・マイクロシステムズ・インコーポレーテッド Image generation method and image generation device
US5287437A (en) 1992-06-02 1994-02-15 Sun Microsystems, Inc. Method and apparatus for head tracked display of precomputed stereo images
US5550928A (en) * 1992-12-15 1996-08-27 A.C. Nielsen Company Audience measurement system and method
US5394202A (en) 1993-01-14 1995-02-28 Sun Microsystems, Inc. Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
US5961571A (en) * 1994-12-27 1999-10-05 Siemens Corporated Research, Inc Method and apparatus for automatically tracking the location of vehicles

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8139896B1 (en) * 2005-03-28 2012-03-20 Grandeye, Ltd. Tracking moving objects accurately on a wide-angle video
US7539337B2 (en) * 2005-07-18 2009-05-26 Analogic Corporation Method of and system for splitting compound objects in multi-energy computed tomography images
US20070014471A1 (en) * 2005-07-18 2007-01-18 Sergey Simanovsky Method of and system for splitting compound objects in multi-energy computed tomography images
US20130113940A1 (en) * 2006-09-13 2013-05-09 Yoshikazu Watanabe Imaging device and subject detection method
US8830346B2 (en) * 2006-09-13 2014-09-09 Ricoh Company, Ltd. Imaging device and subject detection method
US20090238406A1 (en) * 2006-09-29 2009-09-24 Thomson Licensing Dynamic state estimation
CN101894378A (en) * 2010-06-13 2010-11-24 南京航空航天大学 Moving target visual tracking method and system based on double ROI (Region of Interest)
CN102074000A (en) * 2010-11-23 2011-05-25 天津市亚安科技电子有限公司 Tracking method for adaptively adjusting window width by utilizing optimal solution of variance rate
US20130148850A1 (en) * 2011-12-13 2013-06-13 Fujitsu Limited User detecting apparatus, user detecting mehtod, and computer-readable recording medium storing a user detecting program
US9280649B2 (en) * 2011-12-13 2016-03-08 Fujitsu Limited Apparatus and method for detecting an object from an image
US10096127B2 (en) * 2012-10-22 2018-10-09 Nokia Technologies Oy Classifying image samples
US20150243049A1 (en) * 2012-10-22 2015-08-27 Nokia Technologies Oy Classifying image samples
CN103426185A (en) * 2013-08-09 2013-12-04 北京博思廷科技有限公司 Method and device for adjusting target scale in pan-tilt-zoom (PTZ) tracking process
US10015371B2 (en) * 2015-06-12 2018-07-03 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Electronic device and image tracking method thereof
US20160366308A1 (en) * 2015-06-12 2016-12-15 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Electronic device and image tracking method thereof
CN111010590A (en) * 2018-10-08 2020-04-14 传线网络科技(上海)有限公司 Video clipping method and device
CN110532432A (en) * 2019-08-21 2019-12-03 深圳供电局有限公司 A kind of personage's trajectory retrieval method and its system, computer readable storage medium

Also Published As

Publication number Publication date
US6394557B2 (en) 2002-05-28

Similar Documents

Publication Publication Date Title
US6394557B2 (en) Method and apparatus for tracking an object using a continuously adapting mean shift
US6363160B1 (en) Interface using pattern recognition and tracking
Greiffenhagen et al. Statistical modeling and performance characterization of a real-time dual camera surveillance system
US8369574B2 (en) Person tracking method, person tracking apparatus, and person tracking program storage medium
CN109344702B (en) Pedestrian detection method and device based on depth image and color image
CN110780739B (en) Eye control auxiliary input method based on gaze point estimation
US8577151B2 (en) Method, apparatus, and program for detecting object
US8374392B2 (en) Person tracking method, person tracking apparatus, and person tracking program storage medium
US7650015B2 (en) Image processing method
US8553931B2 (en) System and method for adaptively defining a region of interest for motion analysis in digital video
US20040213460A1 (en) Method of human figure contour outlining in images
JP2003526841A (en) Face extraction system and method based on biometrics
WO2009109127A1 (en) Real-time body segmentation system
US20090245575A1 (en) Method, apparatus, and program storage medium for detecting object
US7181047B2 (en) Methods and apparatus for identifying and localizing an area of relative movement in a scene
WO1998043105A2 (en) Object tracking system using statistical modeling and geometric relationship
KR100777199B1 (en) Apparatus and method for tracking of moving target
EP1105842B1 (en) Image processing apparatus
US20090245576A1 (en) Method, apparatus, and program storage medium for detecting object
CN111291701A (en) Sight tracking method based on image gradient and ellipse fitting algorithm
US20030052971A1 (en) Intelligent quad display through cooperative distributed vision
CN113608663A (en) Fingertip tracking method based on deep learning and K-curvature method
CN113485615A (en) Method and system for making typical application intelligent image-text tutorial based on computer vision
CN111145216B (en) Tracking method of video image target
JP4559375B2 (en) Object position tracking method, apparatus, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BRADSKI, GARY ROST;REEL/FRAME:009179/0726

Effective date: 19980513

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20100528