US20180300539A1 - Method and Apparatus for Pattern Tracking - Google Patents

Method and Apparatus for Pattern Tracking Download PDF

Info

Publication number
US20180300539A1
US20180300539A1 US15/894,073 US201815894073A US2018300539A1 US 20180300539 A1 US20180300539 A1 US 20180300539A1 US 201815894073 A US201815894073 A US 201815894073A US 2018300539 A1 US2018300539 A1 US 2018300539A1
Authority
US
United States
Prior art keywords
hand
pill
region
determined
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/894,073
Inventor
Lei Guan
Adam Hanina
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AIC Innovations Group Inc
Original Assignee
AIC Innovations Group Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AIC Innovations Group Inc filed Critical AIC Innovations Group Inc
Priority to US15/894,073 priority Critical patent/US20180300539A1/en
Assigned to AIC INNOVATIONS GROUP, INC. reassignment AIC INNOVATIONS GROUP, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: AI Cure Technologies, Inc.
Assigned to AI Cure Technologies, Inc. reassignment AI Cure Technologies, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GUAN, Lei, HANINA, ADAM
Publication of US20180300539A1 publication Critical patent/US20180300539A1/en
Assigned to WESTERN ALLIANCE BANK reassignment WESTERN ALLIANCE BANK SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AIC INNOVATIONS GROUP, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/00355
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/66Trinkets, e.g. shirt buttons or jewellery items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • This invention relates generally to the tracking of objects through time and space, and more particularly to the recognition and tracking of a particular object being held by a particular holder by a single web camera or the like employing still or sequential video images.
  • the invention also relates to the tracking of a pill or other medication held between the fingers of a medication administrator.
  • a tracking scheme to a pill management system may be particularly troublesome in that a pill or other medication may be small, and may be colored similarly to a background object such as the shirt of a user, wall or other object. Furthermore, the user may move the pill or other medication quickly through the field of view of the camera. If implemented on a mobile device or the like, movement of the device in addition to movement of the pill may contribute to tracking difficulties, as may various environmental difficulties, such as poor lighting, background noise and the like. These variables may contribute to a very challenging situation for pill identification and tracking over time.
  • such a determination may be made particularly difficult when employing a simple webcam that does not include the ability to determine distance, such as with a time of flight camera or a stereo camera pair, and in particular if the determination is to be made on a small pill with a potentially close proximity to the webcam.
  • an image captured by the webcam comprises a two dimensional picture of a scene without the ability to differentiate between near and far objects.
  • Various complications as noted above may make such determinations even more difficult.
  • known patterns may be exploited in order to track one or more objects.
  • knowledge of the color of the skin of a user, or user of a range of possible or potential skin tones may allow for the tracking of a pill or other medication by searching for a combination of colors including “skin-pill-skin”, thus allowing for differentiation of such a pill even from a background with a color similar to that of the pill.
  • the color sequence may be tracked through various images over time. If the image is lost, prompting may be provided to the user to place the pill at a particular location in the display to allow for re-identification and continued tracking of the pill.
  • Such a tracking scheme may be extended to tracking any number of types of objects, and in particular any such objects being held by a user in a manner in which skin tone from the user's hand or the like is visible in a relatively fixed relationship to such object to be tracked.
  • the inventive system may also be extended to use in auditing of various desired action sequences, and in particular to a sequence to be performed by, for example, a surgeon, where the skin tone and medical device color combination may be identified and tracked to provide and automated audit of actions performed. Similar audit may be performed to other actions of doctors, nurses, or other healthcare providers.
  • the system may also be applicable to other, non-medical applications.
  • the invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, and the apparatus embodying features of construction, combinations of elements and arrangement of parts that are adapted to affect such steps, all as exemplified in the following detailed disclosure, and the scope of the invention will be indicated in the claims.
  • FIG. 1 is a flowchart diagram depicting an embodiment of the invention.
  • a basis for tracking of the pill in accordance with embodiments of the invention may comprise tracking a specific color and/or shape pattern “fingertip-pill-fingertip”, “palm-pill-palm”, “cup-pill-cup” after determining a fingertip or palm color and/or shape of the user, or the color of a holding device, such as a cup or the like (use of the fingertips and color will be continue to be used through the remainder of the description, although any of the alternative embodiments may be employed), and preferably knowing the color of the pill in advance, although determination of the pill color in real or near-real time may also be provided.
  • this sequence may also be applicable to other than a pill, such as a medical instrument or other device to be tracked.
  • the user may be asked to place the pill so that when imaged by a webcam, fills a particular portion of a display.
  • a determination of the pill color may be made, and the colors immediately above and below the pill may further be determined to be the fingertips, and thus comprise the fingertip color, of the user.
  • a user may be asked to place their fingers or palm, or both, in a specific location to be imaged so that the color thereof may be determined. Additional characteristics of the pill or other objects to be tracked, such as shape, color combinations, markings or the like may also be employed.
  • the inventive system thereafter may split the imaged region into three sub-regions with individual reference color models for each region (i.e. a model for the top fingertip, a model for the pill and a model for the bottom fingertip), generating a pattern signature that may be distinguishable from other objects or sequences of objects that may enter the field of view of the web cam.
  • individual reference color models for each region (i.e. a model for the top fingertip, a model for the pill and a model for the bottom fingertip), generating a pattern signature that may be distinguishable from other objects or sequences of objects that may enter the field of view of the web cam.
  • a single reference color model may be used for the entire fingertip-pill-fingertip area.
  • a comparison of the imaged pill with prior reference images of one or more known pills is made.
  • features including the color of the fingers and pill, as well as any other characteristic features of the pill and/or fingers that may be relied upon.
  • the features of these images are then each preferably represented by a feature vector.
  • a distance between each pair of image vectors may be calculated, using one or more known techniques, such as Euclidean distance, Mahalanobis distance or one or more other known machine learning methods.
  • Each distance value is then preferably fit into a predetermined Gaussian distribution to generate a probability value which may be used to evaluate similarity.
  • This predetermined Gaussian distribution may be populated by acquisition of various images and processing thereof to generate the distribution.
  • a later acquired image with its feature vector is then compared to the library of acquired images to determine a distance between its feature vector, and the distribution of feature vectors, thus providing a probability value indicative of the likelihood of a match between two images.
  • confidence rates indicative of confidence of a match between an acquired set of images and a reference set of images, may be calculated for each region (finger, pill, finger, for example) by measuring the similarity of an observed color and other noted characteristics, and comparing with a predetermined characteristics each of the one or more reference images.
  • Score level fusion, decision fusion, or other appropriate processing may be applied to those confidence rates to aid in determining if the region contains the desired pill or not.
  • a high level of assurance can be achieved that a newly acquired image is in fact a desired image, it may be possible to reduce further processing power, by reducing resolution, frame rate, etc.; requiring only enough processing power to confirm that the object has not changed.
  • Such features may be beneficial when employing the system on a mobile or other reduced processing powered device.
  • these confidence levels may be stored and compared over multiple medication administrations to determine whether the user is improving over time in taking their medication or not, and may be used to flag situations where the user may be trying to trick the system, or may benefit from additional training regarding the process of medication administration.
  • Other factors such as lighting, background colors and other conditions may impact a particular acquired color or acquisition of other desired characteristic, and thus may also be employed to suggest a change of location, lighting, hardware, or other issues that may affect the ability of the system to properly acquire images of the system.
  • Advantages of the inventive method include, among others: 1) it is relatively robust to variations in shape, rotation, scale, lighting, movement of a mobile platform and the like; 2) it is relatively robust to changes background, or background that may have a color similar to the color of the pill or another object; 3) it uses the information based upon pattern identification assumptions.
  • a reference color model for the fingertip regions and pill region may be determined. Separating the colors in this region for proper separate acquisition, while potentially improving subsequent recognition of those elements, presents another challenge.
  • Embodiments of the invention may preferably employ motion information to determine whether the “hand-pill-hand” is within a predetermined region of interest of the image. If large areas of motion are measured, image segmentation techniques may be used to further analyze that region of the image. If that region is able to be segmented into three parts (hand, pill, hand), the inventive system is then able to build the color model for each region. If it is determined that the region cannot be segmented, the system then may further determine that the image of the region being investigated may not include a proper hand-pill-hand combination. Or alternatively, if the region can be determined to include the hand-pill-hand combination, but for some reason cannot be properly segmented, the entire region may be used and included as a single entity.
  • processing begins at step 100 and at step 105 a user is asked to place a pill in a particular portion of a field of view of a webcam, as may be indicated to them on a display including their image and a graphical or other locator for pill placement.
  • a foreground detection process is performed at step 110 to determine a hand-pill-hand region in a manner as described above.
  • image segmentation may be performed to separate the determined hand portion of the hand-pill-hand region from the pill portion thereof.
  • three reference models may be built, one for each hand region and one for the pill region based upon their appearances.
  • Such appearances may include, but are not limited to, shape, color, texture, gray-scale intensity, histogram of colors, or other determinable attributes of the hand and pill.
  • a single reference model for the entire hand-pill-hand region may be built.
  • a dynamic model for tracking the hand-pill-hand region is initialized.
  • N possible next positions for the hand-pill-hand region may be determined at step 130 , and at step 135 for each such determined position, various features may be determined, and a new model for that region may be built in accordance with the determined position.
  • step 140 for each such position a comparison is made between the new model and a reference model, and at step 145 a position whose new model generates a highest similarity score (thus, having the smallest feature vector difference) between it and the current reference model is determined. It is then inquired at step 150 whether that similarity score is greater than a predetermined threshold.
  • step 150 If it is determined that the similarity score is not greater than a predetermined threshold, and thus the inquiry is answered in the negative, processing returns to step 130 , and an additional N possible hand positions may be presented. If on the other hand, the inquiry at step 150 is answered in the affirmative, and it is determined that the similarity score for one of the determined new model positions is grater that the predetermined threshold, processing continues with step 155 , where a new position of the pill (and thus the hand-pill-hand combination) is determined to have been found. Processing then moves to step 160 where a new dynamic model is determined, and processing then passes to step 130 where N possible new hand-pill-hand positions are determined.
  • both motion tracking and image segmentation techniques may be employed.
  • the ratios of the three segmented sub-regions (top hand, pill, bottom hand) can be obtained.
  • the ratios not only help to divide the tracking window into three sub-regions and but also help with subsequent tracking. Although a size of a tracking window may change over time, the approximate ratio of these segmented sub-regions is likely the same, a useful reference point for tracking.
  • a dynamic feature selection optimization scheme based on one or more decision fusion confidence levels for pill or other object recognition and tracking may be employed.
  • processing power is generally considered to be expensive, especially in mobile devices.
  • confidence levels of object or pill identification or object or pill tracking are high, the need to perform high levels of processing may be significantly reduced. This may allow the system to be switched to possibly employ lower performance machine vision computations, resulting in better performance and potentially lower bandwidth requirements.
  • a similar approach may be to downsample images based on confidence levels and mutliscaling.
  • an original image could be downsampled to 1/20, 1/10, 1 ⁇ 5 or the like. If the confidence is low in 1/20 scale, then the system may preferably utilize a next level until an acceptable confidence is achieved. Such adjustments may be made in a dynamic manner because the performance is continually monitored over time and within multiple frames to optimize confidence levels.
  • a process of feature selection may be optimized due to the fact that some features require higher computational power than others: a) in predictive color scanning, the inventive system may discard non-core pixel colors that do not match expected results to avoid wasting of computational bandwidth; b) when performing predictive shape scanning, the system may search algorithms based on identifying known corners or images; and c) the system may apply a similar methodology to unique markings on the pill or shape of interest instead of identifying points in a procedural manner. The system therefore may automatically narrow confidence levels to optimize confirmation, and in each case reduce the number of points needed for each feature to optimize accuracy.
  • the user may be encouraged by a proximity indicator to bring the pill close enough to the camera to ensure the correct amount of data is captured to both confirm likeness and to ensure the proper metrics are tracked.
  • the system may then guide the user for correct proximity placement to maximize confidence levels of object verification.
  • Other objects may be tracked with a simple webcam by the user of similar image segmentation and other aspects of the invention. For example, a brand of a pair of shoes may be identified, and the wearer may be offered a discount when walking by. Based upon known patterns, cans of soda or other food items may be similarly tracked, perhaps requesting the user brings such a can closer to the webcam to interact with the user on a new sales promotion.
  • the system may further be applied to injectable medications and the like, inhalers, or other medication delivery systems, and in particular may be employed to confirm activation and/or positioning thereof.
  • determination of a pill being in a patient's mouth may be tracked and confirmed. Because webcams cannot see depth, as long as the pill or other object is substantially surrounded by an unbroken ring of red color (the inner parts of the mouth), then one can safely assume that the pill is no longer held by the fingers or the hand and has been placed in the mouth.
  • the unbroken circle is another pattern to determine placement. The unbroken circle may be determined and tracked in place of the finger tip or palm as noted above. Identification sequences may be similar to those noted above with respect to the finger-pill-finger combination, but rather employing a color determination of mouth-pill-mouth.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

A method and apparatus for pattern tracking. The method includes the steps of performing a foreground detection process to determine a hand-pill-hand region, performing image segmentation to separate the determined hand portion of the hand-pill-hand region from the pill portion thereof, building three reference models, one for each hand region and one for the pill region, initializing a dynamic model for tracking the hand-pill-hand region, determining N possible next positions for the hand-pill-hand region, for each such determined position, determining various features, building a new model for that region in accordance with the determined position, for each position, comparing the new model and a reference model, determining a position whose new model generates a highest similarity score, determining whether that similarity score is greater than a predetermined threshold, and wherein if it is determined that the similarity score is greater than the predetermined threshold, the object is tracked.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/447,243 to Guan et al., filed Feb. 28, 2011, titled “Method and Apparatus for Pattern Tracking”, the entire contents thereof being incorporated herein by reference.
  • FIELD OF THE INVENTION
  • This invention relates generally to the tracking of objects through time and space, and more particularly to the recognition and tracking of a particular object being held by a particular holder by a single web camera or the like employing still or sequential video images. The invention also relates to the tracking of a pill or other medication held between the fingers of a medication administrator.
  • BACKGROUND OF THE INVENTION
  • Automatic identification and tracking of objects in three dimensional space utilizing only a standard web cam is difficult in that there is no simple way of determining distance from the camera. Furthermore, selecting a particular object from a plurality of objects may be difficult in that lack of depth perception does not allow for the differentiation of these objects based on position in a direction comprising distance from the camera (z-axis direction). Complicated images may therefore result in an unacceptable number of false positive or false negative responses.
  • Application of such a tracking scheme to a pill management system may be particularly troublesome in that a pill or other medication may be small, and may be colored similarly to a background object such as the shirt of a user, wall or other object. Furthermore, the user may move the pill or other medication quickly through the field of view of the camera. If implemented on a mobile device or the like, movement of the device in addition to movement of the pill may contribute to tracking difficulties, as may various environmental difficulties, such as poor lighting, background noise and the like. These variables may contribute to a very challenging situation for pill identification and tracking over time.
  • Therefore, it would be desirable to provide a method and apparatus that overcome the drawbacks of the prior art.
  • SUMMARY OF THE INVENTION
  • In U.S. patent application Ser. No. 12/620,686 filed Nov. 18, 2009 titled Method and Apparatus for Verification of Medication Administration Adherence; Ser. No. 12/646,383 filed Dec. 23, 2009 titled Method and Apparatus for Verification of Clinical Trial Adherence; Ser. No. 12/646,603 filed Dec. 23, 2009 titled Method and Apparatus for Management of Clinical Trials; and Ser. No. 12/728,721 filed Mar. 22, 2010 titled Apparatus and Method for Collection of Protocol Adherence Data, the entire contents of each of these applications being incorporated herein by reference, as well as in other co-owned applications, the inventors of the present invention describe a system for automatically monitoring patient adherence to a medication protocol. As part of that application, determination of when a user places a pill in their mouth is an important step. Such a determination further requires that such a pill is first determined to be present in a field of view, and thereafter tracked through that field of view.
  • As noted above, such a determination may be made particularly difficult when employing a simple webcam that does not include the ability to determine distance, such as with a time of flight camera or a stereo camera pair, and in particular if the determination is to be made on a small pill with a potentially close proximity to the webcam. In such a situation, an image captured by the webcam comprises a two dimensional picture of a scene without the ability to differentiate between near and far objects. Various complications as noted above may make such determinations even more difficult.
  • In accordance with various embodiments of the invention, known patterns may be exploited in order to track one or more objects. Thus, in one preferred embodiment of the present invention, knowledge of the color of the skin of a user, or user of a range of possible or potential skin tones, may allow for the tracking of a pill or other medication by searching for a combination of colors including “skin-pill-skin”, thus allowing for differentiation of such a pill even from a background with a color similar to that of the pill. Once identified, the color sequence may be tracked through various images over time. If the image is lost, prompting may be provided to the user to place the pill at a particular location in the display to allow for re-identification and continued tracking of the pill. Such a tracking scheme may be extended to tracking any number of types of objects, and in particular any such objects being held by a user in a manner in which skin tone from the user's hand or the like is visible in a relatively fixed relationship to such object to be tracked.
  • The inventive system may also be extended to use in auditing of various desired action sequences, and in particular to a sequence to be performed by, for example, a surgeon, where the skin tone and medical device color combination may be identified and tracked to provide and automated audit of actions performed. Similar audit may be performed to other actions of doctors, nurses, or other healthcare providers. The system may also be applicable to other, non-medical applications.
  • Still other objects and advantages of the invention will in part be obvious and will in part be apparent from the specification and drawings.
  • The invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, and the apparatus embodying features of construction, combinations of elements and arrangement of parts that are adapted to affect such steps, all as exemplified in the following detailed disclosure, and the scope of the invention will be indicated in the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a more complete understanding of the invention, reference is made to the following description and accompanying drawings, in which:
  • FIG. 1 is a flowchart diagram depicting an embodiment of the invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The invention will now be described making reference to the following drawings in which like reference numbers denote like structure or steps.
  • In accordance with one or more embodiments of the present invention, when attempting to track a pill being held and administered by a user, an assumption may be made that the pill is always held between two fingers, or otherwise held in the palm of the user. Employing this assumption, a basis for tracking of the pill in accordance with embodiments of the invention may comprise tracking a specific color and/or shape pattern “fingertip-pill-fingertip”, “palm-pill-palm”, “cup-pill-cup” after determining a fingertip or palm color and/or shape of the user, or the color of a holding device, such as a cup or the like (use of the fingertips and color will be continue to be used through the remainder of the description, although any of the alternative embodiments may be employed), and preferably knowing the color of the pill in advance, although determination of the pill color in real or near-real time may also be provided. As noted above, this sequence may also be applicable to other than a pill, such as a medical instrument or other device to be tracked. In one embodiment of the invention, the user may be asked to place the pill so that when imaged by a webcam, fills a particular portion of a display. A determination of the pill color may be made, and the colors immediately above and below the pill may further be determined to be the fingertips, and thus comprise the fingertip color, of the user. Alternatively, a user may be asked to place their fingers or palm, or both, in a specific location to be imaged so that the color thereof may be determined. Additional characteristics of the pill or other objects to be tracked, such as shape, color combinations, markings or the like may also be employed. The inventive system thereafter may split the imaged region into three sub-regions with individual reference color models for each region (i.e. a model for the top fingertip, a model for the pill and a model for the bottom fingertip), generating a pattern signature that may be distinguishable from other objects or sequences of objects that may enter the field of view of the web cam. Alternatively, as will be described below, a single reference color model may be used for the entire fingertip-pill-fingertip area.
  • In order to aid in determining whether a pill is a correct pill a comparison of the imaged pill with prior reference images of one or more known pills is made. In this particular instance, features including the color of the fingers and pill, as well as any other characteristic features of the pill and/or fingers that may be relied upon. The features of these images are then each preferably represented by a feature vector. In order to determine similarity between the images, a distance between each pair of image vectors may be calculated, using one or more known techniques, such as Euclidean distance, Mahalanobis distance or one or more other known machine learning methods. Each distance value is then preferably fit into a predetermined Gaussian distribution to generate a probability value which may be used to evaluate similarity. This predetermined Gaussian distribution may be populated by acquisition of various images and processing thereof to generate the distribution. A later acquired image with its feature vector is then compared to the library of acquired images to determine a distance between its feature vector, and the distribution of feature vectors, thus providing a probability value indicative of the likelihood of a match between two images. Thus, through this process, confidence rates, indicative of confidence of a match between an acquired set of images and a reference set of images, may be calculated for each region (finger, pill, finger, for example) by measuring the similarity of an observed color and other noted characteristics, and comparing with a predetermined characteristics each of the one or more reference images. In accordance with an embodiment of the invention, features Score level fusion, decision fusion, or other appropriate processing may be applied to those confidence rates to aid in determining if the region contains the desired pill or not. As a result, if a high level of assurance can be achieved that a newly acquired image is in fact a desired image, it may be possible to reduce further processing power, by reducing resolution, frame rate, etc.; requiring only enough processing power to confirm that the object has not changed. Such features may be beneficial when employing the system on a mobile or other reduced processing powered device.
  • In addition to comparing any such resulting score or confidence level to one or more thresholds to determine one or more levels of similarity, these confidence levels may be stored and compared over multiple medication administrations to determine whether the user is improving over time in taking their medication or not, and may be used to flag situations where the user may be trying to trick the system, or may benefit from additional training regarding the process of medication administration. Other factors such as lighting, background colors and other conditions may impact a particular acquired color or acquisition of other desired characteristic, and thus may also be employed to suggest a change of location, lighting, hardware, or other issues that may affect the ability of the system to properly acquire images of the system.
  • Advantages of the inventive method include, among others: 1) it is relatively robust to variations in shape, rotation, scale, lighting, movement of a mobile platform and the like; 2) it is relatively robust to changes background, or background that may have a color similar to the color of the pill or another object; 3) it uses the information based upon pattern identification assumptions.
  • As noted above, in accordance with various embodiments of the invention, a reference color model for the fingertip regions and pill region may be determined. Separating the colors in this region for proper separate acquisition, while potentially improving subsequent recognition of those elements, presents another challenge. In accordance with this particular embodiment of the invention, no previous knowledge or information regarding the pill, including its color, need be known. This information is preferably captured during run-time (i.e. a first step before the object is tracked). Therefore, the system preferably determines when a hand-pill-hand combination is within a region of interest, and then differentiates the two hand portions and the pill for the purposes of building reference color models. Embodiments of the invention may preferably employ motion information to determine whether the “hand-pill-hand” is within a predetermined region of interest of the image. If large areas of motion are measured, image segmentation techniques may be used to further analyze that region of the image. If that region is able to be segmented into three parts (hand, pill, hand), the inventive system is then able to build the color model for each region. If it is determined that the region cannot be segmented, the system then may further determine that the image of the region being investigated may not include a proper hand-pill-hand combination. Or alternatively, if the region can be determined to include the hand-pill-hand combination, but for some reason cannot be properly segmented, the entire region may be used and included as a single entity.
  • Therefore, referring first to FIG. 1, processing begins at step 100 and at step 105 a user is asked to place a pill in a particular portion of a field of view of a webcam, as may be indicated to them on a display including their image and a graphical or other locator for pill placement. Once placed in an appropriate location, a foreground detection process is performed at step 110 to determine a hand-pill-hand region in a manner as described above. Thereafter, at step 115, image segmentation may be performed to separate the determined hand portion of the hand-pill-hand region from the pill portion thereof. Then, at step 120, three reference models may be built, one for each hand region and one for the pill region based upon their appearances. Such appearances may include, but are not limited to, shape, color, texture, gray-scale intensity, histogram of colors, or other determinable attributes of the hand and pill. Alternatively, a single reference model for the entire hand-pill-hand region may be built. Next, at step 125, a dynamic model for tracking the hand-pill-hand region is initialized.
  • Thereafter, based upon that dynamic model, N possible next positions for the hand-pill-hand region may be determined at step 130, and at step 135 for each such determined position, various features may be determined, and a new model for that region may be built in accordance with the determined position. Next, at step 140 for each such position, a comparison is made between the new model and a reference model, and at step 145 a position whose new model generates a highest similarity score (thus, having the smallest feature vector difference) between it and the current reference model is determined. It is then inquired at step 150 whether that similarity score is greater than a predetermined threshold. If it is determined that the similarity score is not greater than a predetermined threshold, and thus the inquiry is answered in the negative, processing returns to step 130, and an additional N possible hand positions may be presented. If on the other hand, the inquiry at step 150 is answered in the affirmative, and it is determined that the similarity score for one of the determined new model positions is grater that the predetermined threshold, processing continues with step 155, where a new position of the pill (and thus the hand-pill-hand combination) is determined to have been found. Processing then moves to step 160 where a new dynamic model is determined, and processing then passes to step 130 where N possible new hand-pill-hand positions are determined.
  • As is noted in the description of FIG. 1, both motion tracking and image segmentation techniques may be employed. Of course, while only one of these techniques may be employed in isolation, another advantage of the embodiments of the invention that employ motion and image segmentation techniques in combination is that the ratios of the three segmented sub-regions (top hand, pill, bottom hand) can be obtained. The ratios not only help to divide the tracking window into three sub-regions and but also help with subsequent tracking. Although a size of a tracking window may change over time, the approximate ratio of these segmented sub-regions is likely the same, a useful reference point for tracking.
  • In accordance with another preferred embodiment of the invention, a dynamic feature selection optimization scheme based on one or more decision fusion confidence levels for pill or other object recognition and tracking may be employed. As is well known, and as has been recognized by the inventors of the present invention, processing power is generally considered to be expensive, especially in mobile devices. It has further been determined by the inventors of the present invention that when confidence levels of object or pill identification or object or pill tracking are high, the need to perform high levels of processing may be significantly reduced. This may allow the system to be switched to possibly employ lower performance machine vision computations, resulting in better performance and potentially lower bandwidth requirements. A similar approach may be to downsample images based on confidence levels and mutliscaling. For example, an original image could be downsampled to 1/20, 1/10, ⅕ or the like. If the confidence is low in 1/20 scale, then the system may preferably utilize a next level until an acceptable confidence is achieved. Such adjustments may be made in a dynamic manner because the performance is continually monitored over time and within multiple frames to optimize confidence levels.
  • In addition, a process of feature selection may be optimized due to the fact that some features require higher computational power than others: a) in predictive color scanning, the inventive system may discard non-core pixel colors that do not match expected results to avoid wasting of computational bandwidth; b) when performing predictive shape scanning, the system may search algorithms based on identifying known corners or images; and c) the system may apply a similar methodology to unique markings on the pill or shape of interest instead of identifying points in a procedural manner. The system therefore may automatically narrow confidence levels to optimize confirmation, and in each case reduce the number of points needed for each feature to optimize accuracy.
  • In accordance with another embodiment of the invention, upon tracking of a pill or other object, if identification of such an object is difficult, the user may be encouraged by a proximity indicator to bring the pill close enough to the camera to ensure the correct amount of data is captured to both confirm likeness and to ensure the proper metrics are tracked. The system may then guide the user for correct proximity placement to maximize confidence levels of object verification.
  • Other objects may be tracked with a simple webcam by the user of similar image segmentation and other aspects of the invention. For example, a brand of a pair of shoes may be identified, and the wearer may be offered a discount when walking by. Based upon known patterns, cans of soda or other food items may be similarly tracked, perhaps requesting the user brings such a can closer to the webcam to interact with the user on a new sales promotion. The system may further be applied to injectable medications and the like, inhalers, or other medication delivery systems, and in particular may be employed to confirm activation and/or positioning thereof.
  • Therefore, in accordance with various embodiments of the invention, determination of a pill being in a patient's mouth may be tracked and confirmed. Because webcams cannot see depth, as long as the pill or other object is substantially surrounded by an unbroken ring of red color (the inner parts of the mouth), then one can safely assume that the pill is no longer held by the fingers or the hand and has been placed in the mouth. The unbroken circle is another pattern to determine placement. The unbroken circle may be determined and tracked in place of the finger tip or palm as noted above. Identification sequences may be similar to those noted above with respect to the finger-pill-finger combination, but rather employing a color determination of mouth-pill-mouth.
  • It will thus be seen that the objects set forth above, among those made apparent from the preceding description, are efficiently attained and, because certain changes may be made in carrying out the above method and in the construction(s) set forth without departing from the spirit and scope of the invention, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
  • It is also to be understood that this description is intended to cover all of the generic and specific features of the invention herein described and all statements of the scope of the invention which, as a matter of language, might be said to fall there between.

Claims (20)

What is claimed:
1. A method for tracking a pill in a patient's hand, comprising:
performing a foreground detection process to determine a hand-pill-hand region;
performing image segmentation to separate the determined hand portion of the hand-pill-hand region from the pill portion thereof;
building three reference models, one for each hand region and one for the pill region;
initializing a dynamic model for tracking the hand-pill-hand region;
determining N possible next positions for the hand-pill-hand region;
for each such determined position, determining various features to be employed in a comparison to a reference model;
building a new model for that region in accordance with the determined position;
for each position, comparing the new model and the reference model;
determining a position whose new model generates a highest similarity score;
determining whether that similarity score is greater than a predetermined threshold;
wherein if it is determined that the similarity score is greater than the predetermined threshold, the object is tracked.
2. The method of claim 1, wherein the reference models are combined into a single reference model.
3. The method of claim 1, wherein the feature may be selected from the group of:
shape, color, texture, gray-scale intensity and histogram of colors.
4. A method for tracking a pill in a patient's hand, comprising the steps of:
performing a foreground detection process of an image or series of images to determine a hand-pill-hand region;
performing image segmentation of the image or sequence of images to separate the determined hand portion of the hand-pill-hand region from the pill portion thereof;
building one or more reference models including at least the hand region and the pill region;
determining a feature vector of the hand hand-pill-hand region;
comparing the determined feature vector to one or more reference feature vectors;
determining a distance between the determined feature vector and each of the one or more reference feature vectors; and
determining an image or set of images corresponding to the determined feature vector to be similar to an image or set of images whose corresponding feature vector when the determined distance is less than a predetermined threshold.
5. The method of claim 4, further comprising the step of fitting each determined distance into a Gaussian distribution to determine a confidence probability of a match between the two corresponding images or sets of images.
6. A method for tracking a pill in a user's hand comprising the steps of:
acquiring one or more images;
determining a hand-pill-hand portion of one or more of the acquired images;
storing an indication of one or more characteristics of and
determining a location of the hand-pill-hand portion in a next one or more of the acquired images in accordance with the one or more stored indications.
7. The method of claim 6, further comprising the step of storing a color signature of the hand-pill-hand portion of the one or more of the acquired images as the indication of the one or more characteristics.
8. The method of claim 7, further comprising the step of determining the location of the hand-pill-hand portion in a next one or more of the acquired images in accordance with the stored color signature.
9. The method of claim 7, further comprising the step of distinguishing a pill colored similarly to a background in accordance with the stored color signature.
10. The method of claim 9, wherein the background is a user shirt.
11. The method of 9, wherein the background is an environmental surface.
12. The method of claim 6, wherein the step of determining a hand-pill-hand portion of one or more of the acquired images further comprises the steps of:
performing a foreground detection process of an image or series of images to determine a hand-pill-hand region;
performing image segmentation of the image or sequence of images to separate the determined hand portion of the hand-pill-hand region from the pill portion thereof; and
building one or more reference models including at least one of the hand region and the pill region.
13. The method of claim 12, wherein the method of claim 12 further comprises the step of determining a feature vector difference between at least one of the one or more reference models and a hand-pill-hand portion of one or more of the one or more acquired images.
14. The method of claim 13, wherein if the vector difference is less than a predetermined threshold, there is determined to be a match.
15. The method of claim 12, wherein the foreground detection process takes into account one or more environmental factors.
16. The method of claim 15, wherein the one or more environmental factors comprises ambient light.
17. The method of claim 6, wherein the one or more characteristics comprises shape.
18. The method of claim 6, wherein the one or more characteristics comprises color.
19. The method of claim 6, further comprising repeating the processing for a plurality of consecutive acquired images.
20. The method of claim 19, wherein processing is performed on a subset of the plurality of consecutive acquired images.
US15/894,073 2011-02-28 2018-02-12 Method and Apparatus for Pattern Tracking Abandoned US20180300539A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/894,073 US20180300539A1 (en) 2011-02-28 2018-02-12 Method and Apparatus for Pattern Tracking

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201161447243P 2011-02-28 2011-02-28
US13/110,500 US9665767B2 (en) 2011-02-28 2011-05-18 Method and apparatus for pattern tracking
US15/605,695 US9892316B2 (en) 2011-02-28 2017-05-25 Method and apparatus for pattern tracking
US15/894,073 US20180300539A1 (en) 2011-02-28 2018-02-12 Method and Apparatus for Pattern Tracking

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/605,695 Continuation US9892316B2 (en) 2011-02-28 2017-05-25 Method and apparatus for pattern tracking

Publications (1)

Publication Number Publication Date
US20180300539A1 true US20180300539A1 (en) 2018-10-18

Family

ID=46719017

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/110,500 Active 2033-08-23 US9665767B2 (en) 2011-02-28 2011-05-18 Method and apparatus for pattern tracking
US15/605,695 Active US9892316B2 (en) 2011-02-28 2017-05-25 Method and apparatus for pattern tracking
US15/894,073 Abandoned US20180300539A1 (en) 2011-02-28 2018-02-12 Method and Apparatus for Pattern Tracking

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/110,500 Active 2033-08-23 US9665767B2 (en) 2011-02-28 2011-05-18 Method and apparatus for pattern tracking
US15/605,695 Active US9892316B2 (en) 2011-02-28 2017-05-25 Method and apparatus for pattern tracking

Country Status (1)

Country Link
US (3) US9665767B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020135792A1 (en) * 2018-12-28 2020-07-02 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for generating image metric

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10558845B2 (en) * 2011-08-21 2020-02-11 Aic Innovations Group, Inc. Apparatus and method for determination of medication location
JP6069826B2 (en) * 2011-11-25 2017-02-01 ソニー株式会社 Image processing apparatus, program, image processing method, and terminal device
US9747306B2 (en) * 2012-05-25 2017-08-29 Atheer, Inc. Method and apparatus for identifying input features for later recognition
US10412586B2 (en) * 2013-12-17 2019-09-10 Dropbox, Inc. Limited-functionality accounts
US9940726B2 (en) * 2014-12-19 2018-04-10 The Boeing Company System and method to improve object tracking using tracking fingerprints
US10395764B2 (en) * 2015-01-06 2019-08-27 Aic Innovations Group, Inc. Method and apparatus for recognition of patient activity
US10861605B2 (en) * 2016-08-22 2020-12-08 Aic Innovations Group, Inc. Method and apparatus for determining health status
US9958951B1 (en) * 2016-09-12 2018-05-01 Meta Company System and method for providing views of virtual content in an augmented reality environment
KR102624560B1 (en) * 2017-01-31 2024-01-15 엘지전자 주식회사 Cleaner
CN108733203A (en) * 2017-04-20 2018-11-02 上海耕岩智能科技有限公司 A kind of method and apparatus of eyeball tracking operation
US10229313B1 (en) 2017-10-23 2019-03-12 Meta Company System and method for identifying and tracking a human hand in an interactive space based on approximated center-lines of digits
US10701247B1 (en) 2017-10-23 2020-06-30 Meta View, Inc. Systems and methods to simulate physical objects occluding virtual objects in an interactive space
TWI695347B (en) * 2018-07-30 2020-06-01 台灣基督長老教會馬偕醫療財團法人馬偕紀念醫院 Method and system for sorting and identifying medication via its label and/or package
US11743424B1 (en) 2019-01-16 2023-08-29 Omcare Inc. Web enabled audiovisual medication dispensing with enhanced compliance verification
CN110298298B (en) * 2019-06-26 2022-03-08 北京市商汤科技开发有限公司 Target detection and target detection network training method, device and equipment
US11416770B2 (en) 2019-10-29 2022-08-16 International Business Machines Corporation Retraining individual-item models

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6236736B1 (en) * 1997-02-07 2001-05-22 Ncr Corporation Method and apparatus for detecting movement patterns at a self-service checkout terminal
US6269172B1 (en) * 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US6360003B1 (en) * 1997-08-12 2002-03-19 Kabushiki Kaisha Toshiba Image processing apparatus
US6597801B1 (en) * 1999-09-16 2003-07-22 Hewlett-Packard Development Company L.P. Method for object registration via selection of models with dynamically ordered features
US20030219146A1 (en) * 2002-05-23 2003-11-27 Jepson Allan D. Visual motion analysis method for detecting arbitrary numbers of moving objects in image sequences
US6999601B2 (en) * 1999-09-16 2006-02-14 Hewlett-Packard Development Company, Lp Method for visual tracking using switching linear dynamic systems models
US7152051B1 (en) * 2002-09-30 2006-12-19 Michael Lamport Commons Intelligent control with hierarchical stacked neural networks
US20080133058A1 (en) * 2006-12-01 2008-06-05 Honda Motor Co., Ltd. Robot, control method therefor and control program therefor
US7650011B2 (en) * 2004-07-09 2010-01-19 Honda Motor Co., Inc. Visual tracking using incremental fisher discriminant analysis
US20100134611A1 (en) * 2008-07-08 2010-06-03 Tomonobu Naruoka Article estimating apparatus and article position estimating apparatus, article estimating method as well as article estimating program
US20100135528A1 (en) * 2008-11-29 2010-06-03 International Business Machines Corporation Analyzing repetitive sequential events
US7747351B2 (en) * 2007-06-27 2010-06-29 Panasonic Corporation Apparatus and method for controlling robot arm, and robot and program
US20110071675A1 (en) * 2009-09-22 2011-03-24 Gm Global Technology Operations, Inc. Visual perception system and method for a humanoid robot
US7983448B1 (en) * 2006-06-02 2011-07-19 University Of Central Florida Research Foundation, Inc. Self correcting tracking of moving objects in video
US8094881B2 (en) * 2004-01-15 2012-01-10 Canon Kabushiki Kaisha Action recognition apparatus and method, moving-object recognition apparatus and method, device control apparatus and method, and program
US20120084091A1 (en) * 2010-10-05 2012-04-05 Ai Cure Technologies Llc Apparatus and method for object confirmation and tracking
US8194921B2 (en) * 2008-06-27 2012-06-05 Nokia Corporation Method, appartaus and computer program product for providing gesture analysis
US8253831B2 (en) * 2008-11-29 2012-08-28 International Business Machines Corporation Location-aware event detection
US8452051B1 (en) * 2010-04-26 2013-05-28 Microsoft Corporation Hand-location post-process refinement in a tracking system
US8494214B2 (en) * 2008-10-30 2013-07-23 Toshiba Global Commerce Solutions Holdings Corporation Dynamically learning attributes of a point of sale operator

Family Cites Families (116)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3814845A (en) 1973-03-01 1974-06-04 Bell Telephone Labor Inc Object positioning
US5065447A (en) 1989-07-05 1991-11-12 Iterated Systems, Inc. Method and apparatus for processing digital data
US5441047A (en) 1992-03-25 1995-08-15 David; Daniel Ambulatory patient health monitoring techniques utilizing interactive visual communication
US5544649A (en) 1992-03-25 1996-08-13 Cardiomedix, Inc. Ambulatory patient health monitoring techniques utilizing interactive visual communication
US6283761B1 (en) 1992-09-08 2001-09-04 Raymond Anthony Joao Apparatus and method for processing and/or for providing healthcare information and/or healthcare-related information
US20010011224A1 (en) 1995-06-07 2001-08-02 Stephen James Brown Modular microprocessor-based health monitoring system
KR970000683B1 (en) 1993-05-31 1997-01-16 삼성전자 주식회사 Resolution adaptive video compression/decompression method and apparatus
US5752621A (en) 1995-03-20 1998-05-19 Eigen Technology Inc. Smart automatic medication dispenser
US5911132A (en) 1995-04-26 1999-06-08 Lucent Technologies Inc. Method using central epidemiological database
US5619991A (en) 1995-04-26 1997-04-15 Lucent Technologies Inc. Delivery of medical services using electronic data communications
US5961446A (en) 1995-10-06 1999-10-05 Tevital Incorporated Patient terminal for home health care system
US8092224B2 (en) 1995-11-22 2012-01-10 James A. Jorasch Systems and methods for improved health care compliance
US5646912A (en) 1996-01-25 1997-07-08 Cousin; Damon S. Medication compliance, co-ordination and dispensing system
US7370797B1 (en) 1996-05-31 2008-05-13 Scott Lindsay Sullivan Pill printing and identification
US5810747A (en) 1996-08-21 1998-09-22 Interactive Remote Site Technology, Inc. Remote site medical intervention system
GB9623573D0 (en) 1996-11-13 1997-01-08 Philips Electronics Nv Image segmentation
GB9626825D0 (en) 1996-12-24 1997-02-12 Crampton Stephen J Avatar kiosk
GB9704843D0 (en) 1997-03-08 1997-04-23 Murphy Graham F Apparatus
US6535637B1 (en) 1997-04-04 2003-03-18 Esco Electronics, Inc. Pharmaceutical pill recognition and verification system
US6233428B1 (en) 1997-09-17 2001-05-15 Bruce Fryer System and method for distribution of child care training materials and remote monitoring of child care centers
JPH11155142A (en) 1997-11-19 1999-06-08 Mitsubishi Electric Corp Medical treatment support system
US6421650B1 (en) 1998-03-04 2002-07-16 Goetech Llc Medication monitoring system and apparatus
US6045501A (en) 1998-08-28 2000-04-04 Celgene Corporation Methods for delivering a drug to a patient while preventing the exposure of a foetus or other contraindicated individual to the drug
US6484144B2 (en) 1999-03-23 2002-11-19 Dental Medicine International L.L.C. Method and system for healthcare treatment planning and assessment
US6607485B2 (en) 1999-06-03 2003-08-19 Cardiac Intelligence Corporation Computer readable storage medium containing code for automated collection and analysis of patient information retrieved from an implantable medical device for remote patient care
US7256708B2 (en) 1999-06-23 2007-08-14 Visicu, Inc. Telecommunications network for remote patient monitoring
DE60045044D1 (en) 1999-09-14 2010-11-11 Topcon Corp Face photographing apparatus and method
US6294999B1 (en) 1999-12-29 2001-09-25 Becton, Dickinson And Company Systems and methods for monitoring patient compliance with medication regimens
US6988075B1 (en) 2000-03-15 2006-01-17 Hacker L Leonard Patient-controlled medical information system and method
US6542902B2 (en) 2000-03-24 2003-04-01 Bridge Medical, Inc. Method and apparatus for displaying medication information
US20010056358A1 (en) 2000-03-24 2001-12-27 Bridge Medical, Inc., Method and apparatus for providing medication administration warnings
GB0014059D0 (en) 2000-06-09 2000-08-02 Chumas Paul D Method and apparatus
ES2340945T3 (en) 2000-07-05 2010-06-11 Smart Technologies Ulc PROCEDURE FOR A CAMERA BASED TOUCH SYSTEM.
US6795068B1 (en) 2000-07-21 2004-09-21 Sony Computer Entertainment Inc. Prop input device and method for mapping an object from a two-dimensional camera image to a three-dimensional space for controlling action in a game program
US20020026330A1 (en) 2000-08-23 2002-02-28 Klein Edward E. System and method for patient medication management and compliance using a portable computing device
US6763148B1 (en) 2000-11-13 2004-07-13 Visual Key, Inc. Image recognition methods
US7448544B1 (en) 2000-11-16 2008-11-11 Gsl Solutions, Inc. Tracking system for individual detection of prescription orders contained within a bulk container
EP1350385A4 (en) 2000-12-15 2004-12-15 Leonard Reiffel Imaged coded data source tracking product
US7221794B1 (en) 2000-12-18 2007-05-22 Sportsvision, Inc. Foreground detection
JP3308972B1 (en) 2001-01-16 2002-07-29 有限会社セルフセキュリティ Medication confirmation support device
US20070118389A1 (en) 2001-03-09 2007-05-24 Shipon Jacob A Integrated teleconferencing system
US6611206B2 (en) 2001-03-15 2003-08-26 Koninklijke Philips Electronics N.V. Automatic system for monitoring independent person requiring occasional assistance
US6879970B2 (en) 2001-04-02 2005-04-12 Invivodata, Inc. Apparatus and method for prediction and management of subject compliance in clinical research
US7415447B2 (en) 2001-04-02 2008-08-19 Invivodata, Inc. Apparatus and method for prediction and management of participant compliance in clinical research
US8065180B2 (en) 2001-04-02 2011-11-22 invivodata®, Inc. System for clinical trial subject compliance
US7395214B2 (en) 2001-05-11 2008-07-01 Craig P Shillingburg Apparatus, device and method for prescribing, administering and monitoring a treatment regimen for a patient
AU2002348242A1 (en) 2001-11-30 2003-06-17 Becton, Dickinson And Company Medication adherence system
US7317967B2 (en) 2001-12-31 2008-01-08 B. Braun Medical Inc. Apparatus and method for transferring data to a pharmaceutical compounding system
WO2003071410A2 (en) 2002-02-15 2003-08-28 Canesta, Inc. Gesture recognition system using depth perceptive sensors
DE10210050A1 (en) 2002-03-07 2003-12-04 Siemens Ag Method and device for repetitive relative positioning of a patient
US7369685B2 (en) 2002-04-05 2008-05-06 Identix Corporation Vision-based operating method and system
US7908155B2 (en) 2002-04-12 2011-03-15 Becton, Dickinson And Company System for collecting, storing, presenting and analyzing immunization data having remote stations in communication with a vaccine and disease database over a network
EP1521615A4 (en) 2002-06-11 2010-11-03 Jeffrey A Matos System for cardiac resuscitation
US7844361B2 (en) 2002-09-26 2010-11-30 Stratamed Labs, Inc. Prescription drug compliance monitoring system
US7774075B2 (en) 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
KR100503039B1 (en) 2002-11-25 2005-07-22 삼성테크윈 주식회사 Method to control operation of digital camera for user to easily take an identification photograph
US7002476B2 (en) 2003-01-30 2006-02-21 Leap Of Faith Technologies, Inc. Medication compliance system
US7769465B2 (en) 2003-06-11 2010-08-03 Matos Jeffrey A System for cardiac resuscitation
US7304228B2 (en) 2003-11-10 2007-12-04 Iowa State University Research Foundation, Inc. Creating realtime data-driven music using context sensitive grammars and fractal algorithms
US7657443B2 (en) 2003-12-19 2010-02-02 Carefusion 303, Inc. Intravenous medication harm index system
US20050144150A1 (en) 2003-12-30 2005-06-30 Shankar Ramamurthy Remote process capture, identification, cataloging and modeling
US8095379B2 (en) 2003-12-30 2012-01-10 Cerner Innovation, Inc. System and method for preemptive determination of the potential for an atypical clinical event related to the administering of medication
US20050182664A1 (en) 2004-02-18 2005-08-18 Klaus Abraham-Fuchs Method of monitoring patient participation in a clinical study
JP4483334B2 (en) 2004-02-18 2010-06-16 富士ゼロックス株式会社 Image processing device
DE102004013814A1 (en) 2004-03-20 2005-10-13 B. Braun Medizintechnologie Gmbh A method of allowing operator input on a medical device
US7761311B2 (en) 2004-03-23 2010-07-20 Board Of Regents, The University Of Texas System Pharmaceutical treatment effectiveness analysis computer system and methods
GB2412831A (en) 2004-03-30 2005-10-05 Univ Newcastle Highlighting important information by blurring less important information
US7627142B2 (en) 2004-04-02 2009-12-01 K-Nfb Reading Technology, Inc. Gesture processing with low resolution images with high resolution processing for optical character recognition for a reading machine
WO2005117697A2 (en) 2004-05-28 2005-12-15 Narayanan Ramasubramanian Unified indigestion package and process for patient compliance with prescribed medication regimen
US8335694B2 (en) 2004-07-09 2012-12-18 Bruce Reiner Gesture-based communication and reporting system
US7562121B2 (en) 2004-08-04 2009-07-14 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US7355594B2 (en) 2004-09-30 2008-04-08 Symbol Technologies, Inc. Optical touch screen arrangement
EP1910956A2 (en) 2005-05-04 2008-04-16 Board of Regents, The University of Texas System System, method and program product for delivering medical services from a remote location
US7782189B2 (en) 2005-06-20 2010-08-24 Carestream Health, Inc. System to monitor the ingestion of medicines
US7616111B2 (en) 2005-06-20 2009-11-10 Carestream Health, Inc. System to monitor the ingestion of medicines
GB2428927A (en) 2005-08-05 2007-02-07 Hewlett Packard Development Co Accurate positioning of a time lapse camera
US8566121B2 (en) 2005-08-29 2013-10-22 Narayanan Ramasubramanian Personalized medical adherence management system
US7881537B2 (en) 2006-01-31 2011-02-01 Honeywell International Inc. Automated activity detection using supervised learning
US20070194034A1 (en) 2006-02-17 2007-08-23 Vasilios Vasiadis Device for printing pills, tablets or caplets in a precise manner
US20070233049A1 (en) 2006-03-28 2007-10-04 Hospira, Inc. Medication administration and management system and method
US20080138604A1 (en) 2006-05-02 2008-06-12 John Kenney Authenticating and identifying objects using markings formed with correlated random patterns
US9031853B2 (en) 2006-05-06 2015-05-12 Irody, Inc. Apparatus and method for obtaining an identification of drugs for enhanced safety
US7539533B2 (en) 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
US8727208B2 (en) 2006-06-30 2014-05-20 Intel-Ge Care Innovations Llc Method for identifying pills via an optical device
US8044778B2 (en) 2007-07-12 2011-10-25 Henry Schein, Inc. Injection device and case with reporting ability
US7698002B2 (en) 2006-09-29 2010-04-13 Nellcor Puritan Bennett Llc Systems and methods for user interface and identification in a medical device
US7769345B2 (en) 2006-09-29 2010-08-03 Sony Ericsson Mobile Communications Ab Device and method for guiding a user to a communication position
US7983933B2 (en) 2006-12-06 2011-07-19 Microsoft Corporation Patient monitoring via image capture
US7770136B2 (en) 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
US20100092093A1 (en) 2007-02-13 2010-04-15 Olympus Corporation Feature matching method
EP3669895A1 (en) 2007-02-22 2020-06-24 University of Florida Research Foundation, Incorporated Medication adherence monitoring system
DE102007009652A1 (en) 2007-02-26 2008-09-04 Körber Ag Dummy device for patient-based medication using medicinal or pharmaceutical or dietary supplement products for patient, is true-to-original visual reproduction of real packaging section having actual products
US7956727B2 (en) 2007-03-22 2011-06-07 Carespeak Communications, Inc. Methods and systems for medication management
US20080303638A1 (en) 2007-03-24 2008-12-11 Hap Nguyen Portable patient devices, systems, and methods for providing patient aid and preventing medical errors, for monitoring patient use of ingestible medications, and for preventing distribution of counterfeit drugs
ES2438976T3 (en) 2007-05-11 2014-01-21 Saab Ab Device and procedure for a vision apparatus
US8154583B2 (en) 2007-05-31 2012-04-10 Eastman Kodak Company Eye gazing imaging for video communications
US20100136509A1 (en) 2007-07-02 2010-06-03 Alden Mejer System and method for clinical trial investigator meeting delivery and training including dynamic media enrichment
US20090012818A1 (en) 2007-07-06 2009-01-08 Valence Broadband, Inc. Dispensing medication and verifying proper medication use
US20090043610A1 (en) 2007-08-07 2009-02-12 Walgreen Co. Comprehensive medication management system
US8538775B2 (en) 2007-08-16 2013-09-17 Qualcomm Incorporated Mobile wireless medication management system
US8002174B2 (en) 2007-12-21 2011-08-23 Becton, Dickinson And Company Medication administration tracking
US20100057646A1 (en) 2008-02-24 2010-03-04 Martin Neil A Intelligent Dashboards With Heuristic Learning
US20090217194A1 (en) 2008-02-24 2009-08-27 Neil Martin Intelligent Dashboards
JP2009237619A (en) 2008-03-25 2009-10-15 Seiko Epson Corp Detection of face area and organ area in image
JP4840413B2 (en) 2008-07-02 2011-12-21 ソニー株式会社 Information display method, information processing apparatus, and information display program
US8146020B2 (en) 2008-07-24 2012-03-27 Qualcomm Incorporated Enhanced detection of circular engagement gesture
US20100042430A1 (en) 2008-08-12 2010-02-18 Irody Inc System and method for collecting and authenticating medication consumption
US20100262436A1 (en) 2009-04-11 2010-10-14 Chen Ying-Yu Medical information system for cost-effective management of health care
US8292832B2 (en) 2009-07-27 2012-10-23 Anthony Vallone Event-based health activity tracking with icon-based user interface
US20110153360A1 (en) 2009-12-23 2011-06-23 Al Cure Technologies LLC Method and Apparatus for Verification of Clinical Trial Adherence
US20110119073A1 (en) 2009-11-18 2011-05-19 Al Cure Technologies LLC Method and Apparatus for Verification of Medication Administration Adherence
US9293060B2 (en) 2010-05-06 2016-03-22 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US8893169B2 (en) 2009-12-30 2014-11-18 United Video Properties, Inc. Systems and methods for selectively obscuring portions of media content using a widget
US20110161109A1 (en) 2009-12-31 2011-06-30 Mckesson Financial Holdings Limited Systems and methods for providing adherence-based messages and benefits
US20110195520A1 (en) 2010-02-11 2011-08-11 Ameritox, Ltd. Methods of normalizing measured drug concentrations and testing for non-compliance with a drug treatment regimen
WO2012040554A2 (en) 2010-09-23 2012-03-29 Stryker Corporation Video monitoring system

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6236736B1 (en) * 1997-02-07 2001-05-22 Ncr Corporation Method and apparatus for detecting movement patterns at a self-service checkout terminal
US6360003B1 (en) * 1997-08-12 2002-03-19 Kabushiki Kaisha Toshiba Image processing apparatus
US6269172B1 (en) * 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US6597801B1 (en) * 1999-09-16 2003-07-22 Hewlett-Packard Development Company L.P. Method for object registration via selection of models with dynamically ordered features
US6999601B2 (en) * 1999-09-16 2006-02-14 Hewlett-Packard Development Company, Lp Method for visual tracking using switching linear dynamic systems models
US20030219146A1 (en) * 2002-05-23 2003-11-27 Jepson Allan D. Visual motion analysis method for detecting arbitrary numbers of moving objects in image sequences
US7152051B1 (en) * 2002-09-30 2006-12-19 Michael Lamport Commons Intelligent control with hierarchical stacked neural networks
US8094881B2 (en) * 2004-01-15 2012-01-10 Canon Kabushiki Kaisha Action recognition apparatus and method, moving-object recognition apparatus and method, device control apparatus and method, and program
US7650011B2 (en) * 2004-07-09 2010-01-19 Honda Motor Co., Inc. Visual tracking using incremental fisher discriminant analysis
US7983448B1 (en) * 2006-06-02 2011-07-19 University Of Central Florida Research Foundation, Inc. Self correcting tracking of moving objects in video
US20080133058A1 (en) * 2006-12-01 2008-06-05 Honda Motor Co., Ltd. Robot, control method therefor and control program therefor
US8041457B2 (en) * 2006-12-01 2011-10-18 Honda Motor Co., Ltd. Robot, control method therefor and control program therefor
US7747351B2 (en) * 2007-06-27 2010-06-29 Panasonic Corporation Apparatus and method for controlling robot arm, and robot and program
US8194921B2 (en) * 2008-06-27 2012-06-05 Nokia Corporation Method, appartaus and computer program product for providing gesture analysis
US20100134611A1 (en) * 2008-07-08 2010-06-03 Tomonobu Naruoka Article estimating apparatus and article position estimating apparatus, article estimating method as well as article estimating program
US8494214B2 (en) * 2008-10-30 2013-07-23 Toshiba Global Commerce Solutions Holdings Corporation Dynamically learning attributes of a point of sale operator
US20100135528A1 (en) * 2008-11-29 2010-06-03 International Business Machines Corporation Analyzing repetitive sequential events
US8253831B2 (en) * 2008-11-29 2012-08-28 International Business Machines Corporation Location-aware event detection
US20110071675A1 (en) * 2009-09-22 2011-03-24 Gm Global Technology Operations, Inc. Visual perception system and method for a humanoid robot
US8452051B1 (en) * 2010-04-26 2013-05-28 Microsoft Corporation Hand-location post-process refinement in a tracking system
US20120084091A1 (en) * 2010-10-05 2012-04-05 Ai Cure Technologies Llc Apparatus and method for object confirmation and tracking

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020135792A1 (en) * 2018-12-28 2020-07-02 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for generating image metric
US11436720B2 (en) 2018-12-28 2022-09-06 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for generating image metric

Also Published As

Publication number Publication date
US9892316B2 (en) 2018-02-13
US9665767B2 (en) 2017-05-30
US20170270355A1 (en) 2017-09-21
US20120219176A1 (en) 2012-08-30

Similar Documents

Publication Publication Date Title
US9892316B2 (en) Method and apparatus for pattern tracking
US10529071B2 (en) Facial skin mask generation for heart rate detection
US10083233B2 (en) Video processing for motor task analysis
US8401225B2 (en) Moving object segmentation using depth images
CN111274928B (en) Living body detection method and device, electronic equipment and storage medium
EP2441383B1 (en) Gaze target determination device and gaze target determination method
EP2869264B1 (en) Information processing device and information processing method
CN113164098A (en) Human gait analysis system and method
WO2018015329A1 (en) Method and system for monitoring the status of the driver of a vehicle
US20110317871A1 (en) Skeletal joint recognition and tracking system
US9477887B2 (en) Apparatus and method for analyzing trajectory
CN110073363B (en) Tracking the head of an object
WO2016107638A1 (en) An image face processing method and apparatus
JP2016512765A (en) On-axis gaze tracking system and method
Tulyakov et al. Robust real-time extreme head pose estimation
KR102369989B1 (en) Color identification using infrared imaging
EP2339507A1 (en) Head detection and localisation method
CN108537144B (en) A kind of multidimensional body gait recognition methods and equipment
US20150199592A1 (en) Contour-based classification of objects
JP2010123019A (en) Device and method for recognizing motion
Marcos-Ramiro et al. Let your body speak: Communicative cue extraction on natural interaction using RGBD data
JP2021086322A (en) Image processing device, image processing method, and program
KR20120049605A (en) Apparatus and method for detecting center of pupil
CN110826495A (en) Body left and right limb consistency tracking and distinguishing method and system based on face orientation
Führ et al. Robust patch-based pedestrian tracking using monocular calibrated cameras

Legal Events

Date Code Title Description
AS Assignment

Owner name: AI CURE TECHNOLOGIES, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUAN, LEI;HANINA, ADAM;REEL/FRAME:044904/0311

Effective date: 20110518

Owner name: AIC INNOVATIONS GROUP, INC., NEW YORK

Free format text: CHANGE OF NAME;ASSIGNOR:AI CURE TECHNOLOGIES, INC.;REEL/FRAME:045308/0793

Effective date: 20130410

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: WESTERN ALLIANCE BANK, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:AIC INNOVATIONS GROUP, INC.;REEL/FRAME:066128/0170

Effective date: 20231220