WO2002043352A2 - System and method for object identification and behavior characterization using video analysis - Google Patents
System and method for object identification and behavior characterization using video analysis Download PDFInfo
- Publication number
- WO2002043352A2 WO2002043352A2 PCT/US2001/043282 US0143282W WO0243352A2 WO 2002043352 A2 WO2002043352 A2 WO 2002043352A2 US 0143282 W US0143282 W US 0143282W WO 0243352 A2 WO0243352 A2 WO 0243352A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- behavior
- video
- image
- foreground
- mouse
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 92
- 238000004458 analytical method Methods 0.000 title claims description 63
- 238000012512 characterization method Methods 0.000 title description 18
- 230000033001 locomotion Effects 0.000 claims abstract description 28
- 230000036544 posture Effects 0.000 claims description 64
- 241001465754 Metazoa Species 0.000 claims description 53
- 230000000694 effects Effects 0.000 claims description 41
- 238000003860 storage Methods 0.000 claims description 19
- 238000004422 calculation algorithm Methods 0.000 claims description 18
- 238000001514 detection method Methods 0.000 claims description 15
- 238000012935 Averaging Methods 0.000 claims description 12
- 230000007937 eating Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 8
- 230000002123 temporal effect Effects 0.000 claims description 7
- 238000005204 segregation Methods 0.000 claims description 6
- 230000007704 transition Effects 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 5
- 239000003814 drug Substances 0.000 claims description 5
- 229940079593 drug Drugs 0.000 claims description 5
- 238000011410 subtraction method Methods 0.000 claims description 5
- 230000036961 partial effect Effects 0.000 claims description 4
- 238000010191 image analysis Methods 0.000 claims description 3
- 230000008014 freezing Effects 0.000 claims description 2
- 238000007710 freezing Methods 0.000 claims description 2
- 238000010353 genetic engineering Methods 0.000 claims description 2
- 230000003370 grooming effect Effects 0.000 claims description 2
- 230000004931 aggregating effect Effects 0.000 claims 2
- 238000012731 temporal analysis Methods 0.000 claims 2
- 230000009194 climbing Effects 0.000 claims 1
- 238000000513 principal component analysis Methods 0.000 claims 1
- 238000000700 time series analysis Methods 0.000 claims 1
- 238000012544 monitoring process Methods 0.000 abstract description 24
- 230000011218 segmentation Effects 0.000 abstract description 7
- 230000009471 action Effects 0.000 abstract description 6
- 230000006399 behavior Effects 0.000 description 181
- 241000699666 Mus <mouse, genus> Species 0.000 description 137
- 230000008569 process Effects 0.000 description 34
- 230000015654 memory Effects 0.000 description 24
- 241000699670 Mus sp. Species 0.000 description 22
- 230000003542 behavioural effect Effects 0.000 description 14
- 206010000117 Abnormal behaviour Diseases 0.000 description 13
- 230000006835 compression Effects 0.000 description 12
- 238000007906 compression Methods 0.000 description 12
- 238000013459 approach Methods 0.000 description 11
- 230000000384 rearing effect Effects 0.000 description 9
- 230000002159 abnormal effect Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 108090000623 proteins and genes Proteins 0.000 description 7
- 238000012360 testing method Methods 0.000 description 7
- 238000003066 decision tree Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000033764 rhythmic process Effects 0.000 description 6
- 241000282412 Homo Species 0.000 description 5
- 230000035622 drinking Effects 0.000 description 5
- 238000003708 edge detection Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000011002 quantification Methods 0.000 description 4
- 241000700159 Rattus Species 0.000 description 3
- 238000002790 cross-validation Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 230000009191 jumping Effects 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000035772 mutation Effects 0.000 description 3
- 238000009987 spinning Methods 0.000 description 3
- 208000012239 Developmental disease Diseases 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000031018 biological processes and functions Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000001647 drug administration Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 210000003194 forelimb Anatomy 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000002547 new drug Substances 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000004445 quantitative analysis Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000011896 sensitive detection Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000009261 transgenic effect Effects 0.000 description 2
- 206010011878 Deafness Diseases 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 241000283984 Rodentia Species 0.000 description 1
- 206010048232 Yawning Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000010171 animal model Methods 0.000 description 1
- 230000037007 arousal Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 239000004459 forage Substances 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 244000144993 groups of animals Species 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 238000004377 microelectronic Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000010172 mouse model Methods 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000010223 real-time analysis Methods 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000006748 scratching Methods 0.000 description 1
- 230000002393 scratching effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000011273 social behavior Effects 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K1/00—Housing animals; Equipment therefor
- A01K1/02—Pigsties; Dog-kennels; Rabbit-hutches or the like
- A01K1/03—Housing for domestic or laboratory animals
- A01K1/031—Cages for laboratory animals; Cages for measuring metabolism of animals
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
- A01K29/005—Monitoring or measuring activity, e.g. detecting heat or mating
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4094—Diagnosing or monitoring seizure diseases, e.g. epilepsy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/40—Animals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/42—Evaluating a particular growth phase or type of persons or animals for laboratory research
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- the invention relates generally to object identification and recognition. More particularly, one aspect of the invention is directed to monitoring and characterization of an object in an image, for example an animal or a person, using video analysis.
- Video analysis has developed over the past few decades to become an integral part of machine operations in manufacturing using machine automation.
- video object recognition and pattern recognition has been used to orient and align various pieces of a product for machining and assembly in various manufacturing industries.
- One such use is in the manufacturing of semiconductor integrated circuits and microelectronic packaging, h this case, pattern recognition has made great inroads because the size of the work product is microscopic and orientation and alignment of the work product is thus far too tedious for a human being to do consistently and accurately over a large number of pieces.
- military has carried out research to use video to track moving targets such as tanks and vehicles, in the scene. Other positioning instruments such as global positioning system will be used to assist such tracking.
- Another application for video analysis is monitoring animal activity in laboratory testing for the pharmaceutical and biological sciences.
- One particular area is monitoring animal behavior to determine the effects of various new drugs or gene changes on a particular type of animal.
- One such animal used in laboratory testing is the mouse.
- Model organisms are an important tool for understanding and dissecting human disease and biological process. Because mice and humans share many of the same fundamental biological and behavioral processes, this animal is one of the most significant laboratory models for human disease and studying biological processes in mammals.
- Adding a time line for the locus of mouse point is all they can offer.
- Other animal location type systems used to monitor animal motion include those described in U.S. Pat. Nos. 3,100,473; 3,803,571; 3,974,798; 4,337,726; 4,574,734; and 5,816,256.
- the other systems in the field are the systems that identify individual behavior using video.
- the existing video analysis systems e.g. Noldus Observer/Ethovision, Sterling, VA; HVS Image, Hampton, UK; AccuScan Instruments Inc.'s VideoScan2000 System; and San Diego Instruments Poly-Track system, San Diego, CA
- Digitized images from video are used to capture the body of mouse and provide quantitative data about the position and movements of the animal and the pattern of these variables across time. They do not just treat the animal (e.g., mouse) as a point in the space. Instead, they handle it as a block of pixels. More information is preserved. However, they can only make use of a few simple features. For example, the mass center of the animal (e.g., mouse) is calculated and used as a means for tracking the animal (e.g., a mouse). As such, a lot of information that is critical to identify the animal's behaviors such as different postures, positions of portions of the animal's body such as limbs, is lost. These systems can only distinguish basic behaviors such as locomotion, and cannot automatically identify simple animal postures such as eating, rearing, and jumping, not to mention complex behaviors such as skilled reaching. Such behavior identification requires human intervention and input.
- the Noldus Observer system has a video camera, TV monitor, a high end VCR, and a PC system, all hooked together.
- the camera takes video footage of the mouse in a cage. This video is recorded on videotape, digitized, input into the PC system, and displayed on the computer monitor.
- the human observer can control the recorded video that is displayed, the human observer still needs to look at the animal on the screen, decide which behavior the animal is engaged in, and enter (by typing) the information into a mechanism provided by the system for storage and later analysis. While this system facilitates observation of behavior, it does not automate it, and is thus prone to human error and extremely labor intensive.
- the tasks of coding behavior throughout the day and building a profile of behavior for different types of animals and different strains of the same animal e.g., different strains of mouse
- the present invention is directed to systems and methods for finding patterns of behaviors and/or activities of an object using video.
- the invention includes a system with a video camera connected to a computer in which the computer is configured to automatically provide object identification, object motion tracking (for moving objects), object shape and posture classification, and behavior identification.
- the present invention is capable of automatically monitoring a video image to identify, track and classify the actions of various obj ects and their movements.
- the video image may be provided in real time from a camera and/or from a storage location.
- the invention is particularly useful for monitoring and classifying animal behavior for testing drugs and genetic mutations, but may be used in any of a number of surveillance or other applications.
- the invention includes a system in which an analog video camera and a video record/playback device (e.g., VCR) are coupled to a video digitization/compression unit.
- the video camera may provide a video image containing an object to be identified.
- the video digitization/compression unit is coupled to a computer that is configured to automatically monitor the video image to identify, track and classify the actions of the object and its movements over time within a sequence of video session image frames.
- the digitization/compression unit may convert analog video and audio into, for example, MPEG or other formats.
- the computer may be, for example, a personal computer, using either a Windows platform or a Unix platform, or a Macintosh computer and compatible platform.
- the computer is loaded and configured with custom software programs (or equipped with firmware) using, for example, MATLAB or C/C++ programming language, so as to analyze the digitized video for object identification and segmentation, tracking, and/or behavior/activity characterization.
- This software may be stored in, for example, a program memory, which may include ROM, RAM, CD ROM and/or a hard drive, etc.
- the software or firmware includes a unique background subtraction method which is more simple, efficient, and accurate than those previously known.
- the system receives incoming video images from either the video camera in real time or pre-recorded from the video record/playback unit.
- the information is converted from analog to digital format and may be compressed by the video digitization/compression unit.
- the digital video images are then provided to the computer where various processes are undertaken to identify and segment a predetermined object from the image, h a preferred embodiment the object is an object (e.g., a mouse) in motion with some movement from frame to frame in the video, and is in the foreground of the video images, i any case, the digital images may be processed to identify and segregate a desired (predetermined) object from the various frames of incoming video. This process may be achieved using, for example, background subtraction, mixture modeling, robust estimation, and/or other processes. The shape and location of the desired object is then tracked from one frame or scene to another frame or scene of video images.
- object e.g., a mouse
- the changes in the shapes, locations, and/or postures of the object of interest maybe identified, their features extracted, and classified into meaningful categories, for example, vertical positioned side view, horizontal positioned side view, vertical positioned front view, horizontal positioned front view, moving left to right, etc.
- the shape, location, and posture categories may be used to characterize the object's activity into one of a number of pre-defined behaviors.
- some pre-defined normal behaviors may include sleeping, eating, drinking, walking, running, etc.
- pre-defined abnormal behavior may include spinning vertical, jumping in the same spot, etc.
- the pre-defined behaviors may be stored in a database in the data memory.
- the behavior may be characterized using, for example, approaches such as rule-based label analysis, token parsing procedure, and or Hidden Markov Modeling (HMM).
- HMM Hidden Markov Modeling
- the system maybe constructed to characterize the object behavior as new behavior and particular temporal rhythm.
- the system operates as follows. As a preliminary matter, normal postures and behaviors of the animals are defined and may be entered into a Normal Postures and Behaviors database, hi analyzing in a first instant, incoming video images are received. The system determines if the video images are in analog or digital format and input into a computer. If the video images are in analog format they are digitized and may be compressed, using, for example, an MPEG digitizer/compression unit. Otherwise, the digital video image may be input directly to the computer. Next, a background may be generated or updated from the digital video images and foreground objects detected. Next, the foreground objects features are extracted.
- the foreground object shape is classified into various categories, for example, standing, sitting, etc.
- the foreground object posture is compared to the various predefined postures stored in the database, and then identified as a particular posture or a new (unidentified) posture.
- various groups of postures are concatenated into a series to make up a foreground object behavior and then compared against the sequence of postures, stored in for example a database in memory, that make up known normal or abnormal behaviors of the animal.
- the abnormal behaviors are then identified in terms of known abnormal behavior, new behavior and/or daily rhythm.
- object detection is performed through a unique method of background subtraction. First, the incoming digital video signal is split into individual images (frames) in real-time.
- the system determines if the background image derived from prior incoming video needs to be updated due to changes in the background image or a background image needs to be developed because there was no background image was previously developed. If the background image needs to be generated, then a number of frames of video image, for example 20, will be grouped into a sample of images. Then, the system creates a standard deviation map of the sample of images. Next, the process removes a bounding box area in each frame or image where the variation within the group of images is above a predetermined threshold (i.e., where the object of interest or moving objects are located). Then, the various images within the sample less the bounding box area are averaged. Final background is obtained by averaging 5-10 samples. This completes the background generation process.
- a predetermined threshold i.e., where the object of interest or moving objects are located
- the background image does not remain constant for a great length of time due to various reasons.
- the background needs to be recalculated periodically as above or it can be recalculated by keeping track of the difference image and note any sudden changes.
- the newly generated background image is next subtracted from the current video image(s) to obtain foreground areas that may include the object of interest.
- regions of interest are obtained by identifying areas where the intensity difference generated from the subtraction is greater than a predetermined threshold, which constitute potential foreground object(s) being sought. Classification of these foreground regions of interest will be performed using the sizes of the ROIs, distances among these ROIs, threshold of intensity, and connectedness, to thereby identify the foreground objects.
- the foreground object identification/detection process may be refined by adaptively learning histograms of foreground ROIs and using edge detection to more accurately identify the desired object(s).
- the information identifying the desired foreground object is output. The process may then continue with the tracking and/or behavior characterization step(s).
- the previous embodiments are particularly applicable to the study and analysis of mice used in genetic and drug experimentation.
- One variation of the present invention is directed particularly to automatically determining the behavioral characteristics of a mouse in a home cage.
- the need for sensitive detection of novel phenotypes of genetically manipulated or drug-administered mice demands automation of analyses. Behavioral phenotypes are often best detected when mice are unconstrained by experimenter manipulation.
- automation of analysis of behavior in a known environment, for example a home cage would be a powerful tool for detecting phenotypes resulting from gene manipulations or drug administrations. Automation of analysis would allow quantification of all behaviors as they vary across the daily cycle of activity.
- the automated system may also be able to detect behaviors that do not normally occur and present the investigator with video clips of such behavior without the investigator having to view an entire day or long period of mouse activity to manually identify the desired behavior.
- the systematically developed definition of mouse behavior that is detectable by the automated analysis according to the present invention makes precise and quantitative analysis of the entire mouse behavior repertoire possible for the first time.
- the various computer algorithms included in the invention for automating behavior analysis based on the behavior definitions ensure accurate and efficient identification of mouse behaviors, hi addition, the digital video analysis techniques of the present invention improves analysis of behavior by leading to: (1) decreased variance due to non-disturbed observation of the animal; (2) increased experiment sensitivity due to the greater number of behaviors sampled over a much longer time span than ever before possible; and (3) the potential to be applied to all common normative behavior patterns, capability to assess subtle behavioral states, and detection of changes of behavior patterns in addition to individual behaviors.
- Classification criteria (based on features extracted from the foreground object such as shape, position, movement) were derived and fitted into a decision tree (DT) classification algorithm.
- the decision tree could classify almost 500 sample features into 5 different postures classes with an accuracy over 93%.
- a simple HMM system has been built using dynamic programming and has been used to classify the classified postures identified by the DT and yields an almost perfect mapping from input posture to output behaviors in mouse behavior sequences.
- the invention may identify some abnormal behavior by using video image information
- abnormalities may also result from an increase in any particular type of normal behavior. Detection of such new abnormal behaviors may be achieved by the present invention detecting, for example, segments of behavior that do not fit the standard profile.
- the standard profile may be developed for a particular strain of mouse whereas detection of abnormal amounts of a normal behavior can be detected by comparison to the statistical properties of the standard profile.
- the automated analysis of the present invention may be used to build profiles of the behaviors, their amount, duration, and daily cycle for each animal, for example each commonly used strain of mice.
- a plurality of such profiles may be stored in, for example, a database in a data memory of the computer. One or more of these profile may then be compared to a mouse in question and difference from the profile expressed quantitatively.
- the techniques developed with the present invention for automation of the categorization and quantification of all home-cage mouse behaviors throughout the daily cycle is a powerful tool for detecting phenotypic effects of gene manipulations in mice. As previously discussed, this technology is extendable to other behavior studies of animals and humans, as well as surveillance purposes. As will be described in detail below, the present invention provides automated systems and methods for automated accurate identification, tracking and behavior categorization of an object whose image is captured with video.
- Figure 1 is a block diagram of one exemplary system configurable to find the position, shape, and behavioral characteristics of an object using automated video analysis, according to one embodiment of the present invention.
- Figure 2 is a block diagram of various functional portions of a computer system, such as the computer system shown in Figure 1, when configured to find the position, shape, and behavioral characteristics of an object using automated video analysis, according to one embodiment of the present invention.
- Figure 3 is a flow chart of a method of automatic video analysis for object identification and characterization, according to one embodiment of the present invention.
- Figure 4 is a flow chart of a method of automatic video analysis for object identification and characterization, according to another embodiment of the present invention.
- Figure 5 is a flow chart of a method of automatic video analysis for object detection and identification, according to one variation of the present invention.
- Figure 6 illustrates a sample video image frame with a mouse in a rearing up posture as determined using one variation of the present invention to monitor and characterize mouse behavior.
- Figure 7A is a first video image frame in a sequence with a mouse in an eating posture for illustrating background generation for a background subtraction process according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 7B is a copy of the first video image frame of Figure 7A in which the process has extracted an area of the video image related to the mouse in the foreground resulting in a "hole" which will be filled up when other frames are averaged with it for a background subtraction process according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 7C is the resulting background image for a video clip including the first video image frame of Figure 7A converted as shown in Figure 7B and averaged with subsequent video images, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 8A is a difference image between foreground and background for the image shown in Figure 7A, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 8B is the image shown in Fig. 7A after completing a threshold process for identifying the foreground image of the mouse which is shown as correctly identified, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 8C is a video image frame showing the foreground mouse object correctly identified by the system as identified with a polygon outline, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 9 A is a video image frame showing a mouse eating, to demonstrate a b-spline approach to object location and outline identification according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 9B is a computer generated image showing the outline of the foreground mouse shown in Figure 9A after edge segmentation, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 9C is a computer generated image of the outline of the foreground mouse shown in Figure 9 A as derived from the outline of the mouse shown in Figure 9B as generated from a b-spline process, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- Figure 10 is a chart illustrating one example of various mouse state transitions used in characterizing mouse behavior including: Horizontal (HS); Cuddled up (CU); Partially reared (PR); Vertically Reared (VR); and Forward Back (FB), along with an indication of duration of these states based on a sample, according to one variation of the present invention as applied for monitoring and characterizing mouse behavior.
- HS Horizontal
- PR Partially reared
- VR Vertically Reared
- FB Forward Back
- the present invention can automatically find the patterns of behaviors and/or activities of a predetermined object being monitored using video.
- the invention includes a system with a video camera connected to a computer in which the computer is configured to automatically provide object identification, object motion tracking (for moving objects), object shape and posture classification, and behavior identification.
- the system includes various video analysis algorithms.
- the computer processes analyze digitized video with the various algorithms so as to automatically monitor a video image to identify, track and classify the actions of one or more predetennined objects and its movements captured by the video image as it occurs from one video frame or scene to another.
- the system may characterize behavior by accessing a database of object information of known behavior of the predetermined object.
- the image to be analyzed may be provided in real time from one or more camera and/or from storage.
- the invention is configured to enable monitoring and classifying of animal behavior that results from testing drugs and genetic mutations on animals.
- the system may be similarly configured for use in any of a number of surveillance or other applications.
- the invention can be applied to various situations in which tracking moving objects is needed.
- One such situation is security surveillance in public areas like airports, military bases, or home security systems.
- the system maybe useful in automatically identifying and notifying proper law enforcement officials if a crime is being committed and/or a particular behavior being monitored is identified.
- the system may be useful for monitoring of parking security or moving traffic at intersections so as to automatically identify and track vehicle activity.
- the system may be configured to automatically determine if a vehicle is speeding or has performed some other traffic violation.
- the system may be configured to automatically identify and characterize human behavior involving guns or human activity related to robberies or thefts.
- the invention may be capable of identifying and understanding subtle behaviors involving portions of body such as forelimb and can be applied to identify and understand human gesture recognition. This could help deaf individuals communicate.
- the invention may also be the basis for computer understanding of human gesture to enhance the present human-computer interface experience, where gestures will be used to interface with computers. The economic potential of applications in computer-human interface applications and in surveillance and monitoring applications is enormous.
- the invention includes a system in which an analog video camera 105 and a video storage/retrieval unit 110 may be coupled to each other and to a video digitization/compression unit 115.
- the video camera 105 may provide a real time video image containing an object to be identified.
- the video storage/retrieval unit 110 may be, for example, a VCR, DVD, CD or hard disk unit.
- the video digitization/compression unit 115 is coupled to a computer 150 that is configured to automatically monitor a video image to identify, track and classify the actions (or state) of the object and its movements (or stillness) over time within a sequence of images.
- the digitization/compression unit 115 may convert analog video and audio into, for example, MPEG format, Real Player format, etc.
- the computer may be, for example, a personal computer, using either a Windows platform or a Unix platform, or a Macintosh computer and compatible platform.
- the computer may include a number of components such as (1) a data memory 151, for example, a hard drive or other type of volatile or nonvolatile memory; (2) a program memory 152, for example, RAM, ROM, EEPROM, etc. that may be volatile or non- volatile memory; (3) a processor 153, for example, a microprocessor; and (4) a second processor to manage the computation intensive features of the system, for example, a math coprocessor 154.
- the computer may also include a video processor such as an MPEG encoder/decoder.
- a video processor such as an MPEG encoder/decoder.
- the computer 150 has been shown in Figure 1 to include two memories (data memory 151 and program memory 152) and two processors (processor 153 and math co-processor 154), in one variation the computer may include only a single processor and single memory device or more then two processors and more than two memory devices.
- the computer 150 maybe equipped with user interface components such as a keyboard 155, electronic mouse 156, and display unit 157.
- the system may be simplified by using all digital components such as a digital video camera and a digital video storage/retrieval unit 110, which may be one integral unit. In this case, the video digitization/compression unit 115 may not be needed.
- the computer is loaded and configured with custom software program(s) (or equipped with firmware) using, for example, MATLAB or C/C++ programming language, so as to analyze the digitized video for object identification and segmentation, tracking, and/or behavior/activity characterization.
- This software may be stored in, for example, a program memory 152 or data memory that may include ROM, RAM, CD ROM and/or a hard drive, etc.
- the software or firmware includes a unique background subtraction method which is more simple, efficient, and accurate than those previously known which will be discussed in detail below, any case, the algorithms may be implemented in software and may be understood as unique functional modules as shown in Figure 2 and now described.
- the system is preloaded with standard object information before analyzing an incoming video including a predetermined object, for example, a mouse.
- a stream of digital video including a known obj ect with known characteristics may be fed into the system to a standard object classifier module 220.
- a user may then view the standard object on a screen and identify and classify various behaviors of the standard object, for example, standing, sitting, lying, normal, abnormal, etc.
- Data information representing such standard behavior may then be stored in the standard object behavior storage modules 225, for example a database in data memory 151.
- standard object behavior information data sets maybe loaded directly into the standard object behavior storage module 225 from another system or source as long as the data is compatible with the present invention protocols and data structure.
- the system may be used to analyze and classify the behavior of one or more predetermined objects, for example, a mouse.
- digital video (either real-time and or stored) of monitored objects to be identified and characterized is input to an object identification and segregation module 205.
- This module identifies and segregates a predetermined type of object from the digital video image and inputs it to an object tracking module 210.
- the object tracking module 210 facilitates tracking of the predetermined object from one frame or scene to another as feature information.
- This feature information is then extracted and input to the object shape and posture classifier 215.
- This module classifies the various observed states of the predetermined object of interest into various shape and posture categories and sends it to the behavior identification module 230.
- the behavior identification module 230 compares the object shape, motion, and posture information with shape, motion, and posture information for a standard object and classifies the behavior accordingly into the predefined categories exhibited by the standard object, including whether the behavior is normal, abnormal, new, etc. This information is output to the user as characterized behavior information on, for example, a display unit 157.
- the system may receive incoming video images at step 305, from the video camera 105 in real time, pre-recorded from the video storage/retrieval unit 110, and/or a memory integral to the computer 150. If the video is in analog format, then the information is converted from analog to digital format and may be compressed by the video digitization/compression unit 115.
- the digital video images are then provided to the computer 150 for various computational intensive processing to identify and segment a predetermined object from the image, hi a preferred embodiment, the object to be identified and whose activities is to be characterized is a moving object, for example a mouse, which has some movement from frame to frame or scene to scene in the video images and is generally in the foreground of the video images.
- the digital images may be processed to identify and segregate a desired (predetermined) object from the various frames of mcoming video. This process may be achieved using, for example, background subtraction, mixture modeling, robust estimation, and/or other processes.
- various movements (or still shapes) of the desired object may then be tracked from one frame or scene to another frame or scene of video images.
- this tracking may be achieved by, for example, tracking the outline contour of the object from one frame or scene to another as it varies from shape to shape and/or location to location.
- the changes in the motion of the object such as the shapes, locations, and postures of the object of interest, maybe identified and their features extracted and classified into meaningful categories. These categories may include, for example, vertical positioned side view, horizontal positioned side view, vertical positioned front view, horizontal positioned front view, moving left to right, etc.
- the states of the object may be used to characterize the objects activity into one of a number of pre-defined behaviors.
- some pre-defined normal behaviors may include sleeping, eating, drinking, walking, running, etc.
- pre-defined abnormal behavior may include spinning vertical, jumping in the same spot, etc.
- the pre-defined behaviors may be stored in a database in the data memory 151.
- Types of behavior may also be characterized using, for example, approaches such as rule-based label analysis, token parsing procedure, and/or Hidden Markov Modeling (HMM).
- HMM Hidden Markov Modeling
- the HMM is particularly helpful in characterizing behavior that is determined with temporal relationships of the various motion of the object across a selection of frames. From these methods, the system may be capable of characterizing the object behavior as new behavior and particular temporal rhythm.
- the system is directed toward video analysis of animated objects such as animals.
- video of the activities of a standard object and known behavior characteristics are input into the system.
- This information maybe provided from a video storage/retrieval unit 110 in digitized video form into a standard object classified module 220.
- This information may then be manually categorized at step 416 to define normal and abnormal activities or behaviors by a user viewing the video images on the display unit 157 and inputting their classifications. For example, experts in the field may sit together watching recorded scenes.
- an animal's e.g., a mouse
- behaviors may constitute the important posture and behavior database and are entered into a storage, for example a memory, of known activity of the standard object at step 420.
- This information provides a point of reference for video analysis to characterize the behavior of non-standard objects whose behaviors/activities need to be characterized such as genetically altered or drug administered mice.
- normal postures and behaviors of the animals are defined and may be entered into a normal postures and behaviors database.
- the system may then be used to analyze incoming video images that may contain an object for which automated behavior characterization is desired.
- incoming video images are received.
- decision step 406 the system determines if the video images are in analog or digital format. If the video images are in analog format they are then digitized at step 407.
- the video may be digitized and may be compressed, using, for example, a digitizer/compression unit 115 into a convenient digital video format such as MPEG, RealPlayer, etc. Otherwise, the digital video image may be input directly to the computer 150. Now the object of interest is identified within the video images and segregated for analysis.
- a background may be generated or updated from the digital video images and foreground objects including a predetermined object for behavior characterization may be detected. For example, a mouse in a cage is detected in the foreground and segregated from the background. Then, at step 409, features such as centroid, the principal orientation angle of the object, the area (number of pixels), the eccentricity (roundness), and the aspect ratio of the object, and/or shape in terms of convex hull or b-spline, of the foreground object of interest (e.g., a mouse) are extracted. Next, at step 410, the foreground object shape and postures are classified into various categories, for example, standing, sitting, etc.
- the foreground object e.g., a mouse
- the foreground object posture maybe compare to the various predefined postures in the set of known postures in the standard object storage of step 420, which may be included in a database.
- the observed postures of the object contained in the analyzed video image may be classified and identified as a particular posture known for the standard object or a new previously unidentified posture.
- various groups of postures maybe concatenated into a series to make up a foreground object behavior that is then compared against the sequence of postures, stored in for example a database in memory, that make up a known standard object behavior.
- This known standard behavior is, in a preferred embodiment, normal behavior for the type of animal being studied.
- the known activity of the standard object may be normal or abnormal behavior of the animal.
- the abnormal behaviors are then identified in terms of (1) known abnormal behavior; (2) new behavior likely to be abnormal; and/or (3) daily rhythm differences likely to be abnormal behavior.
- Known normal behavior may also be output as desired by the user. This information is automatically identified to the user for their review and disposition.
- the information output may include behavior information that is compatible with current statistical packages such as Systat and SPSS.
- object detection is performed through a unique method of background subtraction.
- incoming video is provided to the system for analysis. This video may be provided by digital equipment and input to the object identification and segregation module 205 of the computer 150.
- the incoming digital video signal may be split into individual images (frames) in real-time. This step maybe included if it is desired to carry out real-time analysis.
- decision step 506 the system determines if the background image needs to be developed because there was no background image developed previously or the background image has changed.
- a background image is generated by first grouping a number of frames or images into a sample of video images, for example 20 frames or images.
- the background may need to be updated periodically due to changes caused by, for example, lighting and displacement of moveable objects in the cage, such as the bedding.
- the system generates a standard deviation map of the group of images.
- an object(s) bounding box area is identified and removed from each frame or image to create a modified frame or image.
- the bounding box area is determined by sensing the area wherein the variation of a feature such as the standard deviation of intensity is above a predetermined threshold.
- an area in the digitized video image where the object of interest in motion is located is removed leaving only a partial image.
- the various modified images within the group, less the bounding box area are combined, for example averaged, to create a background image at step 511.
- the background image does not remain constant for a great length of time due to various reasons. For example, the bedding in a mouse cage can shift due to the activity of the mouse. External factors such as change in illumination conditions also require background image recalculations. If the camera moves, then, background might need to be changed. Thus, the background typically needs to be recalculated periodically as described above or it can be recalculated by keeping track of the difference image and note any sudden changes such as an increase in the number of particular color (e.g., white) pixels in the difference image or the appearance of patches of the particular color (e.g., white) pixels in another area of the difference image, hi any case, the newly generated background image may then be combined with any existing background image to create a new background image at step 511.
- particular color e.g., white
- the newly generated background image is next, at step 512, subtracted from the current video image(s) to obtain foreground areas that may include the object of interest.
- the process may proceed to step 512 and the background image is subtracted from the current image, leaving the foreground objects.
- the object identification/detection process is performed.
- regions of interest ROI
- Classification of these foreground regions of interest will be performed using the sizes of the ROIs, distances among these ROIs, threshold of intensity, and connectedness to identify the foreground objects.
- the foreground object identification/detection process may be refined by utilizing information about the actual distribution (histograms) of the intensity levels of the foreground object and using edge detection to more accurately identify the desired object(s).
- the system continuously maintains a distribution of the foreground object intensities as obtained.
- a lower threshold may be used to thereby permit a larger amount of noise to appear in the foreground image in the form of ROIs.
- a histogram is then updated with the pixels in the ROI.
- plotting a histogram of all the intensities of a particular color pixels over many images provides a bi-modal shape with the larger peak corresponding to the foreground object's intensity range and the smaller peak corresponding to the noise pixels in the ROI's images.
- step 516 having "learned" the intensity range of the foreground object, only those pixels in the foreground object that conform to this intensity range are selected, thereby identifying the foreground object more clearly even with background that is fairly similar.
- the foreground object of interest may be refined using edge information to more accurately identify the desired object.
- An edge detection mechanism such as Prewitt operator is applied to the original image. Adaptive thresholds for edge detections can be used.
- the actual boundary of the foreground object is assumed to be made up of one or more segments in the edge map, i.e., the actual contour of the foreground objects comprises edges in the edge map.
- the closed contour of the "detected" foreground object is broken into smaller segments, if necessary. Segments in the edge map that are closest to these contour segments according to a distance metric are found to be the desired contour.
- One exemplary distance metric is the sum of absolute normal distance to the edge map segment from each point in the closed contour of the "detected" foreground object.
- the previous embodiments are generally applicable to identifying, tracking, and characterizing the activities of a particular object of interest present in a video image, e.g., an animal, a human, a vehicle, etc.
- the invention is also particularly applicable to the study and analysis of animals used for testing new drugs and/or genetic mutations.
- a number of variations of the invention related to determining changes in behavior of mice will be described in more detail below using examples of video images obtained.
- One variation of the present invention is designed particularly for the purpose of automatically determining the behavioral characteristics of a mouse.
- the need for sensitive detection of novel phenotypes of genetically manipulated or dmg-administered mice demands automation of analyses. Behavioral phenotypes are often best detected when mice are unconstrained by experimenter manipulation.
- automation of analysis of behavior in a home cage would be a preferred means of detecting phenotypes resulting from gene manipulations or drug administrations.
- Automation of analysis as provided by the present invention will allow quantification of all behaviors and may provide analysis of the mouse's behavior as they vary across the daily cycle of activity. Because gene defects causing developmental disorders in humans usually result in changes in the daily rhythm of behavior, analysis of organized patterns of behavior across the day may be effective in detecting phenotypes in transgenic and targeted mutant mice.
- the automated system of the present invention may also detect behaviors that do not normally occur and present the investigator with video clips of such behavior without the investigator having to view an entire day or long period of mouse activity to manually identify the desired behavior.
- the systematically developed definition of mouse behavior that is detectable by the automated analysis of the present invention makes precise and quantitative analysis of the entire mouse behavior repertoire possible for the first time.
- the various computer algorithms included in the invention for automating behavior analysis based on the behavior definitions ensure accurate and efficient identification of mouse behaviors.
- the digital video analysis techniques of the present invention improves analysis of behavior by leading to: (1) decreased variance due to non-disturbed observation of the animal; (2) increased experiment sensitivity due to the greater number of behaviors sampled over a much longer time span than ever before possible; and (3) the potential to be applied to all common normative behavior patterns, capability to assess subtle behavioral states, and detection of changes of behavior patterns in addition to individual behaviors.
- Development activities have been complete to validate various scientific definition of mouse behaviors and to create novel digital video processing algorithms for mouse tracking and behavior recognition, which are embody in software and hardware system according to the present invention.
- the first step in the analysis of home cage behavior is an automated initialization step that involves analysis of video images to identify the location and outline of the mouse, as indicated by step 310.
- the location and outline of the mouse are tracked over time, as indicated by step 315.
- Performing the initialization step periodically may be used to reset any propagation errors that appear during the tracking step.
- the mouse is tracked over time, its features including shape are extracted, and used for training and classifying the posture of the mouse from frame to frame, as indicated by step 320.
- Posture labels are generated for each frame, which are analyzed over time to determine the actual behavior, as indicated by step 325.
- FIG. 6 A typical video frame of a mouse in its home cage is shown in Figure 6.
- a mouse In this video frame a mouse is shown in a rearing up posture. Many such frames make up the video of, for example, a 24 hour mouse behavior monitoring session.
- FIG. 5 A typical video frame of a mouse in its home cage is shown in Figure 6.
- Background subtraction as used in the present invention generally involves generating a still background image from all or a subset of the frames in a video clip and subtracting this background image from any given image to obtain the foreground objects.
- the background is generated by averaging many frames, for example approximately 100 frames of the video, after compensating for any shifts caused by the motion of the camera. Even if foreground objects are present in the frames that are being averaged to generate the background image, their unwanted contribution is negligible when large numbers of frames are used for the background calculation, assuming that the foreground object does not remain at the same location throughout. Nevertheless, it may be helpful to not consider those pixels where the foreground object is present. In one implementation of the background averaging process, only the stationary pixels in an image are considered to avoid the unwanted contributions of the foreground moving objects.
- the stationary and non-stationary pixels are determined by analyzing the local variations of each pixel of a series of frames over a short time period as indicated in step 509 of figure 5.
- the standard deviation from the mean is first calculated for each pixel. If the standard deviation is greater than a chosen threshold, we tag those pixels as being non- stationary or varying pixels. Those pixels that are below the threshold may be tagged as stationary or constant pixels. Only those stationary pixels are used in the averaging process to calculate the background. Since the varying pixels are not used, there will be "holes" in each image that is being used in the averaging process. Over time, not all frames will have these holes at the same location and hence, a complete background image may be obtained with the averaging process. Once the background image has been obtained, subtraction of the background image from the given analyzed image yields the foreground objects.
- a chosen threshold we tag those pixels as being non- stationary or varying pixels. Those pixels that are below the threshold may be tagged as stationary or constant pixels. Only those stationary pixels are used in the averaging process to calculate the background. Since the varying pixels are not used, there will be "holes" in each image that is being used in the averaging process. Over time, not
- T be the number of frames that are being averaged to calculate the background.
- P (Xtytt) be the pixel value at position (x, y) and frame number t , then the mean, p x ⁇ , and standard deviation, ⁇ ( ⁇ . y) , for that location are defined respectively as,
- ⁇ x y For a particular pixel is greater than a threshold, for example an intensity of 64 on the scale of 0 to 255 was used for a video clip with mouse in a cage, then it is omitted from the background image calculation.
- a threshold for example an intensity of 64 on the scale of 0 to 255 was used for a video clip with mouse in a cage
- the background image of a video session does not remain constant for a great length of time.
- the bedding in the mouse cage can shift due to the activity of the mouse.
- the background may need to be recalculated periodically.
- External factors such as change in illumination conditions may require background image recalculations. If the camera 105 moves, then the background image might need to be recalculated.
- Another method other than performing background recalculations periodically, is to keep track of the difference image and note any sudden changes such as an increase in the number of white pixels in the difference image or the appearance of patches of white pixels in another area of the difference image.
- Figures 7 A, 7B and 7C An example of some screen shots of one exemplary background subtraction process used for monitoring a mouse with the present invention is shown in Figures 7 A, 7B and 7C.
- Figure 7 A illustrates a first frame in a sequence with the mouse in an eating posture 705.
- Figure 7B illustrates the same frame of the video image now having the area of the frame in which the pixels are changing identified as a blocked out 710.
- the background has a "hole” 710 (shown in black).
- This hole 710 will be filled with an image indicative of the true complete background image when other frames are averaged with it.
- several samples should first be generated. For example, a 10-20 frame sample (30 frames per second) from a video clip is taken and then averaged to generate one sample.
- FIG. 7C illustrates the resulting background image for the video clip once the group of frames in a sample set and a number of sample sets are averaged together.
- this method is quite successful at generating a reasonably complete background image (less the foreground object of interest) to be used in the background subtraction process for identifying and segregating a desired object, in this case a mouse.
- One primary advantage of this technique is its low complexity that enables the background recalculations and foreground object detection to be performed with ease. This makes the background subtraction method of the present invention well suited for use in realtime processing applications.
- Various other algorithms may be used for object or mouse identification.
- a mixture model and/or robust estimation algorithms in addition to, or in place of, background subtraction.
- These algorithms are newly developed theory in image sequence processing and object segmentation. They may handle object segmentation better than background subtraction in certain circumstances.
- Preliminary analysis indicates that mixture model and/or robust estimation algorithms may have excellent results for mouse identification.
- the background is then used to determine the foreground objects by taking the intensity difference and applying a threshold determination procedure to remove noise.
- This step may involve threshold determination on both the intensity and the size of region.
- An 8-connection labeling procedure may be performed to screen out disconnected small noisy regions and improve the region that corresponds to the mouse.
- all pixels in a frame will be assigned a label as foreground pixel or background pixel.
- Thresholding has generated labels for certain pixels. Neighbors of those labeled pixels that have not been labeled may obtain the same label as the labeled pixel.
- Eight-connectedness defines 8 corner-adjacent pixels that are all neighbors. The remaining regions indicated to be foreground objects are much smaller compared to the region of mouse, thus a size criteria is used to select the larger mouse region. The outline or contour of this foreground object is thus determined.
- convex hull of the pixels is used in the foreground object for representation.
- Convex hull H of an arbitrary set S which is a region in the frame in this case, is the smallest set containing S.
- the set difference H-S is called the convex deficiency D of the set S.
- the region S' boundary can be partitioned by following the contour of S and marking the points at which transition is made into or out of a component of the convex deficiency. These marking points can be connected into a polygon that gives a description of the region.
- the centroid (or center of mass) of the foreground object is calculated and is used for representing the location of the object (e.g., mouse).
- Figures 8 A, 8B and 8C illustrate the results of the location and object outline identification for a mouse using the present invention.
- Figure 8A illustrates a difference image between foreground and background for the image in Figure 7A.
- Figure 8B illustrates the image after thresholding showing the foreground mouse 705 object correctly identified.
- Figure 8C illustrates a video image showing the foreground object, a mouse correctly identified with a polygon outline 805, created using convex hull approach as described above.
- Another method of location and outline identification that may improve the representation of the shape of the mouse is the b-spline method.
- B-spline are piecewise polynomial functions that can provide local approximation of contours of shapes using a small number of parameters and the piecewise smooth lines can be uged to represent the outline of the object area. This is useful because human perception of shapes is deemed to be based on curvatures of parts of contours (or object surfaces). This is especially true since shapes of mice are curvatures at any time. This representation may thus results in compression of boundary data as well as smoothing of coarsely digitized contours.
- This set of points is to be approximated by a B-spline representation as follows: iv-l
- a series of image processing procedures may be performed first to detect edge using a sobel edge detection algorithm and then, using morphological operations to trim edge points to ensure that the edge points are singly chained.
- Figures 9 A - 9C One example of the use of the B-Spline algorithm implemented in the present invention is illustrated in Figures 9 A - 9C.
- Figure 9A illustrates an exemplary video image frame of mouse eating 705.
- Figure 9B illustrates the segmented edge 905 of the mouse 705 found in Figure 9 A.
- Figure 9C illustrates a b-spline representation of the mouse edge 910 extrapolated from the segmented edge of the mouse found in Figure 9 A.
- b-spline representation, or convex hull representation can be used as features of foreground object, in addition to other features that include but not limited to: centroid, the principal orientation angle of the object, the area (number of pixels), the eccentricity (roundness), and the aspect ratio of object.
- Ideal tracking of foreground objects in the image domain involves a matching operation to be performed that identifies corresponding points from one frame to the next. This process may become computationally too consuming or expensive to perform in an efficient manner. Thus, one approach is to use approximations to the ideal case that can be accomplished in a short amount of time. For example, tracking the foreground object may be achieved by merely tracking the outline contour from one frame to the next in the feature space (i.e., identified foreground object image).
- tracking is performed in the feature space, which provides a close approximation to tracking in the image domain.
- the features include the centroid, principal orientation angle of the object, area (number of pixels), eccentricity (roundness), and the aspect ratio of object with lengths measured along the secondary and primary axes of the object.
- S be the set of pixels in the foreground object
- A denote the area in number of pixels
- C x , C y denote the centroid
- ⁇ denote the orientation angle
- E denote the eccentricity
- R denote the aspect ratio.
- Second order moments m. 2,0 ⁇ (x-C x )(y-C y )
- R is equal to the ratio of the length of the range of the points projected along an axis perpendicular to ⁇ , to the length of the range of the points projected along an axis parallel to ⁇ . This may also be defined as the aspect ratio (ratio of width to length) after rotating the foreground obj ect by ⁇ .
- Tracking in the feature space involves following feature values from one frame to the next. For example, if the area steadily increases, it could mean that the mouse is coming out of a cuddled up position to a more elongated position, or that it could be moving from a front view to a side view, etc. If the position of the centroid of the mouse moves up, it means that the mouse may be rearing up on its hind legs. Similarly, if the angle of orientation changes from horizontal to vertical, it may be rearing up. These changes can be analyzed with combinations of features also.
- the foreground state of the mouse is classified into one of the given classes.
- This information may be stored in, for example, a database in, for example, a data memory, hi one variation of the invention a Decision Tree classifier (e.g., object shape and posture classifier 215) was implemented by training the classifier with 488 samples of digitized video of a standard, in this case, normal mouse. Six attributes (or features) for each sample were identified. Five posture classes for classification were identified as listed below.
- the system of the present invention was exercised using these classifications.
- the distribution of the samples amongst the five classes is shown in Table 1.
- Table 1 Performing a 10-fold cross-validation on the 488 training samples, a combined accuracy of 93.65% was obtained indicating that the classifier was performing well. This in the range of the highest levels of agreement between human observers.
- the cross-validation procedure involves randomly dividing a training set into N approximately equal sets, and for each of the N folds or iterations, one set is set aside for testing while the remaining N -1 sets are used as training samples.
- Accuracy values for individual classes are indicated in the last column of Table 1.
- Table 2 shows the overall accuracy values for each fold. We assign appropriate labels for each frame depending on the class to with it was classified to.
- Table 1 Distribution of samples in the five classes and the accuracy values for each class.
- Table 2 Accuracy results for each fold for a cross-validation test.
- the present system provides good accuracy for mouse shape and posture recognition and classification.
- One approach is to use a rule-based label analysis procedure (or a token parsing procedure) by which the sequence of labels is analyzed and identify particular behaviors when its corresponding sequence of labels is derived from a video frame being analyzed. For example, if a long sequence (lasting for example several minutes) of the "Cuddled up position" label (Class 3) is observed, and if its centroid remains stationary, then, it may be concluded that the mouse is sleeping. If the location of the waterspout is identified, and if we observe a series of "partially reared" (Class 5) labels, and if the position of the centroid, and the mouse's angle of orientation fall within a small range that has been predetermined, the system can determine and identify that the mouse is drinking. It may also be useful for certain extra conditions to be tested such as, "some part (the mouth) of the mouse must touch the spout if drinking is to be identified" in addition to temporal characteristics of the behavior.
- a rule-based label analysis procedure or a token parsing procedure
- HMMs Hidden Markov Models
- the five exemplary mouse state transitions include: (1) Horizontal (HS) 1005, (2) Cuddled up (CU) 1010, (3) Partially reared (PR) 1015, (4) Vertically Reared (VR) 1020, and (5) Forward Back (FB) 1025 postures.
- Figure 10 shows the five posture states and the duration for which a mouse spent in each state in an exemplary sample video clip.
- One example of a pattern that is understandable and evident from the figure is that the mouse usually passes through the partially reared (PR) 1015 state to reach the vertically reared (VR) 1025 state from the other three ground-level states.
- the states are defined according to the five posture classes mentioned previously.
- a simple HMM system has been created using dynamic programming to find the best match between the input sequence and paths through the state machine. It has been used to classify events in one of the mouse behavior sequences.
- the HMM system was provided with a sequence of tokens representing recognized actions or views from a benchmark mouse-rear video; this file includes views from five different postures, which are:
- Each of these represents a posture of the mouse and all together they constitute five (5) tokens. These tokens cause the HMM to go from one (hidden) state to another.
- the HMM may classify behavior into one of, for example, four hidden states: horizontal, rearing, cuddled, or indecisive:
- HMM defining mouse behaviors can be described as:
- This approach to a HMM for mouse behavior characterization may result in a number of mismatched cases which maybe categorized into three (3) types: (a) one mismatch (the last token) because the start and stop states were forced to be 0; (b) the PARTIALLY_REARED may be mapped to indecisive, but this may only be a difference in the naming; and (c) the FRONT_OR_BACK may be mapped to the same value as HORIZ_SIDE_VIEW (21 cases).
- the system could be configured to automatically detect and characterize an animal freezing and/or touching or sniffing a particular object. Also, the system could be configured to compare the object's behavior against a "norm" for a particular behavioral parameter. Other detailed activities such as skilled reaching and forelimb movements as well as social behavior among groups of animals can also be detected and characterized.
- the system of the present invention first obtains the video image background and uses it to identify the foreground objects. Then, features are extracted from the foreground objects, which are in turn passed to the decision tree classifier for classification and labeling. This labeled sequence is passed to a behavior identification system module that identifies the final set of behaviors for the video clip. The image resolution of the system that has been obtained and the accuracy of identification of the behaviors attempted so far have been very good and resulted in an effective automated video image object recognition and behavior characterization system.
- the invention may identify some abnormal behavior by using video image information (for example, stored in memory) of known abnormal animals to build a video profile for that behavior. For example, video image of vertical spinning while hanging from the cage top was stored to memory and used to automatically identify such activity in mice. Further, abnormalities may also result from an increase in any particular type of normal behavior.
- video image information for example, stored in memory
- video image of vertical spinning while hanging from the cage top was stored to memory and used to automatically identify such activity in mice.
- abnormalities may also result from an increase in any particular type of normal behavior.
- Detection of such new abnormal behaviors may be achieved by the present invention detecting, for example, segments of behavior that do not fit the standard profile.
- the standard profile may be developed for a particular strain of mouse whereas detection of abnormal amounts of a normal behavior can be detected by comparison to the statistical properties of the standard profile.
- the automated analysis of the present invention may be used to build a profile of the behaviors, their amount, duration, and daily cycle for each animal, for example each commonly used strain of mice.
- a plurality of such profiles may be stored in, for example, a database in a data memory of the computer. One or more of these profile may then be compared to a mouse in question and difference from the profile expressed quantitatively.
- the techniques developed with the present invention for automation of the categorization and quantification of all home-cage of mouse behaviors throughout the daily cycle is a powerful tool for detecting phenotypic effects of gene manipulations in mice.
- this technology is extendable to other behavior studies of animals and humans, as well as surveillance purposes, hi any case, the present invention has proven to be a significant achievement in creating an automated system and methods for automated accurate identification, tracking and behavior categorization of an object whose image is captured in a video image.
- the present invention may also include audio analysis and/or multiple camera analysis.
- the video image analysis maybe augmented with audio analysis since audio is typically included with most video systems today.
- audio may be an additional variable used to determine and classify a particular objects behavior.
- the analysis may be expanded to video image analysis of multiple objects, for example mice, and their social interaction with one another, hi a still further variation, the system may include multiple cameras providing one or more planes of view of an object to be analyzed.
- the camera may be located in remote locations and the video images sent via the Internet for analysis by a server at another site, hi fact, the standard object behavior data and/or database may be housed in a remote location and the data files may be downloaded to a stand alone analysis system via the Internet, in accordance with the present invention.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Physiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Environmental Sciences (AREA)
- Neurology (AREA)
- Psychiatry (AREA)
- Human Computer Interaction (AREA)
- Zoology (AREA)
- Artificial Intelligence (AREA)
- Neurosurgery (AREA)
- Social Psychology (AREA)
- Animal Husbandry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Epidemiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Primary Health Care (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- General Business, Economics & Management (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP01987014A EP1337962B9 (en) | 2000-11-24 | 2001-11-19 | System and method for object identification and behavior characterization using video analysis |
JP2002544950A JP2004514975A (en) | 2000-11-24 | 2001-11-19 | System and method for object identification and behavior characterization using video analysis |
AU2002239272A AU2002239272A1 (en) | 2000-11-24 | 2001-11-19 | System and method for object identification and behavior characterization using video analysis |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/718,374 US6678413B1 (en) | 2000-11-24 | 2000-11-24 | System and method for object identification and behavior characterization using video analysis |
US09/718,374 | 2000-11-24 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2002043352A2 true WO2002043352A2 (en) | 2002-05-30 |
WO2002043352A3 WO2002043352A3 (en) | 2003-01-09 |
Family
ID=24885865
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2001/043282 WO2002043352A2 (en) | 2000-11-24 | 2001-11-19 | System and method for object identification and behavior characterization using video analysis |
Country Status (5)
Country | Link |
---|---|
US (9) | US6678413B1 (en) |
EP (1) | EP1337962B9 (en) |
JP (1) | JP2004514975A (en) |
AU (1) | AU2002239272A1 (en) |
WO (1) | WO2002043352A2 (en) |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2242484A1 (en) * | 2003-01-24 | 2005-11-01 | Pedro Monagas Asensio | Mood analysing device for mammals |
WO2007019140A2 (en) * | 2005-08-03 | 2007-02-15 | Honeywell International Inc. | Boolean complement methods and systems for video image processing a region of interest |
WO2007064559A1 (en) * | 2005-11-28 | 2007-06-07 | Honeywell International Inc. | Detection of abnormal crowd behavior |
US7269516B2 (en) | 2001-05-15 | 2007-09-11 | Psychogenics, Inc. | Systems and methods for monitoring behavior informatics |
WO2007110555A1 (en) * | 2006-03-28 | 2007-10-04 | The University Court Of The University Of Edinburgh | A method for automatically characterizing the behavior of one or more objects |
WO2009045578A2 (en) * | 2007-06-18 | 2009-04-09 | The Boeing Company | Object detection incorporating background clutter removal |
US7643655B2 (en) | 2000-11-24 | 2010-01-05 | Clever Sys, Inc. | System and method for animal seizure detection and classification using video analysis |
WO2010032247A2 (en) * | 2008-09-17 | 2010-03-25 | Ramot At Tel-Aviv University Ltd. | System and method for analyzing exploratory behavior |
US7817824B2 (en) | 2000-11-24 | 2010-10-19 | Clever Sys, Inc. | Unified system and method for animal behavior characterization from top view using video analysis |
JP2011081823A (en) * | 2002-06-28 | 2011-04-21 | Koninkl Philips Electronics Nv | Method and apparatus for modeling behavior using probability distribution function |
CN102970519A (en) * | 2012-11-29 | 2013-03-13 | 河海大学常州校区 | Non-rigid target behavior observation device and method based on visual perception network |
EP2609858A1 (en) * | 2011-12-28 | 2013-07-03 | Samsung Electronics Co., Ltd | Method for measuring quantity of exercise and display apparatus thereof |
EP2521070A3 (en) * | 2011-05-06 | 2013-12-25 | Deutsche Telekom AG | Method and system for recording a static or dynamic scene, for determining raw events and detecting free areas in an area under observation |
US8634635B2 (en) | 2008-10-30 | 2014-01-21 | Clever Sys, Inc. | System and method for stereo-view multiple animal behavior characterization |
US9565398B2 (en) | 2001-06-11 | 2017-02-07 | Arrowsight, Inc. | Caching graphical interface for displaying video and ancillary data from a saved video |
WO2019032306A1 (en) | 2017-08-07 | 2019-02-14 | Standard Cognition, Corp. | Predicting inventory events using semantic diffing |
US10410371B2 (en) | 2017-12-21 | 2019-09-10 | The Boeing Company | Cluttered background removal from imagery for object detection |
US11023850B2 (en) | 2017-08-07 | 2021-06-01 | Standard Cognition, Corp. | Realtime inventory location management using deep learning |
US20210315186A1 (en) * | 2020-04-14 | 2021-10-14 | The United States Of America, As Represented By Secretary Of Agriculture | Intelligent dual sensory species-specific recognition trigger system |
US11195146B2 (en) | 2017-08-07 | 2021-12-07 | Standard Cognition, Corp. | Systems and methods for deep learning-based shopper tracking |
US11200692B2 (en) | 2017-08-07 | 2021-12-14 | Standard Cognition, Corp | Systems and methods to check-in shoppers in a cashier-less store |
US11232687B2 (en) | 2017-08-07 | 2022-01-25 | Standard Cognition, Corp | Deep learning-based shopper statuses in a cashier-less store |
US11232575B2 (en) | 2019-04-18 | 2022-01-25 | Standard Cognition, Corp | Systems and methods for deep learning-based subject persistence |
US11250376B2 (en) | 2017-08-07 | 2022-02-15 | Standard Cognition, Corp | Product correlation analysis using deep learning |
CN114241521A (en) * | 2021-12-13 | 2022-03-25 | 北京华夏电通科技股份有限公司 | Method, device and equipment for identifying court trial video picture normal area |
US11295270B2 (en) | 2017-08-07 | 2022-04-05 | Standard Cognition, Corp. | Deep learning-based store realograms |
US11303853B2 (en) | 2020-06-26 | 2022-04-12 | Standard Cognition, Corp. | Systems and methods for automated design of camera placement and cameras arrangements for autonomous checkout |
US11361468B2 (en) | 2020-06-26 | 2022-06-14 | Standard Cognition, Corp. | Systems and methods for automated recalibration of sensors for autonomous checkout |
US11538186B2 (en) | 2017-08-07 | 2022-12-27 | Standard Cognition, Corp. | Systems and methods to check-in shoppers in a cashier-less store |
US11544866B2 (en) | 2017-08-07 | 2023-01-03 | Standard Cognition, Corp | Directional impression analysis using deep learning |
US11551079B2 (en) | 2017-03-01 | 2023-01-10 | Standard Cognition, Corp. | Generating labeled training images for use in training a computational neural network for object or action recognition |
US11790682B2 (en) | 2017-03-10 | 2023-10-17 | Standard Cognition, Corp. | Image analysis using neural networks for pose and action identification |
EP4046066A4 (en) * | 2019-11-07 | 2023-11-15 | Google LLC | Monitoring animal pose dynamics from monocular images |
Families Citing this family (547)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7076102B2 (en) * | 2001-09-27 | 2006-07-11 | Koninklijke Philips Electronics N.V. | Video monitoring system employing hierarchical hidden markov model (HMM) event learning and classification |
US7484172B2 (en) * | 1997-05-23 | 2009-01-27 | Walker Digital, Llc | System and method for providing a customized index with hyper-footnotes |
GB2352076B (en) * | 1999-07-15 | 2003-12-17 | Mitsubishi Electric Inf Tech | Method and apparatus for representing and searching for an object in an image |
US20050146605A1 (en) * | 2000-10-24 | 2005-07-07 | Lipton Alan J. | Video surveillance system employing video primitives |
GB2371936A (en) * | 2001-02-03 | 2002-08-07 | Hewlett Packard Co | Surveillance system for tracking a moving object |
US6778705B2 (en) * | 2001-02-27 | 2004-08-17 | Koninklijke Philips Electronics N.V. | Classification of objects through model ensembles |
JP3926572B2 (en) * | 2001-03-02 | 2007-06-06 | 株式会社日立製作所 | Image monitoring method, image monitoring apparatus, and storage medium |
US20090231436A1 (en) * | 2001-04-19 | 2009-09-17 | Faltesek Anthony E | Method and apparatus for tracking with identification |
US6810086B1 (en) | 2001-06-05 | 2004-10-26 | At&T Corp. | System and method of filtering noise |
US6909745B1 (en) | 2001-06-05 | 2005-06-21 | At&T Corp. | Content adaptive video encoder |
US6970513B1 (en) | 2001-06-05 | 2005-11-29 | At&T Corp. | System for content adaptive video decoding |
US6968006B1 (en) | 2001-06-05 | 2005-11-22 | At&T Corp. | Method of content adaptive video decoding |
US7773670B1 (en) | 2001-06-05 | 2010-08-10 | At+T Intellectual Property Ii, L.P. | Method of content adaptive video encoding |
JP4596221B2 (en) * | 2001-06-26 | 2010-12-08 | ソニー株式会社 | Image processing apparatus and method, recording medium, and program |
US20030004913A1 (en) * | 2001-07-02 | 2003-01-02 | Koninklijke Philips Electronics N.V. | Vision-based method and apparatus for detecting an event requiring assistance or documentation |
JP2003087771A (en) * | 2001-09-07 | 2003-03-20 | Oki Electric Ind Co Ltd | Monitoring system and monitoring method |
US20030058111A1 (en) * | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Computer vision based elderly care monitoring system |
US7043075B2 (en) * | 2001-09-27 | 2006-05-09 | Koninklijke Philips Electronics N.V. | Computer vision system and method employing hierarchical object classification scheme |
US7369680B2 (en) * | 2001-09-27 | 2008-05-06 | Koninklijke Phhilips Electronics N.V. | Method and apparatus for detecting an event based on patterns of behavior |
US7110569B2 (en) * | 2001-09-27 | 2006-09-19 | Koninklijke Philips Electronics N.V. | Video based detection of fall-down and other events |
US7432940B2 (en) * | 2001-10-12 | 2008-10-07 | Canon Kabushiki Kaisha | Interactive animation of sprites in a video production |
JP2003125401A (en) * | 2001-10-17 | 2003-04-25 | Mitsubishi Electric Corp | Video data reproducing method |
US6944421B2 (en) * | 2001-11-15 | 2005-09-13 | T.F.H. Publications, Inc. | Method and apparatus for providing training information regarding a pet |
US20030105880A1 (en) * | 2001-12-04 | 2003-06-05 | Koninklijke Philips Electronics N.V. | Distributed processing, storage, and transmision of multimedia information |
US20030115215A1 (en) * | 2001-12-18 | 2003-06-19 | Daniel Swarovski | Method and system for watching and tracking birds |
US7552030B2 (en) * | 2002-01-22 | 2009-06-23 | Honeywell International Inc. | System and method for learning patterns of behavior and operating a monitoring and response system based thereon |
US7683929B2 (en) * | 2002-02-06 | 2010-03-23 | Nice Systems, Ltd. | System and method for video content analysis-based detection, surveillance and alarm management |
JP2004021495A (en) * | 2002-06-14 | 2004-01-22 | Mitsubishi Electric Corp | Monitoring system and monitoring method |
ES2241509T1 (en) * | 2002-07-15 | 2005-11-01 | Baylor College Of Medicine | USER INTERFACE FOR COMPUTER THAT FACILITATES THE ACQUISITION AND ANALYSIS OF BIOLOGICAL FEATURES OF SPECIMENS. |
US7773112B2 (en) * | 2002-08-20 | 2010-08-10 | Tektronix, Inc. | Automatic measurement of video parameters |
US7200266B2 (en) * | 2002-08-27 | 2007-04-03 | Princeton University | Method and apparatus for automated video activity analysis |
JP2004096557A (en) * | 2002-09-02 | 2004-03-25 | Canon Inc | Image processor and image processing method |
ATE285590T1 (en) * | 2002-10-25 | 2005-01-15 | Evotec Technologies Gmbh | METHOD AND APPARATUS FOR CAPTURING THREE-DIMENSIONAL IMAGERY OF FLOATED MICROOBJECTS USING HIGH-RESOLUTION MICROSCOPY |
US7375731B2 (en) * | 2002-11-01 | 2008-05-20 | Mitsubishi Electric Research Laboratories, Inc. | Video mining using unsupervised clustering of video content |
US20050134685A1 (en) * | 2003-12-22 | 2005-06-23 | Objectvideo, Inc. | Master-slave automated video-based surveillance system |
US7956889B2 (en) * | 2003-06-04 | 2011-06-07 | Model Software Corporation | Video surveillance system |
US7606417B2 (en) | 2004-08-16 | 2009-10-20 | Fotonation Vision Limited | Foreground/background segmentation in digital images with differential exposure calculations |
US7680342B2 (en) * | 2004-08-16 | 2010-03-16 | Fotonation Vision Limited | Indoor/outdoor classification in digital images |
US7590643B2 (en) * | 2003-08-21 | 2009-09-15 | Microsoft Corporation | Systems and methods for extensions and inheritance for units of information manageable by a hardware/software interface system |
US7106502B1 (en) * | 2003-08-21 | 2006-09-12 | The United States Of America As Represented By The Administrator Of National Aeronautics And Space Administration | Operation of a Cartesian robotic system in a compact microscope imaging system with intelligent controls |
US8238696B2 (en) * | 2003-08-21 | 2012-08-07 | Microsoft Corporation | Systems and methods for the implementation of a digital images schema for organizing units of information manageable by a hardware/software interface system |
US8166101B2 (en) | 2003-08-21 | 2012-04-24 | Microsoft Corporation | Systems and methods for the implementation of a synchronization schemas for units of information manageable by a hardware/software interface system |
US7835572B2 (en) * | 2003-09-30 | 2010-11-16 | Sharp Laboratories Of America, Inc. | Red eye reduction technique |
US7280673B2 (en) * | 2003-10-10 | 2007-10-09 | Intellivid Corporation | System and method for searching for changes in surveillance video |
US7333633B2 (en) * | 2003-10-31 | 2008-02-19 | Plexon, Inc. | Inter-frame video techniques for behavioral analysis of laboratory animals |
US20050104958A1 (en) * | 2003-11-13 | 2005-05-19 | Geoffrey Egnal | Active camera video-based surveillance systems and methods |
KR100601933B1 (en) * | 2003-11-18 | 2006-07-14 | 삼성전자주식회사 | Method and apparatus of human detection and privacy protection method and system employing the same |
US7664292B2 (en) * | 2003-12-03 | 2010-02-16 | Safehouse International, Inc. | Monitoring an output from a camera |
US20050163345A1 (en) * | 2003-12-03 | 2005-07-28 | Safehouse International Limited | Analysing image data |
US8675059B2 (en) | 2010-07-29 | 2014-03-18 | Careview Communications, Inc. | System and method for using a video monitoring system to prevent and manage decubitus ulcers in patients |
US9311540B2 (en) | 2003-12-12 | 2016-04-12 | Careview Communications, Inc. | System and method for predicting patient falls |
US7447333B1 (en) * | 2004-01-22 | 2008-11-04 | Siemens Corporate Research, Inc. | Video and audio monitoring for syndromic surveillance for infectious diseases |
JP4479267B2 (en) * | 2004-02-18 | 2010-06-09 | 株式会社日立製作所 | Surveillance camera video distribution system |
US7486815B2 (en) * | 2004-02-20 | 2009-02-03 | Microsoft Corporation | Method and apparatus for scene learning and three-dimensional tracking using stereo video cameras |
US7831094B2 (en) * | 2004-04-27 | 2010-11-09 | Honda Motor Co., Ltd. | Simultaneous localization and mapping using multiple view feature descriptors |
US7263472B2 (en) * | 2004-06-28 | 2007-08-28 | Mitsubishi Electric Research Laboratories, Inc. | Hidden markov model based object tracking and similarity metrics |
US20060007307A1 (en) * | 2004-07-12 | 2006-01-12 | Chao-Hung Chang | Partial image saving system and method |
US7562299B2 (en) * | 2004-08-13 | 2009-07-14 | Pelco, Inc. | Method and apparatus for searching recorded video |
JP4433948B2 (en) * | 2004-09-02 | 2010-03-17 | 株式会社セガ | Background image acquisition program, video game apparatus, background image acquisition method, and computer-readable recording medium recording the program |
US20080166015A1 (en) * | 2004-09-24 | 2008-07-10 | Object Video, Inc. | Method for finding paths in video |
US7469060B2 (en) * | 2004-11-12 | 2008-12-23 | Honeywell International Inc. | Infrared face detection and recognition system |
US7602942B2 (en) * | 2004-11-12 | 2009-10-13 | Honeywell International Inc. | Infrared and visible fusion face recognition system |
RU2323475C2 (en) * | 2004-11-12 | 2008-04-27 | Общество с ограниченной ответственностью "Центр Нейросетевых Технологий - Интеллектуальные Системы Безопасности" (ООО "ИСС") | Method (variants) and device (variants) for automated detection of intentional or incidental disruptions of technological procedure by operator |
US20060245500A1 (en) * | 2004-12-15 | 2006-11-02 | David Yonovitz | Tunable wavelet target extraction preprocessor system |
EP1834486A1 (en) * | 2004-12-24 | 2007-09-19 | Ultrawaves design holding B. V. | Intelligent distributed image processing |
US20060187230A1 (en) * | 2005-01-31 | 2006-08-24 | Searete Llc | Peripheral shared image device sharing |
US20060221197A1 (en) * | 2005-03-30 | 2006-10-05 | Jung Edward K | Image transformation estimator of an imaging device |
US20060187227A1 (en) * | 2005-01-31 | 2006-08-24 | Jung Edward K | Storage aspects for imaging device |
US8902320B2 (en) * | 2005-01-31 | 2014-12-02 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US20060173972A1 (en) * | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Audio sharing |
US20060285150A1 (en) * | 2005-01-31 | 2006-12-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Regional proximity for shared image device(s) |
US20060174203A1 (en) * | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Viewfinder for shared image device |
US9082456B2 (en) | 2005-01-31 | 2015-07-14 | The Invention Science Fund I Llc | Shared image device designation |
US20060171603A1 (en) * | 2005-01-31 | 2006-08-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Resampling of transformed shared image techniques |
US20060170956A1 (en) * | 2005-01-31 | 2006-08-03 | Jung Edward K | Shared image devices |
US7920169B2 (en) * | 2005-01-31 | 2011-04-05 | Invention Science Fund I, Llc | Proximity of shared image devices |
US20070236505A1 (en) * | 2005-01-31 | 2007-10-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Resampling of transformed shared image techniques |
US8606383B2 (en) | 2005-01-31 | 2013-12-10 | The Invention Science Fund I, Llc | Audio sharing |
US9124729B2 (en) * | 2005-01-31 | 2015-09-01 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US20060187228A1 (en) * | 2005-01-31 | 2006-08-24 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Sharing including peripheral shared image device |
US9489717B2 (en) | 2005-01-31 | 2016-11-08 | Invention Science Fund I, Llc | Shared image device |
US9325781B2 (en) | 2005-01-31 | 2016-04-26 | Invention Science Fund I, Llc | Audio sharing |
US7876357B2 (en) * | 2005-01-31 | 2011-01-25 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US9910341B2 (en) * | 2005-01-31 | 2018-03-06 | The Invention Science Fund I, Llc | Shared image device designation |
US7903141B1 (en) * | 2005-02-15 | 2011-03-08 | Videomining Corporation | Method and system for event detection by multi-scale image invariant analysis |
US7710452B1 (en) | 2005-03-16 | 2010-05-04 | Eric Lindberg | Remote video monitoring of non-urban outdoor sites |
US7286056B2 (en) * | 2005-03-22 | 2007-10-23 | Lawrence Kates | System and method for pest detection |
US8139896B1 (en) * | 2005-03-28 | 2012-03-20 | Grandeye, Ltd. | Tracking moving objects accurately on a wide-angle video |
US7760908B2 (en) * | 2005-03-31 | 2010-07-20 | Honeywell International Inc. | Event packaged video sequence |
US7801328B2 (en) * | 2005-03-31 | 2010-09-21 | Honeywell International Inc. | Methods for defining, detecting, analyzing, indexing and retrieving events using video image processing |
US8704668B1 (en) * | 2005-04-20 | 2014-04-22 | Trevor Darrell | System for monitoring and alerting based on animal behavior in designated environments |
US8233042B2 (en) * | 2005-10-31 | 2012-07-31 | The Invention Science Fund I, Llc | Preservation and/or degradation of a video/audio data stream |
US20070222865A1 (en) * | 2006-03-15 | 2007-09-27 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Enhanced video/still image correlation |
US9967424B2 (en) * | 2005-06-02 | 2018-05-08 | Invention Science Fund I, Llc | Data storage usage protocol |
US9191611B2 (en) * | 2005-06-02 | 2015-11-17 | Invention Science Fund I, Llc | Conditional alteration of a saved image |
US20090144391A1 (en) * | 2007-11-30 | 2009-06-04 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Audio sharing |
US8964054B2 (en) * | 2006-08-18 | 2015-02-24 | The Invention Science Fund I, Llc | Capturing selected image objects |
US7872675B2 (en) * | 2005-06-02 | 2011-01-18 | The Invention Science Fund I, Llc | Saved-image management |
US9621749B2 (en) * | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
US20070008326A1 (en) * | 2005-06-02 | 2007-01-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Dual mode image capture technique |
US9942511B2 (en) | 2005-10-31 | 2018-04-10 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US8681225B2 (en) * | 2005-06-02 | 2014-03-25 | Royce A. Levien | Storage access technique for captured data |
US9819490B2 (en) * | 2005-05-04 | 2017-11-14 | Invention Science Fund I, Llc | Regional proximity for shared image device(s) |
US8072501B2 (en) * | 2005-10-31 | 2011-12-06 | The Invention Science Fund I, Llc | Preservation and/or degradation of a video/audio data stream |
US9001215B2 (en) * | 2005-06-02 | 2015-04-07 | The Invention Science Fund I, Llc | Estimating shared image device operational capabilities or resources |
US9451200B2 (en) * | 2005-06-02 | 2016-09-20 | Invention Science Fund I, Llc | Storage access technique for captured data |
US9167195B2 (en) * | 2005-10-31 | 2015-10-20 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US9076208B2 (en) * | 2006-02-28 | 2015-07-07 | The Invention Science Fund I, Llc | Imagery processing |
US9093121B2 (en) | 2006-02-28 | 2015-07-28 | The Invention Science Fund I, Llc | Data management of an audio data stream |
US20070109411A1 (en) * | 2005-06-02 | 2007-05-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Composite image selectivity |
US8253821B2 (en) * | 2005-10-31 | 2012-08-28 | The Invention Science Fund I, Llc | Degradation/preservation management of captured data |
US7782365B2 (en) * | 2005-06-02 | 2010-08-24 | Searete Llc | Enhanced video/still image correlation |
US20070139529A1 (en) * | 2005-06-02 | 2007-06-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Dual mode image capture technique |
US20070098348A1 (en) * | 2005-10-31 | 2007-05-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Degradation/preservation management of captured data |
US10003762B2 (en) | 2005-04-26 | 2018-06-19 | Invention Science Fund I, Llc | Shared image devices |
US20060260624A1 (en) * | 2005-05-17 | 2006-11-23 | Battelle Memorial Institute | Method, program, and system for automatic profiling of entities |
US20060274153A1 (en) * | 2005-06-02 | 2006-12-07 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Third party storage of captured data |
US7720257B2 (en) * | 2005-06-16 | 2010-05-18 | Honeywell International Inc. | Object tracking system |
US20060291697A1 (en) * | 2005-06-21 | 2006-12-28 | Trw Automotive U.S. Llc | Method and apparatus for detecting the presence of an occupant within a vehicle |
US7944468B2 (en) * | 2005-07-05 | 2011-05-17 | Northrop Grumman Systems Corporation | Automated asymmetric threat detection using backward tracking and behavioral analysis |
CN101228555A (en) * | 2005-07-07 | 2008-07-23 | 独创目标实验室公司 | System for 3D monitoring and analysis of motion behavior of targets |
US7545954B2 (en) * | 2005-08-22 | 2009-06-09 | General Electric Company | System for recognizing events |
US20070047834A1 (en) * | 2005-08-31 | 2007-03-01 | International Business Machines Corporation | Method and apparatus for visual background subtraction with one or more preprocessing modules |
US20070058717A1 (en) * | 2005-09-09 | 2007-03-15 | Objectvideo, Inc. | Enhanced processing for scanning video |
US20070071404A1 (en) * | 2005-09-29 | 2007-03-29 | Honeywell International Inc. | Controlled video event presentation |
US7382280B2 (en) * | 2005-10-17 | 2008-06-03 | Cleverdevices, Inc. | Parking violation recording system and method |
US7806604B2 (en) * | 2005-10-20 | 2010-10-05 | Honeywell International Inc. | Face detection and tracking in a wide field of view |
US20070120980A1 (en) | 2005-10-31 | 2007-05-31 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Preservation/degradation of video/audio aspects of a data stream |
US20070203595A1 (en) * | 2006-02-28 | 2007-08-30 | Searete Llc, A Limited Liability Corporation | Data management of an audio data stream |
US8228382B2 (en) * | 2005-11-05 | 2012-07-24 | Ram Pattikonda | System and method for counting people |
US20100310120A1 (en) * | 2005-11-05 | 2010-12-09 | Charlie Keith | Method and system for tracking moving objects in a scene |
US7692696B2 (en) * | 2005-12-27 | 2010-04-06 | Fotonation Vision Limited | Digital image acquisition system with portrait mode |
US8265392B2 (en) * | 2006-02-07 | 2012-09-11 | Qualcomm Incorporated | Inter-mode region-of-interest video object segmentation |
US7822227B2 (en) * | 2006-02-07 | 2010-10-26 | International Business Machines Corporation | Method and system for tracking images |
US8265349B2 (en) * | 2006-02-07 | 2012-09-11 | Qualcomm Incorporated | Intra-mode region-of-interest video object segmentation |
US7986827B2 (en) * | 2006-02-07 | 2011-07-26 | Siemens Medical Solutions Usa, Inc. | System and method for multiple instance learning for computer aided detection |
US8150155B2 (en) | 2006-02-07 | 2012-04-03 | Qualcomm Incorporated | Multi-mode region-of-interest video object segmentation |
WO2007095477A2 (en) * | 2006-02-14 | 2007-08-23 | Fotonation Vision Limited | Image blurring |
IES20060559A2 (en) * | 2006-02-14 | 2006-11-01 | Fotonation Vision Ltd | Automatic detection and correction of non-red flash eye defects |
JP4607797B2 (en) * | 2006-03-06 | 2011-01-05 | 株式会社東芝 | Behavior discrimination device, method and program |
JP4589261B2 (en) * | 2006-03-31 | 2010-12-01 | パナソニック株式会社 | Surveillance camera device |
IES20060564A2 (en) * | 2006-05-03 | 2006-11-01 | Fotonation Vision Ltd | Improved foreground / background separation |
US8121361B2 (en) | 2006-05-19 | 2012-02-21 | The Queen's Medical Center | Motion tracking system for real time adaptive imaging and spectroscopy |
AU2006345533B2 (en) * | 2006-05-31 | 2013-01-24 | Thomson Licensing | Multi-tracking of video objects |
US7983448B1 (en) * | 2006-06-02 | 2011-07-19 | University Of Central Florida Research Foundation, Inc. | Self correcting tracking of moving objects in video |
US20080123959A1 (en) * | 2006-06-26 | 2008-05-29 | Ratner Edward R | Computer-implemented method for automated object recognition and classification in scenes using segment-based object extraction |
KR100716708B1 (en) * | 2006-07-11 | 2007-05-09 | 영남대학교 산학협력단 | Automatic velocity control running machine using pressure sensor array and fuzzy-logic |
US7930204B1 (en) | 2006-07-25 | 2011-04-19 | Videomining Corporation | Method and system for narrowcasting based on automatic analysis of customer behavior in a retail store |
US7974869B1 (en) | 2006-09-20 | 2011-07-05 | Videomining Corporation | Method and system for automatically measuring and forecasting the behavioral characterization of customers to help customize programming contents in a media network |
US20080112593A1 (en) * | 2006-11-03 | 2008-05-15 | Ratner Edward R | Automated method and apparatus for robust image object recognition and/or classification using multiple temporal views |
US7734623B2 (en) * | 2006-11-07 | 2010-06-08 | Cycorp, Inc. | Semantics-based method and apparatus for document analysis |
JP5479907B2 (en) * | 2006-11-20 | 2014-04-23 | アデレード リサーチ アンド イノヴェーション ピーティーワイ エルティーディー | Network monitoring system |
US8165405B2 (en) * | 2006-12-18 | 2012-04-24 | Honda Motor Co., Ltd. | Leveraging temporal, contextual and ordering constraints for recognizing complex activities in video |
US8189926B2 (en) * | 2006-12-30 | 2012-05-29 | Videomining Corporation | Method and system for automatically analyzing categories in a physical space based on the visual characterization of people |
US8269834B2 (en) | 2007-01-12 | 2012-09-18 | International Business Machines Corporation | Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream |
US8665333B1 (en) * | 2007-01-30 | 2014-03-04 | Videomining Corporation | Method and system for optimizing the observation and annotation of complex human behavior from video sources |
US20080184245A1 (en) * | 2007-01-30 | 2008-07-31 | March Networks Corporation | Method and system for task-based video analytics processing |
EP2119235A4 (en) * | 2007-02-02 | 2011-12-21 | Honeywell Int Inc | Systems and methods for managing live video data |
PL2118864T3 (en) | 2007-02-08 | 2015-03-31 | Behavioral Recognition Sys Inc | Behavioral recognition system |
NZ553146A (en) * | 2007-02-09 | 2011-05-27 | Say Systems Ltd | Improvements relating to monitoring and displaying activities |
US7667596B2 (en) * | 2007-02-16 | 2010-02-23 | Panasonic Corporation | Method and system for scoring surveillance system footage |
US8456528B2 (en) * | 2007-03-20 | 2013-06-04 | International Business Machines Corporation | System and method for managing the interaction of object detection and tracking systems in video surveillance |
US7957565B1 (en) * | 2007-04-05 | 2011-06-07 | Videomining Corporation | Method and system for recognizing employees in a physical space based on automatic behavior analysis |
EP3594853A3 (en) * | 2007-05-03 | 2020-04-08 | Sony Deutschland GmbH | Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device |
US8103109B2 (en) * | 2007-06-19 | 2012-01-24 | Microsoft Corporation | Recognizing hand poses and/or object classes |
WO2009006605A2 (en) * | 2007-07-03 | 2009-01-08 | Pivotal Vision, Llc | Motion-validating remote monitoring system |
US8411935B2 (en) | 2007-07-11 | 2013-04-02 | Behavioral Recognition Systems, Inc. | Semantic representation module of a machine-learning engine in a video analysis system |
US8005262B2 (en) * | 2007-07-16 | 2011-08-23 | Hugh Griffin | System and method for video object identification |
KR101375665B1 (en) * | 2007-08-08 | 2014-03-18 | 삼성전자주식회사 | Method and apparatus for estimating a background change, and method and apparatus for detecting a motion |
US20090062002A1 (en) * | 2007-08-30 | 2009-03-05 | Bay Tek Games, Inc. | Apparatus And Method of Detecting And Tracking Objects In Amusement Games |
US9135491B2 (en) | 2007-08-31 | 2015-09-15 | Accenture Global Services Limited | Digital point-of-sale analyzer |
US7949568B2 (en) * | 2007-08-31 | 2011-05-24 | Accenture Global Services Limited | Determination of product display parameters based on image processing |
US8189855B2 (en) | 2007-08-31 | 2012-05-29 | Accenture Global Services Limited | Planogram extraction based on image processing |
US8009864B2 (en) | 2007-08-31 | 2011-08-30 | Accenture Global Services Limited | Determination of inventory conditions based on image processing |
US8630924B2 (en) * | 2007-08-31 | 2014-01-14 | Accenture Global Services Limited | Detection of stock out conditions based on image processing |
JP2011510521A (en) * | 2007-09-12 | 2011-03-31 | ディジセンサリー・テクノロジーズ・プロプライアタリー・リミテッド | On-chip smart network camera system |
US8300924B2 (en) * | 2007-09-27 | 2012-10-30 | Behavioral Recognition Systems, Inc. | Tracker component for behavioral recognition system |
US8200011B2 (en) * | 2007-09-27 | 2012-06-12 | Behavioral Recognition Systems, Inc. | Context processor for video analysis system |
US8175333B2 (en) * | 2007-09-27 | 2012-05-08 | Behavioral Recognition Systems, Inc. | Estimator identifier component for behavioral recognition system |
US8218811B2 (en) | 2007-09-28 | 2012-07-10 | Uti Limited Partnership | Method and system for video interaction based on motion swarms |
CN101420595B (en) * | 2007-10-23 | 2012-11-21 | 华为技术有限公司 | Method and equipment for describing and capturing video object |
JP5055092B2 (en) * | 2007-11-02 | 2012-10-24 | 株式会社日立国際電気 | Video processing apparatus and video processing method |
US9171454B2 (en) * | 2007-11-14 | 2015-10-27 | Microsoft Technology Licensing, Llc | Magic wand |
US20090137933A1 (en) * | 2007-11-28 | 2009-05-28 | Ishoe | Methods and systems for sensing equilibrium |
KR100936115B1 (en) * | 2007-12-20 | 2010-01-11 | 김세호 | Mouse Activity Measuring Instrument |
US8337404B2 (en) | 2010-10-01 | 2012-12-25 | Flint Hills Scientific, Llc | Detecting, quantifying, and/or classifying seizures using multimodal data |
US8571643B2 (en) | 2010-09-16 | 2013-10-29 | Flint Hills Scientific, Llc | Detecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex |
US8382667B2 (en) | 2010-10-01 | 2013-02-26 | Flint Hills Scientific, Llc | Detecting, quantifying, and/or classifying seizures using multimodal data |
US8098888B1 (en) * | 2008-01-28 | 2012-01-17 | Videomining Corporation | Method and system for automatic analysis of the trip of people in a retail space using multiple cameras |
US20090195382A1 (en) * | 2008-01-31 | 2009-08-06 | Sensormatic Electronics Corporation | Video sensor and alarm system and method with object and event classification |
AU2008200926B2 (en) * | 2008-02-28 | 2011-09-29 | Canon Kabushiki Kaisha | On-camera summarisation of object relationships |
US8284249B2 (en) | 2008-03-25 | 2012-10-09 | International Business Machines Corporation | Real time processing of video frames for triggering an alert |
JP5213237B2 (en) * | 2008-04-17 | 2013-06-19 | パナソニック株式会社 | Imaging position determination method and imaging position determination apparatus |
US9866797B2 (en) | 2012-09-28 | 2018-01-09 | Careview Communications, Inc. | System and method for monitoring a fall state of a patient while minimizing false alarms |
US9579047B2 (en) | 2013-03-15 | 2017-02-28 | Careview Communications, Inc. | Systems and methods for dynamically identifying a patient support surface and patient monitoring |
US10645346B2 (en) | 2013-01-18 | 2020-05-05 | Careview Communications, Inc. | Patient video monitoring systems and methods having detection algorithm recovery from changes in illumination |
US9959471B2 (en) | 2008-05-06 | 2018-05-01 | Careview Communications, Inc. | Patient video monitoring systems and methods for thermal detection of liquids |
US9794523B2 (en) | 2011-12-19 | 2017-10-17 | Careview Communications, Inc. | Electronic patient sitter management system and method for implementing |
US8952894B2 (en) * | 2008-05-12 | 2015-02-10 | Microsoft Technology Licensing, Llc | Computer vision-based multi-touch sensing using infrared lasers |
US8009863B1 (en) | 2008-06-30 | 2011-08-30 | Videomining Corporation | Method and system for analyzing shopping behavior using multiple sensor tracking |
EP2230629A3 (en) | 2008-07-16 | 2012-11-21 | Verint Systems Inc. | A system and method for capturing, storing, analyzing and displaying data relating to the movements of objects |
US8396247B2 (en) * | 2008-07-31 | 2013-03-12 | Microsoft Corporation | Recognizing actions of animate objects in video |
US20100031202A1 (en) * | 2008-08-04 | 2010-02-04 | Microsoft Corporation | User-defined gesture set for surface computing |
US8847739B2 (en) | 2008-08-04 | 2014-09-30 | Microsoft Corporation | Fusing RFID and vision for surface object tracking |
US7710830B2 (en) * | 2008-09-02 | 2010-05-04 | Accuwalk Llc | Outing record device |
US8121968B2 (en) * | 2008-09-11 | 2012-02-21 | Behavioral Recognition Systems, Inc. | Long-term memory in a video analysis system |
US8126833B2 (en) * | 2008-09-11 | 2012-02-28 | Behavioral Recognition Systems, Inc. | Detecting anomalous events using a long-term memory in a video analysis system |
US9633275B2 (en) | 2008-09-11 | 2017-04-25 | Wesley Kenneth Cobb | Pixel-level based micro-feature extraction |
JPWO2010035752A1 (en) * | 2008-09-24 | 2012-02-23 | 株式会社ニコン | Image generation apparatus, imaging apparatus, image reproduction apparatus, and image reproduction program |
US9141862B2 (en) * | 2008-09-26 | 2015-09-22 | Harris Corporation | Unattended surveillance device and associated methods |
US8694443B2 (en) | 2008-11-03 | 2014-04-08 | International Business Machines Corporation | System and method for automatically distinguishing between customers and in-store employees |
DE102008058020A1 (en) * | 2008-11-19 | 2010-05-20 | Zebris Medical Gmbh | Arrangement for training the gear |
US8471899B2 (en) | 2008-12-02 | 2013-06-25 | Careview Communications, Inc. | System and method for documenting patient procedures |
US9373055B2 (en) * | 2008-12-16 | 2016-06-21 | Behavioral Recognition Systems, Inc. | Hierarchical sudden illumination change detection using radiance consistency within a spatial neighborhood |
US20100157051A1 (en) * | 2008-12-23 | 2010-06-24 | International Business Machines Corporation | System and method for detecting and deterring rfid tag related fraud |
US8218877B2 (en) * | 2008-12-23 | 2012-07-10 | National Chiao Tung University | Tracking vehicle method by using image processing |
EP2380143A4 (en) * | 2008-12-24 | 2012-06-13 | Vehicle Monitoring Systems Pty Ltd | Method and system for detecting vehicle offences |
US20100169169A1 (en) * | 2008-12-31 | 2010-07-01 | International Business Machines Corporation | System and method for using transaction statistics to facilitate checkout variance investigation |
US20100182445A1 (en) * | 2009-01-22 | 2010-07-22 | Upi Semiconductor Corporation | Processing Device, Method, And Electronic System Utilizing The Same |
US8295546B2 (en) * | 2009-01-30 | 2012-10-23 | Microsoft Corporation | Pose tracking pipeline |
US8285046B2 (en) * | 2009-02-18 | 2012-10-09 | Behavioral Recognition Systems, Inc. | Adaptive update of background pixel thresholds using sudden illumination change detection |
WO2010099575A1 (en) | 2009-03-04 | 2010-09-10 | Honeywell International Inc. | Systems and methods for managing video data |
US20100293194A1 (en) * | 2009-03-11 | 2010-11-18 | Andersen Timothy L | Discrimination between multi-dimensional models using difference distributions |
US8416296B2 (en) * | 2009-04-14 | 2013-04-09 | Behavioral Recognition Systems, Inc. | Mapper component for multiple art networks in a video analysis system |
WO2010122174A1 (en) * | 2009-04-24 | 2010-10-28 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | System and method for determining the posture of a person |
US9047742B2 (en) * | 2009-05-07 | 2015-06-02 | International Business Machines Corporation | Visual security for point of sale terminals |
US8608481B2 (en) * | 2009-05-13 | 2013-12-17 | Medtronic Navigation, Inc. | Method and apparatus for identifying an instrument location based on measuring a characteristic |
US9417700B2 (en) | 2009-05-21 | 2016-08-16 | Edge3 Technologies | Gesture recognition systems and related methods |
US20100299140A1 (en) * | 2009-05-22 | 2010-11-25 | Cycorp, Inc. | Identifying and routing of documents of potential interest to subscribers using interest determination rules |
US8320619B2 (en) * | 2009-05-29 | 2012-11-27 | Microsoft Corporation | Systems and methods for tracking a model |
US9740977B1 (en) * | 2009-05-29 | 2017-08-22 | Videomining Corporation | Method and system for recognizing the intentions of shoppers in retail aisles based on their trajectories |
US8649594B1 (en) | 2009-06-04 | 2014-02-11 | Agilence, Inc. | Active and adaptive intelligent video surveillance system |
US20100315506A1 (en) * | 2009-06-10 | 2010-12-16 | Microsoft Corporation | Action detection in video through sub-volume mutual information maximization |
US8571259B2 (en) * | 2009-06-17 | 2013-10-29 | Robert Allan Margolis | System and method for automatic identification of wildlife |
US8462987B2 (en) * | 2009-06-23 | 2013-06-11 | Ut-Battelle, Llc | Detecting multiple moving objects in crowded environments with coherent motion regions |
US11004093B1 (en) | 2009-06-29 | 2021-05-11 | Videomining Corporation | Method and system for detecting shopping groups based on trajectory dynamics |
US20100332140A1 (en) * | 2009-06-30 | 2010-12-30 | Jonathan Livingston Joyce | Method of assessing the eating experience of a companion animal |
TWI386239B (en) * | 2009-07-24 | 2013-02-21 | Univ Far East | Animal experiment gait recording method |
JP5350928B2 (en) * | 2009-07-30 | 2013-11-27 | オリンパスイメージング株式会社 | Camera and camera control method |
US8625884B2 (en) * | 2009-08-18 | 2014-01-07 | Behavioral Recognition Systems, Inc. | Visualizing and updating learned event maps in surveillance systems |
US8280153B2 (en) * | 2009-08-18 | 2012-10-02 | Behavioral Recognition Systems | Visualizing and updating learned trajectories in video surveillance systems |
US8493409B2 (en) * | 2009-08-18 | 2013-07-23 | Behavioral Recognition Systems, Inc. | Visualizing and updating sequences and segments in a video surveillance system |
US8340352B2 (en) * | 2009-08-18 | 2012-12-25 | Behavioral Recognition Systems, Inc. | Inter-trajectory anomaly detection using adaptive voting experts in a video surveillance system |
US20110043689A1 (en) * | 2009-08-18 | 2011-02-24 | Wesley Kenneth Cobb | Field-of-view change detection |
US9805271B2 (en) * | 2009-08-18 | 2017-10-31 | Omni Ai, Inc. | Scene preset identification using quadtree decomposition analysis |
US8295591B2 (en) * | 2009-08-18 | 2012-10-23 | Behavioral Recognition Systems, Inc. | Adaptive voting experts for incremental segmentation of sequences with prediction in a video surveillance system |
US8358834B2 (en) * | 2009-08-18 | 2013-01-22 | Behavioral Recognition Systems | Background model for complex and dynamic scenes |
US8379085B2 (en) * | 2009-08-18 | 2013-02-19 | Behavioral Recognition Systems, Inc. | Intra-trajectory anomaly detection using adaptive voting experts in a video surveillance system |
US8797405B2 (en) * | 2009-08-31 | 2014-08-05 | Behavioral Recognition Systems, Inc. | Visualizing and updating classifications in a video surveillance system |
US8167430B2 (en) * | 2009-08-31 | 2012-05-01 | Behavioral Recognition Systems, Inc. | Unsupervised learning of temporal anomalies for a video surveillance system |
US8270733B2 (en) * | 2009-08-31 | 2012-09-18 | Behavioral Recognition Systems, Inc. | Identifying anomalous object types during classification |
US8270732B2 (en) * | 2009-08-31 | 2012-09-18 | Behavioral Recognition Systems, Inc. | Clustering nodes in a self-organizing map using an adaptive resonance theory network |
US8786702B2 (en) | 2009-08-31 | 2014-07-22 | Behavioral Recognition Systems, Inc. | Visualizing and updating long-term memory percepts in a video surveillance system |
US8285060B2 (en) * | 2009-08-31 | 2012-10-09 | Behavioral Recognition Systems, Inc. | Detecting anomalous trajectories in a video surveillance system |
US8218818B2 (en) * | 2009-09-01 | 2012-07-10 | Behavioral Recognition Systems, Inc. | Foreground object tracking |
US8218819B2 (en) * | 2009-09-01 | 2012-07-10 | Behavioral Recognition Systems, Inc. | Foreground object detection in a video surveillance system |
US8180105B2 (en) * | 2009-09-17 | 2012-05-15 | Behavioral Recognition Systems, Inc. | Classifier anomalies for observed behaviors in a video surveillance system |
US8170283B2 (en) * | 2009-09-17 | 2012-05-01 | Behavioral Recognition Systems Inc. | Video surveillance system configured to analyze complex behaviors using alternating layers of clustering and sequencing |
FR2950989B1 (en) * | 2009-10-05 | 2011-10-28 | Alcatel Lucent | DEVICE FOR INTERACTING WITH AN INCREASED OBJECT. |
US8320621B2 (en) * | 2009-12-21 | 2012-11-27 | Microsoft Corporation | Depth projector system with integrated VCSEL array |
JP2011210139A (en) * | 2010-03-30 | 2011-10-20 | Sony Corp | Image processing apparatus and method, and program |
US20110246123A1 (en) * | 2010-03-30 | 2011-10-06 | Welch Allyn, Inc. | Personal status monitoring |
US8831732B2 (en) | 2010-04-29 | 2014-09-09 | Cyberonics, Inc. | Method, apparatus and system for validating and quantifying cardiac beat data quality |
US8562536B2 (en) | 2010-04-29 | 2013-10-22 | Flint Hills Scientific, Llc | Algorithm for detecting a seizure from cardiac data |
US8649871B2 (en) | 2010-04-29 | 2014-02-11 | Cyberonics, Inc. | Validity test adaptive constraint modification for cardiac data used for detection of state changes |
US8396252B2 (en) | 2010-05-20 | 2013-03-12 | Edge 3 Technologies | Systems and related methods for three dimensional gesture recognition in vehicles |
EP2395456A1 (en) * | 2010-06-12 | 2011-12-14 | Toyota Motor Europe NV/SA | Methods and systems for semantic label propagation |
US8670029B2 (en) | 2010-06-16 | 2014-03-11 | Microsoft Corporation | Depth camera illuminator with superluminescent light-emitting diode |
US8483481B2 (en) | 2010-07-27 | 2013-07-09 | International Business Machines Corporation | Foreground analysis based on tracking information |
US8641646B2 (en) | 2010-07-30 | 2014-02-04 | Cyberonics, Inc. | Seizure detection using coordinate data |
EP2580738A4 (en) * | 2010-08-10 | 2018-01-03 | LG Electronics Inc. | Region of interest based video synopsis |
US8582866B2 (en) | 2011-02-10 | 2013-11-12 | Edge 3 Technologies, Inc. | Method and apparatus for disparity computation in stereo images |
US8467599B2 (en) | 2010-09-02 | 2013-06-18 | Edge 3 Technologies, Inc. | Method and apparatus for confusion learning |
US8666144B2 (en) | 2010-09-02 | 2014-03-04 | Edge 3 Technologies, Inc. | Method and apparatus for determining disparity of texture |
US8655093B2 (en) | 2010-09-02 | 2014-02-18 | Edge 3 Technologies, Inc. | Method and apparatus for performing segmentation of an image |
WO2012040554A2 (en) | 2010-09-23 | 2012-03-29 | Stryker Corporation | Video monitoring system |
US8684921B2 (en) | 2010-10-01 | 2014-04-01 | Flint Hills Scientific Llc | Detecting, assessing and managing epilepsy using a multi-variate, metric-based classification analysis |
US20120094600A1 (en) | 2010-10-19 | 2012-04-19 | Welch Allyn, Inc. | Platform for patient monitoring |
KR20120052739A (en) * | 2010-11-16 | 2012-05-24 | 삼성전자주식회사 | Display driving device and method for compressing and decompressing image data in the same |
US9432639B2 (en) * | 2010-11-19 | 2016-08-30 | Honeywell International Inc. | Security video detection of personal distress and gesture commands |
JP5718632B2 (en) * | 2010-12-22 | 2015-05-13 | 綜合警備保障株式会社 | Part recognition device, part recognition method, and part recognition program |
AU2010257454B2 (en) * | 2010-12-24 | 2014-03-06 | Canon Kabushiki Kaisha | Summary view of video objects sharing common attributes |
GB2486913A (en) * | 2010-12-30 | 2012-07-04 | Delaval Holding Ab | Control and monitoring system for an animal installation |
US11080513B2 (en) * | 2011-01-12 | 2021-08-03 | Gary S. Shuster | Video and still image data alteration to enhance privacy |
US8970589B2 (en) | 2011-02-10 | 2015-03-03 | Edge 3 Technologies, Inc. | Near-touch interaction with a stereo camera grid structured tessellations |
US10025388B2 (en) * | 2011-02-10 | 2018-07-17 | Continental Automotive Systems, Inc. | Touchless human machine interface |
US9504390B2 (en) | 2011-03-04 | 2016-11-29 | Globalfoundries Inc. | Detecting, assessing and managing a risk of death in epilepsy |
US9498162B2 (en) | 2011-04-25 | 2016-11-22 | Cyberonics, Inc. | Identifying seizures using heart data from two or more windows |
US9402550B2 (en) | 2011-04-29 | 2016-08-02 | Cybertronics, Inc. | Dynamic heart rate threshold for neurological event detection |
TWI454150B (en) * | 2011-05-06 | 2014-09-21 | Altek Corp | Processing method for image file |
US8810640B2 (en) | 2011-05-16 | 2014-08-19 | Ut-Battelle, Llc | Intrinsic feature-based pose measurement for imaging motion compensation |
DE102011101939A1 (en) * | 2011-05-18 | 2012-11-22 | Biobserve GmbH | A method of creating a behavioral analysis of a rodent in an arena and method of creating an image of the rodent |
CN102221996A (en) * | 2011-05-20 | 2011-10-19 | 广州市久邦数码科技有限公司 | Implementation method for performing interaction between dynamic wallpaper and desktop component |
CN103608854B (en) * | 2011-05-30 | 2016-12-28 | 皇家飞利浦有限公司 | Equipment and method for the detection body gesture when sleep |
US8526734B2 (en) | 2011-06-01 | 2013-09-03 | Microsoft Corporation | Three-dimensional background removal for vision system |
US9594430B2 (en) | 2011-06-01 | 2017-03-14 | Microsoft Technology Licensing, Llc | Three-dimensional foreground selection for vision system |
CN102831442A (en) * | 2011-06-13 | 2012-12-19 | 索尼公司 | Abnormal behavior detection method and equipment and method and equipment for generating abnormal behavior detection equipment |
RU2455676C2 (en) * | 2011-07-04 | 2012-07-10 | Общество с ограниченной ответственностью "ТРИДИВИ" | Method of controlling device using gestures and 3d sensor for realising said method |
WO2013018070A1 (en) | 2011-08-03 | 2013-02-07 | Yeda Research And Development Co. Ltd. | Method for automatic behavioral phenotyping |
EP2747641A4 (en) | 2011-08-26 | 2015-04-01 | Kineticor Inc | Methods, systems, and devices for intra-scan motion correction |
KR101903407B1 (en) * | 2011-09-08 | 2018-10-02 | 엘지전자 주식회사 | Health care system based on video in remote health care solution and method for providing health care service |
TWI590193B (en) * | 2011-09-29 | 2017-07-01 | 國立清華大學 | Image method for classifying insects and insect classifying process |
US9124783B2 (en) | 2011-09-30 | 2015-09-01 | Camiolog, Inc. | Method and system for automated labeling at scale of motion-detected events in video surveillance |
US9549677B2 (en) | 2011-10-14 | 2017-01-24 | Flint Hills Scientific, L.L.C. | Seizure detection methods, apparatus, and systems using a wavelet transform maximum modulus algorithm |
US8442265B1 (en) * | 2011-10-19 | 2013-05-14 | Facebook Inc. | Image selection from captured video sequence based on social components |
US8437500B1 (en) * | 2011-10-19 | 2013-05-07 | Facebook Inc. | Preferred images from captured video sequence |
US9177208B2 (en) * | 2011-11-04 | 2015-11-03 | Google Inc. | Determining feature vectors for video volumes |
US9672609B1 (en) | 2011-11-11 | 2017-06-06 | Edge 3 Technologies, Inc. | Method and apparatus for improved depth-map estimation |
US20130273969A1 (en) * | 2011-12-01 | 2013-10-17 | Finding Rover, Inc. | Mobile app that generates a dog sound to capture data for a lost pet identifying system |
US9342735B2 (en) * | 2011-12-01 | 2016-05-17 | Finding Rover, Inc. | Facial recognition lost pet identifying system |
JP5868426B2 (en) * | 2011-12-13 | 2016-02-24 | 株式会社日立製作所 | How to estimate the orientation of a stationary person |
US20150030252A1 (en) * | 2011-12-16 | 2015-01-29 | The Research Foundation For The State University Of New York | Methods of recognizing activity in video |
US8811938B2 (en) | 2011-12-16 | 2014-08-19 | Microsoft Corporation | Providing a user interface experience based on inferred vehicle state |
AU2012355375A1 (en) * | 2011-12-19 | 2014-07-10 | Birds In The Hand, Llc | Method and system for sharing object information |
US8948449B2 (en) * | 2012-02-06 | 2015-02-03 | GM Global Technology Operations LLC | Selecting visible regions in nighttime images for performing clear path detection |
US10076109B2 (en) | 2012-02-14 | 2018-09-18 | Noble Research Institute, Llc | Systems and methods for trapping animals |
IN2014DN08349A (en) | 2012-03-15 | 2015-05-08 | Behavioral Recognition Sys Inc | |
CN103325124B (en) * | 2012-03-21 | 2015-11-04 | 东北大学 | A kind of background subtraction target detection tracker based on FPGA and method |
US9317751B2 (en) * | 2012-04-18 | 2016-04-19 | Vixs Systems, Inc. | Video processing system with video to text description generation, search system and methods for use therewith |
US10448839B2 (en) | 2012-04-23 | 2019-10-22 | Livanova Usa, Inc. | Methods, systems and apparatuses for detecting increased risk of sudden death |
US9681836B2 (en) | 2012-04-23 | 2017-06-20 | Cyberonics, Inc. | Methods, systems and apparatuses for detecting seizure and non-seizure states |
WO2013170129A1 (en) * | 2012-05-10 | 2013-11-14 | President And Fellows Of Harvard College | A system and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals |
TWI484941B (en) * | 2012-05-10 | 2015-05-21 | Animal gait detection system and method | |
RU2531876C2 (en) * | 2012-05-15 | 2014-10-27 | Общество с ограниченной ответственностью "Синезис" | Indexing method of video data by means of card |
US9911043B2 (en) | 2012-06-29 | 2018-03-06 | Omni Ai, Inc. | Anomalous object interaction detection and reporting |
US9111353B2 (en) | 2012-06-29 | 2015-08-18 | Behavioral Recognition Systems, Inc. | Adaptive illuminance filter in a video analysis system |
US9317908B2 (en) | 2012-06-29 | 2016-04-19 | Behavioral Recognition System, Inc. | Automatic gain control filter in a video analysis system |
US9723271B2 (en) | 2012-06-29 | 2017-08-01 | Omni Ai, Inc. | Anomalous stationary object detection and reporting |
US9113143B2 (en) | 2012-06-29 | 2015-08-18 | Behavioral Recognition Systems, Inc. | Detecting and responding to an out-of-focus camera in a video analytics system |
EP2867860A4 (en) | 2012-06-29 | 2016-07-27 | Behavioral Recognition Sys Inc | Unsupervised learning of feature anomalies for a video surveillance system |
WO2014031615A1 (en) | 2012-08-20 | 2014-02-27 | Behavioral Recognition Systems, Inc. | Method and system for detecting sea-surface oil |
CN104641248A (en) * | 2012-09-06 | 2015-05-20 | 三立方有限公司 | Position and behavioral tracking system and uses thereof |
US20140122488A1 (en) * | 2012-10-29 | 2014-05-01 | Elwha Llc | Food Supply Chain Automation Farm Testing System And Method |
US20140121807A1 (en) | 2012-10-29 | 2014-05-01 | Elwha Llc | Food Supply Chain Automation Farm Tracking System and Method |
CN104662585B (en) * | 2012-09-25 | 2017-06-13 | Sk电信有限公司 | The method and the event monitoring device using the method for event rules are set |
WO2014053436A1 (en) * | 2012-10-01 | 2014-04-10 | Stephan Hammelbacher | Method and device for organising at least one object |
US10860683B2 (en) * | 2012-10-25 | 2020-12-08 | The Research Foundation For The State University Of New York | Pattern change discovery between high dimensional data sets |
US9232140B2 (en) | 2012-11-12 | 2016-01-05 | Behavioral Recognition Systems, Inc. | Image stabilization techniques for video surveillance systems |
DE102012111452B3 (en) * | 2012-11-27 | 2014-03-20 | Karlsruher Institut für Technologie | Optical arrangement for recording e.g. behaviors of non-human biological object in biology field, has planar matrix forming pattern, and optical filter passing wavelength of light beam, where wavelength lying within internal is weakened |
CA2892753A1 (en) * | 2012-12-02 | 2014-06-05 | Agricam Ab | Systems and methods for predicting the outcome of a state of a subject |
US10220211B2 (en) | 2013-01-22 | 2019-03-05 | Livanova Usa, Inc. | Methods and systems to diagnose depression |
US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
CN109008972A (en) | 2013-02-01 | 2018-12-18 | 凯内蒂科尔股份有限公司 | The motion tracking system of real-time adaptive motion compensation in biomedical imaging |
KR101993010B1 (en) * | 2013-02-28 | 2019-06-25 | 고려대학교 산학협력단 | Method and apparatus for analyzing video based on spatiotemporal patterns |
KR101930990B1 (en) | 2013-03-01 | 2018-12-19 | 클레버펫 엘엘씨 | Animal interaction device, system, and method |
EP2777490B1 (en) * | 2013-03-11 | 2021-12-08 | Biomedical International R + D GmbH | Non-invasive temperature and physical activity measurement of animals |
JP6057786B2 (en) * | 2013-03-13 | 2017-01-11 | ヤフー株式会社 | Time-series data analysis device, time-series data analysis method, and program |
US10721448B2 (en) | 2013-03-15 | 2020-07-21 | Edge 3 Technologies, Inc. | Method and apparatus for adaptive exposure bracketing, segmentation and scene organization |
KR102203884B1 (en) * | 2013-04-12 | 2021-01-15 | 삼성전자주식회사 | Imaging apparatus and controlling method thereof |
US10346680B2 (en) * | 2013-04-12 | 2019-07-09 | Samsung Electronics Co., Ltd. | Imaging apparatus and control method for determining a posture of an object |
AU2014302060B2 (en) * | 2013-06-28 | 2017-08-31 | The United States Of America, As Represented By The Secretary, Department Of Health And Human Services | Systems and methods of video monitoring for vivarium cages |
AU2014290148A1 (en) * | 2013-07-16 | 2016-02-11 | Pinterest, Inc. | Object based contextual menu controls |
US20150022329A1 (en) * | 2013-07-16 | 2015-01-22 | Forget You Not, LLC | Assisted Animal Communication |
EP3031004A4 (en) | 2013-08-09 | 2016-08-24 | Behavioral Recognition Sys Inc | Cognitive information security using a behavior recognition system |
US9117144B2 (en) | 2013-08-14 | 2015-08-25 | Qualcomm Incorporated | Performing vocabulary-based visual search using multi-resolution feature descriptors |
CN103488148B (en) * | 2013-09-24 | 2016-03-09 | 华北电力大学(保定) | A kind of animal behavior intelligent monitor system based on Internet of Things and computer vision |
US9355306B2 (en) * | 2013-09-27 | 2016-05-31 | Konica Minolta Laboratory U.S.A., Inc. | Method and system for recognition of abnormal behavior |
WO2015066460A2 (en) | 2013-11-01 | 2015-05-07 | Children's Medical Center Corporation | Devices and methods for analyzing rodent behavior |
CN105745598B (en) * | 2013-11-27 | 2019-10-01 | 惠普发展公司,有限责任合伙企业 | Determine the shape of the expression of object |
US9230159B1 (en) * | 2013-12-09 | 2016-01-05 | Google Inc. | Action recognition and detection on videos |
US9495601B2 (en) | 2013-12-09 | 2016-11-15 | Mirsani, LLC | Detecting and reporting improper activity involving a vehicle |
US9396256B2 (en) * | 2013-12-13 | 2016-07-19 | International Business Machines Corporation | Pattern based audio searching method and system |
CN103676886A (en) * | 2013-12-17 | 2014-03-26 | 山东大学 | Standardized henhouse environment and breeding information monitoring and managing system |
JP6411373B2 (en) * | 2013-12-17 | 2018-10-24 | シャープ株式会社 | Recognition data transmission device, recognition data recording device, and recognition data recording method |
AU2013273778A1 (en) * | 2013-12-20 | 2015-07-09 | Canon Kabushiki Kaisha | Text line fragments for text line analysis |
US10986223B1 (en) | 2013-12-23 | 2021-04-20 | Massachusetts Mutual Life Insurance | Systems and methods for presenting content based on user behavior |
US10178222B1 (en) | 2016-03-22 | 2019-01-08 | Massachusetts Mutual Life Insurance Company | Systems and methods for presenting content based on user behavior |
SG10201501052XA (en) * | 2014-02-11 | 2015-09-29 | Agency Science Tech & Res | Method And System For Monitoring Activity Of An Animal |
DE102014203749A1 (en) * | 2014-02-28 | 2015-09-17 | Robert Bosch Gmbh | Method and device for monitoring at least one interior of a building and assistance system for at least one interior of a building |
WO2015138384A1 (en) | 2014-03-10 | 2015-09-17 | Gojo Industries, Inc. | Hygiene tracking compliance |
CN106572810A (en) | 2014-03-24 | 2017-04-19 | 凯内蒂科尔股份有限公司 | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
US9237743B2 (en) | 2014-04-18 | 2016-01-19 | The Samuel Roberts Noble Foundation, Inc. | Systems and methods for trapping animals |
CN104079872A (en) * | 2014-05-16 | 2014-10-01 | 大连理工大学 | Video image processing and human-computer interaction method based on content |
US9843360B2 (en) | 2014-08-14 | 2017-12-12 | Sony Corporation | Method and system for use in configuring multiple near field antenna systems |
US10277280B2 (en) | 2014-05-29 | 2019-04-30 | Sony Interactive Entertainment LLC | Configuration of data and power transfer in near field communications |
US9577463B2 (en) | 2014-05-29 | 2017-02-21 | Sony Corporation | Portable device to portable device wireless power transfer methods and systems |
US10965159B2 (en) | 2014-05-29 | 2021-03-30 | Sony Corporation | Scalable antenna system |
TWI562636B (en) * | 2014-06-16 | 2016-12-11 | Altek Semiconductor Corp | Image capture apparatus and image compensating method thereof |
WO2015198767A1 (en) * | 2014-06-27 | 2015-12-30 | 日本電気株式会社 | Abnormality detection device and abnormality detection method |
US9361802B2 (en) | 2014-07-16 | 2016-06-07 | Sony Corporation | Vehicle ad hoc network (VANET) |
US9906897B2 (en) | 2014-07-16 | 2018-02-27 | Sony Corporation | Applying mesh network to pet carriers |
US9900748B2 (en) | 2014-07-16 | 2018-02-20 | Sony Corporation | Consumer electronics (CE) device and related method for providing stadium services |
US10127601B2 (en) | 2014-07-16 | 2018-11-13 | Sony Corporation | Mesh network applied to fixed establishment with movable items therein |
US9426610B2 (en) | 2014-07-16 | 2016-08-23 | Sony Corporation | Applying mesh network to luggage |
US9516461B2 (en) | 2014-07-16 | 2016-12-06 | Sony Corporation | Mesh network applied to arena events |
EP3188660A4 (en) | 2014-07-23 | 2018-05-16 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
WO2016029893A1 (en) * | 2014-08-29 | 2016-03-03 | Csb-System Ag | Apparatus and method for assessing compliance with animal welfare on an animal for slaughter |
CA2908992A1 (en) | 2014-10-22 | 2016-04-22 | Parsin Haji Reza | Photoacoustic remote sensing (pars) |
US9449230B2 (en) * | 2014-11-26 | 2016-09-20 | Zepp Labs, Inc. | Fast object tracking framework for sports video recognition |
CN104469299A (en) * | 2014-12-02 | 2015-03-25 | 柳州市瑞蚨电子科技有限公司 | Network camera shooting device |
CN104361724B (en) * | 2014-12-03 | 2017-01-18 | 京东方科技集团股份有限公司 | Device and method for detecting peeing of baby |
US10409910B2 (en) | 2014-12-12 | 2019-09-10 | Omni Ai, Inc. | Perceptual associative memory for a neuro-linguistic behavior recognition system |
US10409909B2 (en) | 2014-12-12 | 2019-09-10 | Omni Ai, Inc. | Lexical analyzer for a neuro-linguistic behavior recognition system |
US9305216B1 (en) * | 2014-12-15 | 2016-04-05 | Amazon Technologies, Inc. | Context-based detection and classification of actions |
US9710712B2 (en) | 2015-01-16 | 2017-07-18 | Avigilon Fortress Corporation | System and method for detecting, tracking, and classifiying objects |
US10380486B2 (en) * | 2015-01-20 | 2019-08-13 | International Business Machines Corporation | Classifying entities by behavior |
US10121064B2 (en) | 2015-04-16 | 2018-11-06 | California Institute Of Technology | Systems and methods for behavior detection using 3D tracking and machine learning |
US9619701B2 (en) | 2015-05-20 | 2017-04-11 | Xerox Corporation | Using motion tracking and image categorization for document indexing and validation |
UA124378C2 (en) * | 2015-07-01 | 2021-09-08 | Вікінг Генетікс Фмба | System and method for identification of individual animals based on images of the back |
US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
US10650228B2 (en) | 2015-09-18 | 2020-05-12 | Children's Medical Center Corporation | Devices and methods for analyzing animal behavior |
CA3001063C (en) | 2015-10-14 | 2023-09-19 | President And Fellows Of Harvard College | A method for analyzing motion of a subject representative of behaviour, and classifying animal behaviour |
GB2544324A (en) * | 2015-11-13 | 2017-05-17 | Cathx Res Ltd | Method and system for processing image data |
WO2017091479A1 (en) | 2015-11-23 | 2017-06-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
KR101817583B1 (en) * | 2015-11-30 | 2018-01-12 | 한국생산기술연구원 | System and method for analyzing behavior pattern using depth image |
CN105574501B (en) * | 2015-12-15 | 2019-03-15 | 上海微桥电子科技有限公司 | A kind of stream of people's video detecting analysis system |
CN105512640B (en) * | 2015-12-30 | 2019-04-02 | 重庆邮电大学 | A kind of people flow rate statistical method based on video sequence |
US10327646B2 (en) | 2016-02-02 | 2019-06-25 | Illumisonics Inc. | Non-interferometric photoacoustic remote sensing (NI-PARS) |
US9993182B2 (en) | 2016-02-19 | 2018-06-12 | Conduent Business Services, Llc | Computer vision system for ambient long-term gait assessment |
US11036219B2 (en) | 2016-02-22 | 2021-06-15 | Ketchup On, Inc. | Self-propelled device |
US9600717B1 (en) * | 2016-02-25 | 2017-03-21 | Zepp Labs, Inc. | Real-time single-view action recognition based on key pose analysis for sports videos |
CN105809711B (en) * | 2016-03-02 | 2019-03-12 | 华南农业大学 | A kind of pig movement big data extracting method and its system based on video frequency tracking |
US10750717B2 (en) * | 2016-03-04 | 2020-08-25 | Indiana University Research And Technology Corporation | Method and apparatus for spatial cognitive assessment of a lab animal |
CN105828030A (en) * | 2016-03-14 | 2016-08-03 | 珠海经济特区远宏科技有限公司 | Video investigation mobile terminal system |
CA3017518A1 (en) * | 2016-03-18 | 2017-09-21 | President And Fellows Of Harvard College | Automatically classifying animal behavior |
CN107221133B (en) * | 2016-03-22 | 2018-12-11 | 杭州海康威视数字技术股份有限公司 | A kind of area monitoring alarm system and alarm method |
US10306311B1 (en) | 2016-03-24 | 2019-05-28 | Massachusetts Mutual Life Insurance Company | Intelligent and context aware reading systems |
US10360254B1 (en) | 2016-03-24 | 2019-07-23 | Massachusetts Mutual Life Insurance Company | Intelligent and context aware reading systems |
CN105894536A (en) * | 2016-03-30 | 2016-08-24 | 中国农业大学 | Method and system for analyzing livestock behaviors on the basis of video tracking |
US9576205B1 (en) * | 2016-03-31 | 2017-02-21 | Pointgrab Ltd. | Method and system for determining location of an occupant |
CN108780576B (en) | 2016-04-06 | 2022-02-25 | 赫尔实验室有限公司 | System and method for ghost removal in video segments using object bounding boxes |
US10241514B2 (en) | 2016-05-11 | 2019-03-26 | Brain Corporation | Systems and methods for initializing a robot to autonomously travel a trained route |
US10282849B2 (en) | 2016-06-17 | 2019-05-07 | Brain Corporation | Systems and methods for predictive/reconstructive visual object tracker |
US10016896B2 (en) | 2016-06-30 | 2018-07-10 | Brain Corporation | Systems and methods for robotic behavior around moving bodies |
IL247101B (en) | 2016-08-03 | 2018-10-31 | Pointgrab Ltd | Method and system for detecting an occupant in an image |
CN106264569B (en) * | 2016-08-10 | 2020-03-06 | 深圳先进技术研究院 | Shared emotion nerve experiment system based on observational fear acquisition |
WO2018080547A1 (en) | 2016-10-31 | 2018-05-03 | Hewlett-Packard Development Company, L.P. | Video monitoring |
US10274325B2 (en) | 2016-11-01 | 2019-04-30 | Brain Corporation | Systems and methods for robotic mapping |
US10001780B2 (en) | 2016-11-02 | 2018-06-19 | Brain Corporation | Systems and methods for dynamic route planning in autonomous navigation |
CN106778537B (en) * | 2016-11-28 | 2021-02-02 | 中国科学院心理研究所 | Animal social network structure acquisition and analysis system and method based on image processing |
US10723018B2 (en) | 2016-11-28 | 2020-07-28 | Brain Corporation | Systems and methods for remote operating and/or monitoring of a robot |
US20180150697A1 (en) * | 2017-01-09 | 2018-05-31 | Seematics Systems Ltd | System and method for using subsequent behavior to facilitate learning of visual event detectors |
US10713792B1 (en) * | 2017-01-13 | 2020-07-14 | Amazon Technologies, Inc. | System and apparatus for image processing |
US10852730B2 (en) | 2017-02-08 | 2020-12-01 | Brain Corporation | Systems and methods for robotic mobile platforms |
US10310471B2 (en) * | 2017-02-28 | 2019-06-04 | Accenture Global Solutions Limited | Content recognition and communication system |
US10627338B2 (en) | 2017-03-23 | 2020-04-21 | Illumisonics Inc. | Camera-based photoacoustic remote sensing (C-PARS) |
JP6909960B2 (en) * | 2017-03-31 | 2021-07-28 | パナソニックIpマネジメント株式会社 | Detection device, detection method and detection program |
WO2018185718A1 (en) * | 2017-04-07 | 2018-10-11 | Smaluet Solutions Private Limited | A device and a method of learning a behavior of a pet in response to instructions provided to the pet |
US10034645B1 (en) * | 2017-04-13 | 2018-07-31 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and methods for detecting complex networks in MRI image data |
US10373316B2 (en) * | 2017-04-20 | 2019-08-06 | Ford Global Technologies, Llc | Images background subtraction for dynamic lighting scenarios |
WO2018208319A1 (en) | 2017-05-12 | 2018-11-15 | Children's Medical Center Corporation | Devices and methods for analyzing animal behavior |
CN107292889B (en) * | 2017-06-14 | 2020-09-25 | 上海联影医疗科技有限公司 | Tumor segmentation method, system and readable medium |
US10157476B1 (en) * | 2017-06-15 | 2018-12-18 | Satori Worldwide, Llc | Self-learning spatial recognition system |
US10482613B2 (en) | 2017-07-06 | 2019-11-19 | Wisconsin Alumni Research Foundation | Movement monitoring system |
US10810414B2 (en) | 2017-07-06 | 2020-10-20 | Wisconsin Alumni Research Foundation | Movement monitoring system |
US11450148B2 (en) | 2017-07-06 | 2022-09-20 | Wisconsin Alumni Research Foundation | Movement monitoring system |
WO2019028016A1 (en) * | 2017-07-31 | 2019-02-07 | Cubic Corporation | Automated scenario recognition and reporting using neural networks |
US10489654B1 (en) * | 2017-08-04 | 2019-11-26 | Amazon Technologies, Inc. | Video analysis method and system |
US10445694B2 (en) | 2017-08-07 | 2019-10-15 | Standard Cognition, Corp. | Realtime inventory tracking using deep learning |
CN109583452B (en) * | 2017-09-29 | 2021-02-19 | 大连恒锐科技股份有限公司 | Human identity identification method and system based on barefoot footprints |
CN107751011B (en) * | 2017-11-07 | 2020-12-04 | 山东天智信息科技有限公司 | Drinking water equipment based on drinking water adjunctie therapy |
CN108133737B (en) * | 2017-12-26 | 2021-08-31 | 深圳先进技术研究院 | Rodent fear experiment video analysis method and device |
BR102018002876A2 (en) * | 2018-02-14 | 2019-09-10 | Guimaraes Hummig Ednilson | object locating platform |
CN108401177B (en) * | 2018-02-27 | 2021-04-27 | 上海哔哩哔哩科技有限公司 | Video playing method, server and video playing system |
EP3769510A1 (en) | 2018-05-07 | 2021-01-27 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US11132893B2 (en) * | 2018-05-11 | 2021-09-28 | Seagate Technology, Llc | Multi-sensor edge computing system |
CN108664942B (en) * | 2018-05-17 | 2021-10-22 | 西安理工大学 | Extraction method of mouse video multi-dimensional characteristic values and video classification method |
CN110505412B (en) * | 2018-05-18 | 2021-01-29 | 杭州海康威视数字技术股份有限公司 | Method and device for calculating brightness value of region of interest |
US11576348B2 (en) | 2018-05-21 | 2023-02-14 | Companion Labs, Inc. | Method for autonomously training an animal to respond to oral commands |
US11700836B2 (en) | 2018-05-21 | 2023-07-18 | Companion Labs, Inc. | System and method for characterizing and monitoring health of an animal based on gait and postural movements |
US11205508B2 (en) | 2018-05-23 | 2021-12-21 | Verb Surgical Inc. | Machine-learning-oriented surgical video analysis system |
CN108846326A (en) * | 2018-05-23 | 2018-11-20 | 盐城工学院 | The recognition methods of pig posture, device and electronic equipment |
US10905105B2 (en) * | 2018-06-19 | 2021-02-02 | Farm Jenny LLC | Farm asset tracking, monitoring, and alerts |
WO2020018469A1 (en) * | 2018-07-16 | 2020-01-23 | The Board Of Trustees Of The Leland Stanford Junior University | System and method for automatic evaluation of gait using single or multi-camera recordings |
US11048973B1 (en) | 2018-07-31 | 2021-06-29 | Objectvideo Labs, Llc | Action classification using aggregated background subtraction images |
US10810432B2 (en) * | 2018-08-02 | 2020-10-20 | Motorola Solutions, Inc. | Methods and systems for differentiating one or more objects in a video |
CN109272518B (en) * | 2018-08-17 | 2020-05-05 | 东南大学 | Morris water maze experiment image analysis system and method |
US10769799B2 (en) * | 2018-08-24 | 2020-09-08 | Ford Global Technologies, Llc | Foreground detection |
US10679743B2 (en) | 2018-09-12 | 2020-06-09 | Verb Surgical Inc. | Method and system for automatically tracking and managing inventory of surgical tools in operating rooms |
US11803974B2 (en) | 2018-10-05 | 2023-10-31 | The Trustees Of Princeton University | Automated system to measure multi-animal body part dynamics |
US11715308B2 (en) * | 2018-10-10 | 2023-08-01 | Delaval Holding Ab | Animal identification using vision techniques |
CN113163733A (en) * | 2018-10-17 | 2021-07-23 | 集团罗-曼公司 | Livestock monitoring equipment |
US11312594B2 (en) | 2018-11-09 | 2022-04-26 | Otis Elevator Company | Conveyance system video analytics |
US11093749B2 (en) | 2018-12-20 | 2021-08-17 | L'oreal | Analysis and feedback system for personal care routines |
CN109784208B (en) * | 2018-12-26 | 2023-04-18 | 武汉工程大学 | Image-based pet behavior detection method |
JP7297455B2 (en) * | 2019-01-31 | 2023-06-26 | キヤノン株式会社 | Image processing device, image processing method, and program |
CN111614703A (en) * | 2019-02-25 | 2020-09-01 | 南京爱体智能科技有限公司 | Method for combining Internet of things sensor with video analysis |
CN109831634A (en) * | 2019-02-28 | 2019-05-31 | 北京明略软件系统有限公司 | The density information of target object determines method and device |
US11331006B2 (en) | 2019-03-05 | 2022-05-17 | Physmodo, Inc. | System and method for human motion detection and tracking |
WO2020181136A1 (en) | 2019-03-05 | 2020-09-10 | Physmodo, Inc. | System and method for human motion detection and tracking |
WO2020188386A1 (en) * | 2019-03-15 | 2020-09-24 | Illumisonics Inc. | Single source photoacoustic remote sensing (ss-pars) |
KR102228350B1 (en) * | 2019-05-03 | 2021-03-16 | 주식회사 엘지유플러스 | Apparatus and method for monitering pet |
US11363071B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User interfaces for managing a local network |
US10904029B2 (en) | 2019-05-31 | 2021-01-26 | Apple Inc. | User interfaces for managing controllable external devices |
US20200383299A1 (en) * | 2019-06-06 | 2020-12-10 | Edgar Josue Bermudez Contreras | Systems and methods of homecage monitoring |
CN110516535A (en) * | 2019-07-12 | 2019-11-29 | 杭州电子科技大学 | A kind of mouse liveness detection method and system and hygienic appraisal procedure based on deep learning |
CN110427865B (en) * | 2019-07-29 | 2023-08-25 | 三峡大学 | Human behavior video feature picture extraction and reconstruction method for high-voltage forbidden region |
CN110301364B (en) * | 2019-08-02 | 2021-09-24 | 中国人民解放军军事科学院军事医学研究院 | Experiment box for research on social behaviors of mice |
CN110456831B (en) * | 2019-08-16 | 2022-06-14 | 南开大学 | Mouse contact behavior tracking platform based on active vision |
CN110490161B (en) * | 2019-08-23 | 2022-01-07 | 安徽农业大学 | Captive animal behavior analysis method based on deep learning |
US11213015B2 (en) * | 2019-09-17 | 2022-01-04 | Eknauth Persaud | System and a method of lab animal observation |
US11321927B1 (en) * | 2019-09-23 | 2022-05-03 | Apple Inc. | Temporal segmentation |
AU2020104459A4 (en) * | 2019-10-14 | 2021-10-28 | TBIAS Pty Ltd | An automated behavioural monitoring unit |
TWI731442B (en) * | 2019-10-18 | 2021-06-21 | 宏碁股份有限公司 | Electronic apparatus and object information recognition method by using touch data thereof |
US11557151B2 (en) | 2019-10-24 | 2023-01-17 | Deere & Company | Object identification on a mobile work machine |
CN112764594B (en) * | 2019-11-01 | 2023-06-09 | 宏碁股份有限公司 | Electronic device and object information identification method using touch data thereof |
US11587361B2 (en) | 2019-11-08 | 2023-02-21 | Wisconsin Alumni Research Foundation | Movement monitoring system |
US11109586B2 (en) * | 2019-11-13 | 2021-09-07 | Bird Control Group, Bv | System and methods for automated wildlife detection, monitoring and control |
CN110866559A (en) * | 2019-11-14 | 2020-03-06 | 上海中信信息发展股份有限公司 | Poultry behavior analysis method and device |
US11284824B2 (en) * | 2019-12-02 | 2022-03-29 | Everseen Limited | Method and system for determining a human social behavior classification |
CN111062436B (en) * | 2019-12-15 | 2024-04-16 | 深圳市具安科技有限公司 | Analysis method and device for cockroach mating behavior, computer equipment and storage medium |
AU2019479570A1 (en) | 2019-12-19 | 2022-08-18 | Illumisonics Inc. | Photoacoustic remote sensing (PARS), and related methods of use |
US11755989B2 (en) | 2020-03-27 | 2023-09-12 | Toshiba Global Commerce Solutions Holdings Corporation | Preventing theft at retail stores |
CN113469180A (en) * | 2020-03-31 | 2021-10-01 | 阿里巴巴集团控股有限公司 | Medical image processing method and system and data processing method |
US11482049B1 (en) | 2020-04-14 | 2022-10-25 | Bank Of America Corporation | Media verification system |
US11238634B2 (en) * | 2020-04-28 | 2022-02-01 | Adobe Inc. | Motion model refinement based on contact analysis and optimization |
US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
US11786128B2 (en) | 2020-06-18 | 2023-10-17 | Illumisonics Inc. | PARS imaging methods |
US11122978B1 (en) | 2020-06-18 | 2021-09-21 | Illumisonics Inc. | PARS imaging methods |
US11589010B2 (en) | 2020-06-03 | 2023-02-21 | Apple Inc. | Camera and visitor user interfaces |
US11657614B2 (en) | 2020-06-03 | 2023-05-23 | Apple Inc. | Camera and visitor user interfaces |
US11918370B2 (en) * | 2020-06-10 | 2024-03-05 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and methods for estimation of Parkinson's Disease gait impairment severity from videos using MDS-UPDRS |
CN111967321B (en) * | 2020-07-15 | 2024-04-05 | 菜鸟智能物流控股有限公司 | Video data processing method, device, electronic equipment and storage medium |
CN116075870A (en) * | 2020-07-30 | 2023-05-05 | 杰克逊实验室 | Automated phenotypic analysis of behavior |
WO2022035424A1 (en) * | 2020-08-11 | 2022-02-17 | Hitachi America, Ltd. | Situation recognition method and system for manufacturing collaborative robots |
EP4189682A1 (en) | 2020-09-05 | 2023-06-07 | Apple Inc. | User interfaces for managing audio for media items |
CN112189588B (en) * | 2020-10-10 | 2022-04-05 | 东北农业大学 | Cow image information collecting and processing method and system |
CN112215160B (en) * | 2020-10-13 | 2023-11-24 | 厦门大学 | Video three-dimensional human body posture estimation algorithm utilizing long-short period information fusion |
AU2021359652A1 (en) | 2020-10-14 | 2023-06-22 | One Cup Productions Ltd. | Animal visual identification, tracking, monitoring and assessment systems and methods thereof |
AU2021414124A1 (en) * | 2020-12-29 | 2023-07-13 | The Jackson Laboratory | Gait and posture analysis |
US11928187B1 (en) | 2021-02-17 | 2024-03-12 | Bank Of America Corporation | Media hosting system employing a secured video stream |
US11594032B1 (en) | 2021-02-17 | 2023-02-28 | Bank Of America Corporation | Media player and video verification system |
US11527106B1 (en) | 2021-02-17 | 2022-12-13 | Bank Of America Corporation | Automated video verification |
US11790694B1 (en) | 2021-02-17 | 2023-10-17 | Bank Of America Corporation | Video player for secured video stream |
US20220335446A1 (en) * | 2021-04-14 | 2022-10-20 | Sunshine Energy Technology Co., Ltd. | Real Food Honesty Display System |
CN113205032B (en) * | 2021-04-27 | 2022-11-01 | 安徽正华生物仪器设备有限公司 | Automatic analysis system and method for mouse suspension experiment based on deep learning |
CA3222789A1 (en) | 2021-05-27 | 2022-12-01 | Ai Thinktank Llc | 3d avatar generation and robotic limbs using biomechanical analysis |
US11640725B2 (en) | 2021-05-28 | 2023-05-02 | Sportsbox.ai Inc. | Quantitative, biomechanical-based analysis with outcomes and context |
US12008839B2 (en) | 2021-05-28 | 2024-06-11 | Sportsbox.ai Inc. | Golf club and other object fitting using quantitative biomechanical-based analysis |
US11526548B1 (en) | 2021-06-24 | 2022-12-13 | Bank Of America Corporation | Image-based query language system for performing database operations on images and videos |
US11941051B1 (en) | 2021-06-24 | 2024-03-26 | Bank Of America Corporation | System for performing programmatic operations using an image-based query language |
US11784975B1 (en) * | 2021-07-06 | 2023-10-10 | Bank Of America Corporation | Image-based firewall system |
US12028319B1 (en) | 2021-07-06 | 2024-07-02 | Bank Of America Corporation | Image-based firewall for synthetic media prevention |
CN117980946A (en) | 2021-09-28 | 2024-05-03 | 富士通株式会社 | Image processing program, image processing apparatus, and image processing method |
KR102714247B1 (en) * | 2021-11-29 | 2024-10-08 | 주식회사 바딧 | Method, system and non-transitory computer-readable recording medium for supporting labeling to sensor data |
US20230206615A1 (en) * | 2021-12-29 | 2023-06-29 | Halliburton Energy Services, Inc. | Systems and methods to determine an activity associated with an object of interest |
CN114533040B (en) * | 2022-01-12 | 2024-04-09 | 北京京仪仪器仪表研究总院有限公司 | Method for monitoring specific activity of personnel in fixed space |
USD1035720S1 (en) | 2022-04-20 | 2024-07-16 | Sportsbox.ai Inc. | Display screen with transitional graphical user interface |
USD1035721S1 (en) | 2022-04-20 | 2024-07-16 | Sportsbox.ai Inc. | Display screen with transitional graphical user interface |
USD1036464S1 (en) | 2022-04-20 | 2024-07-23 | Sportsbox.ai Inc. | Display screen with transitional graphical user interface |
EP4276772A1 (en) * | 2022-05-12 | 2023-11-15 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method, computer program and system for analysing one or more moving objects in a video |
CN114916452A (en) * | 2022-05-26 | 2022-08-19 | 浙江理工大学 | Device for testing influence of IVC environmental parameters on thermal comfort of SPF experimental animals |
CN115281604B (en) * | 2022-08-24 | 2023-11-21 | 深圳市具安科技有限公司 | Animal eye movement-based vertical axis rotation analysis method, device and medium |
WO2024166039A1 (en) | 2023-02-08 | 2024-08-15 | Illumisonics Inc. | Photon absorption remote sensing system for histological assessment of tissues |
CN116778420A (en) * | 2023-06-26 | 2023-09-19 | 潍坊医学院 | Medicine use monitoring feedback method and system based on big data video image analysis |
Family Cites Families (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3100473A (en) | 1961-01-30 | 1963-08-13 | Mead Johnson & Co | Apparatus for measuring animal activity |
US3304911A (en) | 1964-08-24 | 1967-02-21 | Shionogi & Co | Apparatus for automatically measuring the movement of an animal |
US3485213A (en) * | 1967-10-23 | 1969-12-23 | Edward J Scanlon | Animal exercising,conditioning and therapy and apparatus therefor |
DE2152406C3 (en) | 1971-10-21 | 1974-09-26 | Institut Dr. Friedrich Foerster Pruefgeraetebau, 7410 Reutlingen | Arrangement for determining the activity of test animals |
US3974798A (en) * | 1975-04-21 | 1976-08-17 | Meetze Jr Murray O | Method and apparatus for studying laboratory animal behavior |
US4337726A (en) | 1980-07-07 | 1982-07-06 | Czekajewski Jan A | Animal activity monitor and behavior processor |
JPS58184810U (en) * | 1982-06-01 | 1983-12-08 | 鐘淵化学工業株式会社 | magnetic circuit device |
US4517593A (en) * | 1983-04-29 | 1985-05-14 | The United States Of America As Represented By The Secretary Of The Navy | Video multiplexer |
US4631676A (en) * | 1983-05-25 | 1986-12-23 | Hospital For Joint Diseases Or | Computerized video gait and motion analysis system and method |
US4574734A (en) | 1984-05-25 | 1986-03-11 | Omnitech Electronics, Inc. | Universal animal activity monitoring system |
US4600016A (en) * | 1985-08-26 | 1986-07-15 | Biomechanical Engineering Corporation | Method and apparatus for gait recording and analysis |
JPH0785080B2 (en) * | 1986-11-25 | 1995-09-13 | 株式会社日立製作所 | Fish condition monitor |
US4888703A (en) * | 1986-09-09 | 1989-12-19 | Hitachi Engineering Co., Ltd. | Apparatus for monitoring the toxicant contamination of water by using aquatic animals |
JPH07108285B2 (en) | 1988-10-14 | 1995-11-22 | 東洋産業株式会社 | Experimental animal behavior observer |
WO1993006779A1 (en) * | 1991-10-10 | 1993-04-15 | Neurocom International, Inc. | Apparatus and method for characterizing gait |
JP3244798B2 (en) | 1992-09-08 | 2002-01-07 | 株式会社東芝 | Moving image processing device |
US5428723A (en) * | 1992-09-09 | 1995-06-27 | International Business Machines Corporation | Method and apparatus for capturing the motion of an object in motion video |
US5299454A (en) * | 1992-12-10 | 1994-04-05 | K.K. Holding Ag | Continuous foot-strike measuring system and method |
US5377258A (en) * | 1993-08-30 | 1994-12-27 | National Medical Research Council | Method and apparatus for an automated and interactive behavioral guidance system |
US5414644A (en) * | 1993-11-24 | 1995-05-09 | Ethnographics, Inc. | Repetitive event analysis system |
US6165747A (en) * | 1993-12-30 | 2000-12-26 | President & Fellows Of Harvard College | Nucleic acids encoding hedgehog proteins |
US20030186357A1 (en) * | 1993-12-30 | 2003-10-02 | Philip W. Ingham | Vertebrate embryonic pattern-inducing proteins, and uses related thereto |
JP3123587B2 (en) * | 1994-03-09 | 2001-01-15 | 日本電信電話株式会社 | Moving object region extraction method using background subtraction |
JPH0863603A (en) * | 1994-06-15 | 1996-03-08 | Olympus Optical Co Ltd | Image analyzer |
US5708767A (en) | 1995-02-03 | 1998-01-13 | The Trustees Of Princeton University | Method and apparatus for video browsing based on content and structure |
US5821945A (en) | 1995-02-03 | 1998-10-13 | The Trustees Of Princeton University | Method and apparatus for video browsing based on content and structure |
US5872865A (en) * | 1995-02-08 | 1999-02-16 | Apple Computer, Inc. | Method and system for automatic classification of video images |
US6343188B1 (en) * | 1995-03-02 | 2002-01-29 | Canon Kabushiki Kaisha | Vibration correction apparatus and optical device |
JP3683929B2 (en) * | 1995-03-02 | 2005-08-17 | キヤノン株式会社 | Blur correction device and optical device |
GB9506324D0 (en) * | 1995-03-28 | 1995-05-17 | Vinten Group Plc | Improvements in or relating to linear force actuators |
US5870138A (en) * | 1995-03-31 | 1999-02-09 | Hitachi, Ltd. | Facial image processing |
JP3377659B2 (en) * | 1995-09-07 | 2003-02-17 | 株式会社日立国際電気 | Object detection device and object detection method |
US6088468A (en) * | 1995-05-17 | 2000-07-11 | Hitachi Denshi Kabushiki Kaisha | Method and apparatus for sensing object located within visual field of imaging device |
US6231527B1 (en) * | 1995-09-29 | 2001-05-15 | Nicholas Sol | Method and apparatus for biomechanical correction of gait and posture |
US5546439A (en) * | 1995-11-02 | 1996-08-13 | General Electric Company | Systems, methods and apparatus for incrementally reconstructing overlapped images in a CT system implementing a helical scan |
US5969755A (en) * | 1996-02-05 | 1999-10-19 | Texas Instruments Incorporated | Motion based event detection system and method |
US6310270B1 (en) * | 1996-03-15 | 2001-10-30 | The General Hospital Corporation | Endothelial NOS knockout mice and methods of use |
JP3540494B2 (en) * | 1996-03-15 | 2004-07-07 | 株式会社東芝 | Cooperative work adjusting device and cooperative work adjusting method |
EP0816986B1 (en) * | 1996-07-03 | 2006-09-06 | Hitachi, Ltd. | System for recognizing motions |
JP3679512B2 (en) * | 1996-07-05 | 2005-08-03 | キヤノン株式会社 | Image extraction apparatus and method |
JP3436293B2 (en) * | 1996-07-25 | 2003-08-11 | 沖電気工業株式会社 | Animal individual identification device and individual identification system |
EP0837418A3 (en) | 1996-10-18 | 2006-03-29 | Kabushiki Kaisha Toshiba | Method and apparatus for generating information input using reflected light image of target object |
JP3512992B2 (en) * | 1997-01-07 | 2004-03-31 | 株式会社東芝 | Image processing apparatus and image processing method |
US6215898B1 (en) * | 1997-04-15 | 2001-04-10 | Interval Research Corporation | Data processing system and method |
US5816256A (en) | 1997-04-17 | 1998-10-06 | Bioanalytical Systems, Inc. | Movement--responsive system for conducting tests on freely-moving animals |
US6263088B1 (en) * | 1997-06-19 | 2001-07-17 | Ncr Corporation | System and method for tracking movement of objects in a scene |
US6295367B1 (en) * | 1997-06-19 | 2001-09-25 | Emtera Corporation | System and method for tracking movement of objects in a scene using correspondence graphs |
US6334187B1 (en) * | 1997-07-03 | 2001-12-25 | Matsushita Electric Industrial Co., Ltd. | Information embedding method, information extracting method, information embedding apparatus, information extracting apparatus, and recording media |
JPH1152215A (en) * | 1997-07-29 | 1999-02-26 | Nikon Corp | Lens driving device |
US6061088A (en) | 1998-01-20 | 2000-05-09 | Ncr Corporation | System and method for multi-resolution background adaptation |
US6212510B1 (en) * | 1998-01-30 | 2001-04-03 | Mitsubishi Electric Research Laboratories, Inc. | Method for minimizing entropy in hidden Markov models of physical signals |
US6242456B1 (en) * | 1998-03-09 | 2001-06-05 | Trustees Of Tufts College | Treatment of stereotypic, self-injurious and compulsive behaviors in man and animals using antagonists of NMDA receptors |
JP3270005B2 (en) * | 1998-03-20 | 2002-04-02 | 勝義 川崎 | Automated method of observing behavior of experimental animals |
US6072496A (en) | 1998-06-08 | 2000-06-06 | Microsoft Corporation | Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects |
IL125940A (en) * | 1998-08-26 | 2002-05-23 | Bar Shalom Avshalom | Device, method and system for automatic identification of sound patterns made by animals |
US6721454B1 (en) * | 1998-10-09 | 2004-04-13 | Sharp Laboratories Of America, Inc. | Method for automatic extraction of semantically significant events from video |
CN1332726A (en) * | 1998-11-02 | 2002-01-23 | 卫福有限公司 | Pyrrolidine compounds and pharmaceutical use thereof |
JP4392886B2 (en) * | 1999-01-22 | 2010-01-06 | キヤノン株式会社 | Image extraction method and apparatus |
US7133537B1 (en) * | 1999-05-28 | 2006-11-07 | It Brokerage Services Pty Limited | Method and apparatus for tracking a moving object |
WO2001033953A1 (en) * | 1999-11-11 | 2001-05-17 | Kowa Co., Ltd. | Method and device for measuring frequency of specific behavior of animal |
GB2358098A (en) * | 2000-01-06 | 2001-07-11 | Sharp Kk | Method of segmenting a pixelled image |
US6311644B1 (en) * | 2000-02-03 | 2001-11-06 | Carl S. Pugh | Apparatus and method for animal behavior tracking, predicting and signaling |
ATE289161T1 (en) * | 2000-05-16 | 2005-03-15 | Max Planck Gesellschaft | NEW SCREENING DEVICE FOR ANALYZING THE BEHAVIOR OF LABORATORY ANIMALS |
US6601010B1 (en) * | 2000-06-05 | 2003-07-29 | The University Of Kansas | Force plate actometer |
US7643655B2 (en) * | 2000-11-24 | 2010-01-05 | Clever Sys, Inc. | System and method for animal seizure detection and classification using video analysis |
US6678413B1 (en) * | 2000-11-24 | 2004-01-13 | Yiqing Liang | System and method for object identification and behavior characterization using video analysis |
US7269516B2 (en) * | 2001-05-15 | 2007-09-11 | Psychogenics, Inc. | Systems and methods for monitoring behavior informatics |
JP2005529580A (en) * | 2001-08-06 | 2005-10-06 | サイコジェニクス インク | Programmable electronic maze for use in animal behavior assessment |
CA2460832A1 (en) * | 2001-09-17 | 2003-03-27 | The Curavita Corporation | Monitoring locomotion kinematics in ambulating animals |
US6929586B2 (en) * | 2002-07-15 | 2005-08-16 | Reginald A. Johnson | Balance and gait training board |
-
2000
- 2000-11-24 US US09/718,374 patent/US6678413B1/en not_active Expired - Lifetime
-
2001
- 2001-11-19 EP EP01987014A patent/EP1337962B9/en not_active Expired - Lifetime
- 2001-11-19 JP JP2002544950A patent/JP2004514975A/en active Pending
- 2001-11-19 AU AU2002239272A patent/AU2002239272A1/en not_active Abandoned
- 2001-11-19 WO PCT/US2001/043282 patent/WO2002043352A2/en active Application Filing
-
2003
- 2003-09-19 US US10/666,741 patent/US7068842B2/en not_active Expired - Lifetime
- 2003-10-30 US US10/698,008 patent/US20040141635A1/en not_active Abandoned
- 2003-10-30 US US10/698,044 patent/US7209588B2/en not_active Expired - Lifetime
-
2004
- 2004-09-30 US US10/955,661 patent/US8514236B2/en active Active
-
2007
- 2007-04-06 US US11/697,544 patent/US20070175406A1/en not_active Abandoned
-
2009
- 2009-01-15 US US12/354,727 patent/US20090285452A1/en not_active Abandoned
- 2009-02-03 US US12/365,149 patent/US7817824B2/en not_active Expired - Fee Related
-
2010
- 2010-09-13 US US12/881,154 patent/US20110007946A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
None |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7643655B2 (en) | 2000-11-24 | 2010-01-05 | Clever Sys, Inc. | System and method for animal seizure detection and classification using video analysis |
US8514236B2 (en) | 2000-11-24 | 2013-08-20 | Cleversys, Inc. | System and method for animal gait characterization from bottom view using video analysis |
US7817824B2 (en) | 2000-11-24 | 2010-10-19 | Clever Sys, Inc. | Unified system and method for animal behavior characterization from top view using video analysis |
US7882135B2 (en) | 2001-05-15 | 2011-02-01 | Psychogenics, Inc. | Method for predicting treatment classes using behavior informatics |
US7269516B2 (en) | 2001-05-15 | 2007-09-11 | Psychogenics, Inc. | Systems and methods for monitoring behavior informatics |
US7580798B2 (en) | 2001-05-15 | 2009-08-25 | Psychogenics, Inc. | Method for predicting treatment classes using animal behavior informatics |
US9565398B2 (en) | 2001-06-11 | 2017-02-07 | Arrowsight, Inc. | Caching graphical interface for displaying video and ancillary data from a saved video |
JP2011081823A (en) * | 2002-06-28 | 2011-04-21 | Koninkl Philips Electronics Nv | Method and apparatus for modeling behavior using probability distribution function |
ES2242484A1 (en) * | 2003-01-24 | 2005-11-01 | Pedro Monagas Asensio | Mood analysing device for mammals |
GB2442673A (en) * | 2005-08-03 | 2008-04-09 | Honeywell Int Inc | Boolean complement methods and systems for video image processing a region of interest |
WO2007019140A3 (en) * | 2005-08-03 | 2007-07-26 | Honeywell Int Inc | Boolean complement methods and systems for video image processing a region of interest |
WO2007019140A2 (en) * | 2005-08-03 | 2007-02-15 | Honeywell International Inc. | Boolean complement methods and systems for video image processing a region of interest |
US7558404B2 (en) | 2005-11-28 | 2009-07-07 | Honeywell International Inc. | Detection of abnormal crowd behavior |
WO2007064559A1 (en) * | 2005-11-28 | 2007-06-07 | Honeywell International Inc. | Detection of abnormal crowd behavior |
WO2007110555A1 (en) * | 2006-03-28 | 2007-10-04 | The University Court Of The University Of Edinburgh | A method for automatically characterizing the behavior of one or more objects |
CN101410855B (en) * | 2006-03-28 | 2011-11-30 | 爱丁堡大学评议会 | Method for automatically attributing one or more object behaviors |
WO2009045578A3 (en) * | 2007-06-18 | 2009-05-22 | Boeing Co | Object detection incorporating background clutter removal |
WO2009045578A2 (en) * | 2007-06-18 | 2009-04-09 | The Boeing Company | Object detection incorporating background clutter removal |
WO2010032247A3 (en) * | 2008-09-17 | 2010-05-27 | Ramot At Tel-Aviv University Ltd. | System and method for analyzing exploratory behavior |
WO2010032247A2 (en) * | 2008-09-17 | 2010-03-25 | Ramot At Tel-Aviv University Ltd. | System and method for analyzing exploratory behavior |
US8634635B2 (en) | 2008-10-30 | 2014-01-21 | Clever Sys, Inc. | System and method for stereo-view multiple animal behavior characterization |
EP2521070A3 (en) * | 2011-05-06 | 2013-12-25 | Deutsche Telekom AG | Method and system for recording a static or dynamic scene, for determining raw events and detecting free areas in an area under observation |
US20130172154A1 (en) * | 2011-12-28 | 2013-07-04 | Samsung Electronics Co., Ltd. | Method for measuring quantity of exercise and display apparatus thereof |
EP2609858A1 (en) * | 2011-12-28 | 2013-07-03 | Samsung Electronics Co., Ltd | Method for measuring quantity of exercise and display apparatus thereof |
CN102970519A (en) * | 2012-11-29 | 2013-03-13 | 河海大学常州校区 | Non-rigid target behavior observation device and method based on visual perception network |
US11551079B2 (en) | 2017-03-01 | 2023-01-10 | Standard Cognition, Corp. | Generating labeled training images for use in training a computational neural network for object or action recognition |
US11790682B2 (en) | 2017-03-10 | 2023-10-17 | Standard Cognition, Corp. | Image analysis using neural networks for pose and action identification |
US11023850B2 (en) | 2017-08-07 | 2021-06-01 | Standard Cognition, Corp. | Realtime inventory location management using deep learning |
US11250376B2 (en) | 2017-08-07 | 2022-02-15 | Standard Cognition, Corp | Product correlation analysis using deep learning |
US12056660B2 (en) | 2017-08-07 | 2024-08-06 | Standard Cognition, Corp. | Tracking inventory items in a store for identification of inventory items to be re-stocked and for identification of misplaced items |
US11195146B2 (en) | 2017-08-07 | 2021-12-07 | Standard Cognition, Corp. | Systems and methods for deep learning-based shopper tracking |
US11200692B2 (en) | 2017-08-07 | 2021-12-14 | Standard Cognition, Corp | Systems and methods to check-in shoppers in a cashier-less store |
US11232687B2 (en) | 2017-08-07 | 2022-01-25 | Standard Cognition, Corp | Deep learning-based shopper statuses in a cashier-less store |
EP3665615A4 (en) * | 2017-08-07 | 2020-12-30 | Standard Cognition, Corp. | Predicting inventory events using semantic diffing |
US11544866B2 (en) | 2017-08-07 | 2023-01-03 | Standard Cognition, Corp | Directional impression analysis using deep learning |
US11270260B2 (en) | 2017-08-07 | 2022-03-08 | Standard Cognition Corp. | Systems and methods for deep learning-based shopper tracking |
US12026665B2 (en) | 2017-08-07 | 2024-07-02 | Standard Cognition, Corp. | Identifying inventory items using multiple confidence levels |
US11295270B2 (en) | 2017-08-07 | 2022-04-05 | Standard Cognition, Corp. | Deep learning-based store realograms |
US11810317B2 (en) | 2017-08-07 | 2023-11-07 | Standard Cognition, Corp. | Systems and methods to check-in shoppers in a cashier-less store |
WO2019032306A1 (en) | 2017-08-07 | 2019-02-14 | Standard Cognition, Corp. | Predicting inventory events using semantic diffing |
US11538186B2 (en) | 2017-08-07 | 2022-12-27 | Standard Cognition, Corp. | Systems and methods to check-in shoppers in a cashier-less store |
US10410371B2 (en) | 2017-12-21 | 2019-09-10 | The Boeing Company | Cluttered background removal from imagery for object detection |
US11232575B2 (en) | 2019-04-18 | 2022-01-25 | Standard Cognition, Corp | Systems and methods for deep learning-based subject persistence |
US11948313B2 (en) | 2019-04-18 | 2024-04-02 | Standard Cognition, Corp | Systems and methods of implementing multiple trained inference engines to identify and track subjects over multiple identification intervals |
EP4046066A4 (en) * | 2019-11-07 | 2023-11-15 | Google LLC | Monitoring animal pose dynamics from monocular images |
US20210315186A1 (en) * | 2020-04-14 | 2021-10-14 | The United States Of America, As Represented By Secretary Of Agriculture | Intelligent dual sensory species-specific recognition trigger system |
US11361468B2 (en) | 2020-06-26 | 2022-06-14 | Standard Cognition, Corp. | Systems and methods for automated recalibration of sensors for autonomous checkout |
US11303853B2 (en) | 2020-06-26 | 2022-04-12 | Standard Cognition, Corp. | Systems and methods for automated design of camera placement and cameras arrangements for autonomous checkout |
US11818508B2 (en) | 2020-06-26 | 2023-11-14 | Standard Cognition, Corp. | Systems and methods for automated design of camera placement and cameras arrangements for autonomous checkout |
US12079769B2 (en) | 2020-06-26 | 2024-09-03 | Standard Cognition, Corp. | Automated recalibration of sensors for autonomous checkout |
CN114241521A (en) * | 2021-12-13 | 2022-03-25 | 北京华夏电通科技股份有限公司 | Method, device and equipment for identifying court trial video picture normal area |
Also Published As
Publication number | Publication date |
---|---|
US8514236B2 (en) | 2013-08-20 |
US20040131254A1 (en) | 2004-07-08 |
US20040141635A1 (en) | 2004-07-22 |
JP2004514975A (en) | 2004-05-20 |
AU2002239272A1 (en) | 2002-06-03 |
US7068842B2 (en) | 2006-06-27 |
US20070229522A1 (en) | 2007-10-04 |
US20090296992A1 (en) | 2009-12-03 |
US20040141636A1 (en) | 2004-07-22 |
EP1337962A2 (en) | 2003-08-27 |
US7209588B2 (en) | 2007-04-24 |
EP1337962B1 (en) | 2012-09-26 |
US20090285452A1 (en) | 2009-11-19 |
EP1337962A4 (en) | 2007-02-28 |
WO2002043352A3 (en) | 2003-01-09 |
US20110007946A1 (en) | 2011-01-13 |
US7817824B2 (en) | 2010-10-19 |
US6678413B1 (en) | 2004-01-13 |
US20070175406A1 (en) | 2007-08-02 |
EP1337962B9 (en) | 2013-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6678413B1 (en) | System and method for object identification and behavior characterization using video analysis | |
US8774532B2 (en) | Calibration of video object classification | |
US8520899B2 (en) | Video object classification | |
US7995843B2 (en) | Monitoring device which monitors moving objects | |
EP2192549B1 (en) | Target tracking device and target tracking method | |
Hong et al. | Fast multi-feature pedestrian detection algorithm based on histogram of oriented gradient using discrete wavelet transform | |
Wang et al. | Towards a kinect-based behavior recognition and analysis system for small animals | |
CN114549371B (en) | Image analysis method and device | |
Twining et al. | Robust tracking and posture description for laboratory rodents using active shape models | |
CN107886060A (en) | Pedestrian's automatic detection and tracking based on video | |
Farah et al. | Catching a rat by its edglets | |
Latecki et al. | Motion detection based on local variation of spatiotemporal texture | |
JP6893812B2 (en) | Object detector | |
CN115690554A (en) | Target identification method, system, electronic device and storage medium | |
Leroy et al. | Computer vision based recognition of behavior phenotypes of laying hens | |
JP2000125288A5 (en) | ||
Zurn et al. | Video-based rodent activity measurement using near-infrared illumination | |
CN117854114B (en) | Intelligent identification method, equipment and medium for coupling behavior of zebra fish | |
CN118247581B (en) | Method and device for labeling and analyzing gestures of key points of animal images | |
Sepúlveda et al. | Evaluation of background subtraction algorithms using MuHAVi, a multicamera human action video dataset | |
US11257238B2 (en) | Unsupervised object sizing method for single camera viewing | |
JP2021125048A (en) | Information processing apparatus, information processing method, image processing apparatus, and program | |
Latecki et al. | Activity and motion detection based on measuring texture change | |
French | Visual Tracking: From An Individual To Groups Of Animals | |
Al-Raziqi et al. | Detection of object interactions in video sequences |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2001987014 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002544950 Country of ref document: JP |
|
WWP | Wipo information: published in national office |
Ref document number: 2001987014 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |