JPH11296651A - Automation method for behavior observation of experimental animal - Google Patents

Automation method for behavior observation of experimental animal

Info

Publication number
JPH11296651A
JPH11296651A JP9252198A JP9252198A JPH11296651A JP H11296651 A JPH11296651 A JP H11296651A JP 9252198 A JP9252198 A JP 9252198A JP 9252198 A JP9252198 A JP 9252198A JP H11296651 A JPH11296651 A JP H11296651A
Authority
JP
Japan
Prior art keywords
behavior
image
predetermined value
action
experimental animal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP9252198A
Other languages
Japanese (ja)
Other versions
JP3270005B2 (en
Inventor
Katsuyoshi Kawasaki
勝義 川崎
Akira Shimizu
明 清水
Yasuhiro Yoshikawa
泰弘 吉川
Junshiro Makino
順四郎 牧野
Keiji Terao
恵治 寺尾
Sunao Yamaumi
直 山海
Takamasa Koyama
高正 小山
Takeshi Hasegawa
毅 長谷川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chuo Electronics Co Ltd
Original Assignee
Chuo Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chuo Electronics Co Ltd filed Critical Chuo Electronics Co Ltd
Priority to JP09252198A priority Critical patent/JP3270005B2/en
Publication of JPH11296651A publication Critical patent/JPH11296651A/en
Application granted granted Critical
Publication of JP3270005B2 publication Critical patent/JP3270005B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Landscapes

  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

PROBLEM TO BE SOLVED: To automate the behavior observation of an experimental animal, to shorten required time and to provide data with objectivity. SOLUTION: The behavior of a rat 7 inside an open field 6 is photographed by a video camera 1 and the behavior of the rat 7 is classified into six behavior items by an observer who observes the behavior of the rat 7 in photographed VTR images. Image parameters composed of seven items are extracted by image processing digital images for which the VTR images are fetched to a computer 5, the image parameters for the respective behavior items judged by the observer are compared and algorithm for judging the behavior of the rat 7 is prepared.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、薬効評価などに有
効な実験動物の行動観察の自動化方法に関する。
The present invention relates to an automatic method for observing the behavior of an experimental animal, which is effective for evaluating the efficacy of a drug.

【0002】[0002]

【従来の技術】従来の実験動物の行動観察は、観察者の
主観に基づいて観察結果を求めているので、正確な結果
を得るために同一観察者による観察をを2回繰り返して
いたが、同一人であっても観察結果が一致しないことが
あった。また、複数の観察者によって観察結果の客観性
を高める場合は、判定基準を一致させるために事前にト
レーニング期間を設けてから観察を行っていたが、それ
でも観察結果が一致しない場合があった。
2. Description of the Related Art In conventional behavioral observation of experimental animals, observation results are obtained based on the subjectivity of the observer. Therefore, observation by the same observer is repeated twice to obtain accurate results. Observation results sometimes did not match even for the same person. In addition, in order to improve the objectivity of the observation result by a plurality of observers, the observation is performed after providing a training period in advance in order to match the criteria, but the observation result may still not match.

【0003】[0003]

【発明が解決しようとする課題】上述したように、従来
の行動観察方法では、観察に要する時間が長時間必要で
あり、しかも観察結果が一致しない場合もあって、観察
データの客観性について慎重に検討する必要があった。
本発明は、このような従来の行動観察方法の欠点を解消
するためになされたものであって、行動観察時間を短縮
すると共に、データに客観性を持たせるようにしたもの
である。
As described above, in the conventional behavior observation method, the time required for observation is long, and the observation results sometimes do not match. Had to be considered.
SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned drawbacks of the conventional behavior observation method, and aims to shorten the behavior observation time and provide data with objectivity.

【0004】[0004]

【課題を解決するための手段】オープンフィールド内の
実験動物の行動を、所定時間ビデオカメラによって撮影
してVTR画像として記録し、このVTR画像における
実験動物の行動を観察して複数の行動項目に分類すると
共に、VTR画像をディジタル画像としてコンピュータ
に入力させてディジタル画像処理を行って実験動物の行
動判定に用いる画像パラメータを抽出する。次に、観察
者が判定した各行動項目ごとの画像パラメータの値を比
較し、行動判定のためのアルゴリズムを作成したうえで
コンピュータに格納しておき、観察対象の実験動物の画
像をビデオカメラからコンピュータに取り込み、画像パ
ラメータを求めてコンピュータに格納してある行動判定
アルゴリズムを適用することにより自動的に観察結果が
得られるようにした。
The behavior of an experimental animal in an open field is photographed by a video camera for a predetermined time and recorded as a VTR image, and the behavior of the experimental animal in the VTR image is observed to be included in a plurality of behavior items. In addition to the classification, the computer inputs the VTR image as a digital image and performs digital image processing to extract image parameters used for determining the behavior of the experimental animal. Next, the values of the image parameters for each action item determined by the observer are compared, an algorithm for action determination is created and stored in a computer, and the image of the experimental animal to be observed is obtained from the video camera. Observation results can be automatically obtained by acquiring the image parameters into a computer, applying an action determination algorithm stored in the computer to obtain image parameters.

【0005】[0005]

【発明の実施の形態】本発明の実施例を図面を参照しな
がら説明する。図1は、本発明に係る実験動物の行動観
察システムの構成を示す説明図である。図1において、
観察対象のラット7を正方形(115×115cm)のオ
ープンフィールド6に入れ、ラット7の行動をオープン
フィールド6の上方に設けたビデオカメラ1によって撮
影し、撮影したVTR画像はビデオテープレコーダ2、
画像入出力装置3、ビデオモニタ4、コンピュータ5よ
り成る行動観察システムに入力して画像処理されるよう
に構成してある。
Embodiments of the present invention will be described with reference to the drawings. FIG. 1 is an explanatory diagram showing a configuration of a behavior observation system for experimental animals according to the present invention. In FIG.
The observation target rat 7 is placed in a square (115 × 115 cm) open field 6, and the behavior of the rat 7 is photographed by the video camera 1 provided above the open field 6, and the photographed VTR image is recorded on the video tape recorder 2.
It is configured to be input to a behavior observation system including an image input / output device 3, a video monitor 4, and a computer 5 and to be subjected to image processing.

【0006】この実施例においては、ラット7をオープ
ンフィールド6内に10分間放置しておいてビデオカメ
ラ1によって撮影し、そのVTR画像を用いて観察者に
よる行動観察を行い、オープンフィールド6内で起こる
ラット7の行動を分類する。分類された行動項目は、 L
ocomotion(移動活動)、 Grooming(毛づくろい)、Stre
ching(伸び)、 Sniffing(嗅ぎ)、Rearing(後肢立ち)
および Immobility(不動)の6項目であり、1秒毎の時
間見本法によって判定し、1秒間に起こる行動は1つと
してある。なお、行動項目は観察対象の実験動物の種類
によって適宜変更する。
In this embodiment, the rat 7 is left in the open field 6 for 10 minutes, photographed by the video camera 1, and an observer observes the behavior using the VTR image. The behavior of rat 7 that occurs is categorized. Classified action items are L
ocomotion (moving activity), Grooming (grooming), Stre
ching (elongation), Sniffing (sniffing), Rearing (hind leg standing)
And Immobility (immobility), which are determined by the time sample method every second, and one action occurs in one second. The action items are appropriately changed depending on the type of the experimental animal to be observed.

【0007】上述したVTR画像をディジタル画像とし
てSENSAR社製のピラミッド・ヴィジョンシステム
(PVS)に入力させ、画像差分法、2値化法、楕円近
似および重心抽出より成る画像処理を行って画像パラメ
ータを抽出し、行動判定に用いる。画像パラメータは7
項目によって構成しており、その内訳は次の通りであ
る。 (1) 面積:画像上でとらえたラットの面積 (2) 長軸長:ラットを楕円近似した際の楕円の長軸長 (3) 重心移動量:ラットの重心の移動量 (4) 直進性:ラットの重心が直進した処理画像数 (5) 回転性:ラットの長軸が回転した角度の大きさ (6) 総差分量:連続する2フレーム間での輝度値変化量
の合計 (7) 部分差分量:ラットを楕円近似したときの長軸両端
近傍における輝度値変化量
[0007] The above-mentioned VTR image is input as a digital image to a pyramid vision system (PVS) manufactured by SENSAR, and image processing including image difference method, binarization method, elliptic approximation, and centroid extraction is performed to set image parameters. Extracted and used for action determination. Image parameter is 7
It consists of items, the breakdown of which is as follows. (1) Area: The area of the rat captured on the image. (2) Long axis length: Long axis length of the ellipse when the rat is approximated by an ellipse. (3) Movement of the center of gravity: Movement of the rat's center of gravity. (4) Straightness : The number of processed images in which the rat's center of gravity goes straight. (5) Rotation: The size of the angle at which the long axis of the rat is rotated. (6) Total difference: The total change in brightness between two consecutive frames (7) Partial difference: The amount of change in the luminance value near both ends of the long axis when the rat is approximated by an ellipse

【0008】前記ピラミッド・ヴィジョンシステム(P
VS)による画像処理のフローチャートを図3に示す。
図3において、VTR画像における各フレーム毎の処理
時間の流れを縦軸方向に、横軸方向に各フレームを左側
から順番に表示してある。第1のフレームにおいて、先
ず対象画像を取り込み(S101)、背景画像を取り込
む(S100)と共に、対象画像と背景画像の差分処理
(S102)を行ってノイズを除去し(S103)、背
景消去画像を作成する(S104)。同様にして、第2
のフレームにおいても対象画像の取り込み(S20
1)、背景画像を取り込む(S100)と共に、対象画
像と背景画像の差分処理(S202)、ノイズ除去(S
203)、背景消去画像を作成する(S204)。第1
のフレームと第2のフレームにおける背景消去画像(S
104とS204)の差分処理を行って(S105)、
総差分量計測(S106)を行い、さらに部分差分量計
測(S107)を行う。
The pyramid vision system (P
FIG. 3 shows a flowchart of the image processing by VS).
In FIG. 3, the flow of processing time for each frame in the VTR image is displayed in the vertical axis direction, and each frame is displayed in the horizontal axis direction in order from the left. In the first frame, first, the target image is fetched (S101), the background image is fetched (S100), and the difference processing between the target image and the background image (S102) is performed to remove noise (S103). It is created (S104). Similarly, the second
Of the target image in the frame (S20)
1), the background image is captured (S100), the difference processing between the target image and the background image (S202), the noise removal (S100)
203), a background erased image is created (S204). First
Background erased image (S
104 and S204) is performed (S105),
The total difference amount measurement (S106) is performed, and further the partial difference amount measurement (S107) is performed.

【0009】次に、第1フレームにおける背景消去画像
(S104)から対象画像の面積計測(S108)を行
い、さらに、楕円近似(S109)と重心計測(S11
0)を行う。前記楕円近似(S109)に基づいて対象
画像の長軸長計測(S112)を行うと共に短軸長計測
(S113)を行う。同様にして第2フレームにおいて
も背景消去画像(S204)からその面積計測(S20
8)、楕円近似(S209)、重心計測(S210)を
行う。第1フレームの重心計測(S110)と第2フレ
ームの重心計測(S210)に基づいてフレーム間にお
ける重心移動量計測(S111)を行う。また、第1フ
レームの楕円近似(S109)と第2フレームの楕円近
似(S209)に基づいてフレーム間における回転性計
測(S114)を行うと共に直進性計測(S115)を
行う。以下同様にして第3フレーム以降のフレームにつ
いても画像処理を行って、行動判定に用いる7項目の画
像パラメータを求める。
Next, the area of the target image is measured (S108) from the background erased image (S104) in the first frame, and the ellipse approximation (S109) and the center of gravity (S11) are performed.
Perform 0). Based on the ellipse approximation (S109), the long axis length measurement (S112) and the short axis length measurement (S113) of the target image are performed. Similarly, in the second frame, the area measurement (S20) is performed from the background erased image (S204).
8), ellipse approximation (S209), and centroid measurement (S210). Based on the center-of-gravity measurement of the first frame (S110) and the center-of-gravity measurement of the second frame (S210), the amount of movement of the center of gravity between frames (S111) is measured. Further, based on the ellipse approximation of the first frame (S109) and the ellipse approximation of the second frame (S209), the rotation between the frames is measured (S114) and the straightness is measured (S115). In the same manner, image processing is performed on the third and subsequent frames to determine seven image parameters used for action determination.

【0010】上述した7項目の画像パラメータに基づい
て、観察者が判定した6つの行動項目ごとの画像パラメ
ータの値を比較し、図2に示すフローチャートを用いて
行動判定のためのアルゴリズムを作成する。図2におい
て、最初に直進性を検討して(S1)、直進性の値が所
定値より大きい場合は Locomotion(移動活動)と判定
し、直進性が所定値より小さい場合は部分差分量を検討
し(S2)、部分差分量が所定値より多い場合は、さら
に長軸長の値を検討し(S3)、長軸長が所定値より長
ければその行動はStreching(伸び)と判定すると共に、
短かければRearing(後肢立ち)と判定する。次に、前記
部分差分量が所定値より少ない場合は、総差分量、重心
移動量および回転性の夫々の値を検討し(S4,S5,
S6)、全ての値が所定値より少ないか又は小さい場合
はその行動は Immobility(不動)であると判定し、総差
分量、重心移動量および回転性のいずれかの値が所定値
より多いか又は大きい場合はさらに面積を検討し(S
7)、面積が所定値より広い場合はその行動は Sniffin
g(嗅ぎ)と判定すると共に、所定値よりも狭い場合はそ
の行動は Grooming(毛づくろい)であると判断する。以
上の方法によって作成した行動判定のアルゴリズムをコ
ンピュータに格納しておく。
Based on the seven image parameters described above, the values of the image parameters for each of the six action items determined by the observer are compared, and an algorithm for action determination is created using the flowchart shown in FIG. . In FIG. 2, the straightness is examined first (S1). If the straightness value is larger than a predetermined value, it is determined to be Locomotion (moving activity). If the straightness is smaller than the predetermined value, the partial difference is examined. (S2) If the partial difference amount is larger than a predetermined value, the value of the major axis length is further examined (S3). If the major axis length is longer than the predetermined value, the action is determined to be Streching (elongation), and
If it is shorter, it is judged as "Rearing". Next, when the partial difference amount is smaller than the predetermined value, the total difference amount, the center-of-gravity shift amount, and the rotation property are examined (S4, S5, S5).
S6) If all the values are less than or less than the predetermined value, it is determined that the action is Immobility (immobility), and whether any of the total difference amount, the center of gravity movement amount, and the rotation property is larger than the predetermined value. Or, if it is large, consider the area further (S
7) If the area is larger than the predetermined value, the action is Sniffin
In addition to determining that the action is g (sniffing), if the action is smaller than the predetermined value, the action is determined to be grooming. The algorithm for action determination created by the above method is stored in a computer.

【0011】観察対象のラット7の画像をビデオカメラ
1からコンピュータ5に取り込んでその画像パラメータ
を求め、予めコンピュータに格納してある行動判定のア
ルゴリズムを適用すると、自動的にラット7の観察結果
が求められる。観察対象の実験動物がラットでない場合
には、行動判定のためのアルゴリズムも異なったものを
作成する必要があるが、上述した画像パラメータとアル
ゴリズムの作成方法を適用することによって容易に作成
可能である。
An image of the rat 7 to be observed is taken from the video camera 1 into the computer 5 to obtain image parameters of the image, and an action determination algorithm stored in advance in the computer is applied. Desired. If the experimental animal to be observed is not a rat, it is necessary to create a different algorithm for behavior determination, but it can be easily created by applying the image parameter and algorithm creation method described above. .

【0012】[0012]

【発明の効果】以上説明したように、本発明による実験
動物の行動観察の自動化方法によると、実験動物の行動
観察に要する時間を短縮すると共に、データに客観性を
もたせることになるので薬効評価などには特に有効であ
る。
As described above, according to the method for automatically observing the behavior of an experimental animal according to the present invention, the time required for observing the behavior of the experimental animal can be shortened, and the data can be objectively evaluated. It is particularly effective for such applications.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明に係る実験動物の行動観察システムの構
成を示す説明図。
FIG. 1 is an explanatory diagram showing a configuration of a behavior observation system for experimental animals according to the present invention.

【図2】本発明による行動判定のためのアルゴリズム。FIG. 2 is an algorithm for determining an action according to the present invention.

【図3】画像処理のフローチャート。FIG. 3 is a flowchart of image processing.

【符号の説明】[Explanation of symbols]

1 ビデオカメラ 2 ビデオテープレコーダ 3 画像入出力装置 4 ビデオモニタ 5 コンピュータ 6 オープンフィールド 7 ラット Reference Signs List 1 video camera 2 video tape recorder 3 image input / output device 4 video monitor 5 computer 6 open field 7 rat

───────────────────────────────────────────────────── フロントページの続き (72)発明者 清水 明 茨城県つくば市竹園3−5−302−202 (72)発明者 吉川 泰弘 東京都保谷市新町3−1−6 (72)発明者 牧野 順四郎 茨城県つくば市稲荷前4−1 (72)発明者 寺尾 恵治 茨城県つくば市松代5−718−1 (72)発明者 山海 直 茨城県つくば市吾妻2−701−202 (72)発明者 小山 高正 東京都豊島区目白3−20−4 クリスティ ー目白301 (72)発明者 長谷川 毅 東京都八王子市元本郷町1丁目9番9号 中央電子株式会社内 ──────────────────────────────────────────────────続 き Continued on the front page (72) Inventor Akira Shimizu 3-5-302-202 Takezono, Tsukuba City, Ibaraki Prefecture (72) Inventor Yasuhiro Yoshikawa 3-1-6 Shinmachi, Hoya City, Tokyo (72) Inventor Jun Makino Shiro 4-1 Inari-mae, Tsukuba City, Ibaraki Prefecture (72) Inventor Keiji Terao 5-718-1 Matsushiro, Tsukuba City, Ibaraki Prefecture (72) Inventor Nao Sankai 2-701-202, Azuma, Tsuzuba City, Ibaraki Prefecture (72) Inventor Koyama Takamasa 301 Christie Mejiro 3-20-4 Mejiro, Toshima-ku, Tokyo (72) Inventor Takeshi Hasegawa 1-9-9 Motohongo-cho, Hachioji-shi, Tokyo Chuo Denshi Co., Ltd.

Claims (2)

【特許請求の範囲】[Claims] 【請求項1】 オープンフィールド内における実験動物
の行動をビデオカメラにより撮影し、撮影したVTR画
像における実験動物の行動を観察した観察者により、前
記実験動物の行動を複数の行動項目に分類すると共に、 前記VTR画像をコンピュータにディジタル画像として
入力させ、このディジタル画像を画像差分法、2値化
法、楕円近似および重心抽出より成る画像処理を行って
画像パラメータを抽出し、観察者が判定した各行動項目
ごとの画像パラメータの値を比較して実験動物の行動判
定の基準となるアルゴリズムを作成してコンピュータに
格納しておき、 観察対象の実験動物の画像をビデオカメラから前記コン
ピュータへ取り込んでそのパラメータを求め、予め格納
してある行動判定のアルゴリズムを適用して自動的に観
察結果が得られるようにしたことを特徴とする実験的動
物の行動観察の自動化方法。
1. The behavior of an experimental animal in an open field is photographed by a video camera, and the behavior of the experimental animal is classified into a plurality of behavior items by an observer who observes the behavior of the experimental animal in the photographed VTR image. The VTR image is input to a computer as a digital image, and the digital image is subjected to image processing including an image difference method, a binarization method, an ellipse approximation, and a centroid extraction, to extract image parameters, and each image determined by an observer. Comparing the values of the image parameters for each action item, creating an algorithm that is a criterion for determining the behavior of the experimental animal, storing it in a computer, and capturing the image of the experimental animal to be observed from the video camera to the computer. Obtain the parameters and apply the stored behavior judgment algorithm to automatically observe Experimental automated method of animal behavior observation, characterized in that has be obtained.
【請求項2】 画像パラメータを用いた行動判定のアル
ゴリズムを、直進性が所定値より大きい場合その行動は
Locomotion(移動活動)であると判定し、 直進性が所定値より小さい場合は部分差分量を検討し、
部分差分量が所定値より多い場合はさらに長軸長の値を
検討し、長軸長が所定値より長ければその行動はStrech
ing(伸び)と判定すると共に、短ければRearing(後肢立
ち)と判定し、 さらに、前記部分差分量が所定値より少ない場合は総差
分量、重心移動量および回転性の夫々の値を検討し、全
ての値が所定値よりも少ないか又は小さい場合はその行
動は Immobility(不動)であると判定し、 総差分量、重心移動量および回転性のいずれかの値が所
定値より多いか又は大きい場合はさらに面積を検討し、
面積が所定値よりも広い場合はその行動は Sniffing(嗅
ぎ)と判定すると共に、所定値よりも狭い場合はその行
動は Grooming(毛づくろい)であると判定するようにし
たことを特徴とする請求項1に記載の実験動物の行動観
察の自動化方法。
2. An algorithm for determining an action using an image parameter, wherein when the straightness is greater than a predetermined value, the action is
Locomotion (moving activity) is determined, and if the straightness is smaller than a predetermined value, the amount of partial difference is examined.
If the partial difference amount is larger than the predetermined value, further examine the value of the long axis length, and if the long axis length is longer than the predetermined value, the action is Strech.
ing (elongation), and if it is short, it is determined to be Rearing (hind limb standing). Further, if the partial difference is smaller than a predetermined value, the total difference, the center-of-gravity shift, and the rotation are examined. If all the values are less than or less than a predetermined value, the action is determined to be Immobility, and any of the total difference amount, the center of gravity shift amount, and the rotation property is greater than a predetermined value or If it is large, consider the area further,
When the area is larger than a predetermined value, the action is determined to be sniffing (sniffing), and when the area is smaller than the predetermined value, the action is determined to be grooming (grooming). The automated method for observing behavior of an experimental animal according to claim 1.
JP09252198A 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals Expired - Lifetime JP3270005B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP09252198A JP3270005B2 (en) 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP09252198A JP3270005B2 (en) 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals

Publications (2)

Publication Number Publication Date
JPH11296651A true JPH11296651A (en) 1999-10-29
JP3270005B2 JP3270005B2 (en) 2002-04-02

Family

ID=14056650

Family Applications (1)

Application Number Title Priority Date Filing Date
JP09252198A Expired - Lifetime JP3270005B2 (en) 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals

Country Status (1)

Country Link
JP (1) JP3270005B2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004021282A1 (en) * 2002-08-29 2004-03-11 Japan Science And Technology Agency Animal behavior analysis method, animal behavior analysis system, animal behavior analysis program, and computer-readable recorded medium on which the program is recorded
JP2004514975A (en) * 2000-11-24 2004-05-20 クレバー エスワイエス インコーポレイテッド System and method for object identification and behavior characterization using video analysis
US7269516B2 (en) 2001-05-15 2007-09-11 Psychogenics, Inc. Systems and methods for monitoring behavior informatics
JP2009531049A (en) * 2006-03-28 2009-09-03 ザ・ユニバーシティ・コート・オブ・ザ・ユニバーシティ・オブ・エディンバラ A method for automatically characterizing the behavior of one or more objects.
US7643655B2 (en) 2000-11-24 2010-01-05 Clever Sys, Inc. System and method for animal seizure detection and classification using video analysis
WO2013170129A1 (en) * 2012-05-10 2013-11-14 President And Fellows Of Harvard College A system and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
US8634635B2 (en) 2008-10-30 2014-01-21 Clever Sys, Inc. System and method for stereo-view multiple animal behavior characterization
JP2020530626A (en) * 2017-08-07 2020-10-22 ザ ジャクソン ラボラトリーThe Jackson Laboratory Long-term continuous animal behavior monitoring
US10909691B2 (en) 2016-03-18 2021-02-02 President And Fellows Of Harvard College Automatically classifying animal behavior
US11020025B2 (en) 2015-10-14 2021-06-01 President And Fellows Of Harvard College Automatically classifying animal behavior

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4598405B2 (en) * 2004-01-06 2010-12-15 小原医科産業株式会社 Behavioral diagnosis method for small animals

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004514975A (en) * 2000-11-24 2004-05-20 クレバー エスワイエス インコーポレイテッド System and method for object identification and behavior characterization using video analysis
US7209588B2 (en) 2000-11-24 2007-04-24 Clever Sys, Inc. Unified system and method for animal behavior characterization in home cages using video analysis
US7643655B2 (en) 2000-11-24 2010-01-05 Clever Sys, Inc. System and method for animal seizure detection and classification using video analysis
US7817824B2 (en) 2000-11-24 2010-10-19 Clever Sys, Inc. Unified system and method for animal behavior characterization from top view using video analysis
US8514236B2 (en) 2000-11-24 2013-08-20 Cleversys, Inc. System and method for animal gait characterization from bottom view using video analysis
US7269516B2 (en) 2001-05-15 2007-09-11 Psychogenics, Inc. Systems and methods for monitoring behavior informatics
US7580798B2 (en) 2001-05-15 2009-08-25 Psychogenics, Inc. Method for predicting treatment classes using animal behavior informatics
US7882135B2 (en) 2001-05-15 2011-02-01 Psychogenics, Inc. Method for predicting treatment classes using behavior informatics
WO2004021282A1 (en) * 2002-08-29 2004-03-11 Japan Science And Technology Agency Animal behavior analysis method, animal behavior analysis system, animal behavior analysis program, and computer-readable recorded medium on which the program is recorded
JP2004089027A (en) * 2002-08-29 2004-03-25 Japan Science & Technology Corp Method for analyzing behavior of animal, system for analyzing behavior of animal, program for analyzing behavior of animal, and recording medium recording the program and readable with computer
JP2009531049A (en) * 2006-03-28 2009-09-03 ザ・ユニバーシティ・コート・オブ・ザ・ユニバーシティ・オブ・エディンバラ A method for automatically characterizing the behavior of one or more objects.
US8634635B2 (en) 2008-10-30 2014-01-21 Clever Sys, Inc. System and method for stereo-view multiple animal behavior characterization
WO2013170129A1 (en) * 2012-05-10 2013-11-14 President And Fellows Of Harvard College A system and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
US9317743B2 (en) 2012-05-10 2016-04-19 President And Fellows Of Harvard College System and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
US9826922B2 (en) 2012-05-10 2017-11-28 President And Fellows Of Harvard College System and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
US10025973B2 (en) 2012-05-10 2018-07-17 President And Fellows Of Harvard College System and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
US11263444B2 (en) 2012-05-10 2022-03-01 President And Fellows Of Harvard College System and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
US11020025B2 (en) 2015-10-14 2021-06-01 President And Fellows Of Harvard College Automatically classifying animal behavior
US11622702B2 (en) 2015-10-14 2023-04-11 President And Fellows Of Harvard College Automatically classifying animal behavior
US11944429B2 (en) 2015-10-14 2024-04-02 President And Fellows Of Harvard College Automatically classifying animal behavior
US10909691B2 (en) 2016-03-18 2021-02-02 President And Fellows Of Harvard College Automatically classifying animal behavior
US11669976B2 (en) 2016-03-18 2023-06-06 President And Fellows Of Harvard College Automatically classifying animal behavior
JP2020530626A (en) * 2017-08-07 2020-10-22 ザ ジャクソン ラボラトリーThe Jackson Laboratory Long-term continuous animal behavior monitoring
US11798167B2 (en) 2017-08-07 2023-10-24 The Jackson Laboratory Long-term and continuous animal behavioral monitoring

Also Published As

Publication number Publication date
JP3270005B2 (en) 2002-04-02

Similar Documents

Publication Publication Date Title
JP3270005B2 (en) Automated method of observing behavior of experimental animals
KR101167567B1 (en) Fish monitoring digital image processing apparatus and method
JP2002257679A (en) Method of obtaining luminance information, image quality evaluating method, device of obtaining luminance information of display apparatus and image quality evaluating method of the display apparatus
CN111899470B (en) Human body falling detection method, device, equipment and storage medium
CN110826522A (en) Method and system for monitoring abnormal human behavior, storage medium and monitoring equipment
JP6934118B2 (en) Image processing system, image processing method and image processing program
CN114531549B (en) Image acquisition method, electronic device, and computer-readable storage medium
CN112232107A (en) Image type smoke detection system and method
Arazo et al. Segmentation enhanced lameness detection in dairy cows from RGB and depth video
JP6715001B2 (en) Food inspection system and food inspection method
JP2007180709A (en) Method of grasping crowding state and staying state of people or the like at store or the like
CN106611417B (en) Method and device for classifying visual elements into foreground or background
JP2004030225A (en) White smoke detection method and apparatus
TWI780378B (en) System and method for detecting and classifying animal behavior
CN114067316B (en) Rapid identification method based on fine-granularity image classification
US20100202688A1 (en) Device for segmenting an object in an image, video surveillance system, method and computer program
Mazur-Milecka et al. The Analysis of Temperature Changes of the Saliva Traces Left on the Fur During Laboratory Rats Soial Contacts
JP2001249008A (en) Monitor
CN117122320B (en) Emotion data benchmarking method and device and computer readable storage medium
JPH0676047A (en) Picture processor
RU2006143529A (en) METHOD FOR SELECTING OBJECT AND BACKGROUND AREAS ON DIGITAL IMAGES
Cincan et al. Exudate Detection in Diabetic Retinopathy Using Deep Learning Techniques
CN111666786A (en) Image processing method, image processing device, electronic equipment and storage medium
CN115170973B (en) Intelligent paddy field weed identification method, device, equipment and medium
JPH0735699A (en) Method and apparatus for detecting surface defect

Legal Events

Date Code Title Description
R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090118

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100118

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100118

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110118

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120118

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130118

Year of fee payment: 11

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

EXPY Cancellation because of completion of term