JP3270005B2 - Automated method of observing behavior of experimental animals - Google Patents

Automated method of observing behavior of experimental animals

Info

Publication number
JP3270005B2
JP3270005B2 JP09252198A JP9252198A JP3270005B2 JP 3270005 B2 JP3270005 B2 JP 3270005B2 JP 09252198 A JP09252198 A JP 09252198A JP 9252198 A JP9252198 A JP 9252198A JP 3270005 B2 JP3270005 B2 JP 3270005B2
Authority
JP
Japan
Prior art keywords
image
action
behavior
determined
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
JP09252198A
Other languages
Japanese (ja)
Other versions
JPH11296651A (en
Inventor
勝義 川崎
明 清水
泰弘 吉川
順四郎 牧野
恵治 寺尾
直 山海
高正 小山
毅 長谷川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chuo Electronics Co Ltd
Original Assignee
Chuo Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chuo Electronics Co Ltd filed Critical Chuo Electronics Co Ltd
Priority to JP09252198A priority Critical patent/JP3270005B2/en
Publication of JPH11296651A publication Critical patent/JPH11296651A/en
Application granted granted Critical
Publication of JP3270005B2 publication Critical patent/JP3270005B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、薬効評価などに有
効な実験動物の行動観察の自動化方法に関する。
The present invention relates to an automatic method for observing the behavior of an experimental animal, which is effective for evaluating the efficacy of a drug.

【0002】[0002]

【従来の技術】従来の実験動物の行動観察は、観察者の
主観に基づいて観察結果を求めているので、正確な結果
を得るために同一観察者による観察をを2回繰り返して
いたが、同一人であっても観察結果が一致しないことが
あった。また、複数の観察者によって観察結果の客観性
を高める場合は、判定基準を一致させるために事前にト
レーニング期間を設けてから観察を行っていたが、それ
でも観察結果が一致しない場合があった。
2. Description of the Related Art In conventional behavioral observation of experimental animals, observation results are obtained based on the subjectivity of the observer. Therefore, observation by the same observer is repeated twice to obtain accurate results. Observation results sometimes did not match even for the same person. In addition, in order to improve the objectivity of the observation result by a plurality of observers, the observation is performed after providing a training period in advance in order to match the criteria, but the observation result may still not match.

【0003】[0003]

【発明が解決しようとする課題】上述したように、従来
の行動観察方法では、観察に要する時間が長時間必要で
あり、しかも観察結果が一致しない場合もあって、観察
データの客観性について慎重に検討する必要があった。
本発明は、このような従来の行動観察方法の欠点を解消
するためになされたものであって、行動観察時間を短縮
すると共に、データに客観性を持たせるようにしたもの
である。
As described above, in the conventional behavior observation method, the time required for observation is long, and the observation results sometimes do not match. Had to be considered.
SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned drawbacks of the conventional behavior observation method, and aims to shorten the behavior observation time and provide data with objectivity.

【0004】[0004]

【課題を解決するための手段】オープンフィールド内の
実験動物の行動を、所定時間ビデオカメラによって撮影
してVTR画像として記録し、このVTR画像における
実験動物の行動を観察して複数の行動項目に分類すると
共に、VTR画像をディジタル画像としてコンピュータ
に入力させてディジタル画像処理を行って実験動物の行
動判定に用いる画像パラメータを抽出する。次に、観察
者が判定した各行動項目ごとの画像パラメータの値を比
較し、行動判定のためのアルゴリズムを作成したうえで
コンピュータに格納しておき、観察対象の実験動物の画
像をビデオカメラからコンピュータに取り込み、画像パ
ラメータを求めてコンピュータに格納してある行動判定
アルゴリズムを適用することにより自動的に観察結果が
得られるようにした。
The behavior of an experimental animal in an open field is photographed by a video camera for a predetermined time and recorded as a VTR image, and the behavior of the experimental animal in the VTR image is observed to be included in a plurality of behavior items. In addition to the classification, the computer inputs the VTR image as a digital image and performs digital image processing to extract image parameters used for determining the behavior of the experimental animal. Next, the values of the image parameters for each action item determined by the observer are compared, an algorithm for action determination is created and stored in a computer, and the image of the experimental animal to be observed is obtained from the video camera. Observation results can be automatically obtained by acquiring the image parameters into a computer, applying an action determination algorithm stored in the computer to obtain image parameters.

【0005】[0005]

【発明の実施の形態】本発明の実施例を図面を参照しな
がら説明する。図1は、本発明に係る実験動物の行動観
察システムの構成を示す説明図である。図1において、
観察対象のラット7を正方形(115×115cm)のオ
ープンフィールド6に入れ、ラット7の行動をオープン
フィールド6の上方に設けたビデオカメラ1によって撮
影し、撮影したVTR画像はビデオテープレコーダ2、
画像入出力装置3、ビデオモニタ4、コンピュータ5よ
り成る行動観察システムに入力して画像処理されるよう
に構成してある。
Embodiments of the present invention will be described with reference to the drawings. FIG. 1 is an explanatory diagram showing a configuration of a behavior observation system for experimental animals according to the present invention. In FIG.
The observation target rat 7 is placed in a square (115 × 115 cm) open field 6, and the behavior of the rat 7 is photographed by the video camera 1 provided above the open field 6, and the photographed VTR image is recorded on the video tape recorder 2.
It is configured to be input to a behavior observation system including an image input / output device 3, a video monitor 4, and a computer 5 and to be subjected to image processing.

【0006】この実施例においては、ラット7をオープ
ンフィールド6内に10分間放置しておいてビデオカメ
ラ1によって撮影し、そのVTR画像を用いて観察者に
よる行動観察を行い、オープンフィールド6内で起こる
ラット7の行動を分類する。分類された行動項目は、 L
ocomotion(移動活動)、 Grooming(毛づくろい)、Stre
ching(伸び)、 Sniffing(嗅ぎ)、Rearing(後肢立ち)
および Immobility(不動)の6項目であり、1秒毎の時
間見本法によって判定し、1秒間に起こる行動は1つと
してある。なお、行動項目は観察対象の実験動物の種類
によって適宜変更する。
In this embodiment, the rat 7 is left in the open field 6 for 10 minutes, photographed by the video camera 1, and an observer observes the behavior using the VTR image. The behavior of rat 7 that occurs is categorized. Classified action items are L
ocomotion (moving activity), Grooming (grooming), Stre
ching (elongation), Sniffing (sniffing), Rearing (hind leg standing)
And Immobility (immobility), which are determined by the time sample method every second, and one action occurs in one second. The action items are appropriately changed depending on the type of the experimental animal to be observed.

【0007】上述したVTR画像をディジタル画像とし
てSENSAR社製のピラミッド・ヴィジョンシステム
(PVS)に入力させ、画像差分法、2値化法、楕円近
似および重心抽出より成る画像処理を行って画像パラメ
ータを抽出し、行動判定に用いる。画像パラメータは7
項目によって構成しており、その内訳は次の通りであ
る。 (1) 面積:画像上でとらえたラットの面積 (2) 長軸長:ラットを楕円近似した際の楕円の長軸長 (3) 重心移動量:ラットの重心の移動量 (4) 直進性:ラットの重心が直進した処理画像数 (5) 回転性:ラットの長軸が回転した角度の大きさ (6) 総差分量:連続する2フレーム間での輝度値変化量
の合計 (7) 部分差分量:ラットを楕円近似したときの長軸両端
近傍における輝度値変化量
[0007] The above-mentioned VTR image is input as a digital image to a pyramid vision system (PVS) manufactured by SENSAR, and image processing including image difference method, binarization method, elliptic approximation, and centroid extraction is performed to set image parameters. Extracted and used for action determination. Image parameter is 7
It consists of items, the breakdown of which is as follows. (1) Area: The area of the rat captured on the image. (2) Long axis length: Long axis length of the ellipse when the rat is approximated by an ellipse. (3) Movement of the center of gravity: Movement of the rat's center of gravity. (4) Straightness : The number of processed images in which the rat's center of gravity goes straight. (5) Rotation: The size of the angle at which the long axis of the rat is rotated. (6) Total difference: The total change in brightness between two consecutive frames (7) Partial difference: The amount of change in the luminance value near both ends of the long axis when the rat is approximated by an ellipse

【0008】前記ピラミッド・ヴィジョンシステム(P
VS)による画像処理のフローチャートを図3に示す。
図3において、VTR画像における各フレーム毎の処理
時間の流れを縦軸方向に、横軸方向に各フレームを左側
から順番に表示してある。第1のフレームにおいて、先
ず対象画像を取り込み(S101)、背景画像を取り込
む(S100)と共に、対象画像と背景画像の差分処理
(S102)を行ってノイズを除去し(S103)、背
景消去画像を作成する(S104)。同様にして、第2
のフレームにおいても対象画像の取り込み(S20
1)、背景画像を取り込む(S100)と共に、対象画
像と背景画像の差分処理(S202)、ノイズ除去(S
203)、背景消去画像を作成する(S204)。第1
のフレームと第2のフレームにおける背景消去画像(S
104とS204)の差分処理を行って(S105)、
総差分量計測(S106)を行い、さらに部分差分量計
測(S107)を行う。
The pyramid vision system (P
FIG. 3 shows a flowchart of the image processing by VS).
In FIG. 3, the flow of processing time for each frame in the VTR image is displayed in the vertical axis direction, and each frame is displayed in the horizontal axis direction in order from the left. In the first frame, first, a target image is captured (S101), a background image is captured (S100), and a difference process between the target image and the background image (S102) is performed to remove noise (S103). It is created (S104). Similarly, the second
Of the target image in the frame (S20)
1), the background image is captured (S100), the difference processing between the target image and the background image (S202), the noise removal (S100)
203), a background erased image is created (S204). First
Background erased image (S
104 and S204) is performed (S105),
The total difference amount measurement (S106) is performed, and further the partial difference amount measurement (S107) is performed.

【0009】次に、第1フレームにおける背景消去画像
(S104)から対象画像の面積計測(S108)を行
い、さらに、楕円近似(S109)と重心計測(S11
0)を行う。前記楕円近似(S109)に基づいて対象
画像の長軸長計測(S112)を行うと共に短軸長計測
(S113)を行う。同様にして第2フレームにおいて
も背景消去画像(S204)からその面積計測(S20
8)、楕円近似(S209)、重心計測(S210)を
行う。第1フレームの重心計測(S110)と第2フレ
ームの重心計測(S210)に基づいてフレーム間にお
ける重心移動量計測(S111)を行う。また、第1フ
レームの楕円近似(S109)と第2フレームの楕円近
似(S209)に基づいてフレーム間における回転性計
測(S114)を行うと共に直進性計測(S115)を
行う。以下同様にして第3フレーム以降のフレームにつ
いても画像処理を行って、行動判定に用いる7項目の画
像パラメータを求める。
Next, the area of the target image is measured (S108) from the background erased image (S104) in the first frame, and the ellipse approximation (S109) and the center of gravity (S11) are performed.
Perform 0). Based on the ellipse approximation (S109), the long axis length measurement (S112) and the short axis length measurement (S113) of the target image are performed. Similarly, in the second frame, the area measurement (S20) is performed from the background erased image (S204).
8), ellipse approximation (S209), and centroid measurement (S210). Based on the center-of-gravity measurement of the first frame (S110) and the center-of-gravity measurement of the second frame (S210), the amount of movement of the center of gravity between frames (S111) is measured. Further, based on the ellipse approximation of the first frame (S109) and the ellipse approximation of the second frame (S209), the rotation between the frames is measured (S114) and the straightness is measured (S115). In the same manner, image processing is performed on the third and subsequent frames to determine seven image parameters used for action determination.

【0010】上述した7項目の画像パラメータに基づい
て、観察者が判定した6つの行動項目ごとの画像パラメ
ータの値を比較し、図2に示すフローチャートを用いて
行動判定のためのアルゴリズムを作成する。図2におい
て、最初に直進性を検討して(S1)、直進性の値が所
定値より大きい場合は Locomotion(移動活動)と判定
し、直進性が所定値より小さい場合は部分差分量を検討
し(S2)、部分差分量が所定値より多い場合は、さら
に長軸長の値を検討し(S3)、長軸長が所定値より長
ければその行動はStreching(伸び)と判定すると共に、
短かければRearing(後肢立ち)と判定する。次に、前記
部分差分量が所定値より少ない場合は、総差分量、重心
移動量および回転性の夫々の値を検討し(S4,S5,
S6)、全ての値が所定値より少ないか又は小さい場合
はその行動は Immobility(不動)であると判定し、総差
分量、重心移動量および回転性のいずれかの値が所定値
より多いか又は大きい場合はさらに面積を検討し(S
7)、面積が所定値より広い場合はその行動は Sniffin
g(嗅ぎ)と判定すると共に、所定値よりも狭い場合はそ
の行動は Grooming(毛づくろい)であると判断する。以
上の方法によって作成した行動判定のアルゴリズムをコ
ンピュータに格納しておく。
Based on the seven image parameters described above, the values of the image parameters for each of the six action items determined by the observer are compared, and an algorithm for action determination is created using the flowchart shown in FIG. . In FIG. 2, the straightness is examined first (S1). If the straightness value is larger than a predetermined value, it is determined to be Locomotion (moving activity). If the straightness is smaller than the predetermined value, the partial difference is examined. (S2) If the partial difference amount is larger than a predetermined value, the value of the major axis length is further examined (S3). If the major axis length is longer than the predetermined value, the action is determined to be Streching (elongation), and
If it is shorter, it is judged as "Rearing". Next, when the partial difference amount is smaller than the predetermined value, the total difference amount, the center-of-gravity shift amount, and the rotation property are examined (S4, S5, S5).
S6) If all the values are less than or less than the predetermined value, it is determined that the action is Immobility (immobility), and whether any of the total difference amount, the center of gravity movement amount, and the rotation property is larger than the predetermined value. Or, if it is large, consider the area further (S
7) If the area is larger than the predetermined value, the action is Sniffin
In addition to determining that the action is g (sniffing), if the action is smaller than the predetermined value, the action is determined to be grooming. The algorithm for action determination created by the above method is stored in a computer.

【0011】観察対象のラット7の画像をビデオカメラ
1からコンピュータ5に取り込んでその画像パラメータ
を求め、予めコンピュータに格納してある行動判定のア
ルゴリズムを適用すると、自動的にラット7の観察結果
が求められる。観察対象の実験動物がラットでない場合
には、行動判定のためのアルゴリズムも異なったものを
作成する必要があるが、上述した画像パラメータとアル
ゴリズムの作成方法を適用することによって容易に作成
可能である。
An image of the rat 7 to be observed is taken from the video camera 1 into the computer 5 to obtain image parameters of the image, and an action determination algorithm stored in advance in the computer is applied. Desired. If the experimental animal to be observed is not a rat, it is necessary to create a different algorithm for behavior determination, but it can be easily created by applying the image parameter and algorithm creation method described above. .

【0012】[0012]

【発明の効果】以上説明したように、本発明による実験
動物の行動観察の自動化方法によると、実験動物の行動
観察に要する時間を短縮すると共に、データに客観性を
もたせることになるので薬効評価などには特に有効であ
る。
As described above, according to the method for automatically observing the behavior of an experimental animal according to the present invention, the time required for observing the behavior of the experimental animal can be shortened, and the data can be objectively evaluated. It is particularly effective for such applications.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明に係る実験動物の行動観察システムの構
成を示す説明図。
FIG. 1 is an explanatory diagram showing a configuration of a behavior observation system for experimental animals according to the present invention.

【図2】本発明による行動判定のためのアルゴリズム。FIG. 2 is an algorithm for determining an action according to the present invention.

【図3】画像処理のフローチャート。FIG. 3 is a flowchart of image processing.

【符号の説明】[Explanation of symbols]

1 ビデオカメラ 2 ビデオテープレコーダ 3 画像入出力装置 4 ビデオモニタ 5 コンピュータ 6 オープンフィールド 7 ラット DESCRIPTION OF SYMBOLS 1 Video camera 2 Video tape recorder 3 Image input / output device 4 Video monitor 5 Computer 6 Open field 7 Rat

フロントページの続き (72)発明者 吉川 泰弘 東京都保谷市新町3−1−6 (72)発明者 牧野 順四郎 茨城県つくば市稲荷前4−1 (72)発明者 寺尾 恵治 茨城県つくば市松代5−718−1 (72)発明者 山海 直 茨城県つくば市吾妻2−701−202 (72)発明者 小山 高正 東京都豊島区目白3−20−4 クリステ ィー目白301 (72)発明者 長谷川 毅 東京都八王子市元本郷町1丁目9番9号 中央電子株式会社内 (56)参考文献 特開 平9−101300(JP,A) 特開 平2−242154(JP,A) 特開 昭63−66461(JP,A) 特開 昭63−213080(JP,A) 特開 昭63−142259(JP,A) (58)調査した分野(Int.Cl.7,DB名) G06T 1/00 280 A61B 5/00 101 H04N 7/18 G01N 33/15 Continued on the front page (72) Inventor Yasuhiro Yoshikawa 3-1-6 Shinmachi, Hoya-shi, Tokyo (72) Inventor Junshiro Makino 4-1 Inarimae, Tsukuba-shi, Ibaraki Prefecture (72) Inventor Keiji Terao Matsushiro, Tsukuba-shi, Ibaraki Prefecture 5-718-1 (72) Inventor Naomi Sankai 2-701-202, Azuma, Tsukuba-shi, Ibaraki Prefecture (72) Inventor Takamasa Koyama 3-20-4 Mejiro, Toshima-ku, Tokyo 301 Christie Mejiro 301 (72) Inventor Tsuyoshi Hasegawa 1-9-9 Motohongo-cho, Hachioji-shi, Tokyo Chuo Denshi Co., Ltd. (56) References JP-A-9-101300 (JP, A) JP-A-2-242154 (JP, A) JP-A Sho 63-66461 (JP, A) JP-A-63-213080 (JP, A) JP-A-63-142259 (JP, A) (58) Fields investigated (Int. Cl. 7 , DB name) G06T 1/00 280 A61B 5/00 101 H04N 7/18 G01N 33/15

Claims (1)

(57)【特許請求の範囲】(57) [Claims] 【請求項1】 実験動物の行動をビデオカメラにより撮
影したVTR画像をコンピュータにディジタル画像とし
て入力させ、該ディジタル画像を画像差分法、2値化
法、楕円近似、重心抽出より成る画像処理を行って抽出
した画像パラメータの値と、観察者がVTR画面を用い
て判定した各行動項目ごとの画像パラメータの値とを比
較して作成した実験動物の行動判定の基準となるアルゴ
リズムを前記コンピュータに格納しておき、 観察対象の実験動物の画像を前記ビデオカメラから前記
コンピュータへ取り込んで予め格納してある行動判定の
アルゴリズムを適用するようにした実験動物の行動観察
方法において、 画像上でとらえた観察対象物の面積、長軸長、重心移動
量、直進性と回転性、総差分量と部分差分量より成る画
像パラメータを抽出して行動判定に用い、 直進性が所定値より大きい場合行動は Locomotion(移
動活動)であると判定し、直進性が所定値より小さい場
合は部分差分量を検討し、部分差分量が所定値より多い
場合はさらに長軸長検討し、長軸長が所定値より
ければその行動はStreching(伸び)と判定すると共に、
ければRearing(後肢立ち)と判定し、 さらに、前記部分差分量が所定値より少ない場合は総差
分量、重心移動量および回転性の夫々の値を検討し、全
ての値が所定値よりも少ないか又は小さい場合行動は
Immobility(不動)であると判定し、 総差分量、重心移動量および回転性のいずれかの値が所
定値より多いか又は大きい場合はさらに面積を検討し、
面積が所定値よりも広い場合行動は Sniffing(嗅ぎ)
と判定すると共に、所定値よりも狭い場合行動は Gro
oming(毛づくろい)であると判定し、自動的に観察結果
が得られるようにしたことを特徴とする実験動物の行動
観察の自動化方法。
(1)Capture the behavior of experimental animals with a video camera
Convert the shadowed VTR image into a digital image on a computer
The digital image is binarized by the image difference method.
Extraction by performing image processing consisting of the method, ellipse approximation, and centroid extraction
And the observer uses the VTR screen
Of the image parameters for each action item
Algo that is used as a criterion for judging the behavior of experimental animals
The rhythm is stored in the computer, The image of the experimental animal to be observed is obtained from the video camera.
Of the behavior judgment that is stored in the computer and stored in advance.
Behavioral observation of experimental animals adapted to the algorithm
In the method, Area of observation object, long axis length, and center of gravity movement captured on the image
Image consisting of the amount, straightness and rotation, total difference and partial difference
Extract image parameters and use them for action determination,  When the straightness is greater than the specified valueofAction is Locomotion
Motion activity), and when the straightness is smaller than the predetermined value.
If so, consider the amount of partial difference, and the amount of partial difference is greater than a predetermined value
If it is longer axis lengthToExamine and make sure that the major axis length isAlsoLong
If so, the action is determined to be Streching (stretching),
ShortOrIf the partial difference amount is smaller than a predetermined value, it is determined to be “Rearing” (standing hind legs).
After examining the respective values of volume, center of gravity shift, and rotation,
If all values are less than or less than the specified valueofAction is
 Immobility is determined, and any of the values of the total difference, the center of gravity shift, and the
If it is larger or larger than the fixed value, further consider the area,
When the area is larger than the specified valueofAction is Sniffing
Is determined and the value is smaller than the specified value.ofAction is Gro
Determined to be oming (grooming)And automatically observe the results
Is obtainedThe feature is thatFruitTest animal behavior
How to automate observations.
JP09252198A 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals Expired - Lifetime JP3270005B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP09252198A JP3270005B2 (en) 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP09252198A JP3270005B2 (en) 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals

Publications (2)

Publication Number Publication Date
JPH11296651A JPH11296651A (en) 1999-10-29
JP3270005B2 true JP3270005B2 (en) 2002-04-02

Family

ID=14056650

Family Applications (1)

Application Number Title Priority Date Filing Date
JP09252198A Expired - Lifetime JP3270005B2 (en) 1998-03-20 1998-03-20 Automated method of observing behavior of experimental animals

Country Status (1)

Country Link
JP (1) JP3270005B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005196398A (en) * 2004-01-06 2005-07-21 Ohara Ika Sangyo Kk Small animal behavior analysis device, small animal behavior analysis method, and behavior diagnosis method
US7269516B2 (en) 2001-05-15 2007-09-11 Psychogenics, Inc. Systems and methods for monitoring behavior informatics

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6678413B1 (en) * 2000-11-24 2004-01-13 Yiqing Liang System and method for object identification and behavior characterization using video analysis
US7643655B2 (en) 2000-11-24 2010-01-05 Clever Sys, Inc. System and method for animal seizure detection and classification using video analysis
JP2004089027A (en) * 2002-08-29 2004-03-25 Japan Science & Technology Corp Method for analyzing behavior of animal, system for analyzing behavior of animal, program for analyzing behavior of animal, and recording medium recording the program and readable with computer
CN101410855B (en) * 2006-03-28 2011-11-30 爱丁堡大学评议会 Method for automatically attributing one or more object behaviors
US8634635B2 (en) 2008-10-30 2014-01-21 Clever Sys, Inc. System and method for stereo-view multiple animal behavior characterization
WO2013170129A1 (en) 2012-05-10 2013-11-14 President And Fellows Of Harvard College A system and method for automatically discovering, characterizing, classifying and semi-automatically labeling animal behavior and quantitative phenotyping of behaviors in animals
AU2016338856B2 (en) 2015-10-14 2021-01-28 President And Fellows Of Harvard College Automatically classifying animal behavior
JP6964596B2 (en) 2016-03-18 2021-11-10 プレジデント・アンド・フェロウズ・オブ・ハーバード・カレッジ Automatic classification method of animal behavior
WO2019032622A1 (en) * 2017-08-07 2019-02-14 The Jackson Laboratory Long-term and continuous animal behavioral monitoring

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7269516B2 (en) 2001-05-15 2007-09-11 Psychogenics, Inc. Systems and methods for monitoring behavior informatics
US7580798B2 (en) 2001-05-15 2009-08-25 Psychogenics, Inc. Method for predicting treatment classes using animal behavior informatics
US7882135B2 (en) 2001-05-15 2011-02-01 Psychogenics, Inc. Method for predicting treatment classes using behavior informatics
JP2005196398A (en) * 2004-01-06 2005-07-21 Ohara Ika Sangyo Kk Small animal behavior analysis device, small animal behavior analysis method, and behavior diagnosis method
JP4598405B2 (en) * 2004-01-06 2010-12-15 小原医科産業株式会社 Behavioral diagnosis method for small animals

Also Published As

Publication number Publication date
JPH11296651A (en) 1999-10-29

Similar Documents

Publication Publication Date Title
JP3270005B2 (en) Automated method of observing behavior of experimental animals
RU2008117397A (en) METHOD AND APPARATUS FOR DETERMINING THE CHARACTER OF SKIN DEFECTS AND METHOD FOR ESTIMATING EFFECT OF COSMETIC AGENT AGAINST AGING OF SKIN
JP2002257679A (en) Method of obtaining luminance information, image quality evaluating method, device of obtaining luminance information of display apparatus and image quality evaluating method of the display apparatus
CN112084851A (en) Hand hygiene effect detection method, device, equipment and medium
CN114531549B (en) Image acquisition method, electronic device, and computer-readable storage medium
JP2999392B2 (en) How to detect dust and smoke
JP6934118B2 (en) Image processing system, image processing method and image processing program
Restrepo Girón et al. A new algorithm for detecting and correcting bad pixels in infrared images
CN110060252B (en) Method and device for processing target prompt in picture and endoscope system
JPH07198714A (en) Method and device for discriminating activity of cell
JPH0955932A (en) Method for detecting abnormality of abnormality monitor device
CN114494176A (en) Lens dirt detection method and system
TWI780378B (en) System and method for detecting and classifying animal behavior
JP2001167273A5 (en)
Sergej et al. Perceptual evaluation of demosaicing artefacts
Liu et al. Perceptually relevant ringing region detection method
JPH0676047A (en) Picture processor
US9418404B2 (en) Infrared detector system and method
JPH0735699A (en) Method and apparatus for detecting surface defect
CN109948456B (en) Face recognition method and device applied to digital court
JP7196490B2 (en) Living body detection method
JP3439669B2 (en) Image processing device
JP3038176B2 (en) Inspection equipment by pattern matching
CN117122320A (en) Emotion data benchmarking method and device and computer readable storage medium
CN112989866A (en) Object identification method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090118

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100118

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100118

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110118

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120118

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130118

Year of fee payment: 11

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

EXPY Cancellation because of completion of term