WO2023010890A1 - 基于深度学习适用于啮齿动物的行为分析方法 - Google Patents

基于深度学习适用于啮齿动物的行为分析方法 Download PDF

Info

Publication number
WO2023010890A1
WO2023010890A1 PCT/CN2022/087524 CN2022087524W WO2023010890A1 WO 2023010890 A1 WO2023010890 A1 WO 2023010890A1 CN 2022087524 W CN2022087524 W CN 2022087524W WO 2023010890 A1 WO2023010890 A1 WO 2023010890A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
behavior
experimental
nose tip
frames
Prior art date
Application number
PCT/CN2022/087524
Other languages
English (en)
French (fr)
Inventor
李思迪
李懋
盛益华
罗祥
罗华伦
Original Assignee
安徽正华生物仪器设备有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 安徽正华生物仪器设备有限公司 filed Critical 安徽正华生物仪器设备有限公司
Publication of WO2023010890A1 publication Critical patent/WO2023010890A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • the invention belongs to the field of animal behavior analysis and relates to deep learning technology, in particular to a behavior analysis method suitable for rodents based on deep learning.
  • Depression is the most common type of mental disorder today. According to data disclosed by the World Health Organization (WHO), more than 350 million people worldwide suffer from depression, and the growth rate of patients has been about 18% in the past ten years. In today's material civilization, most people no longer struggle with three meals a day, and their pursuit of material is unprecedented. However, excess material wealth has brought more and more common anxiety, confusion, pain and depression. So far, the cause of depression is still unclear, and the development of symptoms varies among patients.
  • researchers have attributed the causes of depression to some or all of the effects of stress, drugs and their abuse, cognitive impairment, and genetics.
  • Immobility time that is, the time when the mouse stops struggling in the water and only floats with its head out of the water
  • Climbing time the time it takes for the mouse to scratch and climb the wall of the glass jar.
  • the present invention realizes the identification of multiple body key points in the forced swimming experiment of rodents (nose tip, eyes, front paws, hindlimb ankles, hindlimb soles, tail root 10 body key points), and accurately captures ultra-fine behavior indicators , to aid in the study of mental illness.
  • the purpose of the present invention is to provide a behavior analysis method suitable for rodents based on deep learning.
  • a behavioral analysis method suitable for rodents based on deep learning comprising the following steps:
  • Step 1 Place the device, perform video acquisition, and get real-time video
  • Step 2 Analyze the real-time video; obtain the tracking video, motion trajectory map, motion heat map, pixel value of body key points, skeleton length, and skeleton orientation angle;
  • Step 3 Carry out data output
  • the output results are tracking video, motion trajectory map, motion heat map, pixel value of body key points, skeleton length, and skeleton orientation angle;
  • the way to obtain the pixel value of the key points of the body is: the coordinate values of the X and Y axes corresponding to each key point of the body of the rodent identified and tracked by the software;
  • the skeleton length acquisition method is: the length between the connected skeletons, in units of pixels;
  • Step 4 Define the behavioral indicators of the experimental mice
  • the pixel point corresponding to the actual water surface diameter of the forced swimming device/the actual water surface diameter of the forced swimming device, and the actual water surface diameter of the forced swimming device is specifically 25cm;
  • the behavioral characteristics of the experimental mice are obtained, and based on this, the total duration of the climbing behavior of the experimental mice, the light climbing behavior of the experimental mice, the moderate climbing behavior of the experimental mice, and the total climbing behavior of the experimental mice are obtained.
  • Step 5 Complete the definition of experimental mouse behavior.
  • step 1 the specific method for obtaining real-time video in step 1 is:
  • the forced swimming test environment includes a forced swimming device and a camera; after injecting 20 cm of water into the forced swimming device, a camera is placed at a distance of 45 cm from the front side of the water surface, and its field of view can cover the entire mouse swimming interval;
  • the forced swimming device is a plexiglass cylinder, and the plexiglass cylinder is 25 cm in diameter x 40 cm in height x 0.5 cm in thickness;
  • the camera specifically uses a model SONY HDR-CX680 camera, and the specific parameters are: the number of frames is 60 frames ;
  • the resolution is 1920 ⁇ 1080.
  • step 2 the specific method of analyzing the real-time video in step 2 is:
  • the key points of the body include the tip of the nose, eyes, front paws, ankles of the hind limbs, soles of the hind limbs, and root of the tail;
  • the skeleton of the experimental mouse is specifically the tip of the nose-the root of the tail, the tip of the nose-eyes, the ankle of the hind limbs-the soles of the hind limbs;
  • each frame of the recorded real-time video uses the software to analyze each frame of the recorded real-time video.
  • the size of each frame of photo is determined by the resolution of the camera.
  • the value of 1920 is used as the Y axis
  • the value of 1080 is used as the X axis to establish a two-dimensional coordinate system;
  • the method of obtaining the skeleton orientation angle in step 3 is as follows: Take the nose tip-tail root skeleton as an example; establish a two-dimensional coordinate system with the nose tip as the origin, and the angle formed by the nose tip-tail root vector and the positive direction of the X axis; if the nose tip- When the tail root vector coincides with the positive X direction, it is 0°; the clockwise rotation angle increases continuously, and the counterclockwise rotation angle decreases continuously.
  • the total duration of the climbing behavior of the experimental mice the nose tip-tail root angle value T of the first frame M of the detection data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame to meet the interval of 40 to 140; Output the frame number of this segment as: climbing behavior, and record the number of climbing frames Q; Q ⁇ 60 is the total duration;
  • Mild climbing behavior of experimental mice detect the nose tip-tail root angle value T of the first frame M of the data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame; the output T value is 40 ⁇ Angles of T ⁇ 55 and 125 ⁇ T ⁇ 140 are mild climbing behaviors;
  • Moderate climbing behavior of experimental mice detect the nose tip-tail root angle value T of the first frame M of the data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame; the output T value is 55 ⁇ The angles of T ⁇ 75 and 105 ⁇ T ⁇ 125 are moderate climbing behaviors;
  • Strong climbing behavior of experimental mice detect the nose tip-tail root angle value T of the first frame M of the data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame; the output T value is 75 ⁇ T An angle of ⁇ 105 is a strong climbing behavior;
  • the total duration of the struggling behavior of the experimental mice detect the nose tip-tail root angle value Tz of the first frame P of the data, 140 ⁇ Tz or Tz ⁇ 40, and record the first frame P: nose tip (X1, Y1), tail root (X2, Y2), The next frame P+1: nose tip (X3, Y3), tail root (X4, Y4),
  • is greater than 5, and there are three or more consecutive frames after the first frame that meet the above conditions; output
  • the number of frames in this section is: struggling behavior, and record the number of struggling frames I; I ⁇ 60 to get the total duration;
  • the total duration of the immobile behavior of the experimental mice detect the nose tip-tail root angle value Tb of the first frame U of the data, 140 ⁇ Tb or Tb ⁇ 40, and record the first frame U: nose tip (X5, Y5), tail root (X6, Y6) , the next frame U+1: nose tip (X7, Y7), tail root (X8, Y8),
  • the left hind paw (X10, Y10) of U+1 in the next frame, the right hind paw (X11, Y11) of U in the first frame, and the right hind paw (X12, Y12) of U+1 in the next frame calculate calculate Calculate the speed of the left paw or right paw is less than or equal to 7 pixels/frame, and there are five consecutive frames or more after the first frame that meet the above conditions; output the number of frames in this segment as: motionless behavior, and record the number of struggling frames R; R ⁇ 60 total time;
  • the total duration of the swimming behavior of the experimental mice detect the nose tip-tail root angle value Ty of the first frame E of the data, 140 ⁇ Ty or Ty ⁇ 40, and record the first frame E: nose tip (X13, Y13), tail root (X14Y14), the latter Frame E+1: nose tip (X15, Y15), tail root (X16, Y16),
  • are less than 5, record the left hind paw (X17, Y17) of the first frame E, and the next frame
  • Calculate the speed of the left paw or right paw is greater than 7 pixels/frame, and there are five consecutive frames or more after the first frame that meet the above conditions; output the number of frames in this segment as: swimming behavior, and record the number of struggling frames W; W ⁇ 60 total time.
  • the present invention is based on computer vision and deep learning technology, and does not require special experimental hardware equipment and artificially marking animals with special chemical reagents.
  • the animal is fully automatically tracked, the automation of the experimental process is realized, the subjective error introduced by manual counting and the interference to the experimental animals are avoided, and the objectivity and reliability of the experimental results are increased.
  • the analysis is flexible, supports period analysis, supports timing termination and manual termination, and has rich display methods, and can use various display methods such as track diagram, parameter index, curve, histogram and so on for the movement of animals.
  • the analysis of the behavior of animals in the process of forced swimming is further realized.
  • the invention could have a transformative impact on the study of the neural mechanisms behind compulsive swimming behavior, as well as the development of new drug treatments for human psychiatric disorders.
  • Fig. 1 is the behavior analysis method suitable for rodents based on deep learning of the present invention.
  • An automated analysis method suitable for rodent forced swimming experiments based on deep learning specifically includes the following steps:
  • Step 1 Perform video acquisition to obtain real-time video.
  • the forced swimming test environment includes a forced swimming device and a camera; after injecting 20 cm of water into the forced swimming device, a camera is placed at a distance of 45 cm from the front side of the water surface, and its field of view can cover the entire mouse swimming interval.
  • the forced swimming device is a plexiglass cylinder, the plexiglass cylinder is 25 cm in diameter x 40 cm in height x 0.5 cm in thickness; the camera is specifically a SONY HDR-CX680 camera, and the specific parameters are: the number of frames is 60 frames; the resolution 1920 ⁇ 1080.
  • Step 2 Analyze the real-time video.
  • the key points of the body represent the joint points of the body, including the nose tip, eyes, front paws, hind limb ankles, hind limb soles, and tail root; the experiment
  • the mouse skeleton is specifically nose tip-tail base, nose tip-eyes, hindlimb ankle-hindlimb sole, etc.
  • each frame of the recorded real-time video uses the software to analyze each frame of the recorded real-time video.
  • the size of each frame of photo is determined by the resolution of the camera.
  • the value of 1920 is used as the Y axis
  • the value of 1080 is used as the X axis to establish a two-dimensional coordinate system.
  • Step 3 Perform data output.
  • the output results are tracking video, motion trajectory map, motion heat map, pixel value of body key points, skeleton length, skeleton orientation angle and its hidden data.
  • the way to obtain the pixel value of the key points of the body is: the coordinate values of the X and Y axes corresponding to each key point of the body of the rodent identified and tracked by the software.
  • the skeleton length acquisition method is: the length between the connected skeletons, in units of pixels;
  • the way to obtain the skeleton direction angle is as follows: Take the nose tip-tail root skeleton as an example; establish a two-dimensional coordinate system with the nose tip as the origin, and the angle formed by the nose tip-tail root vector and the positive direction of the X axis; if the nose tip-tail root vector coincides with the positive X direction Then it is 0°; the angle of clockwise rotation increases continuously, and the angle of counterclockwise rotation decreases continuously.
  • the way to obtain the concealment data is as follows: Since the body parts will be occluded during the movement of the animal, the system will automatically predict and give a given probability value.
  • Step 4 Define the behavioral indicators of the experimental mice.
  • the pixel point corresponding to the actual water surface diameter of the forced swimming device/the actual water surface diameter of the forced swimming device, and the actual water surface diameter of the forced swimming device is specifically 25cm.
  • the total duration of the climbing behavior of experimental mice the nose tip-tail root angle value T of the first frame M of the detection data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame that meet the range of 40 to 140.
  • Output the frame number of this segment as: climbing behavior, and record the climbing frame number Q.
  • Q ⁇ 60 is the total time.
  • Mild climbing behavior of experimental mice detect the nose tip-tail root angle value T of the first frame M of the data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame.
  • the output T value at the angle of 40 ⁇ T ⁇ 55 and 125 ⁇ T ⁇ 140 is mild climbing behavior.
  • Moderate climbing behavior of experimental mice detect the nose tip-tail root angle value T of the first frame M of the data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame.
  • the output T value at the angle of 55 ⁇ T ⁇ 75 and 105 ⁇ T ⁇ 125 is moderate climbing behavior.
  • Strong climbing behavior of experimental mice detect the nose tip-tail root angle value T of the first frame M of the data, the range of T is between 40 and 140, and there are five consecutive frames or more after the first frame.
  • the output T value at the angle of 75 ⁇ T ⁇ 105 is a strong climbing behavior.
  • the total duration of the struggling behavior of experimental mice detect the nose tip-tail root angle value Tz of the first frame P of the data, 140 ⁇ Tz or Tz ⁇ 40, and record the first frame P: nose tip (X1, Y1), tail root (X2, Y2), The next frame P+1: nose tip (X3, Y3), tail root (X4, Y4), Y4-Y2
  • the total duration of the immobile behavior of the experimental mouse detect the nose tip-tail root angle value Tb of the first frame U of the data, 140 ⁇ Tb or Tb ⁇ 40, and record the first frame U: nose tip (X5, Y5), tail root (X6, Y6) , the next frame U+1: nose tip (X7, Y7), tail root (X8, Y8),
  • the left hind paw (X10, Y10) of U+1 in the next frame, the right hind paw (X11, Y11) of U in the first frame, and the right hind paw (X12, Y12) of U+1 in the next frame calculate calculate Calculate the speed of the left paw or right paw is less than or equal to 7 pixels/frame, and there are five consecutive frames or more after the first frame that meet the above conditions; output the number of frames in this segment as: motionless behavior, and record the number of struggling frames R; R ⁇ 60 total time.
  • the total duration of the swimming behavior of the experimental mice detect the nose tip-tail root angle value Ty of the first frame E of the data, 140 ⁇ Ty or Ty ⁇ 40, and record the first frame E: nose tip (X13, Y13), tail root (X14Y14), the latter Frame E+1: nose tip (X15, Y15), tail root (X16, Y16),
  • are less than 5, record the left hind paw (X17, Y17) of the first frame E, and the next frame
  • Calculate the speed of the left paw or right paw is greater than 7 pixels/frame, and there are five consecutive frames or more after the first frame that meet the above conditions; output the number of frames in this segment as: swimming behavior, and record the number of struggling frames W; W ⁇ 60 total time.
  • Step 5 Complete the definition of experimental mouse behavior.
  • the current equipment used in the mouse forced swimming experiment uses image tracking algorithm technology to generally judge its behavior, the accuracy of the behavior is low, and it cannot describe the climbing behavior and struggling behavior, nor can it identify specific behavioral indicators, such as: climbing strength etc.
  • external equipment usually requires special experimental hardware equipment and special chemical reagent labeling for animals. As a result, the complexity and cost of the experiment are greatly increased, and human factors interfere with the experimental results.
  • the FST of the present invention has three main innovations:
  • mice When the water depth of the self-designed forced swimming test device increases, the mice will not have adaptive behaviors due to touching the bottom of the swimming pool. At the same time, as the water depth increases, the immobility time of the mice decreases, and the FST sensitivity increases.
  • the traditional FST only records the immobility time of the large/mouse in the swimming pool.
  • the FST of the present invention records the 4 behaviors and 3 intensity changes of the mice. This behavioral evaluation technology increases the sensitivity of the FST and can confirm that the drug has Antidepressant activity.
  • the indicators are expected to be applied to the research and development of antidepressants, sedatives and other drugs, and their efficacy and toxicological effects can be explored.
  • the forced-swimming animal model of depression is a reliable experimental model for the study of human depression pharmacology and its pathogenesis, and for screening and observation of antidepressant drugs. Its main feature is the high specificity of drug action.
  • Antidepressants are distinguished from strong antidepressants and anxiolytics, and most antidepressants produce effects that correlate significantly with clinical potency.
  • the development of new drugs is slow. Using computer vision and deep learning technology to track and analyze the key points of mouse stem can accurately judge the behavior of mice, thereby helping the development of new drugs.
  • the present invention is based on computer vision and deep learning technology, and does not require special experimental hardware equipment and artificially marking animals with special chemical reagents.
  • the animal is fully automatically tracked, the automation of the experimental process is realized, the subjective error introduced by manual counting and the interference to the experimental animals are avoided, and the objectivity and reliability of the experimental results are increased.
  • the analysis is flexible, supports period analysis, supports timing termination and manual termination, and has rich display methods, and can use various display methods such as track diagram, parameter index, curve, histogram and so on for the movement of animals.
  • the analysis of the behavior of animals in the process of forced swimming is further realized.
  • the invention could have a transformative impact on the study of the neural mechanisms behind compulsive swimming behavior, as well as the development of new drug treatments for human psychiatric disorders.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Psychiatry (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Artificial Intelligence (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Child & Adolescent Psychology (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Radiology & Medical Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Fuzzy Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

基于深度学习适用于啮齿动物的行为分析方法,基于计算机视觉与深度学习技术,无需特殊的实验硬件设备与人为对动物进行特殊化学试剂标记处理,通过多身体关键点识别技术实现全自动化追踪动物,实现了实验过程的自动化,避免了人工计数引入的主观误差和对实验动物的干扰,增加了实验结果的客观性、可靠性;并通过设计全新指标实现机器分析小鼠行为;分析灵活,支持时段分析,支持定时终止和人工终止,并具有丰富的显示方式,能对动物的运动情况采用轨迹图、参数指标、曲线、直方图的显示方式;同时,根据对超精细行为指标的捕获,进一步实现动物在强迫游泳过程中行为动作的分析。

Description

基于深度学习适用于啮齿动物的行为分析方法
本申请要求于2021年08月03日提交中国专利局、申请号为202110886932.0、发明名称为“ 基于深度学习适用于啮齿动物的行为分析 方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明属于动物行为分析领域,涉及深度学习技术,具体是基于深度学习适用于啮齿动物的行为分析方法。
背景技术
抑郁症,是当下最为普遍的一类精神紊乱症。据世界卫生组织(WHO)披露数据显示,全球有超过3.5亿人罹患抑郁症,近十年来患者增速约18%。在物质文明的今天,大多数人已不再为一日三餐犯难,对于物质的追求是空前的,但过剩的物质财富却带来了更多更为普遍的焦虑、困惑、痛苦与抑郁。迄今为止,抑郁症的发病原因仍尚不明确,发病者的症状发展也各不相同。研究者们将抑郁症的起因归结为压力、药物及其滥用、认知障碍、遗传等部分或者全部的作用。
1977年,科学家PorsoltRD等,首次通过强迫游泳的动物模型评价了抗抑郁药物的作用,因此,该范式又称为The Porsolt forcedswimtest。该实验基于小鼠天生厌恶游泳的特点,将小鼠放入局限且不可逃避的压迫环境,正常状态下,小鼠将通过游泳在水中寻找逃生路径,设计出行为无助情境,以衡量小鼠的抑郁样行为。在实验中,将小鼠放入水中强迫游泳6至20分钟,通过记录:
1).不动时间:即小鼠在水中停止挣扎,仅将头露出水面漂浮的时间;
2).游泳时间:即小鼠四肢划动、拍打及俯冲时间;
3).攀爬时间:小鼠抓爬玻璃缸壁的时间。
一段时间后,抑郁样小鼠将表现为停止游泳,漂浮在水面上,表现为“放弃寻找逃生径”的“behavioraldespair”行为绝望。
1995年,Irwin.Lucki等人在实验中研究了四种不同的抗抑郁药物desipramine(DMI)、fluoxetine(FLX)、sertraline(SRT)和paroxetine(PRX)对大鼠在强迫游泳实验中“行为绝望”状态的作用。结果发现,四种抗抑郁 药物均能显著减少大鼠在水中的不动时间,增长在水中游泳时间,使实验动物从被动放弃状态转变为积极求生状态。从此强迫游泳实验指标逐渐变成研究抗抑郁药等药物的“金标准”之一。
目前强迫游泳实验通常采取人工观察和手动评分的方式。事实上,人工评分存在许多局限性,如缺乏可复制性和标准化。因此,仍无法负担更多的长期和/或大规模解释性研究。除此之外培养一个专业的强迫游泳评分员不仅耗时耗力,还难以达到统一的标准。因此,除非引入技术创新来促进分析,否则将限制精神医学的转化进展。
本发明基于计算机视觉与深度学习技术,实现啮齿动物强迫游泳实验多身体关键点识别(鼻尖、双眼、前爪、后肢脚踝、后肢脚掌、尾根10个身体关键点),对超精细行为指标精确捕获,助力精神疾病研究。
发明内容
本发明的目的在于提供基于深度学习适用于啮齿动物的行为分析方法。
本发明的目的可以通过以下技术方案实现:
基于深度学习适用于啮齿动物的行为分析方法,该方法包括下述步骤:
步骤一:安放设备,进行视频获取,得到实时视频;
步骤二:对实时视频进行分析;得到追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角;
步骤三:进行数据输出;
视频分析成功后输出的结果为追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角;
身体关键点像素值获取方式为:软件所识别与追踪的啮齿动物各个身体关键点所对应的X、Y轴的坐标值;
骨架长度获取方式为:所连接骨架之间的长度,以像素点为单位;
步骤四:进行实验小鼠的行为指标定义;
首先进行像素点与cm换算:1cm=A个像素点;A的具体取值为:
强迫游泳装置实际水面直径对应的像素点/强迫游泳装置水面实际直径,强迫游泳装置水面实际直径具体为25cm;
根据采集到数据分析得到实验小鼠的行为特征,并据此得到实验小鼠攀爬行为的总时长、实验小鼠的轻度攀爬行为、实验小鼠的中度攀爬行为、实验小鼠的强烈攀爬行为、实验小鼠的挣扎行为总时长、实验小鼠的不动行为总时长和实验小鼠的游泳行为总时长;
步骤五:完成实验小鼠行为定义。
进一步地,步骤一中获取实时视频的具体方法为:
S1:强迫游泳实验环境包括强迫游泳装置与相机;向强迫游泳装置注入20厘米水后,将一台相机置于离水面等高的正侧方45厘米处,其视野可覆盖整个小鼠游动区间;
S2:实验开始之后由相机进行全程录像,得到实验视频。
进一步地,强迫游泳装置为有机玻璃圆筒,有机玻璃圆筒为25厘米直径×40厘米高×0.5厘米厚度;相机具体采用型号为SONY HDR-CX680的相机,具体参数为:帧数为60帧;分辨率为1920×1080。
进一步地,步骤二中的对实时视频进行分析具体方法为:
将相机所录制的实验视频导入电脑,自动跟踪识别实验小鼠身体关键点和实验小鼠骨架;
身体关键点包括鼻尖、双眼、前爪、后肢脚踝、后肢脚掌、尾根;实验小鼠骨架具体为鼻尖-尾根、鼻尖-眼睛、后肢脚踝-后肢脚掌;
借助软件对所录制的实时视频进行每帧分析,每帧照片大小由相机分辨率决定,分析时以1920数值为Y轴,1080数值为X轴建立二维坐标系;
得到追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角及其掩藏数据。
进一步地,步骤三中的骨架方向角获取方式如下:以鼻尖-尾根骨架为例;以鼻尖为原点建立二维坐标系,鼻尖-尾根向量与X轴正方向所成的夹角;若鼻尖-尾根向量与X正方向重合则为0°;顺时针旋转角度不断增加,逆时针旋转角度不断减小。
进一步地,根据采集到数据分析得到实验小鼠的行为特征具体方式为:
实验小鼠攀爬行为的总时长:检测数据首帧M的鼻尖-尾根角度值T, T的范围在40到140之间,并且首帧后面有连续五帧及以上满足40到140这个区间;输出此段帧数为:攀爬行为,并记录攀爬帧数Q;Q÷60得总时长;
实验小鼠的轻度攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上;输出T值在40<T<55及125<T<140的角度为轻度攀爬行为;
实验小鼠的中度攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上;输出T值在55<T<75及105<T<125的角度为中度攀爬行为;
实验小鼠的强烈攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上;输出T值在75<T<105的角度为强烈攀爬行为;
实验小鼠的挣扎行为总时长:检测数据首帧P的鼻尖-尾根角度值Tz,140<Tz或Tz<40,并且记首帧P:鼻尖(X1,Y1),尾根(X2,Y2),后一帧P+1:鼻尖(X3,Y3),尾根(X4,Y4),|Y4-Y2|或|Y3-Y1|大于5,并且首帧后面有连续三帧及以上满足以上条件;输出此段帧数为:挣扎行为,并记录挣扎帧数I;I÷60得总时长;
实验小鼠的不动行为总时长:检测数据首帧U的鼻尖-尾根角度值Tb,140<Tb或Tb<40,并且记首帧U:鼻尖(X5,Y5),尾根(X6,Y6),后一帧U+1:鼻尖(X7,Y7),尾根(X8,Y8),|Y8-Y6|并|Y7-Y5|小于5,记首帧U的左后爪(X9,Y9),后一帧U+1的左后爪(X10,Y10),首帧U的右后爪(X11,Y11),后一帧U+1的右后爪(X12,Y12),计算
Figure PCTCN2022087524-appb-000001
计算
Figure PCTCN2022087524-appb-000002
计算出左爪或右爪速度小于等于7像素点/帧,并且首帧后面有连续五帧及以上满足以上条件;输出此段帧数为:不动行为,并记录挣扎帧数R;R÷60得总时长;
实验小鼠的游泳行为总时长:检测数据首帧E的鼻尖-尾根角度值Ty,140<Ty或Ty<40,并且记首帧E:鼻尖(X13,Y13),尾根(X14Y14),后一帧E+1:鼻尖(X15,Y15),尾根(X16,Y16),|Y16-Y14|并|Y15-Y13|小于5,记首帧E的左后爪(X17,Y17),后一帧E+1的左后爪(X18,Y18),首帧E 的右后爪(X19,Y19),后一帧E+1的右后爪(X20,Y20),计算左后爪速具体计算公式为:
Figure PCTCN2022087524-appb-000003
计算右后爪速度,具体公式为:
Figure PCTCN2022087524-appb-000004
计算出左爪或右爪速度大于7像素点/帧,并且首帧后面有连续五帧及以上满足以上条件;输出此段帧数为:游泳行为,并记录挣扎帧数W;W÷60得总时长。
本发明的有益效果:
本发明基于计算机视觉与深度学习技术,无需特殊的实验硬件设备与人为对动物进行特殊化学试剂标记处理。通过多身体关键点识别技术实现全自动化追踪动物,实现了实验过程的自动化,避免了人工计数引入的主观误差和对实验动物的干扰,增加了实验结果的客观性、可靠性。并通过设计全新指标实现机器分析小鼠行为。分析灵活,支持时段分析,支持定时终止和人工终止,并具有丰富的显示方式,能对动物的运动情况采用轨迹图、参数指标、曲线、直方图等多种显示方式。同时,根据对超精细行为指标的捕获,进一步实现动物在强迫游泳过程中行为动作的分析。如此一来,该发明可能对强迫游泳行为背后的神经机制相关研究,以及针对人类精神疾病新药物疗法的开发产生变革性影响。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明基于深度学习适用于啮齿动物的行为分析方法。
具体实施方式
基于深度学习适用于啮齿动物强迫游泳实验的自动化分析方法,该方法具体包括下述步骤:
步骤一:进行视频获取,得到实时视频。
S1:强迫游泳实验环境包括强迫游泳装置与相机;向强迫游泳装置注入20厘米水后,将一台相机置于离水面等高的正侧方45厘米处,其视 野可覆盖整个小鼠游动区间。
S2:实验开始之后由相机进行全程录像,得到实验视频。
强迫游泳装置为有机玻璃圆筒,有机玻璃圆筒为25厘米直径×40厘米高×0.5厘米厚度;相机具体采用型号为SONY HDR-CX680的相机,具体参数为:帧数为60帧;分辨率为1920×1080。
步骤二:对实时视频进行分析。
将相机所录制的实验视频导入电脑,自动跟踪识别实验小鼠身体关键点和实验小鼠骨架;身体关键点表示身体关节点,包括鼻尖、双眼、前爪、后肢脚踝、后肢脚掌、尾根;实验小鼠骨架具体为鼻尖-尾根、鼻尖-眼睛、后肢脚踝-后肢脚掌等。
借助软件对所录制的实时视频进行每帧分析,每帧照片大小由相机分辨率决定,分析时以1920数值为Y轴,1080数值为X轴建立二维坐标系。
得到追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角及其掩藏数据。
步骤三:进行数据输出。
视频分析成功后输出的结果为追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角及其掩藏数据。
身体关键点像素值获取方式为:软件所识别与追踪的啮齿动物各个身体关键点所对应的X、Y轴的坐标值。
骨架长度获取方式为:所连接骨架之间的长度,以像素点为单位;
骨架方向角获取方式如下:以鼻尖-尾根骨架为例;以鼻尖为原点建立二维坐标系,鼻尖-尾根向量与X轴正方向所成的夹角;若鼻尖-尾根向量与X正方向重合则为0°;顺时针旋转角度不断增加,逆时针旋转角度不断减小。
掩藏数据获取方式为:由于动物运动过程中身体部位会出现遮挡现象,系统会自动进行预测并且给出给定的可能性值。
步骤四:进行实验小鼠的行为指标定义。
首先进行像素点与cm换算:1cm=A个像素点;A的具体取值为:
强迫游泳装置实际水面直径对应的像素点/强迫游泳装置水面实际直 径,强迫游泳装置水面实际直径具体为25cm。
实验小鼠攀爬行为的总时长:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上满足40到140这个区间。输出此段帧数为:攀爬行为,并记录攀爬帧数Q。Q÷60得总时长。
实验小鼠的轻度攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上。输出T值在40<T<55及125<T<140的角度为轻度攀爬行为。
实验小鼠的中度攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上。输出T值在55<T<75及105<T<125的角度为中度攀爬行为。
实验小鼠的强烈攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上。输出T值在75<T<105的角度为强烈攀爬行为。
实验小鼠的挣扎行为总时长:检测数据首帧P的鼻尖-尾根角度值Tz,140<Tz或Tz<40,并且记首帧P:鼻尖(X1,Y1),尾根(X2,Y2),后一帧P+1:鼻尖(X3,Y3),尾根(X4,Y4),Y4-Y2|或|Y3-Y1|大于5,并且首帧后面有连续三帧及以上满足以上条件。输出此段帧数为:挣扎行为,并记录挣扎帧数I。I÷60得总时长。
实验小鼠的不动行为总时长:检测数据首帧U的鼻尖-尾根角度值Tb,140<Tb或Tb<40,并且记首帧U:鼻尖(X5,Y5),尾根(X6,Y6),后一帧U+1:鼻尖(X7,Y7),尾根(X8,Y8),|Y8-Y6|并|Y7-Y5|小于5,记首帧U的左后爪(X9,Y9),后一帧U+1的左后爪(X10,Y10),首帧U的右后爪(X11,Y11),后一帧U+1的右后爪(X12,Y12),计算
Figure PCTCN2022087524-appb-000005
Figure PCTCN2022087524-appb-000006
计算
Figure PCTCN2022087524-appb-000007
Figure PCTCN2022087524-appb-000008
计算出左爪或右爪速度小于等于7像素点/帧,并且首帧后面有连续五帧及以上满足以上条件;输出此段帧数为:不动行为,并记录挣扎帧数R;R÷60得总时长。
实验小鼠的游泳行为总时长:检测数据首帧E的鼻尖-尾根角度值Ty,140<Ty或Ty<40,并且记首帧E:鼻尖(X13,Y13),尾根(X14Y14),后一 帧E+1:鼻尖(X15,Y15),尾根(X16,Y16),|Y16-Y14|并|Y15-Y13|小于5,记首帧E的左后爪(X17,Y17),后一帧E+1的左后爪(X18,Y18),首帧E的右后爪(X19,Y19),后一帧E+1的右后爪(X20,Y20),计算左后爪速具体计算公式为:
Figure PCTCN2022087524-appb-000009
计算右后爪速度,具体公式为:
Figure PCTCN2022087524-appb-000010
计算出左爪或右爪速度大于7像素点/帧,并且首帧后面有连续五帧及以上满足以上条件;输出此段帧数为:游泳行为,并记录挣扎帧数W;W÷60得总时长。
步骤五:完成实验小鼠行为定义。
目前用于小鼠强迫游泳实验的设备使用图像跟踪算法技术,从而大体上判断其行为,行为准确率低,并且无法描述攀爬行为和挣扎行为,也不能识别具体的行为指标,如:攀爬强度等。此外,以外设备通常需要特殊的实验硬件设备和对动物进行特殊化学试剂标记处理。如此一来,大大增加了实验复杂度与成本,并造成了人为因素对实验结果的干扰。与传统FST相比,本发明FST有三个主要创新点:
一:自主设计的强迫游泳实验装置水深增加,小鼠就不会因为碰到泳池的底部而产生适应性行为,同时随着水深增加,小鼠静止不动时间减少,FST敏感度加强。
二:传统FST只记录大/小鼠在泳池中出现的不动行为时间,本发明FST记录小鼠的4种行为3个强度变化,这种行为学评价技术增加了FST灵敏度,能够证实药物具有抗抑郁活性。
三:采用计算机视觉技术与深度学习技术,能够实现小鼠3D无标记姿态估计,通过上述行为指标设定,实现计算机自动检测行为,从空间、时间、位移等多个方面阐述小鼠强迫游泳的行为。
指标预期可以应用于抗抑郁药、镇静药等药物的研发,可以探究其药效和毒理作用。强迫游泳的抑郁症动物模型,是研究人类抑郁症药理学及其发病机理、筛选观察抗抑郁药物研究中可靠的实验模型,其主要的特点是药物作用的高度特异性,该实验能够很好的将抗抑郁药物与强安定和抗焦虑药加以区别,而且大多数抗抑郁药所产生的效应与临床效价显著相 关。但现如今由于人工双盲检测或者基于图像跟踪算法技术都不足以满足行为的检测,导致新药研发缓慢。而利用计算机视觉与深度学习技术对小鼠干关键点追踪与分析可以精准判断小鼠行为,从而助力新药研发。
本发明基于计算机视觉与深度学习技术,无需特殊的实验硬件设备与人为对动物进行特殊化学试剂标记处理。通过多身体关键点识别技术实现全自动化追踪动物,实现了实验过程的自动化,避免了人工计数引入的主观误差和对实验动物的干扰,增加了实验结果的客观性、可靠性。并通过设计全新指标实现机器分析小鼠行为。分析灵活,支持时段分析,支持定时终止和人工终止,并具有丰富的显示方式,能对动物的运动情况采用轨迹图、参数指标、曲线、直方图等多种显示方式。同时,根据对超精细行为指标的捕获,进一步实现动物在强迫游泳过程中行为动作的分析。如此一来,该发明可能对强迫游泳行为背后的神经机制相关研究,以及针对人类精神疾病新药物疗法的开发产生变革性影响。
以上内容仅仅是对本发明结构所作的举例和说明,所属本技术领域的技术人员对所描述的具体实施例做各种各样的修改或补充或采用类似的方式替代,只要不偏离发明的结构或者超越本权利要求书所定义的范围,均应属于本发明的保护范围。

Claims (6)

  1. 基于深度学习适用于啮齿动物的行为分析方法,其特征在于,该方法包括下述步骤:
    步骤一:安放设备,进行视频获取,得到实时视频;
    步骤二:对实时视频进行分析;得到追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角;
    步骤三:进行数据输出;
    视频分析成功后输出的结果为追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角;
    身体关键点像素值获取方式为:软件所识别与追踪的啮齿动物各个身体关键点所对应的X、Y轴的坐标值;
    骨架长度获取方式为:所连接骨架之间的长度,以像素点为单位;
    步骤四:进行实验小鼠的行为指标定义;
    首先进行像素点与cm换算:1cm=A个像素点;A的具体取值为:
    强迫游泳装置实际水面直径对应的像素点/强迫游泳装置水面实际直径,强迫游泳装置水面实际直径具体为25cm;
    根据采集到数据分析得到实验小鼠的行为特征,并据此得到实验小鼠攀爬行为的总时长、实验小鼠的轻度攀爬行为、实验小鼠的中度攀爬行为、实验小鼠的强烈攀爬行为、实验小鼠的挣扎行为总时长、实验小鼠的不动行为总时长和实验小鼠的游泳行为总时长;
    步骤五:完成实验小鼠行为定义。
  2. 根据权利要求1所述的基于深度学习适用于啮齿动物的行为分析方法,其特征在于,步骤一中获取实时视频的具体方法为:
    S1:强迫游泳实验环境包括强迫游泳装置与相机;向强迫游泳装置注入20厘米水后,将一台相机置于离水面等高的正侧方45厘米处,其视野可覆盖整个小鼠游动区间;
    S2:实验开始之后由相机进行全程录像,得到实验视频。
  3. 根据权利要求2所述的基于深度学习适用于啮齿动物的行为分析方法,其特征在于,强迫游泳装置为有机玻璃圆筒,有机玻璃圆筒为25厘米直径×40厘米高×0.5厘米厚度;相机具体采用型号为SONY HDR-CX680的相机,具体参数为:帧数为60帧;分辨率为1920×1080。
  4. 根据权利要求1所述的基于深度学习适用于啮齿动物的行为分析方法,其特征在于,步骤二中的对实时视频进行分析具体方法为:
    将相机所录制的实验视频导入电脑,自动跟踪识别实验小鼠身体关键点和实验小鼠骨架;
    身体关键点包括鼻尖、双眼、前爪、后肢脚踝、后肢脚掌、尾根;实验小鼠骨架具体为鼻尖-尾根、鼻尖-眼睛、后肢脚踝-后肢脚掌;
    借助软件对所录制的实时视频进行每帧分析,每帧照片大小由相机分辨率决定,分析时以1920数值为Y轴,1080数值为X轴建立二维坐标系;
    得到追踪视频、运动轨迹图、运动热点图、身体关键点像素值、骨架长度、骨架方向角及其掩藏数据。
  5. 根据权利要求1所述的基于深度学习适用于啮齿动物的行为分析方法,其特征在于,步骤三中的骨架方向角获取方式如下:以鼻尖-尾根骨架为例;以鼻尖为原点建立二维坐标系,鼻尖-尾根向量与X轴正方向所成的夹角;若鼻尖-尾根向量与X正方向重合则为0°;顺时针旋转角度不断增加,逆时针旋转角度不断减小。
  6. 根据权利要求1所述的基于深度学习适用于啮齿动物的行为分析方法,其特征在于,根据采集到数据分析得到实验小鼠的行为特征具体方式为:
    实验小鼠攀爬行为的总时长:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上满足40到140这个区间;输出此段帧数为:攀爬行为,并记录攀爬帧数Q;Q÷60得总时长;
    实验小鼠的轻度攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上;输出T值在40<T<55及125<T<140的角度为轻度攀爬行为;
    实验小鼠的中度攀爬行为:检测数据首帧M的鼻尖-尾根角度值T,T的范围在40到140之间,并且首帧后面有连续五帧及以上;输出T值在55<T<75及105<T<125的角度为中度攀爬行为;
    实验小鼠的强烈攀爬行为:检测数据首帧M的鼻尖-尾根角度值T, T的范围在40到140之间,并且首帧后面有连续五帧及以上;输出T值在75<T<105的角度为强烈攀爬行为;
    实验小鼠的挣扎行为总时长:检测数据首帧P的鼻尖-尾根角度值Tz,140<Tz或Tz<40,并且记首帧P:鼻尖(X1,Y1),尾根(X2,Y2),后一帧P+1:鼻尖(X3,Y3),尾根(X4,Y4),|Y4-Y2|或|Y3-Y1|大于5,并且首帧后面有连续三帧及以上满足以上条件;输出此段帧数为:挣扎行为,并记录挣扎帧数I;I÷60得总时长;
    实验小鼠的不动行为总时长:检测数据首帧U的鼻尖-尾根角度值Tb,140<Tb或Tb<40,并且记首帧U:鼻尖(X5,Y5),尾根(X6,Y6),后一帧U+1:鼻尖(X7,Y7),尾根(X8,Y8),|Y8-Y6|并|Y7-Y5|小于5,记首帧U的左后爪(X9,Y9),后一帧U+1的左后爪(X10,Y10),首帧U的右后爪(X11,Y11),后一帧U+1的右后爪(X12,Y12),计算
    Figure PCTCN2022087524-appb-100001
    计算
    Figure PCTCN2022087524-appb-100002
    计算出左爪或右爪速度小于等于7像素点/帧,并且首帧后面有连续五帧及以上满足以上条件;输出此段帧数为:不动行为,并记录挣扎帧数R;R÷60得总时长;
    实验小鼠的游泳行为总时长:检测数据首帧E的鼻尖-尾根角度值Ty,140<Ty或Ty<40,并且记首帧E:鼻尖(X13,Y13),尾根(X14Y14),后一帧E+1:鼻尖(X15,Y15),尾根(X16,Y16),|Y16-Y14|并|Y15-Y13|小于5,记首帧E的左后爪(X17,Y17),后一帧E+1的左后爪(X18,Y18),首帧E的右后爪(X19,Y19),后一帧E+1的右后爪(X20,Y20),计算左后爪速具体计算公式为:
    Figure PCTCN2022087524-appb-100003
    计算右后爪速度,具体公式为:
    Figure PCTCN2022087524-appb-100004
    计算出左爪或右爪速度大于7像素点/帧,并且首帧后面有连续五帧及以上满足以上条件;输出此段帧数为:游泳行为,并记录挣扎帧数W;W÷60得总时长。
PCT/CN2022/087524 2021-08-03 2022-04-19 基于深度学习适用于啮齿动物的行为分析方法 WO2023010890A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110886932.0 2021-08-03
CN202110886932.0A CN113576466A (zh) 2021-08-03 2021-08-03 基于深度学习适用于啮齿动物的行为分析方法

Publications (1)

Publication Number Publication Date
WO2023010890A1 true WO2023010890A1 (zh) 2023-02-09

Family

ID=78254450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087524 WO2023010890A1 (zh) 2021-08-03 2022-04-19 基于深度学习适用于啮齿动物的行为分析方法

Country Status (3)

Country Link
CN (1) CN113576466A (zh)
NL (1) NL2032151A (zh)
WO (1) WO2023010890A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113576466A (zh) * 2021-08-03 2021-11-02 安徽正华生物仪器设备有限公司 基于深度学习适用于啮齿动物的行为分析方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110087136A1 (en) * 2009-10-12 2011-04-14 Vertex Pharmaceuticals, Inc. Wireless monitoring of laboratory animals
CN103027694A (zh) * 2012-11-30 2013-04-10 中国航天员科研训练中心 一种用于测试动物在强迫游泳状态下活动的装置
CN103478006A (zh) * 2013-09-27 2014-01-01 中国科学院深圳先进技术研究院 强迫游泳系统
CN111832531A (zh) * 2020-07-24 2020-10-27 安徽正华生物仪器设备有限公司 基于深度学习适用于啮齿动物社交实验的分析系统和方法
CN112336313A (zh) * 2020-11-25 2021-02-09 山西医科大学 一种检测大鼠强迫游泳下活动的装置及方法
CN112507961A (zh) * 2020-12-22 2021-03-16 上海科技大学 一种基于深度学习算法的小鼠运动状态分析方法
CN112580552A (zh) * 2020-12-23 2021-03-30 中山大学 一种鼠类行为分析方法及装置
CN113576466A (zh) * 2021-08-03 2021-11-02 安徽正华生物仪器设备有限公司 基于深度学习适用于啮齿动物的行为分析方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111727905A (zh) * 2020-06-22 2020-10-02 安徽正华生物仪器设备有限公司 基于深度学习的大小鼠旷场实验自动化分析系统和方法
CN113080080B (zh) * 2021-04-07 2022-11-22 川北医学院 一种大鼠强迫游泳试验装置及试验方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110087136A1 (en) * 2009-10-12 2011-04-14 Vertex Pharmaceuticals, Inc. Wireless monitoring of laboratory animals
CN103027694A (zh) * 2012-11-30 2013-04-10 中国航天员科研训练中心 一种用于测试动物在强迫游泳状态下活动的装置
CN103478006A (zh) * 2013-09-27 2014-01-01 中国科学院深圳先进技术研究院 强迫游泳系统
CN111832531A (zh) * 2020-07-24 2020-10-27 安徽正华生物仪器设备有限公司 基于深度学习适用于啮齿动物社交实验的分析系统和方法
CN112336313A (zh) * 2020-11-25 2021-02-09 山西医科大学 一种检测大鼠强迫游泳下活动的装置及方法
CN112507961A (zh) * 2020-12-22 2021-03-16 上海科技大学 一种基于深度学习算法的小鼠运动状态分析方法
CN112580552A (zh) * 2020-12-23 2021-03-30 中山大学 一种鼠类行为分析方法及装置
CN113576466A (zh) * 2021-08-03 2021-11-02 安徽正华生物仪器设备有限公司 基于深度学习适用于啮齿动物的行为分析方法

Also Published As

Publication number Publication date
CN113576466A (zh) 2021-11-02
NL2032151A (en) 2023-02-10

Similar Documents

Publication Publication Date Title
CN108446020B (zh) 融合可视图与深度学习的运动想象意念控制方法及应用
Fantozzi et al. Assessment of three-dimensional joint kinematics of the upper limb during simulated swimming using wearable inertial-magnetic measurement units
CN112001122B (zh) 基于端到端生成对抗网络的非接触式生理信号测量方法
WO2023010890A1 (zh) 基于深度学习适用于啮齿动物的行为分析方法
CN107368859A (zh) 病变识别模型的训练方法、验证方法和病变图像识别装置
CN108968973A (zh) 一种人体步态采集与分析系统及方法
WO2019196099A1 (zh) 医学图像内目标对象的边界定位方法、存储介质及终端
de San Roman et al. Saliency driven object recognition in egocentric videos with deep CNN: toward application in assistance to neuroprostheses
CN110321827A (zh) 一种基于人脸疼痛表情视频的疼痛水平评估方法
CN107997933A (zh) 一种具有实时评估功能的儿童视功能训练康复系统
CN110009007A (zh) 一种面向多类型疾病的人工智能手术辅助系统
Gaber et al. Automated grading of facial paralysis using the Kinect v2: a proof of concept study
CN109126045A (zh) 智能化运动分析和训练系统
CN114863111A (zh) 交互融合Transformer的超声图像量化方法
CN212235908U (zh) 一种用于游泳训练评估图像采集的拍摄装置
CN110522415A (zh) 一种动物疼痛测试系统及其测试方法
Chen et al. CNN-LSTM model for recognizing video-recorded actions performed in a traditional chinese exercise
CN105243675A (zh) 一种基于星状骨架模型的猪跛脚行走识别的方法
Ritsche et al. Fully Automated Analysis of Muscle Architecture from B-Mode Ultrasound Images with DL_Track_US
CN116543455A (zh) 建立帕金森症步态受损评估模型、使用方法、设备及介质
Leitner et al. Automatic tracking of the muscle tendon junction in healthy and impaired subjects using deep learning
CN116110122A (zh) 一种隐私场景下的护理行为识别方法
Lee et al. Comparison of Deep Learning and Image Processing for Tracking the Cognitive Motion of a Laboratory Mouse
CN114496157A (zh) 一种处方显示方法以及系统
CN113768471A (zh) 一种基于步态分析的帕金森疾病辅助诊断系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22851613

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE