CN113569675A - Mouse open field experimental behavior analysis method based on ConvLSTM network - Google Patents

Mouse open field experimental behavior analysis method based on ConvLSTM network Download PDF

Info

Publication number
CN113569675A
CN113569675A CN202110805124.7A CN202110805124A CN113569675A CN 113569675 A CN113569675 A CN 113569675A CN 202110805124 A CN202110805124 A CN 202110805124A CN 113569675 A CN113569675 A CN 113569675A
Authority
CN
China
Prior art keywords
behavior
mouse
classification
key point
open field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110805124.7A
Other languages
Chinese (zh)
Other versions
CN113569675B (en
Inventor
朱俊才
王治忠
徐正阳
王松伟
牛晓可
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou University
Original Assignee
Zhengzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou University filed Critical Zhengzhou University
Priority to CN202110805124.7A priority Critical patent/CN113569675B/en
Publication of CN113569675A publication Critical patent/CN113569675A/en
Application granted granted Critical
Publication of CN113569675B publication Critical patent/CN113569675B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a behavior analysis method for a mouse open field experiment based on a ConvLSTM network, which comprises the steps of outputting a feature map containing semantic information and position information of a mouse key point by using a key point detection model, inputting a feature map sequence formed by adjacent frames into an identification classification model established based on the ConvLSTM network to realize behavior classification, and finally correcting a misclassification result by using mode filtering to further obtain behavior parameters in the mouse open field experiment; the method realizes the automatic identification of the animal behaviors and the automatic calculation of the animal behavior indexes, can reduce the workload of researchers, provides a quantitative behavior analysis method for the researchers, improves the objectivity of the experiment, can help the researchers catch some behavior patterns which are difficult to perceive by the refined behavior analysis method, and improves the reliability of the analysis result.

Description

Mouse open field experimental behavior analysis method based on ConvLSTM network
Technical Field
The invention belongs to the technical field of ConvLSTM networks and behavior recognition, and particularly relates to a behavior analysis method for a mouse open field experiment based on a ConvLSTM network.
Background
Animal behavior is a subject for researching the function, mechanism, development and evolution of various behaviors of animals, aims to disclose the genetic basis, the behavior mechanism in ecology, the ecological significance and the evolutionary significance of the behaviors and the like of the animal behaviors, and provides a basis for judging the psychological and physiological states of the animals. In the animal behavior research process, experimenters directly or indirectly influence animal behaviors by changing surrounding environments (light, sound, electricity, drug treatment, food induction and the like) and related genes, and then evaluate motor functions, higher central nervous functions, mental states and the like of experimental animals based on behavioral indexes to reflect the overall states of the experimental animals. The mouse open field experiment is one of the most widely applied zoology experiments, the spontaneous activity behaviors and exploration behaviors of the experimental animal can be detected through the mouse open field experiment, and the method is an effective method for evaluating the autonomous behaviors, exploring behaviors and tensity of the experimental animal in a new and different environment. As the similarity of the gene of the mouse serving as a mammal and the human gene is more than 98 percent, a plurality of diseases which are difficult to cure by human can be simulated on the mouse, so the mouse open field experiment is also widely applied to the research of the drug action.
At present, the traditional measuring method of the behavioral indexes is based on the result of manual observation for statistical analysis, is time-consuming and labor-consuming, cannot meet the requirements of some experimental items needing long-time observation, cannot objectively evaluate and measure behaviors, and has large influence of subjective factors. However, in the current animal behavior analysis methods, the main focus is on the measurement of the motion parameters of the experimental animal, namely, the measurement of indexes such as speed, distance and motion trajectory, the automatic analysis of the animal behavior is relatively few, and meanwhile, the method for distinguishing the animal behavior by research workers mainly adopts manual active observation, which is troublesome and laborious, and the analysis results have great deviation.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a mouse open field experimental behavior analysis method based on a ConvLSTM network.
The technical scheme adopted by the invention is as follows:
a behavior analysis method for a mouse open field experiment based on a ConvLSTM network comprises the following steps:
s01, collecting an open field experiment video of the mouse;
s02, detecting the key points of the mouse in the collected image by adopting a key point detection method, and outputting a key point feature map comprising key point information; the key points comprise a nose tip, a left ear, a right ear and a tail root;
s03, taking key point feature maps of adjacent m frames to form a feature map sequence;
s04, inputting the characteristic diagram sequence into a behavior recognition classification model based on a ConvLSTM network, and outputting a classification result of each frame;
the classification results comprise straight walking, turning, decoration, stillness and uprightness;
s05, optimizing and correcting the classification result by adopting a mode filtering method to obtain a behavior sequence diagram;
s06, calculating the behavior parameters of the mouse in the open field experiment according to the behavior sequence diagram, and generating an experiment graph report;
the behavior parameters include the number of behavior occurrences, the duration of the behavior, and the transition pattern of the behavior.
Further, the specific process of outputting the classification result of each frame is as follows:
s102: includes a current frame DiForming a characteristic diagram sequence by the characteristic diagrams corresponding to the adjacent frame images of the intra-total m frames;
s103: inputting the characteristic diagram sequence into a recognition classification model and extracting the time characteristic and the space characteristic of the characteristic diagram sequence;
s104: and obtaining behavior classification according to the time characteristics and the space characteristics, outputting the classification probability of the five types of behaviors, and taking the behavior with the maximum probability value as the behavior classification corresponding to the current frame.
Further, in step S102, a feature map sequence is formed by taking feature maps of m-1 frames of images before the current frame with an inter-frame interval of 2.
Further, the construction process of the behavior recognition classification model comprises the following steps:
s201: constructing a sample data set and a test data set which comprise all kinds of mouse behavior classification results;
s202: adopting a sample data set to construct a classification training model based on a ConvLSTM network; the front four layers of the classification training model are ConvLSTM layers used for extracting behavior time characteristics and space characteristics of the mice, the rear four layers of the classification training model are convolution layers, and the last layer of the classification training model is a global average pooling layer;
s203: and optimizing the classification training model by adopting the test data set to obtain a behavior recognition classification model.
Further, the method for detecting the key points adopts a deep lab cut algorithm, and the specific process of detecting the key points of the mouse in the collected image and outputting the key point feature map by adopting the key point detection method comprises the following steps:
s301: establishing a key point detection model;
the size of the key point detection model input image is 640 x 480 x 3, the backbone network is ResNet50, and a feature map with dimensions of 80 x 60 x 12 is output after deconvolution; the key point detection model comprises 12 channels, wherein the first 4 channels represent category probability, and the last four channels represent coordinate offset relative to the center of the grid;
s302: constructing a mouse key point monitoring data set and training and optimizing a key point detection model;
the mouse key point monitoring data set comprises five basic mouse behaviors of straight walking, stillness, modification, erection and turning.
Further, the optimization and modification of the classification result by using a mode filtering method specifically comprises the following steps: and traversing and sliding on a time sequence by adopting a convolution kernel with the length of n and the step length of 1, wherein the mode in a convolution area is a filtered result.
Further, the behavior parameter is defined as:
the number of times of occurrence of the behavior: counting the total times of occurrence of each behavior in a single experiment by counting the times of change of the behavior from 0 to 1 in a time sequence;
duration of the action: the total continuous frame number of each behavior in a single experiment is calculated by calculating the number of the behaviors which are 1 in a behavior time sequence diagram;
behavior transition mode: the number of times that a certain behavior is transformed into another behavior occurs is calculated by constructing a 5 x 5 behavior transformation matrix, wherein each element in the matrix represents the number of times that the behavior is transformed from another certain behavior.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. outputting a feature map containing semantic information and position information of mouse key points by using a key point detection model, inputting a feature map sequence consisting of adjacent frames into an identification classification model established based on a ConvLSTM network to realize behavior classification, and finally correcting a misclassification result by using mode filtering to further obtain behavior parameters in a mouse open field experiment, so that automatic identification of animal behaviors and automatic calculation of animal behavior indexes are realized, the workload of researchers can be relieved, a quantitative behavior analysis method is provided for the researchers, and the objectivity of the experiment is improved;
2. the method has the advantages that a refined behavior analysis method can help researchers catch some behavior patterns which are difficult to perceive, the reliability of analysis results is improved, and compared with the existing mouse behavior identification method, the method is higher in accuracy, small in influence of environment and illumination changes, and higher in robustness.
3. By adopting the DeepLabCut algorithm to detect key points, the calculation complexity is reduced while the real-time requirement is met.
Drawings
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a diagram of a structure of a key point detection model in the present invention.
FIG. 3 is a diagram of a behavior recognition classification model based on ConvLSTM network according to the present invention.
Fig. 4 is a sample data set including a full category of mouse behavior classification results.
Fig. 5 is a diagram of a mode filtering implementation.
Figure 6 is a timing diagram of the behavior of the mice after filtering.
Figure 7 is a timing diagram of mouse behavior.
FIG. 8 is a graph of the number of behavioral occurrences in mice.
Figure 9 is a graph of the duration of action in mice.
Fig. 10 is a graph showing a behavior transition pattern of mice.
Detailed Description
The invention discloses a method for analyzing the behavior of a mouse in an open field experiment based on a ConvLSTM network, which comprises the following steps as shown in figure 1:
s01, collecting an open field experiment video of the mouse;
s02, detecting the key points of the mouse in the collected image by adopting a key point detection method, and outputting a key point feature map comprising key point information; the key points comprise a nose tip, a left ear, a right ear and a tail root;
s03, taking key point feature maps of adjacent m frames to form a feature map sequence;
s04, inputting the characteristic diagram sequence into a behavior recognition classification model based on a ConvLSTM network, and outputting a classification result of each frame;
the classification results comprise straight walking, turning, decoration, stillness and uprightness;
s05, optimizing and correcting the classification result by adopting a mode filtering method to obtain a behavior sequence diagram;
s06, calculating the behavior parameters of the mouse in the open field experiment according to the behavior sequence diagram, and generating an experiment graph report; the behavior parameters include the number of behavior occurrences, the duration of the behavior, and the transition pattern of the behavior.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments.
As shown in fig. 1, the present invention comprises the steps of:
and S01, collecting the open field experiment video of the mouse.
Open field experiment box size of adoption is 45 x 40cm, and open field experiment background colour adopts white and light grey with mouse health color contrast height, places experimental facilities during the experiment in the quiet environment of dark, uses warm colour light filling lamp to provide the illumination.
The USB zoom camera is adopted to collect video on the open field experiment site, the size of the collected image is 640 x 480, the frame rate is 30, and the distance between the camera and the bottom of the open field experiment box is about 70 cm. During the experiment, the square frame in the image is aligned to the bottom of the open field experiment box by adjusting the focal length and the angle of the camera.
And during the experiment, a program is set according to the foundation, and the camera automatically acquires the video image in the experiment box and records and stores the video image as the video in the avi format.
And S02, detecting the key points of the mouse in the acquired image by adopting a key point detection method, and outputting a key point feature map comprising key point information. The detection method of the key points adopts a DeepLabCut algorithm, and can realize the detection of four key points of the tip of the nose, the left ear, the right ear and the tail root of the mouse.
The concrete process of detecting the key points of the mouse in the collected image by adopting a key point detection method and outputting a key point feature map comprises the following steps:
s301: and establishing a key point detection model.
As shown in fig. 2, the size of the keypoint detection model input image is 640 × 480 × 3, the backbone network is ResNet50, and a feature map with dimensions of 80 × 60 × 12 is output after deconvolution; the keypoint detection model consists of 12 channels, where the first 4 channels represent the class probability and the last four channels represent the coordinate offset from the grid center.
S302: constructing a mouse key point monitoring data set containing 3900 images, and training and optimizing a key point detection model, wherein the mouse key point monitoring data set comprises 5 types of basic mouse behaviors including straight walking, static, modification, erection and turning, 3 camera shooting heights including 60cm, 70cm and 80cm, and 4 gray backgrounds including white, light gray, dark gray and black.
And S03, taking the key point feature maps of the adjacent m frames to form a feature map sequence.
And S04, inputting the characteristic diagram sequence into a behavior recognition classification model based on the ConvLSTM network, and outputting the classification result of each frame. The classification results include straight walking, turning, embellishing, stillness and uprightness. The method specifically comprises the following steps:
s102: defining the current frame as DiTaking 2 as inter-frame interval, taking frame Di、Di-2、Di-4、Di-6And Di-8The feature maps corresponding to 5 frames of images form a feature map sequence with the dimension of 5 × 80 × 60 × 12.
S103: and inputting the characteristic diagram sequence into a behavior recognition classification model and extracting the time characteristic and the space characteristic of the characteristic diagram sequence.
S104: and obtaining behavior classification according to the time characteristics and the space characteristics, outputting the classification probability of the five types of behaviors, and taking the behavior with the maximum probability value as the behavior classification corresponding to the current frame.
The construction process of the behavior recognition classification model comprises the following steps:
s201: and constructing a sample data set and a test data set which comprise all kinds of mouse behavior classification results.
As shown in fig. 4, the sample data set includes 5 behaviors, each of which is 600, and the shooting heights are unified to be 70 cm.
S202: and constructing a classification training model based on the ConvLSTM network by adopting the sample data set.
As shown in fig. 3, the first 4 layers of the classification training model are ConvLSTM layers for extracting mouse behavior temporal features and spatial features, each layer has 64, 128 and 256 convolution kernels, and the convolution kernels have a size of 3 × 3; and 4, classifying the training model, wherein the back 4 layers are convolution layers, the number of convolution kernels is 256, 128, 64 and 5, the size of the convolution kernel is 3 x 3, finally, behavior classification is realized by using global average pooling, and the activation function is a softmax function.
S203: and optimizing the classification training model by adopting the test data set to obtain a behavior recognition classification model.
And the recognition classification model outputs the class probability of 5 classes of behaviors, and the behavior class with the maximum probability value is taken as the behavior class of the current frame.
S05, as shown in fig. 5, performing optimization and correction on the classification result by using a mode filtering method, and obtaining a behavior sequence diagram, that is, performing traversal and sliding on a time sequence by using a convolution kernel with a length of n and a step length of 1, where a mode in a convolution region is a filtered result.
The "upright" behavior is taken as an example for illustration:
assuming a "vertical" behavior sequence of length n, where each element has a value of 0 or 1, and 1 indicates that the frame is a "vertical" behavior, the behavior is continuous in time and does not suddenly change, referring to fig. 5, i.e. it does not suddenly change from 1 to 0 and then to 1, and thus 0 appearing in the illustrated sequence is an erroneous classification result, the present invention corrects the element by a mode around each element, and in order to reduce the complexity of the calculation, the process is implemented by convolution, specifically: and (3) performing traversal sliding on a sequence of behavior time by using a convolution kernel with the length of 11, returning a sequence with the length of n and the value of from 0 to 11, wherein the value in the sequence represents the range with the element position as the center and the radius of 5, the element value is 1, and when the value is greater than or equal to 6, the mode around the element is 1, and the value of the element position is updated to 1, so that the mode filtering is realized.
And S06, calculating the behavior parameters of the mouse in the open field experiment according to the behavior sequence diagram, and generating an experiment graphic report. The behavior parameters include the number of behavior occurrences, the duration of the behavior, and the transition pattern of the behavior. As shown in the figure 6 time chart of the behavior of the filtered mouse, specific behavior parameters are defined as follows:
the number of times of occurrence of the behavior: counting the total times of occurrence of each behavior in a single experiment by counting the times of change of the behavior from 0 to 1 in a time sequence;
duration of the action: the total continuous frame number of each behavior in a single experiment is calculated by calculating the number of the behaviors which are 1 in a behavior time sequence diagram;
behavior transition mode: the number of times that a certain behavior is transformed into another behavior occurs is calculated by constructing a 5 x 5 behavior transformation matrix, wherein each element in the matrix represents the number of times that the behavior is transformed from another certain behavior.
Finally, the related results are saved in a graphical report mode, as shown in fig. 7 to fig. 10, which includes a behavior timing chart, a behavior occurrence number chart, a behavior duration chart, a behavior transition pattern chart, and a res.
The invention has the following advantages:
(1) compared with the existing mouse behavior identification method, the method has the advantages of higher accuracy of analysis results, small influence by environment and illumination change and higher robustness.
(2) The method can identify the behavior of the open-field mouse in real time, automatically analyze the behavioral parameters of the mouse in the open-field experiment after the experiment, and generate a relevant graphic report.
(3) The method for analyzing the experimental behavior of the mouse in the open field based on the ConvLSTM network can be analyzed only by acquiring videos through the camera, is convenient to operate and is convenient for relevant personnel to use.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (7)

1. A behavior analysis method for a mouse open field experiment based on a ConvLSTM network is characterized by comprising the following steps: the method comprises the following steps:
s01, collecting an open field experiment video of the mouse;
s02, detecting the key points of the mouse in the collected image by adopting a key point detection method, and outputting a key point feature map comprising key point information; the key points comprise a nose tip, a left ear, a right ear and a tail root;
s03, taking key point feature maps of adjacent m frames to form a feature map sequence;
s04, inputting the characteristic diagram sequence into a behavior recognition classification model based on a ConvLSTM network, and outputting a classification result of each frame;
the classification results comprise straight walking, turning, decoration, stillness and uprightness;
s05, optimizing and correcting the classification result by adopting a mode filtering method to obtain a behavior sequence diagram;
s06, calculating the behavior parameters of the mouse in the open field experiment according to the behavior sequence diagram, and generating an experiment graph report;
the behavior parameters include the number of behavior occurrences, the duration of the behavior, and the transition pattern of the behavior.
2. The ConvLSTM network-based mouse open field experimental behavior analysis method of claim 1, characterized in that: the specific process of outputting the classification result of each frame is as follows:
s102: includes a current frame DiForming a characteristic diagram sequence by the characteristic diagrams corresponding to the adjacent frame images of the intra-total m frames;
s103: inputting the characteristic diagram sequence into a recognition classification model and extracting the time characteristic and the space characteristic of the characteristic diagram sequence;
s104: and obtaining behavior classification according to the time characteristics and the space characteristics, outputting the classification probability of the five types of behaviors, and taking the behavior with the maximum probability value as the behavior classification corresponding to the current frame.
3. The ConvLSTM network-based mouse open field experimental behavior analysis method of claim 2, characterized in that: in step S102, a feature map sequence is formed by taking a feature map of an image of m-1 frames before the current frame with an inter-frame interval of 2.
4. The ConvLSTM network-based mouse open field experimental behavior analysis method of claim 1, characterized in that: the construction process of the behavior recognition classification model comprises the following steps:
s201: constructing a sample data set and a test data set which comprise all kinds of mouse behavior classification results;
s202: adopting a sample data set to construct a classification training model based on a ConvLSTM network; the front four layers of the classification training model are ConvLSTM layers used for extracting behavior time characteristics and space characteristics of the mice, the rear four layers of the classification training model are convolution layers, and the last layer of the classification training model is a global average pooling layer;
s203: and optimizing the classification training model by adopting the test data set to obtain a behavior recognition classification model.
5. The ConvLSTM network-based mouse open field experimental behavior analysis method of claim 1, characterized in that: the method for detecting the key points adopts a DeepLabCut algorithm, and the specific process of detecting the key points of the mouse in the collected image and outputting the key point characteristic diagram by adopting the key point detection method comprises the following steps:
s301: establishing a key point detection model;
the size of the key point detection model input image is 640 x 480 x 3, the backbone network is ResNet50, and a feature map with dimensions of 80 x 60 x 12 is output after deconvolution; the key point detection model comprises 12 channels, wherein the first 4 channels represent category probability, and the last four channels represent coordinate offset relative to the center of the grid;
s302: constructing a mouse key point monitoring data set and training and optimizing a key point detection model;
the mouse key point monitoring data set comprises five basic mouse behaviors of straight walking, stillness, modification, erection and turning.
6. The ConvLSTM network-based mouse open field experimental behavior analysis method of claim 1, characterized in that: the optimization and correction of the classification result by adopting a mode filtering method specifically comprises the following steps: and traversing and sliding on a time sequence by adopting a convolution kernel with the length of n and the step length of 1, wherein the mode in a convolution area is a filtered result.
7. The ConvLSTM network-based mouse open field experimental behavior analysis method of claim 1, characterized in that: the behavior parameters are defined as:
the number of times of occurrence of the behavior: counting the total times of occurrence of each behavior in a single experiment by counting the times of change of the behavior from 0 to 1 in a time sequence;
duration of the action: the total continuous frame number of each behavior in a single experiment is calculated by calculating the number of the behaviors which are 1 in a behavior time sequence diagram;
behavior transition mode: the number of times that a certain behavior is transformed into another behavior occurs is calculated by constructing a 5 x 5 behavior transformation matrix, wherein each element in the matrix represents the number of times that the behavior is transformed from another certain behavior.
CN202110805124.7A 2021-07-15 2021-07-15 ConvLSTM network-based mouse open field experimental behavior analysis method Active CN113569675B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110805124.7A CN113569675B (en) 2021-07-15 2021-07-15 ConvLSTM network-based mouse open field experimental behavior analysis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110805124.7A CN113569675B (en) 2021-07-15 2021-07-15 ConvLSTM network-based mouse open field experimental behavior analysis method

Publications (2)

Publication Number Publication Date
CN113569675A true CN113569675A (en) 2021-10-29
CN113569675B CN113569675B (en) 2023-05-23

Family

ID=78165132

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110805124.7A Active CN113569675B (en) 2021-07-15 2021-07-15 ConvLSTM network-based mouse open field experimental behavior analysis method

Country Status (1)

Country Link
CN (1) CN113569675B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113989728A (en) * 2021-12-06 2022-01-28 北京航空航天大学 Animal behavior analysis method and device and electronic equipment
CN114022960A (en) * 2022-01-05 2022-02-08 阿里巴巴达摩院(杭州)科技有限公司 Model training and behavior recognition method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019043406A1 (en) * 2017-08-31 2019-03-07 Calipsa Limited Anomaly detection from video data from surveillance cameras
CN109508375A (en) * 2018-11-19 2019-03-22 重庆邮电大学 A kind of social affective classification method based on multi-modal fusion
CN110674785A (en) * 2019-10-08 2020-01-10 中兴飞流信息科技有限公司 Multi-person posture analysis method based on human body key point tracking
CN111414876A (en) * 2020-03-26 2020-07-14 西安交通大学 Violent behavior identification method based on time sequence guide space attention
CN112507961A (en) * 2020-12-22 2021-03-16 上海科技大学 Mouse motion state analysis method based on deep learning algorithm
CN112800903A (en) * 2021-01-19 2021-05-14 南京邮电大学 Dynamic expression recognition method and system based on space-time diagram convolutional neural network
CN112906631A (en) * 2021-03-17 2021-06-04 南京邮电大学 Dangerous driving behavior detection method and detection system based on video

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019043406A1 (en) * 2017-08-31 2019-03-07 Calipsa Limited Anomaly detection from video data from surveillance cameras
CN109508375A (en) * 2018-11-19 2019-03-22 重庆邮电大学 A kind of social affective classification method based on multi-modal fusion
CN110674785A (en) * 2019-10-08 2020-01-10 中兴飞流信息科技有限公司 Multi-person posture analysis method based on human body key point tracking
CN111414876A (en) * 2020-03-26 2020-07-14 西安交通大学 Violent behavior identification method based on time sequence guide space attention
CN112507961A (en) * 2020-12-22 2021-03-16 上海科技大学 Mouse motion state analysis method based on deep learning algorithm
CN112800903A (en) * 2021-01-19 2021-05-14 南京邮电大学 Dynamic expression recognition method and system based on space-time diagram convolutional neural network
CN112906631A (en) * 2021-03-17 2021-06-04 南京邮电大学 Dangerous driving behavior detection method and detection system based on video

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
LEXANDER MATHIS ET AL.: "DeepLabCut: markerless pose estimation of user-defined body parts with deep learning" *
ZHIHONG TIAN ET AL.: "User and Entity Behavior Analysis under Urban Big Data" *
崔莉亚: "运动场景下人体动作分析算法研究" *
连靖 等: "基于3D卷积自编码器的视频异常行为检测" *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113989728A (en) * 2021-12-06 2022-01-28 北京航空航天大学 Animal behavior analysis method and device and electronic equipment
CN114022960A (en) * 2022-01-05 2022-02-08 阿里巴巴达摩院(杭州)科技有限公司 Model training and behavior recognition method and device, electronic equipment and storage medium
CN114022960B (en) * 2022-01-05 2022-06-14 阿里巴巴达摩院(杭州)科技有限公司 Model training and behavior recognition method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113569675B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
CN111178197B (en) Mass R-CNN and Soft-NMS fusion based group-fed adherent pig example segmentation method
Wu et al. Detection and counting of banana bunches by integrating deep learning and classic image-processing algorithms
CN107392091A (en) A kind of agriculture artificial intelligence makees object detecting method, mobile terminal and computer-readable medium
CN113569675A (en) Mouse open field experimental behavior analysis method based on ConvLSTM network
Gaggion et al. ChronoRoot: High-throughput phenotyping by deep segmentation networks reveals novel temporal parameters of plant root system architecture
CN111582234A (en) UAV and deep learning-based large-range oil tea forest fruit intelligent detection and counting method
CN113298023B (en) Insect dynamic behavior identification method based on deep learning and image technology
CN113822185A (en) Method for detecting daily behavior of group health pigs
CN112069985A (en) High-resolution field image rice ear detection and counting method based on deep learning
CN113312999A (en) High-precision detection method and device for diaphorina citri in natural orchard scene
CN114898405B (en) Portable broiler chicken anomaly monitoring system based on edge calculation
CN108829762A (en) The Small object recognition methods of view-based access control model and device
CN114818909A (en) Weed detection method and device based on crop growth characteristics
CN111178172A (en) Laboratory mouse sniffing action recognition method, module and system
CN116597946B (en) Teenager mental health detection method based on house-tree-person
CN112597907A (en) Citrus red spider insect pest identification method based on deep learning
CN113627255B (en) Method, device and equipment for quantitatively analyzing mouse behaviors and readable storage medium
CN115439789A (en) Intelligent identification method and identification system for life state of silkworm
Miranda et al. Pest identification using image processing techniques in detecting image pattern through neural network
CN116189076A (en) Observation and identification system and method for bird observation station
CN113616209B (en) Method for screening schizophrenic patients based on space-time attention mechanism
CN114550918A (en) Mental disorder evaluation method and system based on drawing characteristic data
Wang et al. Apparatus and methods for mouse behavior recognition on foot contact features
Bell et al. Watching plants grow–a position paper on computer vision and Arabidopsis thaliana
CN114010155B (en) Automatic change painful test system of animal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant