CN117854114A - Intelligent identification method, equipment and medium for coupling behavior of zebra fish - Google Patents
Intelligent identification method, equipment and medium for coupling behavior of zebra fish Download PDFInfo
- Publication number
- CN117854114A CN117854114A CN202410251755.2A CN202410251755A CN117854114A CN 117854114 A CN117854114 A CN 117854114A CN 202410251755 A CN202410251755 A CN 202410251755A CN 117854114 A CN117854114 A CN 117854114A
- Authority
- CN
- China
- Prior art keywords
- zebra fish
- image
- neural network
- frame
- behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 241000252212 Danio rerio Species 0.000 title claims abstract description 125
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000008878 coupling Effects 0.000 title claims abstract description 47
- 238000010168 coupling process Methods 0.000 title claims abstract description 47
- 238000005859 coupling reaction Methods 0.000 title claims abstract description 47
- 230000006399 behavior Effects 0.000 claims abstract description 94
- 238000001514 detection method Methods 0.000 claims abstract description 72
- 238000013528 artificial neural network Methods 0.000 claims abstract description 57
- 238000012549 training Methods 0.000 claims abstract description 31
- 230000008569 process Effects 0.000 claims abstract description 19
- 238000002372 labelling Methods 0.000 claims abstract description 18
- 238000001914 filtration Methods 0.000 claims abstract description 14
- 230000002159 abnormal effect Effects 0.000 claims abstract description 6
- 230000008030 elimination Effects 0.000 claims abstract description 6
- 238000003379 elimination reaction Methods 0.000 claims abstract description 6
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 241000251468 Actinopterygii Species 0.000 claims description 34
- 238000012545 processing Methods 0.000 claims description 11
- 210000001015 abdomen Anatomy 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 6
- 230000033001 locomotion Effects 0.000 claims description 4
- 210000000988 bone and bone Anatomy 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000009432 framing Methods 0.000 claims description 3
- 230000004927 fusion Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 206010044565 Tremor Diseases 0.000 description 7
- 238000004891 communication Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000003542 behavioural effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 230000002045 lasting effect Effects 0.000 description 2
- 229920003023 plastic Polymers 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 238000011282 treatment Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000007876 drug discovery Methods 0.000 description 1
- 238000007877 drug screening Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004634 feeding behavior Effects 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 208000000509 infertility Diseases 0.000 description 1
- 230000036512 infertility Effects 0.000 description 1
- 231100000535 infertility Toxicity 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009456 molecular mechanism Effects 0.000 description 1
- 230000036403 neuro physiology Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 230000009329 sexual behaviour Effects 0.000 description 1
- 230000014639 sexual reproduction Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 231100000419 toxicity Toxicity 0.000 description 1
- 230000001988 toxicity Effects 0.000 description 1
- 231100000027 toxicology Toxicity 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/80—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
- Y02A40/81—Aquaculture, e.g. of fish
Landscapes
- Image Analysis (AREA)
Abstract
The application provides an intelligent identification method for coupling behaviors of zebra fish, which relates to the field of fusion of computer vision and biology, and comprises the following steps: training the joint point detection neural network through an initial infrared video; acquiring an infrared video to be identified; preprocessing an infrared video to be identified to generate an gray-white characteristic image; detecting the joint points and the labeling frames of the neural network recognition gray characteristic images through the trained joint points; filtering out joint points and marking frames of gray characteristic images which are missed and false detected in the identification process through an abnormal frame elimination method and a joint point detection neural network, obtaining and combining second joint point predicted values, generating a time sequence signal, and displaying the time sequence signal in a heat map mode; classifying the heat map of the infrared video to be identified through the space-time detection neural network, and finally realizing classification of the coupling behavior of the infrared video to be identified. And filtering the condition of missing detection and false detection by a network model and an abnormal frame elimination method, thereby improving the detection accuracy.
Description
Technical Field
The application relates to the field of fusion of computer vision and biology, in particular to an intelligent identification method, equipment and medium for zebra fish coupling behaviors.
Background
Zebra fish shares 70% of its genome with humans, and its high quality genome makes people more aware of the key genomic features of zebra fish. Zebra fish, which is known for its contribution to genetic biology, has now become an important clinical trial for the study of human diseases. Zebra fish is rapidly spreading in neuroscience and behavioural transformation studies due to its disease characteristics, etiology and progression and its molecular mechanism being clinically relevant and highly conserved. Zebra fish not only plays an important role in the aspects of toxicity, drug discovery, high-throughput and high-content phenotypic drug screening and the like of micro-plastics and nano-plastics, but also plays a role in the fields of behavioural neuroscience, cancer biology, accurate treatment of cancers, toxicology effects and the like, so the zebra fish is an important model organism in the fields of developmental genetics, neurophysiology and biomedicine. In many biological experiments, the behavioral analysis of zebra fish is still manually judged by researchers. Due to subjectivity of measurement, the behavior of the zebra fish is good, the coupling behavior only lasts for a few frames, and researchers can lead to inaccurate discrimination results due to inconsistent discrimination results of the zebra fish coupling behavior in a very short time, and the discrimination is complicated and time-consuming.
The research of the coupling behavior of the zebra fish is significant to the biological genetics, so as to deeply understand the role of the secretograin-2 gene in the sexual behavior and reproduction of the zebra fish. By using computer vision techniques, researchers can more accurately record and analyze a large number of behavioral data, and their secretoglanin-2 mutants provide convenience for screening potentially injectable reproduction-promoting model systems, whether for spawning in cultured fish or for finding new treatments for human infertility.
The coupling behavior of zebra fish includes: nose collision, side by side, turn around and chase. So far, there are few researches trying to automatically identify the puppet behavior of zebra fish, and some related researches identify the feeding behavior of zebra fish by extracting a plurality of pieces of information such as the zebra fish track, the centroid and the speed, and as the puppet behavior of zebra fish is only related to the interaction of male and female fishes, the direct interaction behavior identification of male and female fishes needs to be considered, so that a method showing combined interaction information and time sequence information is needed to identify the puppet behavior.
Disclosure of Invention
The invention aims at: in order to solve the problems mentioned in the background art, the method, the device and the medium for intelligently identifying the coupling behavior of the zebra fish by considering the direct interaction behavior identification of the male zebra fish and the female zebra fish are provided.
The above object of the present application is achieved by the following technical solutions:
s1: acquiring and processing an initial infrared video to obtain a zebra fish training image; labeling the zebra fish training image;
s2: inputting the labeled zebra fish training image into a joint point detection neural network for training, and adjusting parameters of the joint point detection neural network;
s3: labeling the initial infrared video to obtain a time period of the coupling behavior, and determining the behavior video with the coupling behavior; training parameters of the space-time detection neural network through behavior videos;
s4: acquiring an infrared video to be identified; preprocessing an infrared video to be identified to generate an gray-white characteristic image; detecting the joint points and the labeling frames of the neural network recognition gray characteristic images through the trained joint points;
s5: filtering out joint points and marking frames of gray characteristic images which are missed and false detected in the identification process through an abnormal frame elimination method and a joint point detection neural network to obtain a second joint point predicted value;
s6: combining the predicted values of the second joint points of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode; classifying the heat map of the infrared video to be identified through the full-connection layer of the trained space-time detection neural network; and visualizing the behavior classification result of the zebra fish in the infrared video to be identified through video annotation software, and finally realizing the classification of the coupling behaviors of the infrared video to be identified.
Optionally, step S1 includes:
placing the zebra fish in a fish tank, wherein an infrared backlight plate is placed below the fish tank, and the fish tank is completely placed in the plane of the infrared backlight plate;
shooting the zebra fish under the overlook plane through an infrared camera, and collecting an initial infrared video of the zebra fish;
performing frame-separated extraction on the initial infrared video to obtain a zebra fish training image;
and manually labeling the labeling frame and the joint points of each zebra fish in the zebra fish training image by using joint point labeling software.
Optionally, the joint point detection neural network adopts a yolov 8-phase attitude estimation model.
Optionally, step S3 includes:
the space-time detection neural network is a PoseC3D bone behavior recognition model;
manually marking the time period of the initial infrared video by using video marking software, and splitting the initial infrared video into a plurality of behavior videos according to the marked time period;
extracting a first joint point predicted value of each frame of image of the behavior video by using a joint point detection neural network, combining the first joint point predicted values of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode;
and taking the class of the heat map and the behavior video as the input of the space-time detection neural network, and training the parameters of the space-time detection neural network.
Optionally, the specific steps of step S4 are as follows:
s41: selecting a target area to be identified from each frame of zebra fish to-be-identified image in the infrared video to be identified, and manually framing and cutting out an effective image area of zebra fish movement in the target area;
creating a background remover;
extracting frames of the infrared video to be identified corresponding to the effective image area by using an OPENCV library, and inputting the extracted frame images into a background eliminator to obtain a foreground mask;
performing open operation on the foreground mask; convolving the foreground mask using a 2x2 convolution kernel;
s42: creating an ash-white foreground image with the same size as the frame image;
acquiring a non-mask background image through the anti-mask and the gray foreground image;
extracting original image only containing zebra fish from the frame image by using the convolved foreground mask;
combining the non-mask background image with the original image to obtain an grey-white characteristic image only containing zebra fish;
performing feature enhancement on a mask region of the gray feature image; the feature enhancement includes: brightness enhancement and contrast enhancement;
s43: and detecting joint points and marking frames of the gray characteristic image with the enhanced identification characteristics of the neural network by using the joint points.
Optionally, step S5 includes:
s51: setting the number of zebra fish in the fish tank as N;
s52: in the joint point detection neural network identification process, when missed detection exists, the detected number of zebra fish is N-1, and the joint point at the last moment is returned;
because the two fishes are trembled or bump the abdomen, the two fishes are overlapped and crossed, and false detection is caused; when false detection exists, the detection quantity of the zebra fish is N+1;
setting one joint point of the identified gray feature image as a centroid joint point;
s53: matching centroid inodes of zebra fish in the current frame image and the previous frame image by using a Hungary algorithm, and deleting a misdetection annotation frame and the inodes;
s54: and filtering the gray characteristic image through a Kalman filter, and taking the result of Kalman filtering prediction as a second joint point predicted value of the gray characteristic image of the current frame.
Optionally, the step S53 includes:
detecting a neural network through the node to obtain a centroid node;
defining a plurality of zebra fish joint points obtained from the gray characteristic image of the previous frame asThe multiple zebra fish joint points obtained from the gray characteristic image of the current frame are gathered as +.>;
,/>Comparing Euclidean distance obtained from different joint points, and determining cost matrix +.>The following are provided:
wherein,the smaller the value, the greater the degree of inter-frame similarity;
obtaining a centroid node of the gray characteristic image of the current frame matched with the gray characteristic image of the previous frame by carrying out Hungary algorithm matching on the cost matrix;
and deleting the non-centroid joint point of false detection and the corresponding annotation frame through the centroid joint point of the matching of the gray-white characteristic image of the current frame and the gray-white characteristic image of the previous frame.
An electronic device comprises a processor, a memory, a user interface and a network interface, wherein the memory is used for storing instructions, the user interface and the network interface are used for communicating with other devices, and the processor is used for executing the instructions stored in the memory so that the electronic device can execute an intelligent identification method for the zebra fish coupling behavior.
A computer readable storage medium storing instructions that, when executed, perform a method of intelligent identification of zebra fish coupling behavior.
The beneficial effects that this application provided technical scheme brought are:
1. and identifying the joint point and the labeling frame of each frame of the infrared video to be identified and the second joint point predicted value through the joint point detection neural network. And combining the predicted values of the second joint points of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode. And classifying the heat map through a space-time detection neural network to realize classification of the coupling behavior of the infrared video to be identified. When the zebra fish is subjected to trembling or abdomen touching behaviors, the zebra fish is often overlapped or crossed, and the occurrence frequency of the phenomenon in the process of preparing a data set is also small, so that false detection and multiple detection phenomena can be generated to influence the zebra fish behavior recognition network. The abnormal frame elimination method provided by the invention is used for filtering out the joint points and the marked frames of the gray characteristic images which are missed and false detected in the identification process, so that the accuracy of tremble and abdomen touching behaviors in the identification process is effectively improved.
2. The method comprises the steps of selecting a target area to be identified from each frame of zebra fish to-be-identified image in an infrared video to be identified, manually cutting out an effective image area of zebra fish motion in the target area by a picture frame, searching out the effective image area by starting from the background, enhancing the characteristics of the effective image area, effectively improving the accuracy of detection, and distinguishing and correcting the situation that the frame is directly identified as a mistake.
3. The illumination shake is an important factor causing the joint point detector to generate missed detection and false detection, and the environmental changes of water quality, a cylinder body and hardware can cause the increase of the false detection rate of the model. The infrared imaging utilizes a special infrared band, and after the backlight plate is added, the influence of visible light is eliminated, in the image preprocessing process, the part which is very similar to the shape of the fish in the background is eliminated through the background eliminator, the external environment interference is overcome, the detection characteristic is enhanced, and the robustness of the detector is improved.
4. The jitter noise generated in the detection process of the joint point can also influence the behavior classification result, the jitter can be eliminated by using Kalman filtering, most of problems of the jitter of the detector and the crossing and overlapping phenomena of the detected object in the computer vision are eliminated from software, and the behavior recognition accuracy is improved.
Drawings
The application will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a step diagram of an intelligent identification method for coupling behavior of zebra fish in an embodiment of the application;
FIG. 2 is a diagram of a frame anomaly removal method of the intelligent identification method of the coupling behavior of the zebra fish in the embodiment of the application;
FIG. 3 is a schematic diagram of a device of the intelligent identifying method for the coupling behavior of the zebra fish in the embodiment of the application;
FIG. 4 is a thermal diagram of joint-point-containing space-time features of the intelligent identification method for the coupling behavior of the zebra fish in the embodiment of the application;
fig. 5 is a schematic diagram of zebra fish behavior according to the intelligent identifying method of zebra fish coupling behavior in the embodiment of the application;
fig. 6 is a schematic diagram of an electronic device structure of the intelligent identifying method for the coupling behavior of the zebra fish in the embodiment of the application.
Detailed Description
For a clearer understanding of technical features, objects, and effects of the present application, a detailed description of specific embodiments of the present application will be made with reference to the accompanying drawings.
The embodiment of the application provides an intelligent identification method for coupling behaviors of zebra fish.
Referring to fig. 1, fig. 1 is a step diagram of an intelligent identifying method for coupling behavior of zebra fish in an embodiment of the present application, including:
s1: acquiring and processing an initial infrared video to obtain a zebra fish training image; labeling the zebra fish training image;
s2: inputting the labeled zebra fish training image into a joint point detection neural network for training, and adjusting parameters of the joint point detection neural network;
s3: labeling the initial infrared video to obtain a time period of the coupling behavior, and determining the behavior video with the coupling behavior; training parameters of the space-time detection neural network through behavior videos;
s4: acquiring an infrared video to be identified; preprocessing an infrared video to be identified to generate an gray-white characteristic image; detecting the joint points and the labeling frames of the neural network recognition gray characteristic images through the trained joint points;
s5: filtering out joint points and marking frames of gray characteristic images which are missed and false detected in the identification process through an abnormal frame elimination method and a joint point detection neural network to obtain a second joint point predicted value;
s6: combining the predicted values of the second joint points of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode; classifying the heat map of the infrared video to be identified through the full-connection layer of the trained space-time detection neural network; and visualizing the behavior classification result of the zebra fish in the infrared video to be identified through video annotation software, and finally realizing the classification of the coupling behaviors of the infrared video to be identified.
The step S1 comprises the following steps:
placing the zebra fish in a fish tank, wherein an infrared backlight plate is placed below the fish tank, and the fish tank is completely placed in the plane of the infrared backlight plate;
shooting the zebra fish under the overlook plane through an infrared camera, and collecting an initial infrared video of the zebra fish;
performing frame-separated extraction on the initial infrared video to obtain a zebra fish training image;
and manually labeling the labeling frame and the joint points of each zebra fish in the zebra fish training image by using joint point labeling software.
Specifically, the initiation of the zebra fish rear-end collision is that the head of the applying party starts to touch the head of the applying party in a close distance or the applying party starts to be parallel to the head of the applying party at a certain distance without obvious shaking of the body, the head of the applying party starts to shift the tail of the applying party or one of the applying party and the applying party which are parallel to each other starts to shift, the shift gradually increases, and the duration of the rear-end collision exceeds 0.2 seconds. As shown in fig. 5, the zebra fish trembling behavior is started by touching the head contact, the abdomen, the tail and the tail of the female fish and the male fish at a close distance, and is expressed as trembling action, the end of the trembling behavior is determined by the end of trembling of the female fish and the male fish, and the joint distance offset is increased, and the duration of the trembling behavior is more than 0.2 seconds. The zebra fish bellying behavior is manifested by a male fish contacting the body or tail of a female fish with the nose and head, with bellying behavior lasting over 0.2 seconds. The zebra fish turn around behavior is represented by a complete circle of a male fish turning around a female fish, or a turn around in front of a female fish, the turn around behavior lasting more than 0.2 seconds.
In the application, labelme software is used for marking the frame and 5 joint points of each zebra fish in an image, and the frame and 5 joint points are used for inputting image data into a joint point detection neural network for parameter training. The joint point detection neural network adopts a yolov 8-phase attitude estimation model.
The joint point detection neural network adopts a yolov 8-phase attitude estimation model.
The step S3 comprises the following steps:
the space-time detection neural network is a PoseC3D bone behavior recognition model;
manually marking the time period of the initial infrared video by using video marking software, and splitting the initial infrared video into a plurality of behavior videos according to the marked time period;
extracting a first joint point predicted value of each frame of image of the behavior video by using a joint point detection neural network, combining the first joint point predicted values of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode;
and taking the class of the heat map and the behavior video as the input of the space-time detection neural network, and training the parameters of the space-time detection neural network.
Specifically, the video annotation software is used for carrying out time period annotation on the shot video stream, and the video behavior classification videos of the small segments are split according to the annotation information by using a corresponding method of the video fileplay class in the moviey library of Python and are used for inputting video data into the space-time detection neural network for parameter training.
The specific steps of step S4 are as follows:
s41: selecting a target area to be identified from each frame of zebra fish to-be-identified image in the infrared video to be identified, and manually framing and cutting out an effective image area of zebra fish movement in the target area;
creating a background remover;
extracting frames of the infrared video to be identified corresponding to the effective image area by using an OPENCV library, and inputting the extracted frame images into a background eliminator to obtain a foreground mask;
performing open operation on the foreground mask; convolving the foreground mask using a 2x2 convolution kernel;
s42: creating an ash-white foreground image with the same size as the frame image;
acquiring a non-mask background image through the anti-mask and the gray foreground image;
extracting original image only containing zebra fish from the frame image by using the convolved foreground mask;
combining the non-mask background image with the original image to obtain an grey-white characteristic image only containing zebra fish;
performing feature enhancement on a mask region of the gray feature image; the feature enhancement includes: brightness enhancement and contrast enhancement;
s43: and detecting joint points and marking frames of the gray characteristic image with the enhanced identification characteristics of the neural network by using the joint points.
Specifically, the confidence coefficient of the joint point detection neural network and the size of the input image are adjusted to adjust the accuracy of prediction; where confidence is proportional to the predicted accuracy P and recall R is inversely proportional. Input size greatly improves the prediction effect, but also increases the computational time.
The step S5 comprises the following steps:
s51: setting the number of zebra fish in the fish tank as N;
s52: in the joint point detection neural network identification process, when missed detection exists, the detected number of zebra fish is N-1, and the joint point at the last moment is returned;
because the two fishes are trembled or bump the abdomen, the two fishes are overlapped and crossed, and false detection is caused; when false detection exists, the detection quantity of the zebra fish is N+1;
setting one joint point of the identified gray feature image as a centroid joint point;
s53: matching centroid inodes of zebra fish in the current frame image and the previous frame image by using a Hungary algorithm, and deleting a misdetection annotation frame and the inodes;
s54: and filtering the gray characteristic image through a Kalman filter, and taking the result of Kalman filtering prediction as a second joint point predicted value of the gray characteristic image of the current frame.
Step S53 includes:
detecting a neural network through the node to obtain a centroid node;
defining a plurality of zebra fish joint points obtained from the gray characteristic image of the previous frame asThe multiple zebra fish joint points obtained from the gray characteristic image of the current frame are gathered as +.>;
,/>Comparing Euclidean distance obtained from different joint points, and determining cost matrix +.>The following are provided:
wherein,the smaller the value, the greater the degree of inter-frame similarity;
obtaining a centroid node of the gray characteristic image of the current frame matched with the gray characteristic image of the previous frame by carrying out Hungary algorithm matching on the cost matrix;
and deleting the non-centroid joint point of false detection and the corresponding annotation frame through the centroid joint point of the matching of the gray-white characteristic image of the current frame and the gray-white characteristic image of the previous frame.
An example of the frame anomaly removal method of the present application is shown in fig. 2.
Specifically, since jitter noise exists in the identification process, a Kalman filter is used for determining the state noise Q and observing the parameters of the noise R, wherein the R value is kept small, and the excessive deviation of a predicted joint point is avoided.
Specifically, fig. 3 is a schematic diagram of an apparatus for acquiring an infrared video; in the classifying process of the coupling behavior of the infrared video to be identified, the following conditions need to be met: meanwhile, predicting the whole infrared video to be recognized frame by frame, setting the total frame number as A and the frame rate R, wherein the predicted initial frame is (M+1)/2, and the predicted end frame is A- (M+1)/2; the number of frames continuously predicted for each behavior is F, which must satisfy FR*0.2。
Specifically, FIG. 4 is a thermal diagram of a temporal-spatial signature of a joint point. The zebra fish coupling behavior contains information of time and space. In space, the chasing action of the zebra fish is that the female fish and the male fish are close to each other, and the turning action of the zebra fish is that the starting position, the middle radian and the ending position of the turning circle are required to be changed. The thermal patterns of time sequence signals can be changed due to the position and posture change of any zebra fish in adjacent frames, different sequence signals represent interactive behaviors of different behavior categories, a PoseC3d model is utilized to receive the thermal patterns as input, the PoseC3d model can distinguish differences of time sequence changes through sub-video parameter training after behavior category labeling, and after parameter training is completed, the PoseC3d model can be combined with a full-connection layer of a neural network to obtain the thermal patterns for classifying sub-videos through the method, and finally classification of sub-video zebra fish coupling behaviors is achieved.
And acquiring data through a camera in the overlooking direction. Training a pose joint neural network. And outputting joint point information by using the trained gesture joint point neural network to be used as a space-time action neural network for training. The coupling behavior recognition stage comprises the following steps: and (5) preprocessing an image. And (5) reasoning a joint point network model. And filtering and denoising the joint points. And superposing the time sequence information of the node to generate a heat map, and classifying by using the zebra fish coupling behavior classification network to obtain final different class behavior time periods.
Aiming at the zebra fish behavioural experiment, the invention aims at the problem that the detection effect of two zebra fish is poor when the two zebra fish are overlapped in a crossing way in single visual angle shooting because the zebra fish which is required to be obtained in the adult period are placed in the same fish tank and the zebra fish moves. According to the invention, the joint point detection network model and the POSEC3d behavior classification model are respectively used for detecting the joint point, generating a heat map and classifying the behaviors, and the processes of effective characteristic enhancement, joint point false detection processing and joint point denoising and debouncing are carried out in the middle process, so that clear, accurate and high-stability data are provided for the input of the space-time detection neural network, and the accuracy of the zebra fish behavior classification is finally improved.
Specifically, YOLOv 8-phase is a variant model based on YOLOv4, and is mainly used for human body posture estimation. The function of posture estimation is realized by adding a branch network for detecting human body joint points on the basis of YOLOv 4.
Specifically, poseC3D is a 3D-CNN-based skeleton behavior recognition framework, and a 2D human body/animal gesture is used as input to extract a high-precision and high-efficiency skeleton sequence of a human body/animal, so that SOTA is achieved on a plurality of skeleton behavior data sets including FineGYM, NTURGB+D, kinetics-skeleton and the like.
The video annotation software may be Vidat software.
The joint point labeling software may be labelme software.
It should be noted that: in the device provided in the above embodiment, when implementing the functions thereof, only the division of the above functional modules is used as an example, in practical application, the above functional allocation may be implemented by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. In addition, the embodiments of the apparatus and the method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the embodiments of the method are detailed in the method embodiments, which are not repeated herein.
The application also discloses electronic equipment. Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to the disclosure of the embodiment of the present application. The electronic device 500 may include: at least one processor 501, at least one network interface 504, a user interface 503, a memory 505, at least one communication bus 502.
Wherein a communication bus 502 is used to enable connected communications between these components.
The user interface 503 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 503 may further include a standard wired interface and a standard wireless interface.
The network interface 504 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the processor 501 may include one or more processing cores. The processor 501 connects various parts throughout the server using various interfaces and lines, performs various functions of the server and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 505, and invoking data stored in the memory 505. Alternatively, the processor 501 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 501 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 501 and may be implemented by a single chip.
The Memory 505 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory).
Optionally, the memory 505 comprises a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 505 may be used to store instructions, programs, code sets, or instruction sets. The memory 505 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described various method embodiments, etc.; the storage data area may store data or the like involved in the above respective method embodiments. The memory 505 optionally also includes, but is not limited to, at least one storage device located remotely from the aforementioned processor 501. Referring to fig. 6, a memory 505, which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and an application program of a zebra fish coupling behavior intelligent recognition method.
In the electronic device 500 shown in fig. 6, the user interface 503 is mainly used for providing an input interface for a user, and acquiring data input by the user; and the processor 501 may be configured to invoke an application program in the memory 505 that stores a zebra fish coupling behavior intelligent recognition method, which when executed by the one or more processors 501, causes the electronic device 500 to perform the method as in one or more of the embodiments described above. It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided herein, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, such as a division of units, merely a division of logic functions, and there may be additional divisions in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other includes, but is not limited to, an indirect coupling or communication connection via some service interface, device or unit, including but not limited to electrical or other forms.
Elements illustrated as separate elements include, but are not limited to, or may not be physically separate, and elements shown as elements include, but are not limited to, or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, and also include, but are not limited to, each unit being physically present alone, or two or more units being integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone goods, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution, in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned memory includes: various media capable of storing program codes, such as a U disk, a mobile hard disk, a magnetic disk or an optical disk.
The above are merely exemplary embodiments of the present disclosure and are not intended to limit the scope of the present disclosure. That is, equivalent changes and modifications are contemplated by the teachings of this disclosure, which fall within the scope of the present disclosure. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure.
This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a scope and spirit of the disclosure being indicated by the claims.
Claims (9)
1. The intelligent identifying method for the coupling behavior of the zebra fish is characterized by comprising the following steps of:
s1: acquiring and processing an initial infrared video to obtain a zebra fish training image; labeling the zebra fish training image;
s2: inputting the labeled zebra fish training image into a joint point detection neural network for training, and adjusting parameters of the joint point detection neural network;
s3: labeling the initial infrared video to obtain a time period of the coupling behavior, and determining the behavior video with the coupling behavior; training parameters of the space-time detection neural network through behavior videos;
s4: acquiring an infrared video to be identified; preprocessing an infrared video to be identified to generate an gray-white characteristic image; detecting the joint points and the labeling frames of the neural network recognition gray characteristic images through the trained joint points;
s5: filtering out joint points and marking frames of gray characteristic images which are missed and false detected in the identification process through an abnormal frame elimination method and a joint point detection neural network to obtain a second joint point predicted value;
s6: combining the predicted values of the second joint points of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode; classifying the heat map of the infrared video to be identified through the full-connection layer of the trained space-time detection neural network; and visualizing the behavior classification result of the zebra fish in the infrared video to be identified through video annotation software, and finally realizing the classification of the coupling behaviors of the infrared video to be identified.
2. The method for intelligently identifying coupling behaviors of zebra fish according to claim 1, wherein the step S1 comprises the following steps:
placing the zebra fish in a fish tank, wherein an infrared backlight plate is placed below the fish tank, and the fish tank is completely placed in the plane of the infrared backlight plate;
shooting the zebra fish under the overlook plane through an infrared camera, and collecting an initial infrared video of the zebra fish;
performing frame-separated extraction on the initial infrared video to obtain a zebra fish training image;
and manually labeling the labeling frame and the joint points of each zebra fish in the zebra fish training image by using joint point labeling software.
3. The method for intelligently identifying the coupling behavior of the zebra fish according to claim 1, wherein the joint detection neural network adopts a yolov 8-phase attitude estimation model.
4. The method for intelligently identifying coupling behaviors of zebra fish according to claim 1, wherein the step S3 comprises:
the space-time detection neural network is a PoseC3D bone behavior recognition model;
manually marking the time period of the initial infrared video by using video marking software, and splitting the initial infrared video into a plurality of behavior videos according to the marked time period;
extracting a first joint point predicted value of each frame of image of the behavior video by using a joint point detection neural network, combining the first joint point predicted values of each frame to generate a time sequence signal, and presenting the time sequence signal in a heat map mode;
and taking the class of the heat map and the behavior video as the input of the space-time detection neural network, and training the parameters of the space-time detection neural network.
5. The intelligent identifying method for the coupling behavior of the zebra fish according to claim 1, wherein the specific steps of the step S4 are as follows:
s41: selecting a target area to be identified from each frame of zebra fish to-be-identified image in the infrared video to be identified, and manually framing and cutting out an effective image area of zebra fish movement in the target area;
creating a background remover;
extracting frames of the infrared video to be identified corresponding to the effective image area by using an OPENCV library, and inputting the extracted frame images into a background eliminator to obtain a foreground mask;
performing open operation on the foreground mask; convolving the foreground mask using a 2x2 convolution kernel;
s42: creating an ash-white foreground image with the same size as the frame image;
acquiring a non-mask background image through the anti-mask and the gray foreground image;
extracting original image only containing zebra fish from the frame image by using the convolved foreground mask;
combining the non-mask background image with the original image to obtain an grey-white characteristic image only containing zebra fish;
performing feature enhancement on a mask region of the gray feature image; the feature enhancement includes: brightness enhancement and contrast enhancement;
s43: and detecting joint points and marking frames of the gray characteristic image with the enhanced identification characteristics of the neural network by using the joint points.
6. The method for intelligently identifying coupling behaviors of zebra fish according to claim 4, wherein the step S5 comprises:
s51: setting the number of zebra fish in the fish tank as N;
s52: in the joint point detection neural network identification process, when missed detection exists, the detected number of zebra fish is N-1, and the joint point at the last moment is returned;
because the two fishes are trembled or bump the abdomen, the two fishes are overlapped and crossed, and false detection is caused; when false detection exists, the detection quantity of the zebra fish is N+1;
setting one joint point of the identified gray feature image as a centroid joint point;
s53: matching centroid inodes of zebra fish in the current frame image and the previous frame image by using a Hungary algorithm, and deleting a misdetection annotation frame and the inodes;
s54: and filtering the gray characteristic image through a Kalman filter, and taking the result of Kalman filtering prediction as a second joint point predicted value of the gray characteristic image of the current frame.
7. The intelligent identifying method for coupling behavior of zebra fish according to claim 6, wherein the step S53 comprises:
detecting a neural network through the node to obtain a centroid node;
defining a plurality of zebra fish joint points obtained from the gray characteristic image of the previous frame asThe multiple zebra fish joint points obtained from the gray characteristic image of the current frame are gathered as +.>;
,/>Comparing Euclidean distance obtained from different joint points, and determining cost matrix +.>The following are provided:
wherein,the smaller the value, the greater the degree of inter-frame similarity;
obtaining a centroid node of the gray characteristic image of the current frame matched with the gray characteristic image of the previous frame by carrying out Hungary algorithm matching on the cost matrix;
and deleting the non-centroid joint point of false detection and the corresponding annotation frame through the centroid joint point of the matching of the gray-white characteristic image of the current frame and the gray-white characteristic image of the previous frame.
8. An electronic device comprising a processor (501), a memory (505), a user interface (503) and a network interface (504), the memory (505) being configured to store instructions, the user interface (503) and the network interface (504) being configured to communicate to other devices, the processor (501) being configured to execute the instructions stored in the memory (505) to cause the electronic device (500) to perform the method according to any of claims 1-7.
9. A computer readable storage medium storing instructions which, when executed, perform the method steps of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410251755.2A CN117854114B (en) | 2024-03-06 | 2024-03-06 | Intelligent identification method, equipment and medium for coupling behavior of zebra fish |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410251755.2A CN117854114B (en) | 2024-03-06 | 2024-03-06 | Intelligent identification method, equipment and medium for coupling behavior of zebra fish |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117854114A true CN117854114A (en) | 2024-04-09 |
CN117854114B CN117854114B (en) | 2024-06-04 |
Family
ID=90536381
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410251755.2A Active CN117854114B (en) | 2024-03-06 | 2024-03-06 | Intelligent identification method, equipment and medium for coupling behavior of zebra fish |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117854114B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102512146A (en) * | 2011-11-14 | 2012-06-27 | 沈阳大学 | Collection system of Internet of things for activity-of-daily-living information of human body |
CN105869181A (en) * | 2016-06-16 | 2016-08-17 | 山东大学 | Body joint distributed information consistency estimation method based on interacting multiple models |
CN108985259A (en) * | 2018-08-03 | 2018-12-11 | 百度在线网络技术(北京)有限公司 | Human motion recognition method and device |
US20190171912A1 (en) * | 2017-12-05 | 2019-06-06 | Uber Technologies, Inc. | Multiple Stage Image Based Object Detection and Recognition |
CN109934111A (en) * | 2019-02-12 | 2019-06-25 | 清华大学深圳研究生院 | A kind of body-building Attitude estimation method and system based on key point |
CN112580523A (en) * | 2020-12-22 | 2021-03-30 | 平安国际智慧城市科技股份有限公司 | Behavior recognition method, behavior recognition device, behavior recognition equipment and storage medium |
CN116524414A (en) * | 2023-06-26 | 2023-08-01 | 广州英码信息科技有限公司 | Method, system and computer readable storage medium for identifying racking behavior |
-
2024
- 2024-03-06 CN CN202410251755.2A patent/CN117854114B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102512146A (en) * | 2011-11-14 | 2012-06-27 | 沈阳大学 | Collection system of Internet of things for activity-of-daily-living information of human body |
CN105869181A (en) * | 2016-06-16 | 2016-08-17 | 山东大学 | Body joint distributed information consistency estimation method based on interacting multiple models |
US20190171912A1 (en) * | 2017-12-05 | 2019-06-06 | Uber Technologies, Inc. | Multiple Stage Image Based Object Detection and Recognition |
CN108985259A (en) * | 2018-08-03 | 2018-12-11 | 百度在线网络技术(北京)有限公司 | Human motion recognition method and device |
CN109934111A (en) * | 2019-02-12 | 2019-06-25 | 清华大学深圳研究生院 | A kind of body-building Attitude estimation method and system based on key point |
CN112580523A (en) * | 2020-12-22 | 2021-03-30 | 平安国际智慧城市科技股份有限公司 | Behavior recognition method, behavior recognition device, behavior recognition equipment and storage medium |
CN116524414A (en) * | 2023-06-26 | 2023-08-01 | 广州英码信息科技有限公司 | Method, system and computer readable storage medium for identifying racking behavior |
Non-Patent Citations (2)
Title |
---|
MARTA DE OLIVEIRA BARREIROS ET AL.: "Zebrafish tracking using YOLOv2 and Kalman filter", 《SCIENTIFIC REPORTS》, 5 February 2021 (2021-02-05), pages 1 - 14 * |
张晶晶 等: "基于改进顶帽变换的红外弱小目标检测", 《电子与信息学报》, vol. 46, no. 1, 31 January 2024 (2024-01-31), pages 267 - 276 * |
Also Published As
Publication number | Publication date |
---|---|
CN117854114B (en) | 2024-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10055843B2 (en) | System and methods for automatic polyp detection using convulutional neural networks | |
CN112733744B (en) | Camouflage object detection model based on edge cooperative supervision and multi-level constraint | |
CN111052126A (en) | Pedestrian attribute identification and positioning method and convolutional neural network system | |
CN112446270A (en) | Training method of pedestrian re-identification network, and pedestrian re-identification method and device | |
CN109241956B (en) | Method, device, terminal and storage medium for synthesizing image | |
JP7419080B2 (en) | computer systems and programs | |
CN112487844A (en) | Gesture recognition method, electronic device, computer-readable storage medium, and chip | |
Li et al. | Multi-scale sparse network with cross-attention mechanism for image-based butterflies fine-grained classification | |
Bak et al. | Two-stream convolutional networks for dynamic saliency prediction | |
Sanaeifar et al. | Advancing precision agriculture: The potential of deep learning for cereal plant head detection | |
Gonzalo-Martín et al. | Improving deep learning sorghum head detection through test time augmentation | |
CN111310605A (en) | Image processing method and device, electronic equipment and storage medium | |
Li et al. | Biological eagle eye-based method for change detection in water scenes | |
Wilkowski et al. | Training data extraction and object detection in surveillance scenario | |
CN117854114B (en) | Intelligent identification method, equipment and medium for coupling behavior of zebra fish | |
Yu | Deep learning methods for human action recognition | |
CN114299610A (en) | Method and system for recognizing actions in infrared video | |
Chin et al. | Plant Disease Detection and Classification Using Deep Learning Methods: A Comparison Study | |
Wu et al. | An improved YOLOv7 network using RGB-D multi-modal feature fusion for tea shoots detection | |
Wang et al. | Learning to remove reflections from windshield images | |
Menesatti et al. | A new morphometric implemented video-image analysis protocol for the study of social modulation in activity rhythms of marine organisms | |
Vijayalakshmi et al. | Face Detection and Recognition using Machine Learning Techniques | |
CN114693986A (en) | Training method of active learning model, image processing method and device | |
Wu et al. | Super-resolution fusion optimization for poultry detection: a multi-object chicken detection method | |
Han et al. | Social Behavior Atlas: A computational framework for tracking and mapping 3D close interactions of free-moving animals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |