CN111144379B - Automatic identification method for visual dynamic response of mice based on image technology - Google Patents

Automatic identification method for visual dynamic response of mice based on image technology Download PDF

Info

Publication number
CN111144379B
CN111144379B CN202010001628.9A CN202010001628A CN111144379B CN 111144379 B CN111144379 B CN 111144379B CN 202010001628 A CN202010001628 A CN 202010001628A CN 111144379 B CN111144379 B CN 111144379B
Authority
CN
China
Prior art keywords
mouse
head
reaction
outline
optokinetic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010001628.9A
Other languages
Chinese (zh)
Other versions
CN111144379A (en
Inventor
高会军
佟明斯
林伟阳
邵俊杰
原慧萍
邵正波
张诗琦
于兴虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202010001628.9A priority Critical patent/CN111144379B/en
Publication of CN111144379A publication Critical patent/CN111144379A/en
Application granted granted Critical
Publication of CN111144379B publication Critical patent/CN111144379B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an automatic identification method of a mouse optokinetic reaction based on an image technology, and relates to an image identification method. The invention aims to solve the problems that the conventional mouse optokinetic reaction identification method consumes a lot of time and labor cost, and is low in identification accuracy and efficiency. The automatic identification method of the visual movement reaction of the mouse based on the image technology comprises the following specific processes: 1. extracting contours of the body, ears and tails of the mice; 2. identifying the orientation of the head of the mouse based on the contours of the body, ears and tail of the mouse; 21. preliminarily positioning the nose tip of the mouse based on the contours of the body, ears and tails of the mouse; 22. performing position correction on the initial positioning result of the nasal tip of the mouse obtained in the second step to obtain the nasal tip of the corrected mouse; 23. based on the nasal tips of the mice obtained by the two methods, the head orientations of the mice are identified. 3. The mouse optokinetic response is identified based on the contour of the body, ear, tail and head orientation of the mouse obtained. The invention is used in the field of biology.

Description

Automatic identification method for visual dynamic response of mice based on image technology
Technical Field
The present invention relates to an image recognition method.
Background
Mice (so-called laboratory mice, which are not necessarily white in practical use) are a model of animals that are often used in biological experiments. In life science and biological research it is often necessary to evaluate the stress response of animals, but since experimental animals do not autonomously express their subjective feelings, it is necessary to study accurate and effective indirect methods to measure their response levels. There is a large difference in behavioral patterns between different animal individuals. For example, mice have a visual motor response phenomenon to a moving object, that is, have autonomous following, resetting, and following processes. One common method of mouse optokinetic experiments is to place a mouse in a circular grating to view the rotating grating. Different experimental individuals will produce different degrees of optokinetic response to the rotating grating, manifested by a change in the frequency and single duration of head-to-grating motion. Fig. 1 is a typical simple optokinetic experimental apparatus. In the experimental process, a camera is generally used for recording the motion state of the mice, and after the experiment is finished, the video is manually analyzed and counted. For optokinetic experiments, the following problems exist in manual statistics:
1. the response of the mouse to the grating is not continuous, but intermittent, usually lasting for several seconds, and is converted into irregular motion, so that the experimenter needs to watch the video completely (usually with a length of several minutes), and the experimenter needs to watch repeatedly for the motion which is not easy to identify.
2. Optokinetic experiments generally require a large number of experimental samples, and a single sample needs to record video for a certain period of time, so that the manual watching and statistics workload is huge.
3. Because individual differences exist in mice, subjective factor interference exists in manual statistics, so that the judgment standard is changed, and particularly in long-time experimental statistics, potential psychological factor fluctuation of people can influence the reliability and objectivity of the whole experimental result.
Therefore, the automatic identification method of the visual dynamic response of the mice based on the computer image technology is established, the labor cost can be effectively reduced, and the efficiency and the accuracy of the experiment can be improved.
Disclosure of Invention
The invention aims to solve the problems that the existing mouse optokinetic reaction identification method consumes a lot of time and labor cost and is low in identification accuracy and efficiency, and provides an automatic identification method for the mouse optokinetic reaction based on an image technology.
The automatic identification method of the visual movement reaction of the mouse based on the image technology comprises the following specific processes:
step one, extracting contours of a mouse body, ears and tails;
step two, recognizing the head position of the mouse based on the contours of the body, ears and tails of the mouse;
and thirdly, recognizing the optokinetic reaction of the mouse based on the contours of the body, the ears and the tails of the mouse obtained in the first step and the head orientation of the mouse obtained in the second step.
The beneficial effects of the invention are as follows:
1. and (5) full-automatic real-time identification. The mice are placed into the experimental device after the experimental parameters are set, the optokinetic experimental data can be obtained immediately after the specified experimental time is reached, manual intervention and monitoring are not needed, and time and labor cost are greatly saved. Assuming that each mouse requires 2 minutes of experimentation, for example 20 mice per group of experiments, the manual method typically requires 2 x 20 = 40 minutes of experimentation and about 3 times the video monitoring time, totaling 160 minutes. The automatic identification can be performed by using the invention only in 40 minutes.
2. Objective criteria. The visual response is identified and evaluated by the computer, so that the influence of factors such as fatigue and experience deficiency of experimenters on experimental results can be completely avoided, objective and accurate experimental results are obtained, and the method has very important significance for medical research.
3. High throughput experiments can be achieved. According to the invention, the effect of multiplying experimental efficiency can be achieved by arranging a plurality of devices in parallel, for example, 5 devices work simultaneously, the experiment of 100 mice can obtain results only in 40 minutes, and for manual experiments, the sample size needs to be input with huge labor cost, so that the efficiency and accuracy of the experiment are improved.
4. Helping to ensure the consistency of experimental conditions. Since the behavior state of mice changes with time, it is necessary to ensure that all mice complete the test in the same period to ensure consistency of experimental conditions. The invention can be applied to high-throughput experiments, and can realize the requirement under the condition of small manpower input.
Drawings
FIG. 1 is a schematic diagram of the invention for artificially observing the optokinetic response of mice;
FIG. 2 is a diagram of an apparatus used in the present invention, 1 is a grating display module, which is composed of 4 liquid crystal display screens, wherein the screen can display rolling bright and dark stripes, and parameters such as width, brightness, speed, color, etc. can be preset and modified by a computer; 2 is a mouse platform for placing mice; 3. 4 is an upper cover and a bottom plate respectively, which are made of mirror materials and are used for expanding the longitudinal depth of the grating; 5 is a camera for video acquisition in the experimental process;
FIG. 3 shows the result of extraction of the contour of a mouse according to the present invention, 6 is the contour of the body of the mouse, 7 is the contour of the ear of the mouse, and 8 is the contour of the tail of the mouse;
FIG. 4 shows the results of preliminary positioning of the tip of the mouse according to the present invention, 9 is the calculated center of gravity of the body contour of the mouse, 10 is the area around the ear of the mouse selected to avoid the surrounding of the external outline of the mouse during recognition, 11 is the circumscribed rectangle of the tail contour of the mouse, and 12 is the obtained tip pre-selection point;
FIG. 5 shows the result of correcting the nasal tip of the mouse according to the present invention, 13 is the center of gravity of the front half of the mouse, 14 is the vicinity of the head of the mouse for screening the nasal tip, and 15 is the final nasal tip P n
FIG. 6 shows the result of recognizing the direction of the head of the mouse according to the present invention, 16 and 17 are shown as P respectively n The point is the center of a circle and the radius is r 1 and r2 Is the center of gravity G, 18 and 19 respectively r1 and G r2 20 is the post-obtained mouse head orientation;
FIG. 7a is a schematic representation of the partial integration of the first order difference used in the present invention to find a preselected region of the mouse optokinetic response;
FIG. 7b is a schematic representation of the partial integration of the second order difference used in the present invention to find a preselected region of the mouse optokinetic response;
fig. 8 is a schematic diagram showing the recognition of the apparent movement reaction state of a mouse recognized by the method of the present invention, wherein the broken line is a random head movement curve, the solid line mark part is the apparent movement reaction state of the mouse, and the solid line mark parts except the third solid line are the recognized apparent movement reaction state of the mouse.
Detailed Description
The first embodiment is as follows: the method for automatically identifying the optokinetic reaction of the mouse based on the image technology comprises the following specific steps:
the invention relates to an automatic identification method of a mouse optokinetic reaction based on a computer vision technology, which can be applied to an automatic experimental device of the mouse optokinetic reaction based on a rotating grating principle, wherein a camera is used for collecting video of a mouse in a grating, a computer is used for analyzing the video in real time, and information such as the reaction state, the frequency, the duration and the like of the mouse to the grating is identified and marked.
The experimental device structure used in the invention is shown in figure 2, wherein 1 is a grating display module, which consists of 4 liquid crystal display screens, the screens can display rolling bright and dark stripes, and parameters such as width, brightness, speed, color and the like can be preset and modified by a computer; 2 is a mouse platform for placing mice; 3. 4 is an upper cover and a bottom plate respectively, which are made of mirror materials and are used for expanding the longitudinal depth of the grating; and 5, a camera is used for video acquisition in the experimental process.
In conducting the experiment, the mice were placed on a mouse stand while the continuously scrolling raster stripes were displayed on the screen. The camera collects video in the experimental process, sends images to the computer, and analyzes and judges the optokinetic reaction of the mice by the recognition algorithm. The detection flow of the algorithm is as follows:
step one, extracting contours of a mouse body, ears and tails;
step two, recognizing the head position of the mouse based on the contours of the body, ears and tails of the mouse;
step three, recognizing the optokinetic reaction of the mouse based on the contours of the body, the ears and the tails of the mouse obtained in the step one and the head orientation of the mouse obtained in the step two; on the basis of the previous two steps, the head orientation and motion changes of the mice can be continuously tracked in the video. Analysis of the follow-up data can yield the optokinetic response data of the mice.
The second embodiment is as follows: the first difference between the present embodiment and the specific embodiment is that the contours of the body, the ear and the tail of the mouse are extracted in the first step; the specific process is as follows:
binarization processing is carried out on the acquired gray level image, and the edge outline of the mouse is extracted;
the binarization formula is:
Figure BDA0002353707760000041
wherein ,s1 Is the minimum value of the pixel value of the region of interest, s 2 Is the maximum value of the pixel value of the region of interest; i.e k,l Is the pixel value of the first column element of the kth line in the acquired image, bi k,l The pixel value of the first column element of the kth row in the binarized image is obtained; in recognizing the body, ear and tail of a mouse, since the colors are not necessarily the same, different s should be set at the time of contour extraction 1 ,s 2
On the basis, the outlines of the body, the ears and the tails of the mouse can be extracted according to the different size, shape and structure of the body, the ears and the tails of the mouse. In order to improve the stability of contour extraction, the three channels of the color image RGB (red, green and blue) can be used for respectively performing binarization processing, so that the stability of contour extraction is improved.
The outline of the extracted outline is shown in FIG. 3, 6 is the outline of the mouse body, 7 is the outline of the mouse ear, and 8 is the outline of the mouse tail.
Other steps and parameters are the same as in the first embodiment.
And a third specific embodiment: the first difference between the present embodiment and the specific embodiment is that the contours of the body, the ear and the tail of the mouse are extracted in the first step; the specific process is as follows:
when the pictures acquired by the common camera are processed, the outline of the mouse is sometimes not completely extracted, and the positioning of the head of the subsequent mouse and the recognition of the head direction can be influenced to a certain extent. To better extract the contour of the mice, depth cameras may be used to collect data from the mice.
The depth camera comprises two infrared cameras, an infrared lattice projector and an RGB camera. The combined action of the three can obtain depth information of objects perceived from the surroundings. In the device for detecting the optokinetic reaction, the distance range between the mouse and the camera is fixed, and the set formed by the points of depth information of pixel points in the selected image between the distance ranges is the body part of the mouse. The outer edges of the mouse body parts form the outline of the mouse body. The contour of the body of the mouse separated in this way is more fitted to the actual contour of the mouse than the contour extracted by the common camera through an algorithm.
The depth camera is used for data acquisition of the mice, and the range of the mice from the depth camera meets the following conditions:
f min <f<f max (2)
wherein f is the distance from the mouse to the depth camera, and the distances between different parts of the mouse and the camera are different; f (f) min Is the distance from the upper end of the mouse (the highest point of the mouse gesture) to the depth camera, and the upper end of the mouse is not provided with an object due to different sizes of the mouse and f min The value of (2) may suitably take on a smaller value; f (f) max Is the distance from the bottom end of the mouse (the lowest point of the mouse gesture) to the depth camera;
selecting a set formed by points between the distance ranges of the formula (2) according to the depth information of the pixel points in the picture to form a body part of the mouse; extracting the outer edge of the mouse body part to obtain the outline of the mouse body; extraction of the contours of the mouse body can in this way separate better results from the image.
Binarizing the outline of the mouse body to extract the outline of the mouse edge;
the binarization formula is:
Figure BDA0002353707760000051
wherein ,s1 Is the minimum value of the pixel value of the region of interest, s 2 Is the maximum value of the pixel value of the region of interest; i.e k,l Is the pixel value of the first column element of the kth line in the acquired image, bi k,l The pixel value of the first column element of the kth row in the binarized image is obtained;
the outline of the body, the ear and the tail of the mouse are respectively extracted according to the different morphological structures of the body, the ear and the tail of the mouse.
Other steps and parameters are the same as in the first embodiment.
The specific embodiment IV is as follows: the second embodiment is different from the second or third embodiment in that in the second step, the head orientation of the mouse is identified based on the contours of the body, the ear and the tail of the mouse; the specific process is as follows:
step two, preliminarily positioning the nose tip of the mouse based on the contours of the body, ears and tails of the mouse;
step two, carrying out position correction on the result of preliminary positioning of the nasal tip of the mouse obtained in the step two to obtain the nasal tip P of the corrected mouse n
Step two and step three, based on the nasal tip P of the mouse obtained in step two n The orientation of the mouse head was identified.
Other steps and parameters are the same as in the second or third embodiment.
Fifth embodiment: the difference between the embodiment and the first to fourth embodiments is that in the second step, the nasal tip of the mouse is primarily positioned based on the contours of the body, the ear and the tail of the mouse; the specific process is as follows:
after obtaining the contour of the mouse body, the center of gravity G of the contour of the mouse body is calculated c
The calculation formula is as follows:
Figure BDA0002353707760000061
wherein ,Ct Is the set of coordinates of each point in the outline of the mouse body;
in general, the center of gravity of the outline of the mouse body is at the rear part of the mouse, and the point on the outline furthest from the center of gravity of the mouse is the tip of the nose of the mouse. In some cases, however, the center of gravity of the mouse may be present in the anterior portion of the mouse, which may easily allow the tip of the mouse to be misidentified near the tail root. In addition, when the outline of the mouse body is identified, the periphery of the mouse outer auricle is also identified due to the hair color which is the same as that of other parts of the body, and the periphery of the outer auricle is particularly easy to form a small tip, so that the nose tip of the mouse can be easily identified to the periphery of the mouse outer auricle.
Because of the two misidentification conditions, the method uses the contours of the tail and the ear of the mouse to correct the initial positioning of the nose tip of the mouse when the gravity center is used for determining the nose tip of the mouse. The area around the mouse ear (e.g., a circle centered on the centroid of the mouse ear outline and having a diameter slightly larger than the size of the ear) is selected so as to avoid around the mouse outer pinna during recognition. For the situation that the tail root is misidentified, selecting the distance G on the outline of the mouse body outside the area around the mouse ear c The furthest point and the point on the contour of the mouse body outside the region around the mouse's ear furthest from this point are the pre-selected points of the initial positioning point of the mouse's nasal tip. The distances between these two points and the tail and ear of the mouse were determined separately (the distances can be calculated using the center of gravity of the mouse and the outline of the ear). The point far from the tail of the mouse and near the ear of the mouse is selected as a preselected point P of the nose tip of the mouse pn ,P pn Is the result of preliminary positioning of the nasal tip of the mouse.
After the tail and ear contour of the mouse are identified, the area around the mouse's ear is calculated:
the center of gravity of the outline of the mouse ear is taken as the circle center, and the distance of the outline of the mouse ear, which is farthest from the center of gravity of the outline of the mouse ear, is l ec To be slightly larger than l ec Distance (1.1 l may be used in the experiment) ec ) Forming two circles for drawing a circle for the radius, wherein the two circles are the surrounding areas of the ears of the mice;
selecting the center of gravity G of the contour of the mouse body outside the region around the mouse ear c The furthest point is taken as one P of the alternative points of the mouse nasal tip pre-selection point pn1 Calculating the distance P on the outline of the mouse body outside the area around the mouse ear pn1 The furthest point is two P which is the alternative point of the mouse nasal tip pre-selection point pn2
The head of the mouse is far away from the tail and near the ear; the pre-selected point of the mouse nose is determined based on the distance of the candidate point from the tail and the ear. In the experiment, the recognition accuracy of the ears of the mice is higher than that of the tails, the tails are used for soft judgment, and then the ears are used for hard judgment, so that the correct result can be obtained through the judgment of the ears under the condition that the tail recognition is wrong. Since the mouse cannot recognize the ear in some cases, the process of judging using the tail cannot be lost.
Respectively find the points P pn1 、P pn2 Distance to tail and ear of mouse respectively (the distance can be calculated by using the centers of gravity of mouse and ear outline), and point P is selected pn1 、P pn2 The point far from the tail of the mouse and near the ear of the mouse is taken as a preselected point P of the nose tip of the mouse pn ,P pn Is the result of preliminary positioning of the nasal tip of the mouse. Thus, P can be ensured pn The tail root or the external auricle is not misidentified.
As shown in FIG. 4, the results of preliminary positioning of the tip of the mouse are shown as 9, the calculated center of gravity of the body contour of the mouse, 10, the selected area around the ear of the mouse to avoid the surrounding of the external auricle of the mouse during the recognition, 11, the circumscribed rectangle of the tail contour of the mouse, 12, the obtained pre-selected point P of the tip of the mouse pn
Other steps and parameters are the same as in one to four embodiments.
Specific embodiment six: the difference between the present embodiment and the first to fifth embodiments is that, in the second step, the position correction is performed on the result of preliminary positioning of the nasal tip of the mouse obtained in the first step, so as to obtain the nasal tip P of the corrected mouse n The method comprises the steps of carrying out a first treatment on the surface of the The specific process is as follows:
after obtaining the result of preliminary positioning of the nasal tip of the mouse and the center of gravity G of the outline of the body of the mouse c Thereafter, the result of preliminary localization of the tip of the mouse nose and the center of gravity G of the outline of the mouse body are calculated c Distance between l pnc The body outline of the mouse is selected to be centered on the nasal tip of the mouse obtained by preliminary positioning, and the radius is slightly larger than l pnc (e.g. 1.2l pnc ) The part in the circle with the length is taken as the front half part of the outline of the mouse, and the gravity center of the front half part of the outline of the mouse is calculated;
selecting the body outline of the mouse to avoid the area around the ears of the mouse, and taking the tip of the nose of the mouse as the center, wherein the radius is slightly smaller than l pnc Half (e.g. 0.4l pnc ) The inner part of the circle with the length is taken as the head near area of the outline of the mouse, and the point farthest from the center of gravity of the front half part of the outline of the mouse in the head near area is selected as the nose tip P of the corrected mouse n
Preselection point P of mouse nasal tip pn In general, the nose tip can be selected, but when the difference between the direction of the head and the direction of the body of the mouse is large, the position of the selected point deviates from the true position of the nose tip. A suitable correction method is to select the point of the vicinity of the mouse head, which is farthest from the center of gravity of the front half of the mouse, as the new tip of the mouse while avoiding the mouse's ear. Center of gravity distance P after changing the contour center of gravity of the mouse to the center of gravity of the first half of the mouse pn The degree of shortening is greater than the degree of shortening from the actual tip of the nose, so that the tip of the nose becomes the point furthest from the center of gravity of the front half. As shown in FIG. 5, 13 is the center of gravity of the front half of the mouse, 14 is the vicinity of the head of the mouse for screening the tip of the nose, and 15 is the tip of the nose P finally obtained n
Other steps and parameters are the same as in one of the first to fifth embodiments.
Seventh embodiment: the present embodiment is different from the first to sixth embodiments in that the second and third steps are based on the nasal tip P of the mouse obtained in the second step n Identifying the orientation of the head of the mouse; the specific process is as follows:
according to the symmetry of the mouse head, the nose tip P of the mouse can be selected n The point is the center of a circle and the radius is r 1 and r2 Respectively draw circles r 1 <r 2 The gravity centers of the two circles in the outline of the mouse are G respectively r1 and Gr2 The direction of the mouse head can be calculated as follows:
Figure BDA0002353707760000081
wherein ,
Figure BDA0002353707760000082
The vector representing the orientation of the mouse head is derived from G r2 Pointing to G r1
As a result of recognizing the direction of the head of the mouse, as shown in FIG. 6, 16 and 17 are shown as P respectively n The point is the center of a circle and the radius is r 1 and r2 Is the center of gravity G, 18 and 19 respectively r1 and Gr2 The post-obtained mouse head orientation is 20.
Other steps and parameters are the same as in one of the first to sixth embodiments.
Eighth embodiment: this embodiment differs from one of the embodiments one to seven in that the r 1 and r2 Should be changed according to the size of the mouse, r 1 The value of (2) is approximately initially selected as the length of the head of the mouse, r 2 The value of (2) is approximately twice the length from the tip of the nose to the neck of the mouse, and r can be continuously adjusted 1 、r 2 So that the resulting center of gravity G r1 At the center of the head of the mouse, G r2 At the neck connecting the mouse head and body.
Other steps and parameters are the same as those of one of the first to seventh embodiments.
Detailed description nine: the difference between the embodiment and one of the embodiments one to eight is that, in the third step, based on the contour of the body, the ear and the tail of the mouse obtained in the first step and the orientation of the head of the mouse obtained in the second step, the optokinetic reaction of the mouse is identified; on the basis of the previous two steps, the head orientation and motion changes of the mice can be continuously tracked in the video. Analysis of the follow-up data can yield the optokinetic response data of the mice. The specific process is as follows:
step three, establishing a mouse head movement curve;
step three, extracting optokinetic reaction; the specific process is as follows:
step three, step two, namely, preliminary extraction based on differential optokinetic reaction;
and determining the optokinetic reaction by adopting a least square method based on the step III, the step II and the step I.
Other steps and parameters are the same as in one to eight of the embodiments.
Detailed description ten: the difference between the present embodiment and one of the first to ninth embodiments is that in the third step, a movement curve of the head of the mouse is established; the specific process is as follows:
based on the second step, each frame of image will obtain a mouse head orientation, establish a coordinate system, convert the direction into angle, and use G r2 As the origin of coordinates, the coordinates of the mouse head are connected, and the angle value of the mouse head in each frame of image can be established, and the angle value is expressed as follows by a formula:
Figure BDA0002353707760000091
wherein θ represents an angle calculated from the head orientation, G r1·x and Gr1·y G is respectively r1 On the abscissa, G of the pixel space coordinate system r2·x and Gr2·y G is respectively r2 The abscissa in the pixel space coordinate system;
it is worth noting that by G r1 and Gr2 When the direction of the connecting line calculates the angle, the change range of the angle can only be-180 degrees to 180 degrees, and when the data is changed at the angle of-180 degrees and around 180 degrees, the judgment of the mouse swinging after the data cannot keep the continuity influence of small difference change can be caused. In this regard, when the data changes from around-180 ° to around 180 ° or from around 180 ° to around-180 °, the current value is adjusted down or up by 360 ° so that the continuity of the small differential change is maintained between the current value and the value of the last data.
The corresponding angles of the continuous videos are connected to form a head movement curve of the mouse.
Other steps and parameters are the same as in one of the first to ninth embodiments.
Eleventh embodiment: the difference between the first embodiment and the tenth embodiment is that the differential optokinetic reaction-based preliminary extraction in the third and the fourth embodiments; the specific process is as follows:
the optokinetic response of the mice following the grating is gentle and not strongly varying, i.e. its velocity variation is relatively small. The speed change of the mouse can be measured by the first derivative curve and the second derivative curve of the azimuth angle change of the head of the mouse. For discrete signals, the determination is made by calculating a first order difference and a second order difference.
The formula of the first order difference is as follows:
θ′ i =θ ii-1 (6)
wherein ,θi 、θ i-1 Is the angle value of the ith and the ith-1 th head orientations in the head movement curve of the mouse, and theta' i Is the i first difference in the head movement curve of the mouse;
the formula of the second order difference is as follows:
θ″ i =θ′ ii-1 (7)
wherein ,θ′i 、θ′ i-1 Is the ith and the (i-1) th first order difference, theta', in the head movement curve of the mouse i Is the ith second-order difference in the head motion curve of the mouse;
when the mouse has a visual response following the grating, the first-order and second-order difference values are relatively small, while in other cases, the first-order and second-order difference values are relatively large. A two-step difference can be used to distinguish whether the detected changes in the orientation of the mouse head are severe.
In order to make the difference more obvious to distinguish the intensity of the change of the orientation of the head of the mouse, the difference can be locally integrated, so that the detection result of the place with larger difference is larger, and the intensity of the orientation of the head of the mouse is easier to detect. The formula of the first order differential partial integration is as follows:
Figure BDA0002353707760000102
wherein ,θ′isum Is theta'. i Before (including θ' i ) Continuous z 1 Partial integration of the first order difference, z 1 Is the product ofThe length of the divided section; θ'. j The j is the j first order difference in the head movement curve of the mouse, and the j is valued from i+1-z 1 To i;
when z is selected 1 As the value increases, the differential result is more pronounced and the stability of the detection is higher. But when z 1 When the value is too large, the result obtained by integration will not be representative of the current change of the mouse head angle, which will lead to error of detection result, and according to experience, z 1 Typically a value of 5 is available.
Also, the formula for the second order differential local integration is as follows:
Figure BDA0002353707760000101
wherein ,θ″j Is the j-th second-order difference, theta', in the head movement curve of the mouse isum Is theta' i Before (including θ i ) Continuous z 2 Local integration of the second-order difference is generally carried out, and the value is 2 according to experience;
when the first-order differential local integral is larger than a first-order integral threshold or the second-order differential local integral is larger than a second-order integral threshold, the motion is recorded as a non-gentle motion;
when the number of non-gentle movements is larger than the number threshold, the section (the section corresponding to the non-gentle movement) is recorded as a non-gentle movement section, the part of the non-gentle section in the whole mouse head movement curve is removed, a gentle section is obtained, the gentle section contains the part of the mouse optokinetic reaction, and the preliminary extraction of the optokinetic reaction area is completed;
for example, the first-order integral threshold is 14 degrees, and the second-order integral threshold is 7 degrees; the number of times threshold is 3.
The first order difference excludes the steeper curve, the second order difference excludes the curve with particularly severe curve fluctuations, and the remaining curve is the portion containing the visual response of the mouse.
Fig. 7a and 7b are schematic diagrams of first and second order differences of the mouse head angle curves. When the head of the mouse moves vigorously, the differential data of the head of the mouse can show obvious peaks, and the interval containing the optokinetic reaction can be extracted by dividing the peaks.
Other steps and parameters are the same as in one to one tenth of the embodiments.
Twelve specific embodiments: this embodiment differs from one of the first to eleventh embodiments in that the step iii two is to determine the optokinetic reaction based on the step iii two by one using a least squares method; the specific process is as follows:
the speed of the mouse following the optokinetic reaction of the grating should be within a certain range, and the analysis of the speed of the area with relatively gentle change of the head orientation of the mouse can judge whether the mouse has the optokinetic reaction. The speed of the change in the orientation of the mouse head can be expressed in terms of a slope.
Fitting the curve of the apparent movement reaction region extracted in the step III and step III by using a least square method, wherein the obtained slope and correlation coefficient of the fitting straight line can be used for evaluating the speed of the change of the head azimuth of the mouse; the specific process is as follows:
for n observations of points on the curve of the apparent motion reaction region extracted in step three two (each group is one point on the curve of the apparent motion reaction region extracted in step three two) (x 1 ,y 1 ),(x 2 ,y 2 ),...,(x n ,y n ) The least squares method calculates the slope as follows:
the average value of the independent variables is:
Figure BDA0002353707760000111
the mean value of the dependent variables is:
Figure BDA0002353707760000112
the calculated slope is:
Figure BDA0002353707760000113
wherein ,xk Is the abscissa of the kth observation, y k Is the ordinate of the kth observation;
the calculated correlation coefficients are:
Figure BDA0002353707760000114
wherein ,xk Is the abscissa of the kth observation, y k Is the ordinate of the kth observation;
dividing the obtained gentle section into a series of cells with fixed length (30 points, for example) in sequence, and overlapping the cells (20 points, for example)
Figure BDA0002353707760000121
);
Calculating the slope between cells according to formulas (10), (11) and (12), calculating a correlation coefficient according to formula (13), and determining whether the speed of the change of the azimuth angle of the head of the mouse in the cells is within the range of the optokinetic reaction or not according to the slope and the magnitude of the correlation coefficient;
if the slope and the correlation coefficient are within the range of the optokinetic reaction, judging that the optokinetic reaction occurs in the cells;
if the slope and the correlation coefficient are not in the range of the optokinetic reaction, judging that the optokinetic reaction does not occur in the cells;
the cells continuously judged as the optokinetic reaction are combined together to form large areas, and each large area is judged as one optokinetic reaction. Thus, the number of the visual movement reaction and the duration of the visual movement reaction of the mice can be determined;
the range of the optokinetic reaction is that the slope range is more than or equal to-0.65 and less than or equal to-0.1, the correlation coefficient range is more than or equal to-1 and less than or equal to-0.8 (or the slope range is more than or equal to 0.1 and less than or equal to 0.65, and the correlation coefficient range is more than or equal to 0.8 and less than 1).
The number of times and duration of the apparent visual effect are determined in the video are recorded.
The curve in fig. 8 is the recognized state of the visual response of the mice, the number of visual responses is 7, and the total duration of 7 times is 13.87s. The number of the visual movement reaction and the duration of the visual movement reaction are judged by recording the video.
The automatic identification method for the visual movement reaction of the mice, which is established by the invention, can provide identification data in real time in the experimental process, wherein the identification data comprises data such as the frequency, duration and the like of the visual movement reaction of the mice, the synchronization rate with the stripe speed and the like.
Other steps and parameters are the same as in one of the embodiments one to eleven.
The present invention is capable of other and further embodiments and its several details are capable of modification and variation in light of the present invention, as will be apparent to those skilled in the art, without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (4)

1. The automatic identification method for the optokinetic reaction of the mice based on the image technology is characterized by comprising the following steps of: the method comprises the following specific processes:
step one, extracting contours of a mouse body, ears and tails;
step two, recognizing the head position of the mouse based on the contours of the body, ears and tails of the mouse;
step three, recognizing the optokinetic reaction of the mouse based on the contours of the body, the ears and the tails of the mouse obtained in the step one and the head orientation of the mouse obtained in the step two;
in the second step, the head position of the mouse is identified based on the contours of the body, ears and tails of the mouse; the specific process is as follows:
step two, preliminarily positioning the nose tip of the mouse based on the contours of the body, ears and tails of the mouse;
step two, carrying out position correction on the result of preliminary positioning of the nasal tip of the mouse obtained in the step two to obtain the nasal tip P of the corrected mouse n
Step two and three, based on the small obtained in step twoNasal tip P of mouse n Identifying the orientation of the head of the mouse;
in the second step, the nose tip of the mouse is initially positioned based on the outline of the body, the ears and the tail of the mouse; the specific process is as follows:
after obtaining the contour of the mouse body, the center of gravity G of the contour of the mouse body is calculated c
The calculation formula is as follows:
Figure FDA0004101209740000011
wherein ,Ct Is the set of coordinates of each point in the outline of the mouse body;
after the tail and ear contour of the mouse are identified, the area around the mouse's ear is calculated:
the center of gravity of the outline of the mouse ear is taken as the circle center, and the distance of the outline of the mouse ear, which is farthest from the center of gravity of the outline of the mouse ear, is l ec To be slightly larger than l ec Forming two circles for drawing a circle for the radius, wherein the two circles are the surrounding areas of the ears of the mice;
selecting the center of gravity G of the contour of the mouse body outside the region around the mouse ear c The furthest point is taken as one P of the alternative points of the mouse nasal tip pre-selection point pn1 Calculating the distance P on the outline of the mouse body outside the area around the mouse ear pn1 The furthest point is two P which is the alternative point of the mouse nasal tip pre-selection point pn2
Respectively find the points P pn1 、P pn2 The distance between the mouse tail and the ear respectively, and the point P is selected pn1 、P pn2 The point far from the tail of the mouse and near the ear of the mouse is taken as a preselected point P of the nose tip of the mouse pn ,P pn Results of preliminary positioning of the nasal tips of the mice;
in the second step, the position correction is performed on the result of the preliminary positioning of the nasal tip of the mouse obtained in the first step, so as to obtain the nasal tip P of the corrected mouse n The method comprises the steps of carrying out a first treatment on the surface of the The specific process is as follows:
at the beginning of obtaining the nasal tip of the mouseResults of localization and center of gravity G of mouse body contour c Thereafter, the result of preliminary localization of the tip of the mouse nose and the center of gravity G of the outline of the mouse body are calculated c Distance between l pnc Selecting the nasal tip of the mouse obtained by preliminary positioning as the center in the outline of the body of the mouse, wherein the radius is larger than l pnc The part in the circle with the length is taken as the front half part of the outline of the mouse, and the gravity center of the front half part of the outline of the mouse is calculated;
selecting the body outline of the mouse to avoid the area around the ears of the mouse, and taking the tip of the nose of the mouse as the center, wherein the radius is smaller than l pnc The inner part of the circle with half length is used as the head vicinity of the mouse outline, and the point farthest from the center of gravity of the front half part of the mouse outline in the head vicinity is selected as the nose tip P of the corrected mouse n
The nose tip P of the mouse obtained based on the second step in the second step n Identifying the orientation of the head of the mouse; the specific process is as follows:
selecting nasal tip P of mouse n The point is the center of a circle and the radius is r 1 and r2 Respectively draw circles r 1 <r 2 The gravity centers of the two circles in the outline of the mouse are G respectively r1 and Gr2 The direction of the mouse head was calculated as follows:
Figure FDA0004101209740000021
wherein ,
Figure FDA0004101209740000022
the vector representing the orientation of the mouse head is derived from G r2 Pointing to G r1
In the third step, based on the contours of the body, the ears and the tails of the mouse obtained in the first step and the head orientation of the mouse obtained in the second step, the optokinetic reaction of the mouse is identified; the specific process is as follows:
step three, establishing a mouse head movement curve;
step three, extracting optokinetic reaction; the specific process is as follows:
step three, step two, namely, preliminary extraction based on differential optokinetic reaction;
step III, step II, based on step III, step II, adopting a least square method to determine optokinetic reaction;
in the third step, a movement curve of the head of the mouse is established; the specific process is as follows:
based on the second step, each frame of image will obtain a mouse head orientation, establish a coordinate system, convert the direction into angle, and use G r2 Connecting the coordinates of the mouse head with the coordinate origin, and establishing an angle value of the mouse head in each frame of image, wherein the formula is as follows:
Figure FDA0004101209740000023
wherein θ represents an angle calculated from the head orientation, G r1.x and Gr1 Y is G respectively r1 On the abscissa, G of the pixel space coordinate system r2.x and Gr2 Y is G respectively r2 The abscissa in the pixel space coordinate system;
connecting angles corresponding to the continuous videos to form a head movement curve of the mouse;
the differential-based optodynamic reaction in the step III and step II is initially extracted; the specific process is as follows:
the formula of the first order difference is as follows:
θ′ i =θ ii-1 (6)
wherein ,θi 、θ i-1 Is the angle value of the ith and the ith-1 th head orientations in the head movement curve of the mouse, and theta' i Is the i first difference in the head movement curve of the mouse;
the formula of the second order difference is as follows:
θ″ i =θ′ i -θ′ i-1 (7)
wherein ,θ′i 、θ′ i-1 Is the ith and the (i-1) th first order difference, theta', in the head movement curve of the mouse i Is a mouseAn ith second order difference in the head motion curve;
the formula of the first order differential partial integration is as follows:
Figure FDA0004101209740000031
wherein ,θ′isum Is theta'. i Formerly continuous z 1 Partial integration of the first order difference, z 1 The interval length is the integral; θ'. j The j-th first-order difference in the head movement curve of the mouse;
also, the formula for the second order differential local integration is as follows:
Figure FDA0004101209740000032
wherein ,θ″j Is the j-th second-order difference, theta', in the head movement curve of the mouse isum Is theta' i Formerly continuous z 2 Local integration of the second order differences;
when the first-order differential local integral is larger than a first-order integral threshold or the second-order differential local integral is larger than a second-order integral threshold, the motion is recorded as a non-gentle motion;
when the non-gentle movement times is greater than the times threshold, the interval is marked as a non-gentle movement interval, the part of the non-gentle interval in the whole mouse head movement curve is removed, a gentle interval is obtained, the gentle interval contains the part of the mouse optokinetic reaction, and the preliminary extraction of the optokinetic reaction area is completed;
the optokinetic reaction is determined based on the step III, the step II and the step III by adopting a least square method; the specific process is as follows:
fitting the curve of the apparent movement reaction region extracted in the step III and step III by using a least square method, wherein the slope of the obtained fitting straight line is used for evaluating the speed of the change of the head azimuth of the mouse; the specific process is as follows:
the step of calculating the slope by the least square method is as follows for n observed values in points on the curve of the apparent motion reaction region extracted in step three two:
the average value of the independent variables is:
Figure FDA0004101209740000041
the mean value of the dependent variables is:
Figure FDA0004101209740000042
the calculated slope is:
Figure FDA0004101209740000043
wherein ,xk Is the abscissa of the kth observation, y k Is the ordinate of the kth observation;
calculating the slope of the gentle section according to formulas (10), (11) and (12), and determining whether the speed of the change of the azimuth angle of the head of the mouse is within the range of the optokinetic reaction according to the size of the slope;
if the slope is within the range of the optokinetic reaction, judging that the optokinetic reaction occurs;
if the slope is not within the range of the optokinetic reaction, judging that the optokinetic reaction does not occur;
the number of and duration of the motor responses of the mice were thus determined.
2. The automatic identification method of the optokinetic reaction of the mice based on the image technology as claimed in claim 1, wherein: the contours of the body, ears and tails of the mice are extracted in the first step; the specific process is as follows:
binarization processing is carried out on the acquired gray level image, and the edge outline of the mouse is extracted;
the binarization formula is:
Figure FDA0004101209740000044
wherein ,s1 Is the minimum value of the pixel value of the region of interest, s 2 Is the maximum value of the pixel value of the region of interest; i.e k,l Is the pixel value of the first column element of the kth line in the acquired image, bi k,l The pixel value of the first column element of the kth row in the binarized image is obtained;
the outline of the body, the ear and the tail of the mouse are respectively extracted according to the different morphological structures of the body, the ear and the tail of the mouse.
3. The automatic identification method of the optokinetic reaction of the mice based on the image technology as claimed in claim 1, wherein: the contours of the body, ears and tails of the mice are extracted in the first step; the specific process is as follows:
the depth camera is used for data acquisition of the mice, and the range of the mice from the depth camera meets the following conditions:
f min <f<f max (2)
wherein f is the distance of the mouse from the depth camera, f min Is the distance from the upper end of the mouse to the depth camera; f (f) max Is the distance from the bottom end of the mouse to the depth camera;
selecting a set of points between the distance ranges of formula (2) to form a body part of the mouse; extracting the outer edge of the mouse body part to obtain the outline of the mouse body;
binarizing the outline of the mouse body to extract the outline of the mouse edge;
the binarization formula is:
Figure FDA0004101209740000051
wherein ,s1 Is the minimum value of the pixel value of the region of interest, s 2 Is the maximum value of the pixel value of the region of interest; i.e k,l Is the kth line in the acquired imagePixel value of column I element, bi k,l The pixel value of the first column element of the kth row in the binarized image is obtained;
the outline of the body, the ear and the tail of the mouse are respectively extracted according to the different morphological structures of the body, the ear and the tail of the mouse.
4. A method for automatically identifying a visual motor response of a mouse according to claim 2 or 3, wherein the method comprises the steps of: the r is 1 The value of (2) is initially selected as the length of the head of the mouse, r 2 The value of (2) is initially selected to be twice the length from the tip of the mouse nose to the neck of the mouse, and r is continuously adjusted 1 、r 2 So that the resulting center of gravity G r1 At the center of the head of the mouse, G r2 At the neck connecting the mouse head and body.
CN202010001628.9A 2020-01-02 2020-01-02 Automatic identification method for visual dynamic response of mice based on image technology Active CN111144379B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010001628.9A CN111144379B (en) 2020-01-02 2020-01-02 Automatic identification method for visual dynamic response of mice based on image technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010001628.9A CN111144379B (en) 2020-01-02 2020-01-02 Automatic identification method for visual dynamic response of mice based on image technology

Publications (2)

Publication Number Publication Date
CN111144379A CN111144379A (en) 2020-05-12
CN111144379B true CN111144379B (en) 2023-05-23

Family

ID=70523301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010001628.9A Active CN111144379B (en) 2020-01-02 2020-01-02 Automatic identification method for visual dynamic response of mice based on image technology

Country Status (1)

Country Link
CN (1) CN111144379B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832531B (en) * 2020-07-24 2024-02-23 安徽正华生物仪器设备有限公司 Analysis system and method suitable for rodent social experiments based on deep learning

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504404A (en) * 2015-01-23 2015-04-08 北京工业大学 Online user type identification method and system based on visual behavior
CN105447449A (en) * 2015-11-12 2016-03-30 启安动物行为学科技股份有限公司 Body angle index acquiring method, device and system in mouse gait analysis
WO2018095346A1 (en) * 2016-11-25 2018-05-31 平李⋅斯图尔特 Medical imaging system based on hmds

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8634635B2 (en) * 2008-10-30 2014-01-21 Clever Sys, Inc. System and method for stereo-view multiple animal behavior characterization
CN101526996A (en) * 2009-02-23 2009-09-09 华旭 Method of mouse spontaneous behavior motion monitoring and posture image recognition
EP2611401A4 (en) * 2010-08-31 2014-03-19 Univ Cornell Retina prosthesis
JP2012190280A (en) * 2011-03-10 2012-10-04 Hiroshima Univ Action recognition device, action recognition method, and program
CN103070671B (en) * 2013-01-31 2014-07-30 郑州大学 Automatic assessment system for visual cognition behavior function of rodent

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504404A (en) * 2015-01-23 2015-04-08 北京工业大学 Online user type identification method and system based on visual behavior
CN105447449A (en) * 2015-11-12 2016-03-30 启安动物行为学科技股份有限公司 Body angle index acquiring method, device and system in mouse gait analysis
WO2018095346A1 (en) * 2016-11-25 2018-05-31 平李⋅斯图尔特 Medical imaging system based on hmds

Also Published As

Publication number Publication date
CN111144379A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
US10747999B2 (en) Methods and systems for pattern characteristic detection
CN109447945B (en) Quick counting method for basic wheat seedlings based on machine vision and graphic processing
AU2020103260A4 (en) Rice blast grading system and method
CN108596102B (en) RGB-D-based indoor scene object segmentation classifier construction method
CN102915446A (en) Plant disease and pest detection method based on SVM (support vector machine) learning
Ji et al. In-field automatic detection of maize tassels using computer vision
CN106952280B (en) A kind of spray gun paint amount uniformity detection method based on computer vision
CN101726498B (en) Intelligent detector and method of copper strip surface quality on basis of vision bionics
CN110646431B (en) Automatic teaching method of gluing sensor
CN114331986A (en) Dam crack identification and measurement method based on unmanned aerial vehicle vision
CN110766683B (en) Pearl finish grade detection method and system
Qiu et al. Field estimation of maize plant height at jointing stage using an RGB-D camera
CN102855485B (en) The automatic testing method of one grow wheat heading
CN115861721B (en) Livestock and poultry breeding spraying equipment state identification method based on image data
CN108133471A (en) Agriculture Mobile Robot guidance path extracting method and device based on artificial bee colony algorithm under the conditions of a kind of natural lighting
CN106650628B (en) Fingertip detection method based on three-dimensional K curvature
CN111932551B (en) Missing transplanting rate detection method of rice transplanter
CN111144379B (en) Automatic identification method for visual dynamic response of mice based on image technology
CN111369497B (en) Walking type tree fruit continuous counting method and device
Zeng et al. Rapid automated detection of roots in minirhizotron images
CN112381028A (en) Target feature detection method and device
CN115063375B (en) Image recognition method for automatically analyzing ovulation test paper detection result
CN118053154A (en) Oyster mushroom growth monitoring method and device, electronic equipment and storage medium
CN115601690B (en) Edible fungus environment detection method based on intelligent agriculture
CN110111317A (en) A kind of dispensing visual detection method for quality based on intelligent robot end

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant