CN107330919B - Method for acquiring pistil motion track - Google Patents

Method for acquiring pistil motion track Download PDF

Info

Publication number
CN107330919B
CN107330919B CN201710501127.5A CN201710501127A CN107330919B CN 107330919 B CN107330919 B CN 107330919B CN 201710501127 A CN201710501127 A CN 201710501127A CN 107330919 B CN107330919 B CN 107330919B
Authority
CN
China
Prior art keywords
pistil
region
stamen
pixel
petal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710501127.5A
Other languages
Chinese (zh)
Other versions
CN107330919A (en
Inventor
蒋海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Institute of Biology of CAS
Original Assignee
Chengdu Institute of Biology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Institute of Biology of CAS filed Critical Chengdu Institute of Biology of CAS
Priority to CN201710501127.5A priority Critical patent/CN107330919B/en
Publication of CN107330919A publication Critical patent/CN107330919A/en
Application granted granted Critical
Publication of CN107330919B publication Critical patent/CN107330919B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The invention discloses a method for acquiring a movement track of stamen, which comprises the following steps: a1: demarcating an area of an pistil from a video image of the flower blooming, and determining the type and the number of the pistil; a2: acquiring values of RGB components and HIS components of the stamen region aiming at the selected stamen region; a3: counting according to the RGB components and HIS components of the obtained pistil region; a4: segmenting pistil pixels according to the gray level statistical peak value; a5: performing cluster analysis on the divided pistil pixels and the types and the numbers of the pistils determined in the step A1, and respectively calculating the geometric center of each pistil under an image coordinate system; a6: repeating the steps A2-A5 for each subsequent frame of image; a7: and acquiring the motion track of each stamen in the image coordinate system according to the geometric center of each stamen in the image coordinate system acquired in the steps A5 and A6. The method of the invention tracks the movement track of each stamen, studies the pollination process of the stamens and can obtain the information which has guiding significance to the propagation and breeding of plants.

Description

Method for acquiring pistil motion track
Technical Field
The invention relates to the field of image processing, in particular to a method for acquiring a movement track of pistil.
Background
Flowers are the reproductive organs of plants and play an important role in the process of propagating the next generation. The pistil is in direct relation to the fruiting and the seed, and thus, the pistil is the main part of a flower. How to identify the dynamic motion track of the stamens has important significance for understanding the change processes of pollination, morphological development of flowers, fruit formation and the like between the amphoteric stamens. At present, no report exists on the dynamic acquisition of the movement track of the pistil.
Disclosure of Invention
The invention aims to provide a method for acquiring a movement track of pistils, which solves the problems that the movement track of the pistils is not researched at present, and the pistil pollination, the flower morphological change, the fruit formation and the manual intervention cannot be analyzed through the movement track of the pistils.
In order to solve the technical problems, the invention adopts the following technical scheme:
a method for acquiring a movement track of a pistil comprises the following steps:
a1: selecting a first frame image from a video image of flower blooming, dividing a stamen region, and determining the type and number of stamens;
a2: acquiring values of RGB components and HIS components of the stamen region aiming at the selected stamen region;
a3: performing statistics according to the RGB components and HIS components of the pistil region obtained in the step A2 to obtain the gray statistics peak value of R, G, B, H, S, I component pixels of the selected pistil regionr m g m b m h m s m Andi m
a4: performing n-term polynomial fitting according to the gray statistic peak value, taking the first integer wave valley values on the left side and the right side of the peak value as the boundary of the segmentation pixels on the two sides of the peak value respectively, and then segmenting the stamen pixels in the selected stamen region; dividing pixel regions by R, G, B, H, S, I components, comparing the pixel regions divided by R, G, B, H, S, I components pairwise, wherein the overlapped part with the largest number of overlapping times of the pixel regions is the stamen pixel region of the frame image;
a5: performing cluster analysis with spatial distance as a criterion on the pistil pixel regions segmented in the step A4 and the types and numbers of pistils determined in the step A1 to obtain accurate regions of each pistil, respectively calculating the geometric center of each pistil under an image coordinate system, calculating the distance from the farthest point of the pistil pixel cluster to the geometric center, calculating the morphological radius r of the pistil, and fitting the pistil regions to circles with the geometric center of each pistil as the center of a circle and d r as the radius to serve as the pistil regions of the next frame of image;
a6: repeating the steps A2-A5 for each subsequent frame of image;
a7: drawing the geometric center point of each stamen according to the geometric center of each stamen under the image coordinate system, which is obtained in the steps A5 and A6, according to the time sequence, and finally obtaining the motion track of each stamen under the image coordinate system.
According to the method for acquiring the movement track of the pistil, the region is selected manually, the pistil pixel regions are segmented from the image according to the gray value, the segmented pistil pixel regions are clustered, the accurate region of each pistil is obtained, and each pistil pixel region corresponds to one pistil. And then respectively calculating the geometric center of each stamen pixel region, drawing a time-dependent change diagram of the position of the geometric center after acquiring the geometric center of each stamen pixel region in each frame, and acquiring the motion rule of each stamen in an image coordinate system.
In the step a4 in the above method, in order to obtain a more accurate pistil pixel region, it is necessary to compare the pixel regions respectively divided by R, G, B, H, S, I components, and if two-by-two comparison is performed, the overlapped portion where the pixel regions overlap most frequently is used as the pistil pixel region, so that the accuracy of division of the pistil pixel region is higher. If the R, G, B, H, S, I component segmented whole pixel region is directly used, the error of the pistil pixel region is larger, the pistil pixel region obtained through clustering also has larger error, so the obtained geometric center point also has error, and the finally obtained movement rule of each pistil under the image coordinate system also has larger error.
Further, the method for obtaining the pistil motion trail further comprises the following steps:
b1: selecting a petal area at the same time as the step a 1;
b2, acquiring the values of RGB components and HIS components of the petal area aiming at the selected petal area;
b3: performing statistics according to the RGB components and HIS components of the petal areas acquired in the step B2, and acquiring the gray statistical peak value of R, G, B, H, S, I component pixels of the selected petal areasr f g f b f h f s f Andi f (ii) a Performing n-term polynomial fitting according to the gray statistic peak value, respectively taking the first integer wave valley values on the left side and the right side of the peak value as segmentation pixel boundaries on the two sides of the peak value, performing pixel segmentation on the selected petal area, segmenting petal pixels, respectively segmenting pixel areas by R, G, B, H, S, I components, comparing every two of the pixel areas segmented by R, G, B, H, S, I components, wherein the overlapped part with the largest number of overlapping times of the pixel areas is the petal pixel area of the frame image;
b4: carrying out corner point detection on the petal pixel area obtained in the step B3;
b5: in each subsequent frame image, acquiring a petal area of the frame image from the petal area of the previous frame image, and repeating the steps B2-B4;
b6: calculating the translation distance of the flower in the front frame image and the rear frame image according to the corner points of the front frame image and the rear frame image acquired in the step B5;
b7: and subtracting the translation distance of the flower in the front frame image and the rear frame image from the geometric central point of each stamen in the next frame image acquired in the step A6 to obtain the absolute displacement of the geometric center of each stamen in the image coordinate system, and drawing the absolute motion track of each stamen in the image coordinate system.
Because the flower can move integrally between the front frame image and the back frame image, in order to obtain a more accurate track of the stamen under the image coordinate system, the position offset of the flower under the image coordinate system needs to be detected, so that the displacement condition of the petals is judged in an angular point detection mode, finally, the absolute displacement of the geometric center of each stamen under the image coordinate system can be obtained by subtracting the translation distance of the flower in the front frame image and the back frame image, and the absolute motion track of each stamen under the image coordinate system is drawn.
Further, the pistil pixel region divided in the step a4 is compared with the petal pixel region divided in the step B3, and if the pistil pixel region is within the petal pixel region, it is determined as the pistil pixel region, otherwise, it is determined as the noise region.
In the acquisition process of the stamen pixel region, some noise points which are not the stamen pixel region may appear, so that the stamen pixel region is inevitably within the petal pixel region by comparing the stamen pixel region with the petal pixel region, and therefore, if the acquired stamen pixel region is outside the petal pixel region, the noise region is determined to be removed.
Further, the step a6 and the step B5 respectively count the RGB components and the HIS components of the pistil region and the petal region in each subsequent frame image, compare the count result with the count result of the previous frame image, and if the pixel value with the pixel number change larger than M% exceeds N% of the total pixel number, the steps a6 and B5 are both as aboveRe-counting RGB components and HIS components in the divided region of one frame, and dividing the stamen pixels according to the new statistical peak value, otherwise, the step A6 is that the stamen pixels are acquired according to the previous framer m g m b m h m s m Andi m the pixel values of (a) are divided, the step B5 is according to the petals obtained in the previous framer f g f b f h f s f Andi f is divided, wherein M, N is the specified value.
When two adjacent frames of images are processed, the position change of the stamens in the two adjacent frames is not too large, so that the workload is reduced, the working efficiency is improved, when the next frame of image is processed, firstly, RGB components and HSI components of a stamen region selected by the previous frame are counted, then, the RGB components and the HSI components of the stamen region selected by the previous frame are compared with the statistical results of the RGB components and the HSI components of the stamen region of the previous frame, if the statistical result change is large, the displacement of the stamens in the next frame of image is larger than that of the stamens in the previous frame, and therefore, stamen pixels in the stamen region selected by the previous frame are not suitable for the next frame, and the stamen pixels need to be counted again; if the statistical result is not changed much, the stamen pixels counted in the previous frame can be directly used. In addition, the standard for judging the variation size of the statistical result is that whether the percentage of the number of pixels with the relative variation of the RGB component and the HSI component being greater than M% in the total number of pixels exceeds N%, the larger the values of M and N are, the larger the error is, the lower the accuracy of the geometric center of the obtained pistil is, and conversely, the smaller the values of M and N are, the smaller the error is, the higher the accuracy of the geometric center of the obtained pistil is.
Further, in the step a2, the method for obtaining the pistil region is as follows: taking the geometric center of the pistil region selected in the step A1 as the center, taking a times of the pistil region boundary selected in the step A1 as the boundary of a new pistil region, obtaining a new pistil region to be segmented, obtaining pixel values of the new pistil region to be segmented in the step A2, judging pixel points segmented out of the new pistil region to be segmented as noise points, and directly removing the noise points;
in step B2, the petal area is obtained by: taking the geometric center of the petal area selected in the step B1 as the center, taking B times of the petal area boundary selected in the step B1 as the boundary of a new petal area to be segmented, obtaining a new petal area, obtaining a pixel value aiming at the new petal area to be segmented in the step B2, judging a pixel point segmented out of the new petal area to be segmented as a noise point, and directly removing the noise point, wherein a and B are designated values not less than 1;
in step B5, the method for acquiring the petal area of the previous frame image includes: c times of the boundary of the petal area in the previous frame image is a new petal area, and if the c times of the new petal area exceeds the area of the frame image, the area of the frame image is taken as the boundary, wherein c is a designated value not less than 1.
In order to facilitate the processing of the subsequent images, the pistil region or the petal region of each subsequent frame image is expanded on the basis of the pistil region or the petal region of the previous frame image.
Further, in the step a4, when dividing the stamen pixels, it is determined whether the stamens are occluded, which stamens are in front of each other and which stamens are behind each other, and the center point of the occluded stamens is determined.
Because the condition of sheltering from may appear in the motion process of pistil, whether shelter from appears in the acquisition in-process to the pistil region to judge which pistil is preceding when the condition of sheltering from appears, which pistil is after, then confirm the central point of sheltering from back pistil.
The further proposal is that the judgment rule of the occurrence of the occlusion phenomenon of the pistil is as follows: in the step a5, after cluster analysis is performed on the pistil pixels segmented in the step a4 and the types and numbers of pistils determined in the step a1, each pixel belongs to one of the pistils, the geometric center of each pistil is calculated, the distance from each pistil pixel to each geometric center is calculated, and when a certain pixel belongs to a certain pistil region and a pixel spatially adjacent to the certain pixel belongs to another pistil region, the two pistils are judged to have the occlusion phenomenon.
The further scheme is that the specific method for judging which stamen is in front of the other stamen and which stamen is behind the other stamen is as follows: when the distance between the central points of the two pistil regions is gradually reduced in the front frame image and the rear frame image, the angular point information of the pistil regions is obtained, and the shielding relation between the two pistils is judged by utilizing the angular point information of the pistil regions.
The further proposal is that the method for determining the central point of the covered pistil comprises the following steps: matching the shielded pistil region by using the pistil region in the previous frame of image with the pistil being shielded as a template to obtain the unshielded region of the shielded pistil; according to the boundary of the overlapped part of the pistil region before being shielded and the shielded pistil region and the boundary of the non-shielded region of the shielded pistil, the boundary of the shielded pistil region is divided, and the central point of the shielded pistil region is calculated.
The method further comprises the steps that when the shielded pistil regions are matched, angular point detection and matching are carried out on the pistil regions of the front frame image and the rear frame image, and whether the shielded pistil regions have rotation or scaling changes or not is judged; if the rotation exists, after the rotation angle is calculated, the pistil region in the frame of image before being shielded is rotated according to the rotation angle, and then the shielded pistil region is matched; if the zooming exists, after the zooming ratio is calculated, the stamen region in the frame of image before being shielded is zoomed according to the zooming ratio, and then the shielded stamen region is matched.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a method capable of accurately identifying the movement track of pistils during flowering of plants, which can track the movement track of each pistil, study the pollination process of the pistils and obtain information having guiding significance on plant propagation and breeding.
The method can identify the movement track of the pistil and measure the translation distance of the flower at the same time to obtain the absolute displacement of the pistil.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1:
a method for acquiring a movement track of a pistil comprises the following steps:
a1: selecting a first frame image from a video image of flower blooming, dividing a stamen region, and determining the type and number of stamens;
a2: acquiring values of RGB components and HIS components of the stamen region aiming at the selected stamen region;
a3: performing statistics according to the RGB components and HIS components of the pistil region obtained in the step A2 to obtain the gray statistics peak value of R, G, B, H, S, I component pixels of the selected pistil regionr m g m b m h m s m Andi m
a4: performing n-term polynomial fitting according to the gray statistic peak value, taking the first integer wave valley values on the left side and the right side of the peak value as the boundary of the segmentation pixels on the two sides of the peak value respectively, and then segmenting the stamen pixels in the selected stamen region; dividing pixel regions by R, G, B, H, S, I components, comparing the pixel regions divided by R, G, B, H, S, I components pairwise, wherein the overlapped part with the largest number of overlapping times of the pixel regions is the stamen pixel region of the frame image;
a5: performing cluster analysis with spatial distance as a criterion on the pistil pixel regions segmented in the step A4 and the types and numbers of pistils determined in the step A1 to obtain accurate regions of each pistil, respectively calculating the geometric center of each pistil under an image coordinate system, calculating the distance from the farthest point of the pistil pixel cluster to the geometric center, calculating the morphological radius r of the pistil, and fitting the pistil regions to circles with the geometric center of each pistil as the center of a circle and d r as the radius to serve as the pistil regions of the next frame of image;
a6: repeating the steps A2-A5 for each subsequent frame of image;
a7: drawing the geometric center point of each stamen according to the geometric center of each stamen under the image coordinate system, which is obtained in the steps A5 and A6, according to the time sequence, and finally obtaining the motion track of each stamen under the image coordinate system.
According to the method for acquiring the movement track of the pistil, the region is selected manually, the pistil pixel regions are segmented from the image according to the gray value, the segmented pistil pixel regions are clustered, the accurate region of each pistil is obtained, and each pistil pixel region corresponds to one pistil. And then respectively calculating the geometric center of each stamen pixel region, drawing a time-dependent change diagram of the position of the geometric center after acquiring the geometric center of each stamen pixel region in each frame, and acquiring the motion rule of each stamen in an image coordinate system.
In the step a4 in the above method, in order to obtain a more accurate pistil pixel region, it is necessary to compare pixel regions respectively divided from R, G, B, H, S, I components, and if two are compared, the overlapped portion with the largest number of overlapping times of the pixel regions is used as the pistil pixel region, so that the accuracy of division of the pistil pixel region is higher. If the R, G, B, H, S, I component segmented whole pixel region is directly used, the error of the pistil pixel region is larger, the pistil pixel region obtained through clustering also has larger error, so the obtained geometric center point also has error, and the finally obtained movement rule of each pistil under the image coordinate system also has larger error.
Example 2:
on the basis of the embodiment 1, the method for obtaining the pistil motion trail further comprises the following steps:
b1: selecting a petal area at the same time as the step a 1;
b2, acquiring the values of RGB components and HIS components of the petal area aiming at the selected petal area;
b3: according to the petal area obtained in step B2The RGB component and the HIS component are counted, and the gray scale statistical peak value of R, G, B, H, S, I component pixels of the selected petal area is obtainedr f g f b f h f s f Andi f (ii) a Performing n-term polynomial fitting according to the gray statistic peak value, respectively taking the first integer wave valley values on the left side and the right side of the peak value as segmentation pixel boundaries on the two sides of the peak value, performing pixel segmentation on the selected petal area, segmenting petal pixels, respectively segmenting pixel areas by R, G, B, H, S, I components, comparing every two of the pixel areas segmented by R, G, B, H, S, I components, wherein the overlapped part with the largest number of overlapping times of the pixel areas is the petal pixel area of the frame image;
b4: carrying out corner point detection on the petal pixel area obtained in the step B3;
b5: in each subsequent frame image, acquiring a petal area of the frame image from the petal area of the previous frame image, and repeating the steps B2-B4;
b6: calculating the translation distance of the flower in the front frame image and the rear frame image according to the corner points of the front frame image and the rear frame image acquired in the step B5;
b7: and subtracting the translation distance of the flower in the front frame image and the rear frame image from the geometric central point of each stamen in the next frame image acquired in the step A6 to obtain the absolute displacement of the geometric center of each stamen in the image coordinate system, and drawing the absolute motion track of each stamen in the image coordinate system.
Because the flower can move integrally between the front frame image and the back frame image, in order to obtain a more accurate track of the stamen under the image coordinate system, the position offset of the flower under the image coordinate system needs to be detected, so that the displacement condition of the petals is judged in an angular point detection mode, finally, the absolute displacement of the geometric center of each stamen under the image coordinate system can be obtained by subtracting the translation distance of the flower in the front frame image and the back frame image, and the absolute motion track of each stamen under the image coordinate system is drawn.
Example 3:
in example 2, the pistil pixel region divided in the a4 step and the petal pixel region divided in the B3 step are compared, and if the pistil pixel region is within the petal pixel region, it is determined as the pistil pixel region, whereas if not, it is determined as the noise region.
In the embodiment, some noise points which are not the stamen pixel region may appear in the acquisition process of the stamen pixel region, so that the stamen pixel region is inevitably in the petal pixel region by comparing the stamen pixel region with the petal pixel region, and therefore, if the acquired stamen pixel region is outside the petal pixel region, the acquired stamen pixel region is determined to be the noise region and should be removed.
Example 4:
on the basis of embodiment 3, in the steps a6 and B5, RGB components and HIS components of an pistil region and a petal region in each subsequent frame image are respectively counted sequentially, the counted result is compared with the counted result of the previous frame image, if the pixel value with the pixel number change larger than M% exceeds N% of the total pixel number, the regions divided in the previous frame in the steps a6 and B5 are counted again for RGB components and HIS components, and pistil pixels are divided according to a new statistical peak value, otherwise, the step a6 acquires pistil pixels according to the previous framer m g m b m h m s m Andi m the pixel values of (a) are divided, the step B5 is according to the petals obtained in the previous framer f g f b f h f s f Andi f is divided, wherein M, N is the specified value.
When two adjacent frames of images are processed, the position change of the stamens in the two adjacent frames is not too large, so that the workload is reduced, the working efficiency is improved, when the next frame of image is processed, firstly, RGB components and HSI components of a stamen region selected by the previous frame are counted, then, the RGB components and the HSI components of the stamen region selected by the previous frame are compared with the statistical results of the RGB components and the HSI components of the stamen region of the previous frame, if the statistical result change is large, the displacement of the stamens in the next frame of image is larger than that of the stamens in the previous frame, and therefore, stamen pixels in the stamen region selected by the previous frame are not suitable for the next frame, and the stamen pixels need to be counted again; if the statistical result is not changed much, the stamen pixels counted in the previous frame can be directly used. In addition, the standard for judging the variation size of the statistical result is that whether the percentage of the number of pixels with the relative variation of the RGB component and the HSI component being greater than M% in the total number of pixels exceeds N%, the larger the values of M and N are, the larger the error is, the lower the accuracy of the geometric center of the obtained pistil is, and conversely, the smaller the values of M and N are, the smaller the error is, the higher the accuracy of the geometric center of the obtained pistil is.
Example 5:
on the basis of example 4, in step a2, the stamen region was obtained by: taking the geometric center of the pistil region selected in the step A1 as the center, taking a times of the pistil region boundary selected in the step A1 as the boundary of a new pistil region, obtaining a new pistil region to be segmented, obtaining pixel values of the new pistil region to be segmented in the step A2, judging pixel points segmented out of the new pistil region to be segmented as noise points, and directly removing the noise points;
in step B2, the petal area is obtained by: taking the geometric center of the petal area selected in the step B1 as the center, taking B times of the petal area boundary selected in the step B1 as the boundary of a new petal area to be segmented, obtaining a new petal area, obtaining a pixel value aiming at the new petal area to be segmented in the step B2, judging a pixel point segmented out of the new petal area to be segmented as a noise point, and directly removing the noise point, wherein a and B are designated values not less than 1;
in step B5, the method for acquiring the petal area of the previous frame image includes: c times of the boundary of the petal area in the previous frame image is a new petal area, and if the c times of the new petal area exceeds the area of the frame image, the area of the frame image is taken as the boundary, wherein c is a designated value not less than 1.
In order to facilitate the processing of the subsequent image, in this embodiment, the pistil region or the petal region of each subsequent frame image is expanded based on the pistil region or the petal region of the previous frame image.
Example 6:
in example 5, in the step a4, when dividing the stamen pixels, it is determined whether or not the stamens are occluded, which stamen is before and which stamen is after, and the center point of the stamen after occlusion is determined.
Because the condition of sheltering from may appear in the motion process of pistil, whether shelter from appears in the acquisition in-process to the pistil region to judge which pistil is preceding when the condition of sheltering from appears, which pistil is after, then confirm the central point of sheltering from back pistil.
The judgment rule of the occurrence of the occlusion phenomenon of the stamen is as follows: in the step a5, after cluster analysis is performed on the pistil pixels segmented in the step a4 and the types and numbers of pistils determined in the step a1, each pixel belongs to one of the pistils, the geometric center of each pistil is calculated, the distance from each pistil pixel to each geometric center is calculated, and when a certain pixel belongs to a certain pistil region and a pixel spatially adjacent to the certain pixel belongs to another pistil region, the two pistils are judged to have the occlusion phenomenon. Similarly, when n stamens are overlapped and shielded, pairwise comparison of the stamens is judged according to the principle.
The specific method for judging which stamen is in front of the other stamen and which stamen is behind the other stamen is as follows: when the distance between the central points of the two pistil regions is gradually reduced in the front frame image and the rear frame image, the angular point information of the pistil regions is obtained, and the shielding relation between the two pistils is judged by utilizing the angular point information of the pistil regions.
The method for determining the central point of the covered pistil comprises the following steps: matching the shielded pistil region by using the pistil region in the previous frame of image with the pistil being shielded as a template to obtain the unshielded region of the shielded pistil; according to the boundary of the overlapped part of the pistil region before being shielded and the shielded pistil region and the boundary of the non-shielded region of the shielded pistil, the boundary of the shielded pistil region is divided, and the central point of the shielded pistil region is calculated.
When the shielded pistil regions are matched, performing angular point detection and matching on the pistil regions of the front and rear two frames of images, and judging whether the shielded pistil regions have rotation or scaling change; if the rotation exists, after the rotation angle is calculated, the pistil region in the frame of image before being shielded is rotated according to the rotation angle, and then the shielded pistil region is matched; if the zooming exists, after the zooming ratio is calculated, the stamen region in the frame of image before being shielded is zoomed according to the zooming ratio, and then the shielded stamen region is matched.
The specific embodiment is as follows:
in this example, the stamens of the flowers of dayflower were tracked, and video images of the flowering process of dayflower were taken, and one of the frames of images was selected, wherein the dayflower had 1 pistil and 6 stamens, and the size of the image was 3264 × 4928. Manually selecting a region of the dayflower pistils, and counting the values of RGB components and HSI components of the selected pistils; and simultaneously, manually selecting the area of the selected petal, and acquiring the RGB component and the HSI component of the area of the selected petal. The type and the number of the pistils are manually determined, and a rough initial position is mainly provided for accurately determining the geometric center of the pistils.
Counting according to the RGB component and HSI component of the obtained stamen to obtain the gray value r of the statistical peak valuemIs 195, gmIs 190, bmIs 25, hmIs 40, smIs 225, imAt 93, segmentation regions are set based on the gray value peaks to segment the stamen pixels from the image.
Counting according to the RGB components and HSI components of the petals, and acquiring the gray value r of the statistical peak value of each component of the petalsfIs 160 gfIs 125, bfIs 183, hfIs 200, sfIs 56, ifIs 110. And setting a segmentation area according to the gray value peak value, and segmenting petal pixels from the image.
For a single pistil or flower, a new region is obtained by taking the manually selected geometric center as the center and taking the distance between the long side and the short side which is 1.3 times that of the manually selected region as two boundaries. And judging the points segmented out of the new region as noise points, and directly removing the noise points.
And carrying out autonomous clustering classification on the divided pistil pixels and manually selected pistil categories and numbers to acquire each pistil independently. The dayflower has 6 stamens and 1 pistil. After each autonomic cluster is calculated, the geometric center of the stamen is calculated. And fitting the boundary points of each cluster by using a circle, and obtaining the radius r of each circle, wherein the radius of the pistil is 70 pixel values.
When the next frame of image is divided, respectively counting the distribution of RGB component and HIS component values between 0 and 255, comparing with the previous frame, the percentage of the number of pixels with the relative variation of RGB component and HSI component more than 10% in the total number of pixels of the set S1 of the corresponding pixels exceeds 1/3, then counting the values of RGB component and HIS component according to the pistil region or petal region divided from the previous frame, obtaining the gray value r for obtaining the statistical peak value of each component of pistilmIs 195, gmIs 190, bmIs 25, hmIs 40, smIs 225, imIs 93; obtaining gray value r of each component statistical peak value of the petalfIs 160 gfIs 125, bfIs 183, hfIs 200, sfIs 56, ifAnd setting a segmentation area according to the gray value peak value, and segmenting petal pixels and stamen pixels from the image to obtain 110 parts.
And uniformly clustering the divided regions again to obtain the centers of the related pistils. The geometric center points of 6 stamens in a stamen are: [2963, 1486] [3414,1812] [3209,1400] [3122,1569] [3302,1753] [3254,1487 ]; the geometric center of pistil is [1741,1840 ]. The geometric centers of the next frame obtained by the same method are as follows, wherein the geometric centers of 6 stamens are as follows: [2971, 1446] [3424,1851] [3199,1420] [3101,1573] [3281,1733] [3232,1466 ]; the geometric center of pistil is [1722,1863 ].
Because the flower moves integrally between the front frame and the rear frame, in order to obtain a more accurate track of the stamen under the image coordinate system, the embodiment selects a return area with a size of 1.3 times or less outside the flower, outside the area of the flower, to perform angular point detection, which mainly prevents the influence on the whole flower caused by the change of the area of the flower when the petals are opened and the change of the integral rotation of the flower. In the embodiment, the integral change of the corresponding angular point is obtained by utilizing the detection of the angular point of sift and the matching of the angular points, wherein the integral position of the second frame of flowers is translated by 1.3 pixel points in the left direction under an image coordinate system compared with the first frame of flowers; the direction is translated by 2.1 pixels. The geometric center point of stamens in the second frame after calibration is: [2969.7, 1443.9] [3422.7,1848.9] [3197.7,1417.9] [3190.7,1570.9] [3279.7,1730.9] [3230.7,1463.9 ]; the geometric center of pistil is [1720.7,1860.9 ].
According to the method, the positions of the pistils are sequentially obtained, and finally the position track of each pistil along with the change of time is drawn.
Although the invention has been described herein with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More specifically, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure and claims of this application. In addition to variations and modifications in the component parts and/or arrangements, other uses will also be apparent to those skilled in the art.

Claims (10)

1. A method for acquiring a movement track of a pistil is characterized by comprising the following steps: the method comprises the following steps:
a1: selecting a first frame image from a video image of flower blooming, dividing a stamen region, and determining the type and number of stamens;
a2: acquiring values of RGB components and HIS components of the stamen region aiming at the selected stamen region;
a3: performing statistics according to the RGB components and HIS components of the pistil region obtained in the step A2 to obtain the gray statistics peak value of R, G, B, H, S, I component pixels of the selected pistil regionr m g m b m h m s m Andi m
a4: performing n-term polynomial fitting according to the gray statistic peak value, taking the first integer wave valley values on the left side and the right side of the peak value as the boundary of the segmentation pixels on the two sides of the peak value respectively, and then segmenting the stamen pixels in the selected stamen region; dividing pixel regions by R, G, B, H, S, I components, comparing the pixel regions divided by R, G, B, H, S, I components pairwise, wherein the overlapped part with the largest number of overlapping times of the pixel regions is the stamen pixel region of the frame image;
a5: performing cluster analysis with spatial distance as a criterion on the pistil pixel regions segmented in the step A4 and the types and numbers of pistils determined in the step A1 to obtain accurate regions of each pistil, respectively calculating the geometric center of each pistil under an image coordinate system, calculating the distance from the farthest point of the pistil pixel cluster to the geometric center, calculating the morphological radius r of the pistil, and fitting the pistil regions to circles with the geometric center of each pistil as the center of a circle and d r as the radius to serve as the pistil regions of the next frame of image;
a6: repeating the steps A2-A5 for each subsequent frame of image;
a7: drawing the geometric center point of each stamen according to the geometric center of each stamen under the image coordinate system, which is obtained in the steps A5 and A6, according to the time sequence, and finally obtaining the motion track of each stamen under the image coordinate system.
2. The method for obtaining pistil motion trail according to claim 1, wherein: further comprising the steps of:
b1: selecting a petal area at the same time as the step a 1;
b2, acquiring the values of RGB components and HIS components of the petal area aiming at the selected petal area;
b3: performing statistics according to RGB components and HIS components of the petal regions acquired in the step B2, and acquiring gray statistics of R, G, B, H, S, I component pixels of the selected petal regionsPeak valuer f g f b f h f s f Andi f (ii) a Performing n-term polynomial fitting according to the gray statistic peak value, respectively taking the first integer wave valley values on the left side and the right side of the peak value as segmentation pixel boundaries on the two sides of the peak value, performing pixel segmentation on the selected petal area, segmenting petal pixels, respectively segmenting pixel areas by R, G, B, H, S, I components, comparing every two of the pixel areas segmented by R, G, B, H, S, I components, wherein the overlapped part with the largest number of overlapping times of the pixel areas is the petal pixel area of the frame image;
b4: carrying out corner point detection on the petal pixel area obtained in the step B3;
b5: in each subsequent frame image, acquiring a petal area of the frame image from the petal area of the previous frame image, and repeating the steps B2-B4;
b6: calculating the translation distance of the flower in the front frame image and the rear frame image according to the corner points of the front frame image and the rear frame image acquired in the step B5;
b7: and subtracting the translation distance of the flower in the front frame image and the rear frame image from the geometric central point of each stamen in the next frame image acquired in the step A6 to obtain the absolute displacement of the geometric center of each stamen in the image coordinate system, and drawing the absolute motion track of each stamen in the image coordinate system.
3. The method for obtaining pistil motion trail according to claim 2, wherein: the pistil pixel region divided in the step a4 is compared with the petal pixel region divided in the step B3, and if the pistil pixel region is within the petal pixel region, it is determined as the pistil pixel region, otherwise, it is determined as the noise region.
4. The method for obtaining pistil motion trail according to claim 2, wherein: in the step A6 and the step B5, RGB components and HIS components of an pistil region and a petal region in each subsequent frame image are respectively counted in sequence, the counted result is compared with the counted result of the previous frame image,
if the pixel value of the pixel number change larger than M% exceeds N% of the total pixel number, the RGB components and HIS components are counted again according to the divided region of the previous frame in the step A6 and the step B5, the stamen pixels are divided according to the new statistical peak value,
otherwise the step A6 is that of obtaining stamen from previous framer m g m b m h m s m Andi m the pixel values of (a) are divided, the step B5 is according to the petals obtained in the previous framer f g f b f h f s f Andi f is divided, wherein M, N is the specified value.
5. The method for obtaining pistil motion trail according to claim 2, wherein:
in step a2, the method for obtaining the stamen region is: taking the geometric center of the pistil region selected in the step A1 as the center, taking a times of the pistil region boundary selected in the step A1 as the boundary of a new pistil region, obtaining a new pistil region to be segmented, obtaining pixel values of the new pistil region to be segmented in the step A2, judging pixel points segmented out of the new pistil region to be segmented as noise points, and directly removing the noise points;
in step B2, the petal area is obtained by: taking the geometric center of the petal area selected in the step B1 as the center, taking B times of the petal area boundary selected in the step B1 as the boundary of a new petal area to be segmented, obtaining a new petal area, obtaining a pixel value aiming at the new petal area to be segmented in the step B2, judging a pixel point segmented out of the new petal area to be segmented as a noise point, and directly removing the noise point, wherein a and B are designated values not less than 1;
in step B5, the method for acquiring the petal area of the previous frame image includes: c times of the boundary of the petal area in the previous frame image is a new petal area, and if the c times of the new petal area exceeds the area of the frame image, the area of the frame image is taken as the boundary, wherein c is a designated value not less than 1.
6. The method for obtaining pistil motion trail according to claim 1, wherein: in step a4, when dividing the stamen pixels, it is determined whether or not a stamen is occluded, which stamen is in front of the stamen and which stamen is behind the stamen, and the center point of the occluded stamen is determined.
7. The method for obtaining pistil motion trail according to claim 6, wherein: the judgment rule of the occurrence of the occlusion phenomenon of the stamen is as follows:
in the step a5, after cluster analysis is performed on the pistil pixels segmented in the step a4 and the types and numbers of pistils determined in the step a1, each pixel belongs to one of the pistils, the geometric center of each pistil is calculated, the distance from each pistil pixel to each geometric center is calculated, and when a certain pixel belongs to a certain pistil region and a pixel spatially adjacent to the certain pixel belongs to another pistil region, the two pistils are judged to have the occlusion phenomenon.
8. The method for obtaining pistil motion trail according to claim 7, wherein: the specific method for judging which stamen is in front of the other stamen and which stamen is behind the other stamen is as follows:
when the distance between the central points of the two pistil regions is gradually reduced in the front frame image and the rear frame image, the angular point information of the pistil regions is obtained, and the shielding relation between the two pistils is judged by utilizing the angular point information of the pistil regions.
9. The method for obtaining pistil motion trail according to claim 8, wherein: the method for determining the central point of the covered pistil comprises the following steps:
matching the shielded pistil region by using the pistil region in the previous frame of image with the pistil being shielded as a template to obtain the unshielded region of the shielded pistil;
according to the boundary of the overlapped part of the pistil region before being shielded and the shielded pistil region and the boundary of the non-shielded region of the shielded pistil, the boundary of the shielded pistil region is divided, and the central point of the shielded pistil region is calculated.
10. The method for obtaining pistil motion trail according to claim 9, wherein: when the shielded pistil regions are matched, performing angular point detection and matching on the pistil regions of the front and rear two frames of images, and judging whether the shielded pistil regions have rotation or scaling change;
if the rotation exists, after the rotation angle is calculated, the pistil region in the frame of image before being shielded is rotated according to the rotation angle, and then the shielded pistil region is matched;
if the zooming exists, after the zooming ratio is calculated, the stamen region in the frame of image before being shielded is zoomed according to the zooming ratio, and then the shielded stamen region is matched.
CN201710501127.5A 2017-06-27 2017-06-27 Method for acquiring pistil motion track Active CN107330919B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710501127.5A CN107330919B (en) 2017-06-27 2017-06-27 Method for acquiring pistil motion track

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710501127.5A CN107330919B (en) 2017-06-27 2017-06-27 Method for acquiring pistil motion track

Publications (2)

Publication Number Publication Date
CN107330919A CN107330919A (en) 2017-11-07
CN107330919B true CN107330919B (en) 2020-07-10

Family

ID=60197712

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710501127.5A Active CN107330919B (en) 2017-06-27 2017-06-27 Method for acquiring pistil motion track

Country Status (1)

Country Link
CN (1) CN107330919B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171718A (en) * 2017-11-23 2018-06-15 北京林业大学 A kind of small daisy_petal part number automatic testing method based on wavelet transformation
US20210216808A1 (en) * 2018-06-05 2021-07-15 Sony Corporation Information processing apparatus, information processing system, program, and information processing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116899A (en) * 2013-02-01 2013-05-22 拓维信息系统股份有限公司 Mobile phone background image creating method based on firework effect simulation
CN103324913A (en) * 2013-05-29 2013-09-25 长安大学 Pedestrian event detection method based on shape features and trajectory analysis
CN103871079A (en) * 2014-03-18 2014-06-18 南京金智视讯技术有限公司 Vehicle tracking method based on machine learning and optical flow
CN106447771A (en) * 2016-10-18 2017-02-22 中国科学院深圳先进技术研究院 Flower opening process reconstruction method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9563945B2 (en) * 2012-07-05 2017-02-07 Bernard Fryshman Object image recognition and instant active response with enhanced application and utility

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116899A (en) * 2013-02-01 2013-05-22 拓维信息系统股份有限公司 Mobile phone background image creating method based on firework effect simulation
CN103324913A (en) * 2013-05-29 2013-09-25 长安大学 Pedestrian event detection method based on shape features and trajectory analysis
CN103871079A (en) * 2014-03-18 2014-06-18 南京金智视讯技术有限公司 Vehicle tracking method based on machine learning and optical flow
CN106447771A (en) * 2016-10-18 2017-02-22 中国科学院深圳先进技术研究院 Flower opening process reconstruction method and device

Also Published As

Publication number Publication date
CN107330919A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
CN105894499B (en) A kind of space object three-dimensional information rapid detection method based on binocular vision
CN106709950B (en) Binocular vision-based inspection robot obstacle crossing wire positioning method
CN108986064B (en) People flow statistical method, equipment and system
CN105701820B (en) A kind of point cloud registration method based on matching area
CN110232389B (en) Stereoscopic vision navigation method based on invariance of green crop feature extraction
EP2901414B1 (en) Method and image processing system for determining parameters of a camera
CN112102409A (en) Target detection method, device, equipment and storage medium
CN107341844A (en) A kind of real-time three-dimensional people's object plotting method based on more Kinect
CN112085675B (en) Depth image denoising method, foreground segmentation method and human motion monitoring method
CN106651801A (en) Method and system for removing noises during light spot locating
CN104408772A (en) Grid projection-based three-dimensional reconstructing method for free-form surface
CN107330919B (en) Method for acquiring pistil motion track
CN107240120A (en) The tracking and device of moving target in video
CN115760893A (en) Single droplet particle size and speed measuring method based on nuclear correlation filtering algorithm
CN108510544B (en) Light strip positioning method based on feature clustering
CN114155557A (en) Positioning method, positioning device, robot and computer-readable storage medium
CN111369497B (en) Walking type tree fruit continuous counting method and device
CN104978558B (en) The recognition methods of target and device
CN107093182B (en) A kind of human height's estimation method based on feature corners
CN105138979A (en) Method for detecting the head of moving human body based on stereo visual sense
CN110706254A (en) Target tracking template self-adaptive updating method
CN115661187A (en) Image enhancement method for Chinese medicinal preparation analysis
CN112991327B (en) Steel grid welding system, method and terminal equipment based on machine vision
CN111105394B (en) Method and device for detecting characteristic information of luminous pellets
Yang et al. Cherry recognition based on color channel transform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant