CN103471523B - A kind of detection method of arabidopsis profile phenotype - Google Patents

A kind of detection method of arabidopsis profile phenotype Download PDF

Info

Publication number
CN103471523B
CN103471523B CN201310456227.2A CN201310456227A CN103471523B CN 103471523 B CN103471523 B CN 103471523B CN 201310456227 A CN201310456227 A CN 201310456227A CN 103471523 B CN103471523 B CN 103471523B
Authority
CN
China
Prior art keywords
image
arabidopsis
profile
plant
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310456227.2A
Other languages
Chinese (zh)
Other versions
CN103471523A (en
Inventor
田野
张俊梅
柯秋红
聂凤梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Forestry University
Original Assignee
Beijing Forestry University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Forestry University filed Critical Beijing Forestry University
Priority to CN201310456227.2A priority Critical patent/CN103471523B/en
Publication of CN103471523A publication Critical patent/CN103471523A/en
Application granted granted Critical
Publication of CN103471523B publication Critical patent/CN103471523B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The detection method of a kind of arabidopsis profile phenotype, the method specifically includes: in the planting pot of arabidopsis, places scaling board, utilizes the RGB image of collected by camera arabidopsis;To gather after image carry out pretreatment, it is achieved image from dynamic(al) correction and demarcation, wherein image calibration is precisely in order to the distortion of correction chart picture, and image calibration is the full-size(d) in order to obtain unit picture element;Pretreated image is split, by arabidopsis and background segment, extracts from image;After being partitioned into arabidopsis image, extract the profile phenotypic parameter of arabidopsis, describe its overall profile quantitatively with oval Fourier descriptor.By the analysis to overall profile phenotypic parameter, it is possible to describe heterogeneic arabidopsis difference on overall profile form and the orientation of growth, such that it is able to infer heterogeneic function and the impact on arabidopsis thaliana.

Description

A kind of detection method of arabidopsis profile phenotype
Technical field
The present invention relates to the detection method of a kind of arabidopsis profile phenotype, utilize the picture of collected by camera arabidopsis, application image processing method to extract the profile phenotypic parameter of arabidopsis from image, it is achieved the Non-Destructive Testing of arabidopsis growth course.
Background technology
Arabidopsis is a kind of important model plant in botany, genetics, hereditism.Research to arabidopsis phenotype, it is possible to illustrate the physiological function of arabidopsis comprehensively, up hill and dale, the particularly mutual relation between its phenotype and its gene, and the impact that it is grown by different environmental conditions.The detection method of plant phenotypic characteristics includes destructive measurement, contact is measured and Computer Vision Detection method.Destructive measurement, namely for a collection of plant, extracts some randomly, measures its parameter by destructive method.Contact is measured and is namely adopted touch sensor to measure the parameter of plant.Adopting computer vision technique measurement namely by relevant device, including CCD camera, light source etc., it is thus achieved that the spectrum picture of measurand, utilizing the software being correlated with, algorithm that image is processed, it is thus achieved that required data, thus obtaining the phenotypic parameter of plant.Existing research work realizes the analysis of the analysis to individual blade or other crop mainly by computer vision technique.Li Xinguo etc. utilize scanner to obtain the image of rape leaf, and utilize Photoshop software to obtain blade pixel count, area (the Li Xinguo of blade is obtained by resolution, Cai Shengzhong, Li Shaopeng etc. Applied Digital image technique measures avocado leaf area [J]. tropical agricultural science, 2009,29 (2): 10-13.).Han Dianyuan etc. utilize color to carry out the algorithm split for the blade proposition under white background is a kind of, and then utilize the reference rectangular slab in background to calculate the area (Han Dianyuan of blade, gold zone deep pool, Fu Hui etc. the plant leaf area based on color channel similarity image dividing method calculates [J]. Transactions of the Chinese Society of Agricultural Engineering, 2012,28 (6): 179-183.).Li Shaokun etc. utilize image technique that Semen Maydis and Semen Tritici aestivi are carried out image acquisition, and extract relevant parameter (Li Shaokun, Zhang Xian. the research [J] of crop plant type information multi-media image processing techniques. Acta Agronomica Sinica, 1998,24 (3): 265-271).Li Changying etc. utilize computer vision technique that hothouse plants growth is carried out non-destructive monitoring, obtain the formalness feature of plant, including top projected leaf area and plant height (Li Changying, Teng Guanghui, Zhao Chunjiang etc. utilize computer vision technique to realize the non-destructive monitoring [J] to hothouse plants growth. Transactions of the Chinese Society of Agricultural Engineering, 2003,19 (3): 140-143.).
In sum, there is following defect in existing research:
1, adopt destructive measuring method that plant can be caused damage, and the Line Continuity that grows into of plant can not be measured.
2, adopting sensor measurement, directly contact with plant, the growth of plant can be produced certain impact, and its cost is high, development difficulty is also relatively large.
3, existing computer vision technique is concentrated mainly on the Phenotypic examination of individual blade or other crop, and the Phenotypic examination of arabidopsis is relied primarily on artificial realization, and workload is big, inefficient.
At present around the Research on Computer Vision Detection of arabidopsis phenotype, at home and abroad rarely has bibliographical information.
Summary of the invention
The technical problem to be solved in the present invention is: how for the feature of arabidopsis, utilizes computer vision technique that arabidopsis is carried out Non-Destructive Testing, extracts the profile phenotypic parameter in its growth course.The overall profile of plant reflects configuration and the orientation of growth of plant, by the analysis to overall profile, it is possible to grasp the morphological characteristic of plant better.But the quantity of profile point is very big, therefore by elliptic Fourier analysis, it being analyzed, it is thus possible to describe the contour feature of plant with a small amount of oval Fourier descriptor, memory space has been saved in the storage for profile phenotypic data.The profile phenotypic parameter represented with oval Fourier descriptor both can describe the growing state of arabidopsis quantitatively, can be used for the research of arabidopsis gene function, namely by these phenotypic parameters, heterogeneic arabidopsis difference on overall profile form is described, such that it is able to infer heterogeneic function and the impact on arabidopsis thaliana.
(1) technical scheme
To achieve these goals, the invention provides the arabidopsis profile Phenotypic examination method based on computer vision, comprise the following steps:
S1., in the planting pot of arabidopsis, place scaling board, utilize the RGB image of collected by camera arabidopsis;
S2. to gather after image carry out pretreatment, it is achieved image from dynamic(al) correction and demarcation, wherein image calibration is precisely in order to the distortion of correction chart picture, and image calibration is the full-size(d) in order to obtain unit picture element;
S3. pretreated image is split, by arabidopsis and background segment, extract from image;
S4. extract the profile phenotypic parameter of arabidopsis, describe its overall profile quantitatively with oval Fourier descriptor.
What the scaling board in collection image adopted is rim black and white gridiron pattern, 3 × 3 length of sides be that the square of 4mm forms.Scaling board is placed on Arabidopsis plant side, therewith carries out image acquisition.
Step S2 specifically includes following steps:
S2.1 positions black and white gridiron pattern
S2.1.1, according to blue RGB feature, extracts blue border;
The inside of image is carried out holes filling by S2.1.2, then deducts original blue border image, obtains new images;
New images is carried out opening operation by S2.1.3, removes noise;Carry out closed operation again, connect breakpoint, obtain the tessellated region of black and white;
S2.2 Corner Detection
S.2.2.1 calculate the first derivative horizontally and vertically gone up of each point in black and white gridiron pattern region, obtain three width new images: horizontal first derivative square, the product of two first derivatives of quadratic sum of vertical first derivative;
S.2.2.2 with gaussian filtering, three width images are filtered, remove noise;
S.2.2.3 correlation matrix, calculation criterion function are formed by above-mentioned three width images, it is judged that whether pixel therein is angle point;
S2.3 image rectification and image calibration
S.2.3.1 obtain tessellated each foursquare summit coordinate in the picture by Corner Detection, and according to its spatial relation in real world, obtain the transformation matrix of the two;
S.2.3.2 ask for the inverse of transformation matrix, act on image, it is achieved image rectification;
S2.3.3 obtains the tessellated total number of pixels of black and white the full-size(d) according to it by angular coordinate, obtains the full-size(d) of unit picture element;
Step S3 specifically includes following steps:
S3.1 image foremost segment
The rgb value of each pixel is normalized acquisition rgb by S3.1.1, extracts the chromaticity difference diagram of 3g-2.4r-b, as threshold value, image is carried out binaryzation with 0;
Result is deducted the blue border region originally detected by S3.1.2, it is judged that whether the gray value of the G of obtained foreground area pixel, more than 50, if so, then retains, otherwise remove;
S3.1.3 extracts the region with maximum pixel in prospect connected region, is plant regional;
S3.2 removes noise
Gained image is carried out opening operation by S3.2.1, obtains new image, and it will only retain the central area of big leaf area and plant, and removes the detail section in image, including the noise around the stem of plant and blade;
Each connected region of the S3.2.2 detail section to removing calculates its number of pixels, number of pixels directly removing less than 12;
The S3.2.3 number of pixels connected region be more than or equal to 12, the new images individually obtained with opening operation superposes, calculate the areal in image after superposition again, if areal reduces, illustrate that this connected region is stem, it is necessary to retain, otherwise, if the areal after superposition in image increases or constant, then illustrate that this connected region is the noise around blade, it is necessary to remove;
After S3.2.4 has judged all of detail section, the new images that all connected regions retained all obtain with opening operation is superposed, obtains final plant regional;
Step S4 specifically includes following steps:
The hole of plant regional is filled with by S4.1;
S4.2 passes through profile lookup algorithm, extracts the outline point of the plant regional after filling;
Profile point is stored respectively by S4.3 according to x and y-coordinate, thus tieing up (2D) image by 2 to become 21 dimension (1D) signals;
Described in S4.3 two 1D signal is carried out data point mirror image filling by S4.4 respectively so that it is expands to the integral number power times of 2, obtains two new 1D signals;
S4.4 is obtained two 1D signals and carries out 4 layers of Haar wavelet decomposition respectively by S4.5, removes noise, and the STATIC CORRECTION USING APPROXIMATE LAYER coefficient taking the 4th layer processes as new signal,;
Two new 1D signals that S4.5 is obtained by S4.6 carry out Fourier transformation, it is thus achieved that the Fourier coefficient of 10 layers, and every layer coefficients comprises 4 and describes son;
Four coefficients of every layer are sought normalized modulus by S4.7, thus describe the overall profile of plant with 10 Fourier descriptors.
(2) beneficial outcomes
The inventive method utilizes computer vision technique to gather the image of arabidopsis, utilizes image processing techniques to achieve the Non-Destructive Testing of arabidopsis phenotype.By detect the tessellated blue border of rim black and white, and with Corner Detection Algorithm detect within angle point, finally achieve image from dynamic(al) correction and demarcation.Normalized rgb value linear combination is adopted to carry out Threshold segmentation arabidopsis, and by again judging the details of plant regional, remove unnecessary noise, it is achieved thereby that when not affecting arabidopsis normal growth, split from complicated natural growing environment by arabidopsis.After being partitioned into plant, then extract the profile phenotypic parameter of arabidopsis.Observing relative to Traditional Man and measure, the method is enhanced in efficiency.
Accompanying drawing explanation
By reading detailed description non-limiting example made with reference to the following drawings, the other features, objects and advantages of the present invention will become more apparent upon:
Fig. 1 is the flow chart of the arabidopsis profile Phenotypic examination method according to one embodiment of the invention;
Fig. 2 is the collection transposition for gathering image used in method provided by the invention;
Fig. 3 is the image generated in preprocess method processing procedure provided by the invention, wherein, subgraph (1) is the RGB image of the arabidopsis gathered according to the inventive method, (2) it is the black and white gridiron pattern region generated after pretreatment, (3) being the Corner Detection image generated after pretreatment, (4) are the image after the correction generated after pretreatment;
Fig. 4 is the image generated in image partition method processing procedure provided by the invention, wherein, subgraph (1) is the bianry image generated after image foremost segment step, and (2) are the image of the final arabidopsis generated after noise removal step;
Fig. 5 is the image generated in the profile phenotype extracting method processing procedure of arabidopsis provided by the invention, wherein, subgraph (1) is the result after the holes filling in plant regional, (2) outline of the plant regional for extracting through profile lookup algorithm, (3) 1 dimensional signal that the x coordinate for extracting profile point is formed, (4) 1 dimensional signal that the y-coordinate for extracting profile point is formed, (5) length of 1 dimensional signal formed for x coordinate is extended to the result that the integral number power of 2 obtains, (6) length of 1 dimensional signal formed for y-coordinate is extended to the result that the integral number power of 2 obtains, (7) for the result after 1 dimensional signal Wavelet Denoising Method of x coordinate formation, (8) for the result after 1 dimensional signal Wavelet Denoising Method of y-coordinate formation.
Detailed description of the invention
The detection method of a kind of arabidopsis profile phenotype that the present invention proposes, describes in detail as follows in conjunction with the accompanying drawings and embodiments.Following example will assist in those skilled in the art and are further appreciated by the present invention, but do not limit the present invention in any form.It should be pointed out that, to those skilled in the art, without departing from the inventive concept of the premise, it is also possible to make some deformation and change.These broadly fall into protection scope of the present invention.
For the profile phenotypic parameter of rapid extraction arabidopsis, the present invention proposes the detection method of a kind of arabidopsis profile phenotype.The method is partitioned into arabidopsis from complicated background environment, on this basis the profile phenotypic parameter of arabidopsis is extracted after the image gathered is carried out pretreatment, and the phenotypic parameter that improve arabidopsis obtains efficiency.
As it is shown in figure 1, include step according to the arabidopsis profile Phenotypic examination method of one embodiment of the invention:
S1, in the planting pot of arabidopsis, places scaling board, utilizes CCD camera to gather the RGB image of arabidopsis;
In this example, it is possible to adopt the harvester of Fig. 2 to carry out the collection of image.This harvester includes: CCD camera 1, support 2, illuminator 3 and arabidopsis 4, scaling board 5 is placed on the side of arabidopsis.What scaling board adopted is rim black and white gridiron pattern, 3 × 3 length of sides be that the square of 4mm forms.
Image after gathering is carried out pretreatment by S2, specifically includes following sub-step:
S2.1 positions black and white gridiron pattern
S2.1.1 is according to blue RGB feature, and the gray value of R and G is both less than B, and the gray value of B is more than 150, is partitioned into blue border from image.
The inside of image is carried out holes filling by S2.1.2, and its result deducts original blue border image, obtains new images.
New images is carried out opening operation by S2.1.3, removes noise;Carry out closed operation again, connect breakpoint, obtain the tessellated region of black and white.
S2.2 Corner Detection
S.2.2.1 the first derivative horizontally and vertically gone up of each point in calculating black and white gridiron pattern region.Adopt following Prewitt template to be calculated, be namely approximately the derivative of horizontal direction by the 3rd row in 3 × 3 regions and the difference of first row, be approximately the derivative of vertical direction by the difference of the third line and the first row.By the two subtemplate and image convolution, obtain two matrixes identical with image size, be designated as Ix, Iy, and then calculate three width new images: Ix 2, Iy 2And IxIy
S.2.2.2 consider that image can be subject to the interference of noise, adopt the Gauss window of 101 × 101 that three width images are filtered, remove noise.
S.2.2.3 correlation matrix M is formed by three width new images:
M = I x 2 I x I y I x I y I y 2
Utilize this matrix calculus criterion function R:
R=det (M)-k (trace (M))2
Wherein k generally takes 0.04.
For each pixel, a R value can be obtained, if the R value of certain point is more than 0.01Rmax, and the local maximum that it is 3 × 3 neighborhoods, then it will be judged as angle point.
S2.3 image rectification and image calibration
S.2.3.1 obtaining tessellated each foursquare summit coordinate in the picture by Corner Detection, in real world, these summits form each square, according to its spatial relation, can arrange the coordinate on these summits.Order wherein certain point coordinate in the picture is (x ', y ')T, coordinate in space be (x, y)T, its homogeneous coordinates respectively (x1', x2', x3') T and (x1, y, 1)T, there is therebetween Projective Distortion conversion, transformation matrix is the linear transformation H about homogeneous three-dimensional coordinate, is expressed as:
x 1 ′ x 2 ′ x 3 ′ = h 11 h 12 h 13 h 21 h 22 h 23 h 31 h 32 h 33 x y 1
Because the inhomogeneous coordinate of image can be expressed as by homogeneous coordinates:
x ′ = x 1 ′ x 3 ′ = h 11 x + h 12 y + h 13 h 31 x + h 32 y + h 33
y ′ = x 2 ′ x 3 ′ = h 21 x + h 22 y + h 23 h 31 x + h 32 y + h 33
So each group of match point can obtain two equation below:
(h31x+h32y+h33) x '=h11x+H12y+H13
(h31x+h32y+h33) y '=h21x+h22y+h23
Transformation matrix H can be obtained with at least four pairs of match points.
S.2.3.2 ask for the inverse of transformation matrix H, remake for image, it is achieved the correction of image.
S2.3.3 calculates the total number of pixels N in black and white gridiron pattern region by angular coordinateb, and the full-size(d) according to it, i.e. 9 × (4mm)2=144mm2, obtain the full-size(d) of unit picture element, it may be assumed that
A dz = 144 N b
Pretreated image is split by S3, is extracted by arabidopsis from image, specifically includes following sub-step:
S3.1 image foremost segment
The rgb value of each pixel is normalized acquisition rgb by S3.1.1, and method is as follows:
r = R R + G + B g = G R + G + B b = B R + G + B
Extract the chromaticity difference diagram of 3g-2.4r-b, as threshold value, image is carried out binaryzation with 0, it may be assumed that if
I=3g-2.4r-b >=0
Then this pixel is judged as the pixel of prospect plant, is otherwise background pixel.
Result is deducted the blue border region originally detected by S3.1.2.Judge that whether the gray value of G of the pixel of obtained foreground area is more than 50, if so, then retains this pixel, otherwise this pixel is judged as background pixel, remove from region.
S3.1.3 extracts the region with maximum pixel in prospect connected region, is plant regional.
S3.2 removes noise
Gained image is carried out opening operation by the disk template of 7 × 7 by S3.2.1, obtains new image, and it will only retain the central area of big leaf area and plant, and removes the detail section in image, including the noise around the stem of plant and blade.
Each connected region of the S3.2.2 detail section to removing calculates its number of pixels, number of pixels directly removing less than 12.
The S3.2.3 number of pixels connected region be more than or equal to 12, the new images individually obtained with opening operation superposes, calculate the areal in image after superposition again, if areal reduces, illustrate that this connected region is stem, it is necessary to retain, otherwise, if the areal after superposition in image increases or constant, then illustrate that this connected region is the noise around blade, it is necessary to remove.
After S3.2.4 has judged all of detail section, the new images that all connected regions retained all obtain with opening operation is superposed, obtains final plant regional.
S4 extracts the profile phenotypic parameter of arabidopsis, describes its overall profile quantitatively with oval Fourier descriptor, specifically includes following sub-step:
S4.1 holes filling
A point in the given hole of S4.1.1 is as starting point, and is set to 1, and other point is all set to 0, forms a new region X0
S4.1.2 with following symmetrical structure unit to X0Expand, and with original image region cover row intersection operation, obtain a new region X1
S4.1.3 judges X1Whether with X0Equal, if equal, holes filling terminates, and otherwise makes X0=X1, and repeat second step.
S4.2 searches profile
The point in the upper left corner of the plant regional after S4.2.1 Selective filling hole is as starting point b0, make c0For b0The background dot in west side.
S4.2.2 is from c0Start to detect b in a clockwise direction08 consecutive points, make b1For detecting first is not the point of 0, and makes c0For b in detection sequence1Point before, stores b0, b1
S4.2.3 judges b1Whether with b0Equal, if equal, profile is searched and is terminated, and otherwise makes b0=b1, and repeat second step.
S4.3 extracts 1 dimensional signal
After S4.3.1 searches profile, it is thus achieved that profile sequence, extract x coordinate and the y-coordinate of each profile point in this profile sequence.
These two groups of coordinates are stored by S4.3.2 respectively, thus tieing up (2D) image by 2 to become 21 dimension (1D) signals.
S4.4 extends signal
S4.4.1 makes the length of two 1D signals described in S4.3.2 be N0, calculate N0The logarithm value being the end with 2, the result round numbers obtained also adds 1, obtains numerical value k.
If S4.4.2 2kLess than 2N0, then the 1D signal described in S4.3.2 is carried out mirror image switch, after adding to the 1D signal described in S4.3.2, takes front 2kIndividual data point, obtains new signal.
If S4.4.3 2kMore than 2N0, then, after the 1D signal described in S4.3.2 is supplemented the signal of mirror image switch, it is supplemented with the 1D signal described in S4.3.2, then takes front 2 againkIndividual data point, obtains new signal.
S4.5 wavelet transformation
Signal after the extension that S4.4 is obtained by S4.5.1 carries out 4 layers of Haar wavelet decomposition, removes noise, it is thus achieved that the STATIC CORRECTION USING APPROXIMATE LAYER coefficient of the 4th layer.
The coefficient of the STATIC CORRECTION USING APPROXIMATE LAYER that S4.5.1 obtains is extracted front N by S4.5.20Individual data point so that it is identical with the 1D signal length described in S4.3.2, obtains 2 new 1D signals.
S4.6 Fourier transformation
S4.6.1 makes obtained for S4.5.2 2 1D signal respectively x and y, and its length is N0, the cycle is T, then its sample frequency τ is:
τ = T N 0
2 obtained for S4.5.2 1D signals are carried out Fourier transformation by S4.6.2, it is thus achieved that the Fourier coefficient of 10 layers, and every layer coefficients comprises 4 and describes son.
a xk = 2 N 0 Σ i = 1 N 0 x i cos ( kωiτ ) , b xk = 2 N 0 Σ i = 1 N 0 x i sin ( kωiτ ) a yk = 2 N 0 Σ i = 1 N 0 y i cos ( kωiτ ) , b yk = 2 N 0 Σ i = 1 N 0 y i sin ( kωiτ ) k = 1,2 . . . 10
Wherein frequency based on ω, its computing formula is:
ω = 2 π T
τ and ω is substituted into above formula:
a xk = 2 N 0 Σ i = 1 N 0 x i cos ( 2 πki N 0 ) , b xk = 2 N 0 Σ i = 1 N 0 x i sin ( 2 πki N 0 ) a yk = 2 N 0 Σ i = 1 N 0 y i cos ( 2 πki N 0 ) , b yk = 2 N 0 Σ i = 1 N 0 y i sin ( 2 πki N 0 ) k = 1,2 . . . 10
S4.7 normalization
Four coefficient equation below of every layer are sought normalized modulus, thus describe the overall profile of plant with 10 Fourier descriptors.
D ( k ) = a xk 2 + a yk 2 a x 1 2 + a y 1 2 + b xk 2 + b yk 2 b x 1 2 + b y 1 2 , k = 1,2 . . . 10
Below in conjunction with the method that examples of implementation further illustrate the present invention, the method comprises the following steps:
S1, in the planting pot of arabidopsis, places scaling board, utilizes the RGB image of collected by camera arabidopsis.What scaling board adopted is rim black and white gridiron pattern, 3 × 3 length of sides be that the square of 4mm forms.Place it in arabidopsis side, therewith carry out image acquisition, shown in the picture of collection such as Fig. 3 (1).
S2 to gather after image carry out pretreatment, it is achieved image from dynamic(al) correction and demarcation, specifically include following sub-step:
S2.1 positions black and white gridiron pattern
S2.1.1 is according to blue RGB feature, and the gray value of R and G is both less than B, and the gray value of B is more than 150, is partitioned into blue border from image.
The inside of image is carried out holes filling by S2.1.2, and its result deducts original blue border image, obtains new images.
New images is carried out opening operation by S2.1.3, removes noise;Carry out closed operation again, connect breakpoint, obtain the tessellated region of black and white, as shown in Fig. 3 (2).
S2.2 Corner Detection
S.2.2.1 by Prewitt template and image convolution, the first derivative horizontally and vertically gone up of each point in calculating black and white gridiron pattern region, obtain three width new images: horizontal first derivative square, the product of two first derivatives of quadratic sum of vertical first derivative.
Three width images are filtered by the Gauss window S.2.2.2 adopting 101 × 101, remove noise.
S.2.2.3 formed correlation matrix, calculation criterion function by above-mentioned three width images, find out angle point.As shown in Fig. 3 (3), each angle point marks with red point.
S2.3 image rectification and image calibration
S.2.3.1 obtaining tessellated 16 foursquare summits according to from left to right by Corner Detection, the coordinate in the picture of order from top to bottom is as follows:
(163,370)(224,362)(286,354)(347,347)
(171,430)(233,423)(294,415)(356,407)
(179,491)(241,484)(302,476)(346,468)
(187,553)(249,545)(311,537)(372,529)
According to the square that it is strict in real world, by the coordinate convention of each point it is:
(164,347)(225,347)(286,347)(347,347)
(164,408)(225,408)(286,408)(347,408)
(164,469)(225,469)(286,469)(347,469)
(164,530)(225,530)(286,530)(347,530)
Calculating these two groups of transformation of coordinates matrixes is:
0.9820 0.1202 - 1.2519 e - 05 - 0.1232 1.0126 3.8417 e - 05 51.0888 - 43.5533 0.9977
S.2.3.2 the inverse of this transformation matrix is:
1.0026 - 0.1183 0.0000 0.1237 0.9713 0.0000 - 45.9428 48.4665 1.0000
Acted on image, it is achieved image rectification, recovered the metric characteristic of image, shown in result such as Fig. 3 (4).
It is 35721 that S2.3.3 obtains the tessellated total number of pixels of black and white by angular coordinate, and the full-size(d) according to it, i.e. 9 × (4mm)2=144mm2, obtain the full-size(d) of unit picture element, it may be assumed that
A dz = 144 35721
Pretreated image is split by S3, is extracted by arabidopsis from image, specifically includes following sub-step:
S3.1 image foremost segment
The rgb value of each pixel is normalized acquisition rgb by S3.1.1, extracts the chromaticity difference diagram of 3g-2.4r-b, as threshold value, image is carried out binaryzation with 0.
Result is deducted the blue border region originally detected by S3.1.2.Judge that whether the gray value of G of the pixel of obtained foreground area is more than 50, if so, then retains this pixel, otherwise this pixel is judged as background pixel, remove from region.
S3.1.3 extracts the region with maximum pixel in prospect connected region, is plant regional, shown in result such as Fig. 4 (1).
S3.2 removes noise
Gained image is carried out opening operation by the disk template of 7 × 7 by S3.2.1, obtains new image, and it will only retain the central area of big leaf area and plant, and removes the detail section in image, including the noise around the stem of plant and blade.
Each connected region of the S3.2.2 detail section to removing calculates its number of pixels, number of pixels directly removing less than 12.
The S3.2.3 number of pixels connected region be more than or equal to 12, the new images individually obtained with opening operation superposes, calculate the areal in image after superposition again, if areal reduces, illustrate that this connected region is stem, it is necessary to retain, otherwise, if the areal after superposition in image increases or constant, then illustrate that this connected region is the noise around blade, it is necessary to remove.
After S3.2.4 has judged all of detail section, the new images that all connected regions retained all obtain with opening operation is superposed, obtains final plant regional, shown in result such as Fig. 4 (2).
S4 extracts the profile phenotypic parameter of arabidopsis, describes its overall profile quantitatively with oval Fourier descriptor, specifically includes following sub-step:
The hole of plant regional is filled with by S4.1, shown in its result such as Fig. 5 (1).
S4.2 passes through profile lookup algorithm, extracts shown in the outline point such as Fig. 5 (2) of the plant regional after filling.
S4.3 extracts 1 dimensional signal
After S4.3.1 searches profile, it is thus achieved that profile sequence, extract x coordinate and the y-coordinate of each profile point in this profile sequence.
These two groups of coordinates are stored by S4.3.2 respectively, thus 2D image becomes 2 1D signals, as shown in Fig. 5 (3) and 5 (4).
S4.4 extends signal
The length of two 1D signals in S4.4.1S4.3.2 is 1163, and the logarithm value that calculating 1163 is the end with 2, the result round numbers obtained also adds 1, obtains numerical value 11.
2 1D signals in S4.3.2 because 2048 less than 2326, are then carried out mirror image switch, after adding to two 1D signals in S4.3.2, take front 2048 data points, obtain new signal, as shown in Fig. 5 (5) and 5 (6) by S4.4.2.
S4.5 wavelet transformation
Signal after the extension that S4.4 is obtained by S4.5.1 carries out 4 layers of Haar wavelet decomposition, removes noise, it is thus achieved that the STATIC CORRECTION USING APPROXIMATE LAYER coefficient of the 4th layer.
The coefficient of the STATIC CORRECTION USING APPROXIMATE LAYER that S4.5.1 obtains is extracted front N by S4.5.20Individual data point so that it is identical with the 1D signal length described in S4.3.2, shown in result such as Fig. 5 (7) and 5 (8).
S4.6 Fourier transformation
2 obtained for S4.5.2 1D signals being carried out Fourier transformation, it is thus achieved that the Fourier coefficient of 10 layers, every layer coefficients comprises 4 and describes son, as follows:
axk=[69.5316.0114.2815.99-8.19-8.98-21.18-19.5916.12-9.82]
bxk=[-392.92123.9644.41-42.04-49.10-28.088.70-27.93-9.44-0.31]
ayk=[-341.7178.22-59.98-99.15-39.571.58219.87-14.58-8.60-14.93]
byk=[-59.3553.76-5.8624.09-11.4623.22-13.5917.36-10.070.49]
S4.7 normalization
Four coefficient equation below of every layer are asked normalized modulus, thus describe the overall profile of plant with 10 Fourier descriptors, result is as follows:
D=[20.570.290.410.240.120.120.150.090.05]

Claims (2)

1. a detection method for arabidopsis profile phenotype, comprises the following steps:
S1., in the planting pot of arabidopsis, place scaling board, utilize the RGB image of collected by camera arabidopsis;
S2. to gather after image carry out pretreatment, it is achieved image from dynamic(al) correction and demarcation, wherein image calibration is precisely in order to the distortion of correction chart picture, and image calibration is the full-size(d) in order to obtain unit picture element;
Step S2 includes location black and white gridiron pattern, Corner Detection, image rectification and image calibration;
Described location black and white gridiron pattern comprises the following steps: according to blue RGB feature, extracts blue border;The inside of image is carried out holes filling, then deducts original blue border image, obtain new images;New images is carried out opening operation, removes noise;Carry out closed operation again, connect breakpoint, obtain the tessellated region of black and white;
Described Corner Detection comprises the following steps: the first derivative horizontally and vertically gone up of each point in calculating black and white gridiron pattern region, obtain three width new images: horizontal first derivative square, the product of two first derivatives of quadratic sum of vertical first derivative;With gaussian filtering, three width images are filtered, remove noise;Correlation matrix, calculation criterion function is formed, it is judged that whether pixel therein is angle point by above-mentioned three width images;
Described image rectification and image calibration comprise the following steps: obtain tessellated each foursquare summit coordinate in the picture by Corner Detection, and according to its spatial relation in real world, obtain the transformation matrix of the two;Ask for the inverse of transformation matrix, act on image, it is achieved image rectification;Obtain the tessellated total number of pixels of black and white the full-size(d) according to it by angular coordinate, obtain the full-size(d) of unit picture element;
S3. pretreated image is split, by arabidopsis and background segment, extract from image;
Step S3 includes image foremost segment and removes noise;
Described image foremost segment comprises the following steps: the rgb value of each pixel is normalized acquisition rgb, extracts the chromaticity difference diagram of 3g-2.4r-b, as threshold value, image is carried out binaryzation with 0;Result is deducted the blue border region originally detected, it is judged that the gray value of the G of obtained foreground area pixel, whether more than 50, if so, then retains, otherwise removes;The region with maximum pixel in extraction prospect connected region, is plant regional;
Described removal noise comprises the following steps: gained image is carried out opening operation, obtains new image, and it will only retain the central area of big leaf area and plant, and removes the detail section in image, including the noise around the stem of plant and blade;Each connected region of the detail section removed is calculated its number of pixels, number of pixels directly removing less than 12, the number of pixels connected region be more than or equal to 12, the new images individually obtained with opening operation superposes, calculate the areal in image after superposition again, if areal reduces, illustrate that this connected region is stem, it is necessary to retain, otherwise, if the areal after superposition in image increases or constant, then illustrate that this connected region is the noise around blade, it is necessary to remove;After having judged all of detail section, the new images that all connected regions retained all obtain with opening operation is superposed, obtains final plant regional;
S4. after being partitioned into arabidopsis image, extract the profile phenotypic parameter of arabidopsis, describe its overall profile quantitatively with oval Fourier descriptor;Wherein step S4 comprises the steps:
The hole of plant regional is filled with by S4.1;
S4.2 passes through profile lookup algorithm, extracts the outline point of the plant regional after filling;
Profile point is stored respectively by S4.3 according to x and y-coordinate, thus tieing up image by 2 to become two 1 dimensional signals;
Two 1 dimensional signals in S4.3 are carried out data point mirror image filling by S4.4 respectively so that it is expand to the integral number power times of 2, obtain two 1 new dimensional signals;
S4.4 is obtained two 1 dimensional signals and carries out 4 layers of Haar wavelet decomposition respectively by S4.5, removes noise, and the STATIC CORRECTION USING APPROXIMATE LAYER coefficient taking the 4th layer processes as new signal;
Two 1 new dimensional signals that S4.5 is obtained by S4.6 carry out Fourier transformation, it is thus achieved that the Fourier coefficient of 10 layers, and every layer coefficients comprises 4 and describes son;
Four coefficients of every layer are sought normalized modulus by S4.7, thus describe the overall profile of plant with 10 Fourier descriptors.
2. the detection method of arabidopsis profile phenotype according to claim 1, wherein said scaling board adopts rim black and white gridiron pattern, 3 × 3 length of sides be that the square of 4mm forms.
CN201310456227.2A 2013-09-30 2013-09-30 A kind of detection method of arabidopsis profile phenotype Expired - Fee Related CN103471523B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310456227.2A CN103471523B (en) 2013-09-30 2013-09-30 A kind of detection method of arabidopsis profile phenotype

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310456227.2A CN103471523B (en) 2013-09-30 2013-09-30 A kind of detection method of arabidopsis profile phenotype

Publications (2)

Publication Number Publication Date
CN103471523A CN103471523A (en) 2013-12-25
CN103471523B true CN103471523B (en) 2016-07-06

Family

ID=49796477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310456227.2A Expired - Fee Related CN103471523B (en) 2013-09-30 2013-09-30 A kind of detection method of arabidopsis profile phenotype

Country Status (1)

Country Link
CN (1) CN103471523B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103913124A (en) * 2014-03-18 2014-07-09 北京农业信息技术研究中心 Automatic monitoring system and method for fruit appearance quality
CN108171718A (en) * 2017-11-23 2018-06-15 北京林业大学 A kind of small daisy_petal part number automatic testing method based on wavelet transformation
CN108813812A (en) * 2018-04-24 2018-11-16 广州奥玄信息科技有限公司 Utilize the method for mobile phone photograph measurement foot type data
CN109084708B (en) * 2018-07-25 2020-04-21 深圳大学 Method for calculating integral roughness of two-dimensional contour surface of particle
CN112964202B (en) * 2019-06-14 2023-05-09 南京林业大学 Plant phenotype information acquisition system and extraction method
CN110849262A (en) * 2019-10-17 2020-02-28 中国科学院遥感与数字地球研究所 Vegetation phenotype structure parameter measuring method, device and system
CN111422587B (en) * 2020-03-30 2021-09-10 华南理工大学 Method for accurately positioning materials in feeding process of conveyor belt
CN111667429B (en) * 2020-06-06 2023-05-23 南京聚特机器人技术有限公司 Target positioning correction method for inspection robot
CN113470034A (en) * 2021-06-10 2021-10-01 六盘水市农业科学研究院 Device for automatically measuring area of soft rot disease spots of in-vitro plants
CN114792343B (en) * 2022-06-21 2022-09-30 阿里巴巴达摩院(杭州)科技有限公司 Calibration method of image acquisition equipment, method and device for acquiring image data
CN116704497B (en) * 2023-05-24 2024-03-26 东北农业大学 Rape phenotype parameter extraction method and system based on three-dimensional point cloud

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN200996939Y (en) * 2006-11-24 2007-12-26 浙江大学 Early diagnostic device for plant leaf and canopy botrytis
CN102074012A (en) * 2011-01-22 2011-05-25 四川农业大学 Method for three-dimensionally reconstructing tender shoot state of tea by combining image and computation model
CN102200433A (en) * 2011-02-25 2011-09-28 北京农业信息技术研究中心 Device and method for measuring leaf area based on computer vision
CN102737367A (en) * 2012-06-17 2012-10-17 四川农业大学 Tea image enhancement and division method based on color characteristic
US8363905B2 (en) * 2010-07-21 2013-01-29 Cognisense Labs, Inc. Automated image analysis of an organic polarized object

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004191243A (en) * 2002-12-12 2004-07-08 Institute Of Physical & Chemical Research Automatic photographing system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN200996939Y (en) * 2006-11-24 2007-12-26 浙江大学 Early diagnostic device for plant leaf and canopy botrytis
US8363905B2 (en) * 2010-07-21 2013-01-29 Cognisense Labs, Inc. Automated image analysis of an organic polarized object
CN102074012A (en) * 2011-01-22 2011-05-25 四川农业大学 Method for three-dimensionally reconstructing tender shoot state of tea by combining image and computation model
CN102200433A (en) * 2011-02-25 2011-09-28 北京农业信息技术研究中心 Device and method for measuring leaf area based on computer vision
CN102737367A (en) * 2012-06-17 2012-10-17 四川农业大学 Tea image enhancement and division method based on color characteristic

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于归一化椭圆傅里叶描述子的黄瓜形状识别;戚利勇等;《农业机械学报》;20110831;第42卷(第8期);第142、164-167页 *

Also Published As

Publication number Publication date
CN103471523A (en) 2013-12-25

Similar Documents

Publication Publication Date Title
CN103471523B (en) A kind of detection method of arabidopsis profile phenotype
Chen et al. Accurate and robust crack detection using steerable evidence filtering in electroluminescence images of solar cells
CN107451590B (en) Gas detection identification and concentration representation method based on hyperspectral infrared image
CN105844228B (en) A kind of remote sensing images cloud detection method of optic based on convolutional neural networks
CN105389814B (en) A kind of bubble detecting method for air-tight test
Yang et al. Automated extraction of building outlines from airborne laser scanning point clouds
Chen et al. Extracting the navigation path of a tomato-cucumber greenhouse robot based on a median point Hough transform
CN103489192B (en) Method for detecting number of Arabidopsis leaves and distance between cusp and center of mass of each leaf
CN103593840B (en) Method for detecting phenotype of Arabidopsis
CN103455797A (en) Detection and tracking method of moving small target in aerial shot video
Li et al. A leaf segmentation and phenotypic feature extraction framework for multiview stereo plant point clouds
CN107808141A (en) A kind of electric transmission line isolator explosion recognition methods based on deep learning
CN103591887B (en) A kind of detection method of arabidopsis region phenotype
CN106897995A (en) A kind of parts automatic identifying method towards Automatic manual transmission process
CN104318559A (en) Quick feature point detecting method for video image matching
CN109800713A (en) The remote sensing images cloud detection method of optic increased based on region
CN107862687A (en) A kind of early warning system for being used to monitor agricultural pest
CN104424383A (en) Infrared image based hardware processing algorithm effectiveness performance evaluation device and method
CN107590821A (en) A kind of method for tracking target and system based on track optimizing
CN103438834A (en) Hierarchy-type rapid three-dimensional measuring device and method based on structured light projection
CN105718964A (en) Transmission line vibration damper visual detection method
Yang et al. Capturing the spatiotemporal variations in the gross primary productivity in coastal wetlands by integrating eddy covariance, Landsat, and MODIS satellite data: A case study in the Yangtze Estuary, China
CN111369494A (en) Winter wheat ear density detection method and device
CN104299005A (en) Head detection method and system
Wang et al. Measuring driving behaviors from live video

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160706

Termination date: 20160930