CN113607659A - Conveyor belt type crop phenotype acquisition method, system and device - Google Patents
Conveyor belt type crop phenotype acquisition method, system and device Download PDFInfo
- Publication number
- CN113607659A CN113607659A CN202110985641.7A CN202110985641A CN113607659A CN 113607659 A CN113607659 A CN 113607659A CN 202110985641 A CN202110985641 A CN 202110985641A CN 113607659 A CN113607659 A CN 113607659A
- Authority
- CN
- China
- Prior art keywords
- camera
- crop
- information
- color
- mechanical arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000012545 processing Methods 0.000 claims abstract description 15
- 230000003595 spectral effect Effects 0.000 claims abstract description 12
- 238000001228 spectrum Methods 0.000 claims description 21
- 229910052736 halogen Inorganic materials 0.000 claims description 16
- 150000002367 halogens Chemical class 0.000 claims description 16
- 238000013519 translation Methods 0.000 claims description 15
- 239000004744 fabric Substances 0.000 claims description 14
- 239000011159 matrix material Substances 0.000 claims description 12
- 230000004927 fusion Effects 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 4
- 238000012549 training Methods 0.000 claims description 4
- 238000013136 deep learning model Methods 0.000 claims description 3
- 238000007499 fusion processing Methods 0.000 claims description 3
- 244000038559 crop plants Species 0.000 claims 6
- 238000000605 extraction Methods 0.000 abstract description 2
- 210000000056 organ Anatomy 0.000 description 9
- 230000009466 transformation Effects 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 5
- 238000009616 inductively coupled plasma Methods 0.000 description 4
- 230000031700 light absorption Effects 0.000 description 4
- 238000012952 Resampling Methods 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012271 agricultural production Methods 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008033 biological extinction Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000009395 breeding Methods 0.000 description 1
- 230000001488 breeding effect Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/01—Arrangements or apparatus for facilitating the optical investigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/8466—Investigation of vegetal material, e.g. leaves, plants, fruits
Abstract
The invention discloses a method, a system and a device for acquiring a conveyor belt type crop phenotype, wherein the method comprises the following steps: conveying the crops to a pre-built darkroom environment through a conveyor belt; controlling a mechanical arm to move a color camera, a structured light camera and a hyperspectral camera to a plurality of preset locus combinations, and carrying out image acquisition on crops to obtain crop color information, three-dimensional information and spectral information under corresponding loci; and fusing the crop color information, the three-dimensional information and the spectral information under the corresponding sites to obtain a complete crop three-dimensional point cloud, and constructing a crop growth situation tracking database. The system comprises: the system comprises a conveyor belt module, a darkroom light environment module, a multi-source information acquisition module, a data processing module and a control. The device comprises a memory and a processor for executing the conveyor belt type crop phenotype acquisition method. By using the method, the high-precision batch extraction of the phenotypic parameters of the crops is realized. The invention can be widely applied to the field of phenotype collection.
Description
Technical Field
The invention relates to the field of phenotype acquisition, in particular to a conveyor belt type crop phenotype acquisition method, a conveyor belt type crop phenotype acquisition system and a conveyor belt type crop phenotype acquisition device.
Background
In the agricultural field, the method for rapidly acquiring the plant phenotype data to realize the reconstruction of the three-dimensional structure of the plant is an important technical means for the applications of plant phenotype research, variety identification, plant 3D simulation and the like, and the existing method for acquiring the plant phenotype has the problems of low automation degree, poor data accuracy, incapability of establishing a growth database for tracking and comparison and the like, and is not enough to provide basic data support services for the research and the application of the intelligent management of crop digital breeding, digital cultivation and agricultural production.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a conveyor belt type crop phenotype acquisition method, a conveyor belt type crop phenotype acquisition system and a conveyor belt type crop phenotype acquisition device, so that high-precision batch extraction of crop phenotype parameters is realized.
The first technical scheme adopted by the invention is as follows: a method of conveyorized phenotype acquisition of crops comprising the steps of:
s101, conveying crops to a pre-built darkroom environment through a conveyor belt;
s201, controlling a mechanical arm to move a color camera, a structured light camera and a hyperspectral camera to a plurality of preset locus combinations, and carrying out image acquisition on crops to obtain crop color information, three-dimensional information and spectrum information under corresponding loci;
s301, fusing the crop color information, the three-dimensional information and the spectrum information under the corresponding sites to obtain a complete crop three-dimensional point cloud, and constructing a crop growth situation tracking database.
Further, still include:
s401, dividing complete crop three-dimensional point cloud and key phenotype information based on a preset label, and training a deep learning model to obtain a discrimination model;
s501, dividing phenotype information of the crop to be detected based on the discrimination model and judging whether the phenotype information is completely collected or not to obtain a discrimination result.
Further, be equipped with LED light source and halogen bulb light source in the darkroom environment, before the step of conveying the crop to the darkroom environment of buildding in advance through the conveyer belt, still include the calibration step, specifically include:
s001, cooperatively calibrating with a color camera and a structured light camera to determine the number, the position and the luminous intensity of the LED light sources;
s002, cooperatively calibrating with a hyperspectral camera, and determining the number, the position and the luminous intensity of halogen bulb light sources;
and S003, calibrating the spatial relationship among the color camera, the structured light camera and the hyperspectral camera.
Further, be equipped with black extinction cloth roller shutter on the darkroom environment, the step of conveying the crop to the darkroom environment of buildding in advance through the conveyer belt specifically includes:
s1011, sequentially and orderly conveying a plurality of crops into a darkroom by a conveying belt at fixed intervals for fixed time;
s1012, judging that the conveyor belt runs and the black light-absorbing cloth roller shutter is retracted;
and S1013, judging that the conveyor belt stops running, and putting down the black light-absorbing cloth curtain to construct a darkroom environment.
Further, the step of determining the number, the position and the luminous intensity of the LED light sources by calibrating in cooperation with the color camera and the structured light camera specifically includes:
s0011, setting a plurality of groups of LED light source arrangements according to the number, positions and forms of different LED light sources;
s0012, placing object crops in a darkroom and photographing the object crops under the arrangement of the LED light sources through a structured light camera to obtain a plurality of groups of crop point clouds under the arrangement of different LED light sources;
s0013, determining optimal LED light source arrangement according to the effect of crop point clouds under the arrangement of a plurality of groups of different LED light sources;
s0014, placing a standard color card in a darkroom, and taking a picture of the standard color card under the optimal LED light source arrangement through a color camera to obtain a plurality of groups of standard color card pictures under different LED light source arrangements;
s0054, processing the standard color chart pictures under different LED light source arrangements based on color calibration software, analyzing corresponding exposure histograms and color temperatures, and determining the luminous intensity of each LED light source under the optimal LED light source arrangement.
Further, the step of calibrating the spatial relationship between the color camera, the structured light camera, and the hyperspectral camera specifically includes:
s0031, installing a color camera, a structured light camera and a hyperspectral camera on a clamp of a mechanical arm, and placing a checkerboard calibration plate in a darkroom;
s0032, simultaneously shooting a plurality of chessboard pattern calibration plate images from multiple angles based on a color camera, a structured light camera and a hyperspectral camera, and recording mechanical arm position information of each angle during shooting;
and S0033, processing the collected checkerboard calibration board images, and combining with the mechanical arm site information to obtain a rotational translation matrix among the color camera, the structured light camera and the hyperspectral camera, and obtain a rotational translation relation among the color camera, the structured light camera and the hyperspectral camera relative to the tail end of the mechanical arm.
Further, the step of controlling the mechanical arm to move the color camera, the structured light camera and the hyperspectral camera to a plurality of preset locus combinations and perform image acquisition on the crops to obtain the color information, the three-dimensional information and the spectral information of the crops at corresponding loci specifically comprises:
s2011, moving the color camera, the structured light camera and the hyperspectral camera to a preset locus;
s2012, turning on an LED light source, and controlling a color camera and a structured light camera to acquire data to obtain crop color information and three-dimensional information;
s2013, after the collection is finished, the LED light source is turned off, the halogen bulb light source is turned on, the hyperspectral camera is controlled to collect data, crop spectrum information is obtained, and the halogen bulb light source is turned off;
and S2014, moving the mechanical arm control color camera, the structured light camera and the hyperspectral camera to the next position, and repeating the steps S2011-S2013 to obtain the crop color information, the three-dimensional information and the spectrum information collected under each mechanical arm position.
Further, the step of fusing the crop color information, the three-dimensional information and the spectrum information under the corresponding position points to obtain a complete crop three-dimensional point cloud specifically comprises:
s3011, according to a rotation translation matrix between cameras, normalizing and fusing crop color information and three-dimensional information under each mechanical arm position point to crop spectrum information to obtain fused data;
s3012, converting the fused data into a coordinate system of the tail end of the mechanical arm according to the rotation and translation relation between the camera and the tail end of the mechanical arm;
s3013, uniformly converting the fusion data under each point into a mechanical arm base coordinate system according to the mechanical arm point information to obtain a plurality of preliminary registration point cloud data;
s3014, performing pairwise ICP iterative registration on the multiple preliminary registration point clouds to obtain an accurate splicing relation;
s3015, the accurate splicing relation is applied to the fusion data for splicing, and a complete crop three-dimensional point cloud is obtained.
The second technical scheme adopted by the invention is as follows: a conveyorized crop phenotype acquisition system comprising:
the analysis module is used for acquiring the data to be updated and analyzing the data to be updated to obtain the information of the data to be updated;
the conveyer belt module is used for conveying the crops to the darkroom light environment module;
the darkroom light environment module is used for isolating the influence of external environment light to form a stable light environment;
the multi-source information acquisition module is used for acquiring crop color information, three-dimensional information and spectral information;
the data processing module is used for carrying out fusion processing on the collected crop color information, the collected three-dimensional information and the collected spectrum information to obtain a complete crop three-dimensional point cloud;
and the control module is used for controlling the start and stop of the conveyor belt module, the darkroom light environment module, the multi-source information acquisition module and the data processing module.
The third technical scheme adopted by the invention is as follows: a conveyorized crop phenotype acquisition device comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement a conveyorized crop phenotype acquisition apparatus method as described above.
The method, the system and the device have the advantages that: the invention is used as a conveyor belt type crop phenotype acquisition method based on the fusion of a mechanical arm and multiple sensors, crop phenotype parameters are extracted in batches, the conveying of a conveyor belt, the running track of the mechanical arm, a color camera, a structured light camera and a hyperspectral camera are controlled to carry out scanning shooting and data fusion by building a darkroom light environment, and a crop three-dimensional point cloud model is integrally reconstructed.
Drawings
FIG. 1 is a flow chart of the steps of a method of acquiring a phenotype of a conveyorized crop according to the invention;
FIG. 2 is a schematic diagram of a conveyorized crop phenotype acquisition system according to the present invention;
FIG. 3 is a schematic diagram of a multi-source information collection module according to an embodiment of the invention;
reference numerals: 1. a conveyor belt; 2. a six-axis mechanical arm; 3. a front roller shutter; 4. a rear roller shutter; 5. an LED light source; 6. a halogen lamp light source; 7. a color camera; 8. a structured light camera; 9. a hyperspectral camera; 10. a sensor fixture.
Detailed Description
The invention is described in further detail below with reference to the figures and the specific embodiments. The step numbers in the following embodiments are provided only for convenience of illustration, the order between the steps is not limited at all, and the execution order of each step in the embodiments can be adapted according to the understanding of those skilled in the art.
Referring to fig. 1, the present invention provides a method for acquiring phenotype of a conveyorized crop, comprising the steps of:
s101, conveying crops to a pre-built darkroom environment through a conveyor belt;
s201, controlling a mechanical arm to move a color camera, a structured light camera and a hyperspectral camera to a plurality of preset locus combinations, and carrying out image acquisition on crops to obtain crop color information, three-dimensional information and spectrum information under corresponding loci;
specifically, the arm is six arms, and six arms adopt the mode that manual control removed earlier stage, remove the several according to surveyed crop appearance characteristic in the darkroom and enable color camera, structured light camera and high spectrum camera gather the position that is in suitable working range during data, these approximate positions are around 360 degrees evenly distributed of crop, at every turn the crop is conveyed into by the conveyer belt the darkroom during, the arm can be according to predetermined position, control three camera carries out an omnidirectional information acquisition earlier to the crop.
The locus combination is provided with a plurality of groups aiming at different crops and the length and width grades of the bounding boxes of the crops. Two initial discrimination sites are required to be set firstly and are respectively positioned at the top and the side part in a darkroom, the mechanical arm controls the structured light camera to sample crops at the maximum visual angle in the acceptable precision range on the two sites, the crop point cloud bounding boxes under the two visual angles are calculated, the wide range grade of the crops is judged according to the maximum visual angle, and the group of preset point positions adopted in the next collection is determined, so that the collection precision is optimal.
S301, fusing the crop color information, the three-dimensional information and the spectrum information under the corresponding sites to obtain a complete crop three-dimensional point cloud, and constructing a crop growth situation tracking database.
The crops are sent into a darkroom environment for information acquisition through a conveyor belt at certain intervals for a plurality of times in the life cycle of the crops; the method comprises the steps of cultivating crops by using cultivation pots with uniform specifications, attaching labels and checkerboard images, recording related varieties and cultivation condition information of the crops on the labels, carrying out coordinate relation conversion on the checkerboard images through space relation identification between the checkerboard images and a camera in subsequent data processing, rotating the reconstructed three-dimensional point cloud, and unifying a main observation visual angle of the three-dimensional point cloud of the crops so as to facilitate contrast tracking of growth situations of the same crops.
Further as a preferred embodiment of the method, the method further comprises:
s401, dividing complete crop three-dimensional point cloud and key phenotype information based on a preset label, and training a deep learning model to obtain a discrimination model;
specifically, a point cloud marking tool is used for marking the obtained point clouds under multiple visual angles and the reconstructed crop three-dimensional point clouds, the growing periods corresponding to different crop point clouds, crop stems, leaves, flowers, fruits and the like are distinguished, semantic information, particularly complete organ boundary semantics and incomplete boundary semantics generated by occlusion are added according to the point cloud indexes, then deep learning is used for training, a discrimination model is established, automatic organ segmentation of the single-visual-angle crop point clouds and the reconstructed crop point clouds is realized respectively, and whether the crop organs are occluded and cannot be completely collected is discriminated.
S501, dividing phenotype information of the crop to be detected based on the discrimination model and judging whether the phenotype information is completely collected or not to obtain a discrimination result.
Specifically, key phenotype information of three types of data is divided in the early stage in a manual labeling mode, such as whether the crop stem, leaves, flowers and fruits are blocked or not, so that the data processing system can automatically learn and identify the key phenotype information data to obtain an accurate data distinguishing model, then the key organ parts of the crops can be automatically distinguished in application, whether the data can be completely collected or not and whether the data is blocked or not are judged, whether resampling is carried out aiming at the phenotype or not is further decided, the resampling is carried out to identify the blocking, and the non-blocking angle is reselected for data collection.
Further as a preferred embodiment of the method, an LED light source and a halogen bulb light source are arranged in the darkroom environment, and before the step of conveying the crops to the pre-built darkroom environment by the conveyor belt, the method further comprises a calibration step, specifically comprising:
s001, cooperatively calibrating with a color camera and a structured light camera to determine the number, the position and the luminous intensity of the LED light sources;
s002, cooperatively calibrating with a hyperspectral camera, and determining the number, the position and the luminous intensity of halogen bulb light sources;
specifically, through debugging, the number, the form, the mode and the luminous intensity of light source arrangement which can enable a color camera to obtain the best image quality and color and enable a structured light camera to obtain the most accurate point cloud data are found; and then calibrating the quantity, form, mode and luminous intensity of the halogen bulb light sources, and cooperatively calibrating the halogen bulb light sources and the hyperspectral camera to find the quantity, form, mode and luminous intensity of the halogen bulb light sources which can enable the hyperspectral camera to obtain the most accurate spectral information of the surface of the crop.
And S003, calibrating the spatial relationship among the color camera, the structured light camera and the hyperspectral camera.
Further as a preferred embodiment of the method, a black light-absorbing cloth roller shutter is arranged on the darkroom environment, and the step of conveying the crops to the pre-built darkroom environment through a conveyor belt specifically comprises the following steps:
s1011, sequentially and orderly conveying a plurality of crops into a darkroom by a conveying belt at fixed intervals for fixed time;
s1012, judging that the conveyor belt runs and the black light-absorbing cloth roller shutter is retracted;
and S1013, judging that the conveyor belt stops running, and putting down the black light-absorbing cloth curtain to construct a darkroom environment.
Specifically, the conveyer belt is responsible for linking up outside crop culture apparatus, in order in proper order according to certain fixed interval stable conveying of many crops in the darkroom. In addition, an infrared sensor can be arranged, when crops are to be transmitted into the darkroom or out of the darkroom, the infrared sensor can monitor the transmission of the crops, the black light-absorbing cloth roller shutter can be retracted, the roller shutter can be put down to form a required dark environment after the conveyor belt stops running before other operations are carried out, and the process is controlled by the control system.
Further, as a preferred embodiment of the method, the step of determining the number, the position, and the light emitting intensity of the LED light sources in cooperation with the color camera and the structured light camera includes:
s0011, setting a plurality of groups of LED light source arrangements according to the number, positions and forms of different LED light sources;
s0012, placing object crops in a darkroom and photographing the object crops under the arrangement of the LED light sources through a structured light camera to obtain a plurality of groups of crop point clouds under the arrangement of different LED light sources;
s0013, determining optimal LED light source arrangement according to the effect of crop point clouds under the arrangement of a plurality of groups of different LED light sources;
s0014, placing a standard color card in a darkroom, and taking a picture of the standard color card under the optimal LED light source arrangement through a color camera to obtain a plurality of groups of standard color card pictures under different LED light source arrangements;
s0015, processing the standard color card photos under different LED light source arrangements based on color calibration software, analyzing corresponding exposure histograms and color temperatures, and determining the luminous intensity of each LED light source under the optimal LED light source arrangement.
Further, as a preferred embodiment of the method, the step of calibrating the spatial relationship between the color camera, the structured light camera, and the hyperspectral camera specifically includes:
s0031, installing a color camera, a structured light camera and a hyperspectral camera on a clamp of a mechanical arm, and placing a checkerboard calibration plate in a darkroom;
s0032, simultaneously shooting a plurality of chessboard pattern calibration plate images from multiple angles based on a color camera, a structured light camera and a hyperspectral camera, and recording mechanical arm position information of each angle during shooting;
specifically, the calibration board coordinate system is represented by a board, the robot arm base coordinate system is represented by a base, the robot arm tip coordinate system is represented by an end, the color camera coordinate system is represented by camera1, the structured light camera coordinate system is represented by camera2, and the hyperspectral camera coordinate system is represented by camera 3:
and S0033, processing the collected checkerboard calibration board images, and combining with the mechanical arm site information to obtain a rotational translation matrix among the color camera, the structured light camera and the hyperspectral camera, and obtain a rotational translation relation among the color camera, the structured light camera and the hyperspectral camera relative to the tail end of the mechanical arm.
Based on the images obtained by shooting, calculating the pose of each camera in a calibration plate coordinate system by using a camera calibration tool box in software Matlab, wherein the pose is represented by a rotation translation transformation matrix, and the transformation matrix from each camera coordinate system to the calibration plate coordinate system can be represented asOtherwise, it isAnd then, calibrating by hands and eyes to obtain a transformation matrix from the tail end of the mechanical arm to each camera coordinate system: otherwise, it isCalculate the transformation matrix from the end of the robot arm to the base asIn summary, the transformation relationship between the camera coordinate systems is:
the transformation relation from the structured light camera coordinate system to the mechanical arm base coordinate system is as follows:
specifically, the point information is rotation and translation information of the mechanical arm tail end based on a base, such as an Euler angle, Cartesian coordinates and the like, automatically fed back by the mechanical arm
Further, as a preferred embodiment of the method, the step of moving the color camera, the structured light camera and the hyperspectral camera to a plurality of preset site combinations by the control mechanical arm and acquiring images of the crops to obtain the color information, the three-dimensional information and the spectral information of the crops at corresponding sites specifically includes:
s2011, moving the color camera, the structured light camera and the hyperspectral camera to a preset locus;
s2012, turning on an LED light source, and controlling a color camera and a structured light camera to acquire data to obtain crop color information and three-dimensional information;
specifically, the structured light camera and the hyperspectral camera can simultaneously acquire two-dimensional grayscale images.
S2013, after the collection is finished, the LED light source is turned off, the halogen bulb light source is turned on, the hyperspectral camera is controlled to collect data, crop spectrum information is obtained, and the halogen bulb light source is turned off;
and S2014, moving the mechanical arm control color camera, the structured light camera and the hyperspectral camera to the next position, and repeating the steps S2011-S2013 to obtain the crop color information, the three-dimensional information and the spectrum information collected under each mechanical arm position.
Specifically, the color camera, the structured light camera and the hyperspectral camera are connected with the tail end of the mechanical arm through the sensor clamp, so that the working positions and angles of the mechanical arm are controlled.
Further, as a preferred embodiment of the method, the step of fusing the crop color information, the three-dimensional information and the spectral information under the corresponding position points to obtain the complete crop three-dimensional point cloud specifically includes:
s3011, according to a rotation translation matrix between cameras, normalizing and fusing crop color information and three-dimensional information under each mechanical arm position point to crop spectrum information to obtain fused data;
specifically, according to the transformation relationship between cameras:andand normalizing and fusing the data under each mechanical arm position under the structured light camera, namely fusing and attaching crop color information and spectral information to the crop three-dimensional point cloud to obtain crop point cloud fusion data with the color information and the spectral information.
S3012, converting the fused data into a coordinate system of the tail end of the mechanical arm according to the rotation and translation relation between the camera and the tail end of the mechanical arm;
specifically, S3013, uniformly converting the fusion data at each site into a mechanical arm base coordinate system according to the mechanical arm site information to obtain a plurality of preliminary registration point cloud data;
specifically, the point information is rotation and translation information of the mechanical arm tail end fed back automatically by the mechanical arm based on a base, such as an euler angle, a cartesian coordinate and the like, and according to a transformation relation between the camera and the mechanical arm base:the fused data collected from each site can be uniformly moved to the mechanical arm base coordinate system, and then the primary crop registration omnibearing data can be obtained.
S3014, performing pairwise ICP iterative registration on the multiple preliminary registration point clouds to obtain an accurate splicing relation;
specifically, point cloud data acquired at a position point when a camera view angle is a depression angle is selected as a fine registration base point cloud, crop organs with semantics are segmented according to a discrimination model, other point clouds subjected to preliminary registration are segmented in the same way, then the number of points of the base point cloud and the points of the organ point clouds with the same semantics in the other point clouds subjected to preliminary registration are respectively calculated according to semantic information, a pair of organs with the minimum difference of the number of the points is selected, the organ point cloud belonging to the base point cloud is used as a source point cloud in ICP (inductively coupled plasma) precise point cloud registration, the other organ point cloud is used as a target point cloud, local iterative registration is carried out, a coordinate conversion matrix for fine registration and splicing is obtained, and the matrix is applied to the whole view angle point cloud to obtain a precise splicing relation.
S3015, the accurate splicing relation is applied to the fusion data for splicing, and a complete crop three-dimensional point cloud is obtained.
Referring to fig. 2 and 3, a conveyorized crop phenotype acquisition system comprising:
the conveyer belt module is used for conveying the crops to the darkroom light environment module;
the darkroom light environment module is used for isolating the influence of external environment light to form a stable light environment;
the darkroom light environment module comprises an external frame, wherein the external frame is built by adopting an aluminum profile and is square, the periphery and the top of the external frame are surrounded by black light absorption cloth, and a darkroom is formed inside the external frame; the black light absorption cloth is divided into a front side and a rear side according to the conveying direction of the conveying belt conveying module, the light absorption cloth on the front side and the rear side is arranged in a rolling curtain mode, and the light absorption cloth is controlled by a motor to be folded and put down.
The multi-source information acquisition module is used for acquiring crop color information, three-dimensional information and spectral information;
specifically, the multi-source information acquisition module comprises a mechanical arm, a color camera, a structured light camera, a hyperspectral camera and a sensor clamp.
The data processing module is used for carrying out fusion processing on the collected crop color information, the collected three-dimensional information and the collected spectrum information to obtain a complete crop three-dimensional point cloud;
and the control module is used for controlling the start and stop of the conveyor belt module, the darkroom light environment module, the multi-source information acquisition module and the data processing module.
The contents in the above method embodiments are all applicable to the present system embodiment, the functions specifically implemented by the present system embodiment are the same as those in the above method embodiment, and the beneficial effects achieved by the present system embodiment are also the same as those achieved by the above method embodiment.
A conveyorized crop phenotype acquisition device:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement a conveyorized crop phenotype acquisition method as described above.
The contents in the above method embodiments are all applicable to the present apparatus embodiment, the functions specifically implemented by the present apparatus embodiment are the same as those in the above method embodiments, and the advantageous effects achieved by the present apparatus embodiment are also the same as those achieved by the above method embodiments.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. A method of acquiring a phenotype of a conveyorized crop comprising the steps of:
s101, conveying crops to a pre-built darkroom environment through a conveyor belt;
s201, controlling a mechanical arm to move a color camera, a structured light camera and a hyperspectral camera to a plurality of preset locus combinations, and carrying out image acquisition on crops to obtain crop color information, three-dimensional information and spectrum information under corresponding loci;
s301, fusing the crop color information, the three-dimensional information and the spectrum information under the corresponding sites to obtain a complete crop three-dimensional point cloud, and constructing a crop growth situation tracking database.
2. The method of claim 1, further comprising:
s401, dividing complete crop three-dimensional point cloud and key phenotype information based on a preset label, and training a deep learning model to obtain a discrimination model;
s501, dividing phenotype information of the crop to be detected based on the discrimination model and judging whether the phenotype information is completely collected or not to obtain a discrimination result.
3. The conveyor belt type crop phenotype acquisition method according to claim 1, wherein an LED light source and a halogen bulb light source are arranged in the darkroom environment, and before the step of conveying the crop to the pre-built darkroom environment through the conveyor belt, the method further comprises a calibration step, specifically comprising:
s001, cooperatively calibrating with a color camera and a structured light camera to determine the number, the position and the luminous intensity of the LED light sources;
s002, cooperatively calibrating with a hyperspectral camera, and determining the number, the position and the luminous intensity of halogen bulb light sources;
and S003, calibrating the spatial relationship among the color camera, the structured light camera and the hyperspectral camera.
4. The method for acquiring the phenotype of the belt-type crop as claimed in claim 2, wherein the darkroom environment is provided with a black light-absorbing cloth roller shutter, and the step of conveying the crop to the pre-constructed darkroom environment through the belt-type crop phenotype comprises the following steps:
s1011, sequentially and orderly conveying a plurality of crops into a darkroom by a conveying belt at fixed intervals for fixed time;
s1012, judging that the conveyor belt runs and the black light-absorbing cloth roller shutter is retracted;
and S1013, judging that the conveyor belt stops running, and putting down the black light-absorbing cloth curtain to construct a darkroom environment.
5. The method for acquiring phenotype of crops on conveyor belt according to claim 4, wherein the step of determining the number, position and luminous intensity of the LED light sources in cooperation with the calibration of the color camera and the structured light camera comprises:
s0011, setting a plurality of groups of LED light source arrangements according to the number, positions and forms of different LED light sources;
s0012, placing object crops in a darkroom and photographing the object crops under the arrangement of the LED light sources through a structured light camera to obtain a plurality of groups of crop point clouds under the arrangement of different LED light sources;
s0013, determining optimal LED light source arrangement according to the effect of crop point clouds under the arrangement of a plurality of groups of different LED light sources;
s0014, placing a standard color card in a darkroom, and taking a picture of the standard color card under the optimal LED light source arrangement through a color camera to obtain a plurality of groups of standard color card pictures under different LED light source arrangements;
s0015, processing the standard color card photos under different LED light source arrangements based on color calibration software, analyzing corresponding exposure histograms and color temperatures, and determining the luminous intensity of each LED light source under the optimal LED light source arrangement.
6. The method according to claim 5, wherein the step of calibrating the spatial relationship between the color camera, the structured light camera and the hyperspectral camera comprises:
s0031, installing a color camera, a structured light camera and a hyperspectral camera on a clamp of a mechanical arm, and placing a checkerboard calibration plate in a darkroom;
s0032, simultaneously shooting a plurality of chessboard pattern calibration plate images from multiple angles based on a color camera, a structured light camera and a hyperspectral camera, and recording mechanical arm position information of each angle during shooting;
and S0033, processing the collected checkerboard calibration board images, and combining with the mechanical arm site information to obtain a rotational translation matrix among the color camera, the structured light camera and the hyperspectral camera, and obtain a rotational translation relation among the color camera, the structured light camera and the hyperspectral camera relative to the tail end of the mechanical arm.
7. The method for acquiring phenotype of crop plants on conveyor belt according to claim 6, wherein the step of controlling the mechanical arm to move the color camera, the structured light camera and the hyperspectral camera to a plurality of preset position combinations and to acquire images of the crop plants to obtain color information, three-dimensional information and spectral information of the crop plants at corresponding positions specifically comprises:
s2011, moving the color camera, the structured light camera and the hyperspectral camera to a preset locus;
s2012, turning on an LED light source, and controlling a color camera and a structured light camera to acquire data to obtain crop color information and three-dimensional information;
s2013, after the collection is finished, the LED light source is turned off, the halogen bulb light source is turned on, the hyperspectral camera is controlled to collect data, crop spectrum information is obtained, and the halogen bulb light source is turned off;
and S2014, moving the mechanical arm control color camera, the structured light camera and the hyperspectral camera to the next position, and repeating the steps S2011-S2013 to obtain the crop color information, the three-dimensional information and the spectrum information collected under each mechanical arm position.
8. The method for acquiring phenotype of crop plants on conveyor belt according to claim 7, wherein the step of fusing color information, three-dimensional information and spectrum information of crop plants at corresponding positions to obtain a complete three-dimensional point cloud of crop plants comprises:
s3011, according to a rotation translation matrix between cameras, normalizing and fusing crop color information and three-dimensional information under each mechanical arm position point to crop spectrum information to obtain fused data;
s3012, converting the fused data into a coordinate system of the tail end of the mechanical arm according to the rotation and translation relation between the camera and the tail end of the mechanical arm;
s3013, uniformly converting the fusion data under each point into a mechanical arm base coordinate system according to the mechanical arm point information to obtain a plurality of preliminary registration point cloud data;
s3014, performing pairwise ICP iterative registration on the multiple preliminary registration point clouds to obtain an accurate splicing relation;
s3015, the accurate splicing relation is applied to the fusion data for splicing, and a complete crop three-dimensional point cloud is obtained.
9. A conveyorized crop phenotype acquisition system, comprising:
the conveyer belt module is used for conveying the crops to the darkroom light environment module;
the darkroom light environment module is used for isolating the influence of external environment light to form a stable light environment;
the multi-source information acquisition module is used for acquiring crop color information, three-dimensional information and spectral information;
the data processing module is used for carrying out fusion processing on the collected crop color information, the collected three-dimensional information and the collected spectrum information to obtain a complete crop three-dimensional point cloud;
and the control module is used for controlling the start and stop of the conveyor belt module, the darkroom light environment module, the multi-source information acquisition module and the data processing module.
10. A conveyorized crop phenotype acquisition device, comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement a method of conveyorized crop phenotype acquisition as claimed in any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110985641.7A CN113607659B (en) | 2021-08-26 | 2021-08-26 | Conveyor belt type crop phenotype acquisition method, system and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110985641.7A CN113607659B (en) | 2021-08-26 | 2021-08-26 | Conveyor belt type crop phenotype acquisition method, system and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113607659A true CN113607659A (en) | 2021-11-05 |
CN113607659B CN113607659B (en) | 2022-06-28 |
Family
ID=78342057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110985641.7A Active CN113607659B (en) | 2021-08-26 | 2021-08-26 | Conveyor belt type crop phenotype acquisition method, system and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113607659B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116342539A (en) * | 2023-03-22 | 2023-06-27 | 深圳市康士达科技有限公司 | Quick construction method, device and medium for machine vision environment |
CN116772731A (en) * | 2023-06-21 | 2023-09-19 | 华中农业大学 | Mushroom fruiting body high-throughput phenotype and quality detection device |
CN116907576A (en) * | 2023-07-13 | 2023-10-20 | 广东省农业科学院设施农业研究所 | Automatic seedling detection system, method and electronic equipment |
CN116972973A (en) * | 2023-07-17 | 2023-10-31 | 中国科学院上海光学精密机械研究所 | Full-automatic space spectrum measuring device and method for luminous object |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108895964A (en) * | 2018-07-09 | 2018-11-27 | 南京农业大学 | A kind of high-throughput hothouse plants phenotype measuring system based on Kinect Auto-calibration |
CN108981569A (en) * | 2018-07-09 | 2018-12-11 | 南京农业大学 | A kind of high-throughput hothouse plants phenotype measuring system based on the fusion of multispectral cloud |
CN113112504A (en) * | 2021-04-08 | 2021-07-13 | 浙江大学 | Plant point cloud data segmentation method and system |
-
2021
- 2021-08-26 CN CN202110985641.7A patent/CN113607659B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108895964A (en) * | 2018-07-09 | 2018-11-27 | 南京农业大学 | A kind of high-throughput hothouse plants phenotype measuring system based on Kinect Auto-calibration |
CN108981569A (en) * | 2018-07-09 | 2018-12-11 | 南京农业大学 | A kind of high-throughput hothouse plants phenotype measuring system based on the fusion of multispectral cloud |
CN113112504A (en) * | 2021-04-08 | 2021-07-13 | 浙江大学 | Plant point cloud data segmentation method and system |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116342539A (en) * | 2023-03-22 | 2023-06-27 | 深圳市康士达科技有限公司 | Quick construction method, device and medium for machine vision environment |
CN116342539B (en) * | 2023-03-22 | 2023-12-12 | 深圳市康士达科技有限公司 | Quick construction method, device and medium for machine vision environment |
CN116772731A (en) * | 2023-06-21 | 2023-09-19 | 华中农业大学 | Mushroom fruiting body high-throughput phenotype and quality detection device |
CN116907576A (en) * | 2023-07-13 | 2023-10-20 | 广东省农业科学院设施农业研究所 | Automatic seedling detection system, method and electronic equipment |
CN116972973A (en) * | 2023-07-17 | 2023-10-31 | 中国科学院上海光学精密机械研究所 | Full-automatic space spectrum measuring device and method for luminous object |
Also Published As
Publication number | Publication date |
---|---|
CN113607659B (en) | 2022-06-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113607659B (en) | Conveyor belt type crop phenotype acquisition method, system and device | |
Onishi et al. | An automated fruit harvesting robot by using deep learning | |
CN105717115B (en) | High-throughput Plant phenotypic analysis device and method based on optical image technology | |
JP5396484B2 (en) | Method and apparatus suitable for measuring the growth of plant leaf pieces | |
EP3361403B1 (en) | Automated identification of shoe parts | |
CN110274556B (en) | Plant phenotype information extraction method | |
CN103502422A (en) | Image capture and lighting apparatus | |
CN112595367A (en) | Rice root system property nondestructive measurement device based on intelligent robot | |
CN114255334B (en) | Shape feature acquisition device, database and identification system for traditional Chinese medicine | |
Huang et al. | An automatic machine vision-guided grasping system for Phalaenopsis tissue culture plantlets | |
CN204622060U (en) | fruit and vegetable picking robot mechanical arm and robot | |
CN106846462B (en) | insect recognition device and method based on three-dimensional simulation | |
TWM567355U (en) | Multi-spectral image analysis system architecture | |
EP3172954B1 (en) | A system for automatic scarification and assessment of vitality of seeds and a method for automatic scarification and assessment of vitality of seeds | |
Jin et al. | Edge recognition and reduced transplantation loss of leafy vegetable seedlings with Intel RealsSense D415 depth camera | |
CN215525565U (en) | Automatic nondestructive testing system for fruit and vegetable seedling phenotype | |
CN113554691B (en) | Plant height measuring method | |
Kochi et al. | All-around 3D plant modeling system using multiple images and its composition | |
Jung et al. | Depth image conversion model based on CycleGAN for growing tomato truss identification | |
CN110064601B (en) | Seedling detection and classification system and classification method for vegetable grafting | |
CN110741790A (en) | Multi-claw transplanting-sorting processing method for plug seedlings based on depth camera | |
EP2719273A1 (en) | Method and apparatus for cutting plants | |
CN108489418B (en) | Seedling leaf three-dimensional form measuring method and device | |
CN113228957A (en) | Plug seedling taking terminal and seedling searching method | |
CN107976444A (en) | A kind of big chrysanthemum chrysanthemum flower information automatic detection device based on Visible Light Camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |