CN108573504A - The 3D image generating methods and its system of phenotype for analyzing plant - Google Patents

The 3D image generating methods and its system of phenotype for analyzing plant Download PDF

Info

Publication number
CN108573504A
CN108573504A CN201810202788.2A CN201810202788A CN108573504A CN 108573504 A CN108573504 A CN 108573504A CN 201810202788 A CN201810202788 A CN 201810202788A CN 108573504 A CN108573504 A CN 108573504A
Authority
CN
China
Prior art keywords
sensors
robot
image
posture
plant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810202788.2A
Other languages
Chinese (zh)
Other versions
CN108573504B (en
Inventor
金俊植
金亨锡
李芸奭
吴宗右
卢周嫄
吴尙录
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Advanced Institute of Science and Technology KAIST
Original Assignee
Korea Advanced Institute of Science and Technology KAIST
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Advanced Institute of Science and Technology KAIST filed Critical Korea Advanced Institute of Science and Technology KAIST
Publication of CN108573504A publication Critical patent/CN108573504A/en
Application granted granted Critical
Publication of CN108573504B publication Critical patent/CN108573504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/0098Plants or trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The present invention relates to the 3D image generating methods of the phenotype for analyzing plant and its system, the 3D video generation systems of the phenotype for analyzing plant of the invention include:Multiple labels are set to the madial wall of image cavity and form pattern;3D sensors obtain the image of plant;Robot comprising the end device mounted on the robot base that portion is set of image cavity and for installing 3D sensors, driving enable 3D sensors to obtain image during being rotated around plant;And sensor posture presumption unit, the coordinate system of the coordinate system of image cavity, the coordinate system of robot and 3D sensors is formed integral with one another to estimate the posture of 3D sensors using the location information of multiple labels.Therefore precision 3D image structures can be restored in real time to small and thin plant, compared with existing 3D is calibrated, 3D structure restorations accuracy is high, speed is fast.

Description

The 3D image generating methods and its system of phenotype for analyzing plant
Technical field
The present invention relates to the 3D image generating methods of the phenotype for analyzing plant and its systems, in particular to a kind of Accurately presumption obtains the technology of the position of the 3D sensors of the image of plant.
Background technology
The phenotypic analysis of plant refers to that the growth and development informations such as height, the quantity of leaf, color and the apperance for measuring plant are gone forward side by side The technology of row analysis.
It is usually to be laminated that the structures such as leaf, stem, branch, the fruit of plant, which largely have thin and irregular structure, leaf, , it is therefore desirable to from a variety of viewing point plant structures, since the reproductive condition of plant under different time, different periods is Different, therefore with the particularity for needing to observe in real time.
It is previous to use by the direct method for measuring of people, and since recent image and the communication technology are flourishing, attempting profit The technology for being automatically determined and being analyzed with 3D sensors.
By the determination of plant technology of 3D sensors can be divided into fixed 3D sensors and mobile plant in the way of and fixed plant Totally two kinds of the mode of object and mobile 3 D sensor.
The mode of fixed 3D sensors and moving in rotation plant is easy to obtain the position of sensor, and is moved and sent out due to plant The problems such as leave shakes, therefore to obtain accurate image, until needing to wait until that leaf etc. remains stationary as after mobile plant, from And time-consuming, is not suitable for high speed and senses.
The mode of fixed plant and mobile 3 D sensor can solve the problems, such as aforesaid way, but with can not obtain movement 3D sensors exact position the problem of, therefore with integrate and 3D modeling in terms of the limitation that time-consuming.
Invention content
Technical problem
To solve the above problems, the purpose of the present invention is to provide a kind of in imaging chamber body, side wall is arranged to form pattern Multiple labels simultaneously carry out the coordinate system integration of two steps to estimate the method and system of the Precision postural of 3D sensors with this.
Technical solution
The 3D video generation systems of the phenotype for analyzing plant for reaching above-mentioned purpose are characterized in that may include: Multiple labels are set to the madial wall of image cavity and form pattern;3D sensors obtain the image of plant;Robot, It includes the end device mounted on the robot base that portion is set of the image cavity and for installing the 3D sensors, Driving enables the 3D sensors to obtain image during being rotated around the plant;And sensor posture presumption Portion is formed integral with one another the coordinate system of the image cavity, the coordinate of the robot using the location information of the multiple label The coordinate system of system and the 3D sensors is to estimate the posture of the 3D sensors.
Wherein, the sensor posture presumption unit may include:Coordinate integration portion is believed using the position of the multiple label Breath estimates the coordinate conversion relation between the 3D sensors and the robot, the robot and the image cavity and whole Close coordinate system;And sensor pose refinement portion, the posture of the robot is utilized according to the integration in coordinate integration portion Information estimates the posture of the 3D sensors, optimizes the 3D sensors according to the location information of the multiple label of detection Posture.
Also, coordinate integration portion may include:First coordinate converting section utilizes the dynamic information of the robot Estimate the coordinate conversion relation between the robot base and the end device;And second coordinate converting section, utilize institute The location information for stating multiple labels estimates coordinate conversion relation between the image cavity and the 3D sensors.
Also, coordinate integration portion may include:Third coordinate converting section is believed using the posture based on the robot The movable information of the restriction of the 3D sensors of breath, during moving the robot with a variety of postures, described in detection The location information of multiple labels estimates the coordinate conversion relation between the end device of the robot and the 3D sensors and institute State the coordinate conversion relation between image cavity and the robot base.
Further, sensor pose refinement portion can retrieve the position of the multiple label out of shooting image To optimize the posture of the 3D sensors, minimized with this 3D sensors shooting signal occur moment and shooting time it Between have differences and the unconformity of the postures of the 3D sensors that occurs.
Also, the multiple label may include reflection plaster (Retroreflector Sticker).
In addition, be characterized in that can for the 3D image generating methods for the phenotype for analyzing plant of reaching above-mentioned purpose Including:Driving includes that the robot for the end device for being equipped with the 3D sensors makes the 3D sensors be shot with a variety of postures The step of being formed in the pattern of the madial wall of image cavity;And it is formed integral with one another the image using the location information of the pattern The coordinate system of the coordinate system of cavity, the coordinate system of the robot and the 3D sensors is to estimate the posture of the 3D sensors The step of.
Wherein, the step of posture for estimating the 3D sensors, may include:Institute is estimated using the location information of the pattern State the coordinate between coordinate conversion relation and the robot and the image cavity between 3D sensors and the robot The step of transformation relation is to integrate coordinate system;And the posture of the 3D sensors of posture of the presumption based on the robot is simultaneously The step of optimizing the posture of the 3D sensors according to the location information of the pattern of detection.
Also, the step of integrating the coordinate system includes:The machine is estimated using the dynamic information of the robot The step of coordinate conversion relation between the robot base of people and the end device;It is estimated using the location information of the pattern The step of coordinate conversion relation between the image cavity and the 3D sensors;And it utilizes with described in the movement of a variety of postures The location information of the pattern detected during robot estimate the end device of the robot and the 3D sensors it Between coordinate conversion relation and the image cavity and the robot base between coordinate conversion relation the step of, optimize institute The step of stating the posture of 3D sensors can retrieve the position of the pattern again out of shooting image and be passed with optimizing the 3D The posture of sensor, the shooting signal that the 3D sensors are minimized with this occur to have differences and send out between moment and shooting time The unconformity of the posture of the raw 3D sensors.
Technique effect
As described above, the 3D image generating methods and its system according to the present invention for analyzing the phenotype of plant, it can Side wall is arranged the multiple labels to form pattern and carries out the coordinate system integration of two steps with this to estimate 3D in imaging chamber body The Precision postural of sensor.
Further, it is possible to restore precision 3D image structures in real time to small and thin plant, and compared with existing 3D is calibrated, 3D knots It is fast that structure restores accuracy height, speed.
Further, the present invention restores accurately the 3D image structures of plant thin and with pleomorphism more particularly to improve Degree, after improving the 3D images such as follow-up three-dimensional segmentation (3D segmentation), texture mapping (texture mapping) The accuracy of processing makes 3D analyses more accurate.
Description of the drawings
Fig. 1 is the brief signal of the 3D video generation systems of the phenotype for analyzing plant of one embodiment of the invention Figure;
Fig. 2 shows the coordinate system of the composition of Fig. 1;
Fig. 3 is the specific composition figure of the sensor posture presumption unit of Fig. 1;
Fig. 4 is the phenotype for analyzing plant using the 3D video generation systems of the phenotype for analyzing plant of Fig. 1 3D image generating methods flow chart;
Fig. 5 is the flow of the more specific method of the pose refinement method of the coordinate system integration and 3D sensors about Fig. 4 Figure;
Fig. 6 shows the realization example of the video generation system and method for the plant of above-described embodiment.
Reference sign
1:Image cavity 10:Label
20:Robot 21:Robot base
23:End device 30:3D sensors
40:Sensor posture presumption unit 41:Coordinate integration portion
43:Sensor pose refinement portion
Specific implementation mode
Specific embodiments of the present invention are illustrated referring to attached drawing below.
Fig. 1 is the brief signal of the 3D video generation systems of the phenotype for analyzing plant of one embodiment of the invention Figure, Fig. 2 show the coordinate system of the composition of Fig. 1.Referring to Fig. 1, the 3D of the phenotype for analyzing plant of one embodiment of the invention Video generation system includes being set on image cavity 1 and being set as forming multiple labels 10, the robot of predetermined pattern (robot) 20,3D sensors 30 and sensor posture presumption unit 40.
Multiple labels 10 may be disposed at the madial wall of image cavity 1 and setting is shaped as predetermined pattern.Multiple labels 10 exist Can be reflection plaster (Retrorefle ctor Sticker) as benchmark when estimating the posture of 3D sensors.For pattern For, it is contemplated that the position that is readily detected centered on the position that 3D sensors 30 mainly sense is easy matched form Pattern.
Robot 20 is combined with 3D sensors 30, and driving is so that 3D sensors 30 obtain shadow during surrounding plant rotation Picture includes the robot base 21 that portion is set mounted on image cavity 1, the end device for installing 3D sensors 30 (End-effector)23。
Robot base 21 is fixed at the portion of being set, and position is constant, and 3D sensors 30 are installed on end device 23, end End-apparatus 23 makes 3D sensors 30 be rotated by 360 ° centered on plant by driving.Although not showed that in Fig. 1, robot 20 Further include crossing frame (not shown), crossing frame rotating mechanism (not shown), horizontal drive in addition to robot base 21 and end device 23 Dynamic portion's (not shown) etc., these compositions be combined with each other and drive so that end device 23 is being planted with robot base 21 for rotation center It is rotated around object.Also, end device 23 is arranged to move up and down, can be suitable according to the type of plant, growth period, size Locality change setting camera site.
3D sensors 30 are used to obtain the image of plant, are installed on the end device 23 of robot 20.3D sensors 30 with The movement of the end device 23 of robot 20 has six degree of freedom (X, Y, Z, pitching (Pitch), yaw (Yaw), rolling (Roll)) Posture.
The location information phase of multiple labels 10 in the image that sensor posture presumption unit 40 is used to obtain using 3D video cameras It mutually integrates coordinate system, the coordinate system of robot 20 and the coordinate system of 3D sensors 30 of image cavity 1 and 3D sensings is estimated with this The posture of device 30 can be made of the processor of the software and runs software integrating and estimate for coordinate.
Fig. 3 is the specific composition figure of the sensor posture presumption unit 40 of Fig. 1.Referring to Fig. 3, the biography of one embodiment of the invention Sensor posture presumption unit 40 includes coordinate integration portion 41 and sensor pose refinement portion 43.
Location information presumption 3D sensor 30 and the robot of the coordinate integration portion 41 using multiple labels 10 in image 20, the coordinate conversion relation between robot 20 and image cavity 1 and coordinate system is integrated, with a variety of posture mobile robots 20 During observe the positions of multiple labels 10, utilize the movement letter of the 3D sensors 30 based on 20 posture of robot restricted Breath integrates accurate coordinates system.
Referring to Fig. 2, the coordinate system that when image for obtaining plant is considered as has image cavity l coordinate systems, robot base 21 Totally four kinds of coordinate system, 23 coordinate system of end device and 30 coordinate system of 3D sensors.Wherein, the space coordinates of image cavity 1 be can The fixed value arbitrarily set, by for measuring setting in the coordinate system of the position of multiple labels 10 of the madial wall of image cavity 1 Coordinate system as image cavity 1.The coordinate system of robot 20 is different from the coordinate system of image cavity 1, when being setting just The fixed but correlation between image cavity 1 then unknown value.3D sensors 30 are mounted on the end of robot 20 Device 23, coordinate system are set to the direction of optics origin or sensor plane, are transformed between the coordinate system of end device 23 Fixed relationship but correlation then unknown value.
Referring to Fig. 3, coordinate integration portion 41 includes that the first coordinate converting section 41a, 41 b of the second coordinate converting section and third are sat Mark transformation component 41c.
First coordinate converting section 41a is used to estimate robot base 21 and end device by the dynamic information of robot 20 This presumption robot base can be used since the dynamic information of robot 20 is known in coordinate conversion relation between 23 Coordinate conversion relation between seat 21 and end device 23.
Second coordinate converting section 41b using multiple labels 10 location information presumption image cavity 1 and 3D sensors 30 it Between coordinate conversion relation.
Third coordinate converting section 41c is believed using the movement of the restriction of the 3D sensors 30 of the pose information based on robot 20 Breath, with a variety of posture mobile robots 20 during using detection the multiple label 10 location information, estimate machine The seat between coordinate conversion relation and image cavity 1 and robot base 21 between the end device 23 and 3D sensors 30 of people 20 Transformation relation is marked to integrate accurate coordinates system.
Sensor pose refinement portion 43 substantially estimates 3D sensors 30 using the pose information based on mobile robot 20 Posture, behind the position with the multiple labels of the infomation detection 10, optimize 3D sensings while correcting 20 pose information of robot The posture of device 30.
The posture of 3D sensors 30 can substantially obtain during observation is configured at multiple labels 10 of image cavity 1, But it may be since the visual angle of 3D sensors 30 is insufficient or the problem of the position detection accuracy deficiency of the label 10 of observation, 3D are passed In the presence of poor between the practical acquisition moment at the time of order for obtaining data being assigned when sensor 30 scans plant image to sensor It is different to lead to problems such as synchronous accuracy insufficient and observe label 10 in the position of anticipation with 20 posture of robot.
It sensor of the invention pose refinement portion 43, can in order to minimize the unconformity of the posture of this 3D sensors 30 The position of multiple labels 10 is retrieved again out of shooting image on the basis of moving direction to optimize the posture of 3D sensors 30, Accurately know the current location of 3D sensors 30.
Fig. 4 is the phenotype for analyzing plant using the 3D video generation systems of the phenotype for analyzing plant of Fig. 1 3D image generating methods flow chart.It omits and above-described embodiment repeat description as needed.
First scheduled pattern is formed in 1 madial wall of image cavity with reflection plaster.Here, the position of reflection plaster be it is internal The value known.
Referring to Fig. 4, driving robot 20 under a variety of postures of robot 20 by the shooting of 3D sensors 30 to be formed in The pattern (S10) of the madial wall of image cavity 1.For example, for generate plant 3D images and around the same of plant 360 degree rotation When obtain image, with 10 degree of interval shooting images.
Here, estimating robot 20 and image cavity 1 using the location information of the pattern in image, robot 20 and 3D is passed The transformation of coordinate system between sensor 30 is simultaneously formed integral with one another (S20).Also, estimate 3D while correcting robot pose information The posture (S30) of sensor 30.
Fig. 5 is that the more specific method that method is estimated with 30 posture of 3D sensors is integrated about coordinate system shown in Fig. 4 Flow chart.
Referring to Fig. 5, estimated between robot base 21 and end device 23 first with the dynamic information of robot 20 Coordinate conversion relation (S21).Also, utilize the location information for the pattern being made of multiple labels 10 presumption image cavity 1 and 3D Coordinate conversion relation (S23) between sensor 30.Also, it utilizes and is detected during with a variety of posture mobile robot 20 Pattern location information presumption robot 20 end device 23 and 3D sensors 30 between coordinate conversion relation and imaging chamber Coordinate conversion relation (S25) between body 1 and robot base 21.
By step as described above, it is formed integral with one another the coordinate system of image cavity 1, the coordinate system of robot 20 and 3D sensings The coordinate system of device 30.Also, the 3D sensors 30 of posture of the presumption based on robot 20 are integrated by the coordinate system Posture retrieves the position of multiple labels 10 to optimize 3D sensors 30 again on the basis of moving direction out of shooting image Posture (S31).
The coordinate system of one embodiment of the invention is integrated below and an example of posture presumption illustrates.
Following mathematical expressions 1 indicate the initial resolving that the coordinate system in the coordinate integration portion 41 of one embodiment of the invention is integrated Formula.
[mathematical expression 1]
Ti CSTSETi ERTRC=1
→T1 CSTSETi ER=TRC -1
Wherein, TRC -1All be in all postures it is constant, therefore
Ti CSTSETi ER=Ti CSTSETi ER
→(Ti CS)-1Ti CSTSE=TSETi ER(Ti ER)-1
(in above formula, TRC:Transformation matrix of coordinates between image cavity 1 and robot base 21, TSE:End device 23 and 3D Transformation matrix of coordinates between sensor 30, Ti ER:Coordinate between i-th of posture robot base 21 and end device 23 becomes Change matrix, Ti CS:Transformation matrix of coordinates between i-th of posture 3D sensor 30 and image cavity 1)
In a variety of postures by driving robot 20 to obtain, T is found out using above-mentioned mathematical expression 1SEInitial solution.TRC.TSE It is fixed value while is also unknown-value, Ti ER.Ti CSBeing can be by measuring obtained value, as described above, Ti ERIt is by the first coordinate The value that transformation component 41 is calculated according to the dynamic information of robot 20, Ti CSIt is to be calculated based on pattern by the second coordinate converting section 41b Value.
Therefore, (the T of above-mentioned mathematical expression 1i CS)-1Ti CSTSE=TSETi ER(Ti ER)-1In only TSEIt is unknown-value, therefore can leads to It crosses matrix equation and finds out its initial solution.
Coordinate integration portion 41 executes accurate calibration after finding out initial solution, here with the restriction based on robotically-driven formula 30 posture of 3D sensors estimates mode.This is utilized since the position of 3D sensors 30 depends on the movement of robot 20, It can not possibly be positioned at the principle of any position.
Transformation between i-th of posture image cavity 1 and 3D sensors 30, i.e. (Ti CS)-1It can indicate as follows:
[mathematical expression 2]
(Ti CS)=TSETi ER(Vi)TRC
(wherein, Vi:Indicate the robotically-driven parameter in i-th of posture)
For k in label 10 in the case where i-th of posture is observed by 3D sensors 30, the position actually observed is availableIt indicates, the observation position of prediction can indicate as follows.
[mathematical expression 3]
Pi k=Proj (K, d, (Ti CS)-1)
(wherein, Proj (...):Image projecting formula, k:3D sensor internals parameter (intrinsic pa rameter), d:Sensor distortion parameter (disortion parameter))
Cost function can indicate as follows, can calculate T by nonlinear optimizationSEAnd TRC
[mathematical expression 4]
[mathematical expression 5]
The action in the sensor pose refinement portion 43 of one embodiment of the invention is illustrated below.It is clapped when filmed image It takes the photograph in the case of obtaining signal generation moment and actual photographed moment indifference, i.e., 3D sensors 30 are parked in specific position carries out In the case of shooting, the posture of 3D sensors 30 can be calculated with above-mentioned mathematical expression 2.
However, in the case that 3D sensors 30 obtain image in moving process, shooting obtains signal and moment and reality occurs Border shooting time has differences, therefore to release the posture unconformity of this 3D sensors 30, will be robotically-driven when obtaining Parameter executes the optimization of mathematical expression 5 as unknown number (unknown).
For this purpose, to re-search for certain observation positions on the basis of the moving direction of 3D sensors 30 in imageHere, Plant regional is removed using depth (depth) information, is removed in optimization if having fixed value in robotically-driven parameter.
The optimization algorithm of various ways can be used in sensor pose refinement portion 43, such as the literary Burger-Ma Kuaer of row can be used Local optimums (local optimization) methods such as special (Levenberg-Marquardt) algorithm.
Fig. 6 shows the realization example of the video generation system and method for the plant of above-described embodiment, and (a) of Fig. 6 is briefly aobvious Show that 1 inner wall of image cavity is provided with multiple labels 10 (red point), is obtained during rotation around plant 360 in image cavity 1 The robot 20 of image and the photo of video camera are taken, (b) the 3D images for the plant that display is obtained by this system and method.
As described above, the present invention is formed integral with one another the space coordinates of image cavity 1 and the coordinate system of robot 20,3D sensings Three coordinate systems such as coordinate system of device 30 can fasten the essence accurately estimated in particular moment from the space coordinate of image cavity 1 The six degree of freedom posture of true 3D sensors 30.
All inscapes for describing the composition embodiment of the present invention above are combined into one or combine work, but the present invention It is not limited to these embodiments.That is, within the scope of the purpose of the present invention, all inscape alternatives are combined into one The above work.Also, its all inscape can be an independent hardware respectively, but can also selectively combine its each structure It is configured to part or all of function of executing combination on one or more hardware at part or all of element The computer program of program module.The code and code snippet for constituting the computer program can be by the technologies of technical field Personnel are easy to derive.This computer program is stored in computer-readable storage medium (Computer Readable Media it) and is readable by a computer and executes, so as to realize the embodiment of the present invention.The storage medium of computer program can Including magnetic recording medium, optical recording medium, carrier media etc..
In addition, the terms such as the " comprising " recorded above, " composition " or " having " are in the case of no special opposite record Expression can be built-in with corresponding inscape, it should be understood that for that can also include other inscapes, and other must not be interpreted as Inscape forecloses.If without separately defining, all terms including technical term or scientific terminology indicate and this The identical meaning of general understanding of the those of ordinary skill of technical field that the present invention belongs to.The term defined on dictionary etc. is usually used Term should be interpreted that the meaning consistent with the context of the relevant technologies, must not under the premise of undefined in present aspect It is construed to ideal or excessively formality meaning.
The technological thought described above for only illustrating the present invention, general technical staff of the technical field of the invention should Understand:It still can carry out a variety of amendments and deformation in the range of not departing from the intrinsic propesties of the present invention.Therefore, of the invention Disclosed embodiment is not intended to limit the technological thought of the present invention, but for illustrating, the range of technical thought of the invention is not It is limited by these embodiments.Therefore it should explain the scope of the present invention according to the range of technical solution, should explain same All technological thoughts in range are contained in the scope of the present invention.

Claims (9)

1. a kind of 3D video generation systems for analyzing the phenotype of plant, which is characterized in that including:
Multiple labels are set to the madial wall of image cavity and form pattern;
3D sensors obtain the image of plant;
Robot comprising mounted on the robot base that portion is set of the image cavity and for installing the 3D sensings The end device of device, driving enable the 3D sensors to obtain image during being rotated around the plant;And
Sensor posture presumption unit is formed integral with one another the coordinate of the image cavity using the location information of the multiple label The coordinate system of system, the coordinate system of the robot and the 3D sensors is to estimate the posture of the 3D sensors.
2. the 3D video generation systems according to claim 1 for analyzing the phenotype of plant, which is characterized in that the biography Sensor posture presumption unit includes:
Coordinate integration portion estimates the 3D sensors and the robot, described using the location information of the multiple label Coordinate conversion relation between robot and the image cavity simultaneously integrates coordinate system;And
Sensor pose refinement portion estimates institute according to the integration in coordinate integration portion using the pose information of the robot The posture for stating 3D sensors optimizes the posture of the 3D sensors according to the location information of the multiple label of detection.
3. the 3D video generation systems according to claim 2 for analyzing the phenotype of plant, which is characterized in that the seat Mark integration portion includes:
First coordinate converting section estimates the robot base and the end device using the dynamic information of the robot Between coordinate conversion relation;And
Second coordinate converting section estimates the image cavity and the 3D sensors using the location information of the multiple label Between coordinate conversion relation.
4. the 3D video generation systems according to claim 2 for analyzing the phenotype of plant, which is characterized in that the seat Mark integration portion includes:
Third coordinate converting section utilizes the movable information of the restriction of the 3D sensors of the pose information based on the robot, During moving the robot with a variety of postures, using the location information of the multiple label of detection, the machine is estimated Coordinate conversion relation and the image cavity between the end device of device people and the 3D sensors and the robot base it Between coordinate conversion relation.
5. the 3D video generation systems according to claim 2 for analyzing the phenotype of plant, it is characterised in that:
The position of the multiple label is retrieved out of shooting image and is sensed with optimizing the 3D in sensor pose refinement portion The posture of device, the shooting signal that the 3D sensors are minimized with this occur to have differences and occur between moment and shooting time The 3D sensors posture unconformity.
6. the 3D video generation systems according to claim 1 or 2 for analyzing the phenotype of plant, it is characterised in that:
The multiple label includes reflection plaster (Retroreflector Sticker).
7. a kind of 3D image generating methods for analyzing the phenotype of plant, which is characterized in that including:
Driving includes that the robot for the end device for being equipped with the 3D sensors makes the 3D sensors be shot with a variety of postures The step of being formed in the pattern of the madial wall of image cavity;And
It is formed integral with one another the coordinate system of the image cavity, the coordinate system of the robot and institute using the location information of the pattern The step of posture for the coordinate system of 3D sensors being stated to estimate the 3D sensors.
8. the 3D image generating methods according to claim 7 for analyzing the phenotype of plant, which is characterized in that presumption institute The step of posture for stating 3D sensors includes:
Coordinate conversion relation and the institute between the 3D sensors and the robot are estimated using the location information of the pattern State the step of coordinate conversion relation between robot and the image cavity is to integrate coordinate system;And
It estimates the posture of the 3D sensors of the posture based on the robot and is believed according to the position of the pattern of detection Breath optimizes the step of posture of the 3D sensors.
9. the 3D image generating methods according to claim 8 for analyzing the phenotype of plant, it is characterised in that:
The step of integrating the coordinate system include:
The seat between the robot base of the robot and the end device is estimated using the dynamic information of the robot The step of marking transformation relation;
The coordinate conversion relation between the image cavity and the 3D sensors is estimated using the location information of the pattern Step;And
The location information of the pattern detected during moving the robot with a variety of postures is utilized to estimate the machine Coordinate conversion relation between the end device of people and the 3D sensors and
The step of coordinate conversion relation between the image cavity and the robot base,
The step of posture for optimizing the 3D sensors is to retrieve the position of the pattern again out of shooting image to optimize The posture of the 3D sensors, the shooting signal that the 3D sensors are minimized with this occur to exist between moment and shooting time Difference and the unconformity of the postures of the 3D sensors occurred.
CN201810202788.2A 2017-03-13 2018-03-12 3D image generation method and system for analyzing phenotype of plant Active CN108573504B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020170031378A KR101963643B1 (en) 2017-03-13 2017-03-13 3D Image Generating Method And System For A Plant Phenotype Analysis
KR10-2017-0031378 2017-03-13

Publications (2)

Publication Number Publication Date
CN108573504A true CN108573504A (en) 2018-09-25
CN108573504B CN108573504B (en) 2022-10-04

Family

ID=63573895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810202788.2A Active CN108573504B (en) 2017-03-13 2018-03-12 3D image generation method and system for analyzing phenotype of plant

Country Status (2)

Country Link
KR (1) KR101963643B1 (en)
CN (1) CN108573504B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114184123A (en) * 2021-12-15 2022-03-15 西南林业大学 Device and method for measuring and calculating three-dimensional green quantity of grassland sample

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102296308B1 (en) * 2018-12-31 2021-09-01 서울대학교산학협력단 Apparatus and method for plant analysis based on 3d plant structure model and ray tracing simulation

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1080882A (en) * 1996-09-06 1998-03-31 Fujitsu Ltd Coordinate transformation parameter measuring device for robot
US6033415A (en) * 1998-09-14 2000-03-07 Integrated Surgical Systems System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system
EP1120204A2 (en) * 2000-01-28 2001-08-01 Dürr Systems GmbH Method for calibrating an industrial robot
US20080014628A1 (en) * 2004-10-12 2008-01-17 Chuo Precision Industrial Co., Ltd. Cell incubator for single cell operation supporting robot
CN103056884A (en) * 2011-10-20 2013-04-24 株式会社安川电机 Robot system and processed object manufacturing method
CN104044132A (en) * 2013-03-14 2014-09-17 株式会社安川电机 Robot System And Method For Producing To-be-processed Material
CN104057457A (en) * 2013-03-19 2014-09-24 株式会社安川电机 Robot system and calibration method
CN104552292A (en) * 2013-10-10 2015-04-29 精工爱普生株式会社 Control system of robot, robot, program and control method of robot
US20160063708A1 (en) * 2013-04-12 2016-03-03 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi A system and method for optimizing fiducial marker and camera positions/orientations
CN106217372A (en) * 2015-06-02 2016-12-14 精工爱普生株式会社 Robot, robot controller and robot system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1080882A (en) * 1996-09-06 1998-03-31 Fujitsu Ltd Coordinate transformation parameter measuring device for robot
US6033415A (en) * 1998-09-14 2000-03-07 Integrated Surgical Systems System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system
EP1120204A2 (en) * 2000-01-28 2001-08-01 Dürr Systems GmbH Method for calibrating an industrial robot
US20080014628A1 (en) * 2004-10-12 2008-01-17 Chuo Precision Industrial Co., Ltd. Cell incubator for single cell operation supporting robot
CN103056884A (en) * 2011-10-20 2013-04-24 株式会社安川电机 Robot system and processed object manufacturing method
CN104044132A (en) * 2013-03-14 2014-09-17 株式会社安川电机 Robot System And Method For Producing To-be-processed Material
CN104057457A (en) * 2013-03-19 2014-09-24 株式会社安川电机 Robot system and calibration method
US20160063708A1 (en) * 2013-04-12 2016-03-03 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi A system and method for optimizing fiducial marker and camera positions/orientations
CN104552292A (en) * 2013-10-10 2015-04-29 精工爱普生株式会社 Control system of robot, robot, program and control method of robot
CN106217372A (en) * 2015-06-02 2016-12-14 精工爱普生株式会社 Robot, robot controller and robot system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DANIEL PIZARRO 等: "Localization of Mobile Robots Using Odometry and an External Vision Sensor", 《SENSORS》 *
KANG HOU 等: "An Autonomous Positioning and Navigation System for Spherical Mobile Robot", 《2012 INTERNATIONAL WORKSHOP ON INFORMATION AND ELECTRONICS ENGINEERING》 *
龚渝 等: "基于工业机器人的三维形貌自动化测量系统", 《新技术新工艺》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114184123A (en) * 2021-12-15 2022-03-15 西南林业大学 Device and method for measuring and calculating three-dimensional green quantity of grassland sample

Also Published As

Publication number Publication date
CN108573504B (en) 2022-10-04
KR20180104504A (en) 2018-09-21
KR101963643B1 (en) 2019-04-01

Similar Documents

Publication Publication Date Title
US11704833B2 (en) Monocular vision tracking method, apparatus and non-transitory computer-readable storage medium
CN110555889B (en) CALTag and point cloud information-based depth camera hand-eye calibration method
US10068344B2 (en) Method and system for 3D capture based on structure from motion with simplified pose detection
EP3067861B1 (en) Determination of a coordinate conversion parameter
CN110580723B (en) Method for carrying out accurate positioning by utilizing deep learning and computer vision
US7688381B2 (en) System for accurately repositioning imaging devices
JP6516558B2 (en) Position information processing method
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
JP2012101320A (en) Image generation apparatus, image generation method and program
JP6288858B2 (en) Method and apparatus for estimating position of optical marker in optical motion capture
CN111678521B (en) Method and system for evaluating positioning accuracy of mobile robot
CN110517284B (en) Target tracking method based on laser radar and PTZ camera
CN112258574A (en) Method and device for marking pose information and computer readable storage medium
CN109657607A (en) A kind of human face target distance measuring method, device and storage medium based on recognition of face
CN113763479B (en) Calibration method of refraction and reflection panoramic camera and IMU sensor
JP2014514539A (en) Method for aligning at least a portion of a first image and at least a portion of a second image using a collinear transform warp function
CN111899276A (en) SLAM method and system based on binocular event camera
CN107977082A (en) A kind of method and system for being used to AR information be presented
CN114766042A (en) Target detection method, device, terminal equipment and medium
US20070076096A1 (en) System and method for calibrating a set of imaging devices and calculating 3D coordinates of detected features in a laboratory coordinate system
CN115371665A (en) Mobile robot positioning method based on depth camera and inertia fusion
CN109781068A (en) The vision measurement system ground simulation assessment system and method for space-oriented application
CN111105467B (en) Image calibration method and device and electronic equipment
JP6922348B2 (en) Information processing equipment, methods, and programs
CN108573504A (en) The 3D image generating methods and its system of phenotype for analyzing plant

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant