CN107292926A - Crusing robot movement locus verticality measuring method based on many image sequences - Google Patents
Crusing robot movement locus verticality measuring method based on many image sequences Download PDFInfo
- Publication number
- CN107292926A CN107292926A CN201710443460.5A CN201710443460A CN107292926A CN 107292926 A CN107292926 A CN 107292926A CN 201710443460 A CN201710443460 A CN 201710443460A CN 107292926 A CN107292926 A CN 107292926A
- Authority
- CN
- China
- Prior art keywords
- scraper plate
- positioning
- positioning mark
- receptive field
- crusing robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000011946 reduction process Methods 0.000 claims abstract description 7
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 claims description 3
- 230000000452 restraining effect Effects 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 4
- 238000005259 measurement Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000005065 mining Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/181—Segmentation; Edge detection involving edge growing; involving edge linking
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
- Manipulator (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of crusing robot movement locus verticality measuring method based on many image sequences, including:In the square positioning mark of the same position placement of each hydraulic support, hydraulic support image is acquired with video camera, shooting image is distinguished when crusing robot is walked to often section scraper plate end, the often junction of section scraper plate and hydraulic support;Noise reduction process is carried out to the target image collected, the positioning mark after noise reduction process in image is extracted;Edge lines fitting is carried out to each positioning mark, four apex coordinates of each positioning mark are obtained;The angle between adjacent scraper plate is calculated, the movement locus of crusing robot is fitted according to the angle of scraper plate length and adjacent scraper plate, the linearity of movement locus is measured.The present invention measures the movement locus linearity of working face crusing robot using contactless vision measurement technology, and the degree of accuracy is high, is that control, the safe operation of working face provide the necessary technical support.
Description
Technical field
The invention belongs to mining equipment monitoring running state field, more particularly to a kind of survey monitor based on many image sequences
Device people's movement locus verticality measuring method.
Background technology
Crusing robot is as the monitoring part of working face, and its running orbit reacted the straight line of current scratch board conveyor
Degree, therefore may determine that by measuring the movement locus of crusing robot the running situation of current fully-mechanized mining working.Pass through at present
The modes such as encoder are installed on crusing robot to detect the displacement of robot, but the linearity of its movement locus can not be entered
Row measurement.
The content of the invention
For the defect and deficiency of existing technology of preparing, it is an object of the invention to provide a kind of patrolling based on many image sequences
Robot motion track verticality measuring method is examined, the problem of installation encoder can not detect movement locus linearity is solved.
To achieve these goals, the present invention, which is adopted the following technical scheme that, is achieved:
Crusing robot movement locus verticality measuring method based on many image sequences, this method includes N number of hydraulic pressure
A scraper plate is connected with support, each hydraulic support, crusing robot is moved along scraper plate, comprised the following steps:
Step one:Positioning on square positioning marking plate, described each hydraulic support is installed on each hydraulic support
The position of marking plate is identical, when crusing robot is walked to often section scraper plate end, the often junction of section scraper plate and hydraulic support
Image is gathered respectively;
Step 2:Noise reduction process is carried out to the target image collected, the positioning mark in image after noise reduction process is extracted
Know;
Step 3:Edge lines fitting is carried out to each positioning mark using receptive field cell model, each positioning is obtained
Four apex coordinates of mark;
Step 4:Using visual imaging model, each apex coordinate of each positioning mark obtained to step 3 is carried out
Inverse transformation is calculated, and obtains coordinate of each summit of each positioning mark in photocentre coordinate system;
Step 5:The angle between adjacent scraper plate is calculated using formula (1);
βk=θk-γk (1)
Wherein, k=1,2 ..., N-1, N be marking plate number,For the of+1 marking plate of kth
I apex coordinate, i is any one in four summits of positioning marking plate;LkFor+1 positioning of k-th of scraper plate end and kth
Projector distance between marking plate in photocentre coordinate system, lkFor between+1 positioning marking plate in+1 scraper plate end of kth and kth
Projector distance in photocentre coordinate system, a is the length for often saving scraper plate;
Step 6:Using first shooting point as the origin of coordinates, using crusing robot along the first segment scraper plate direction of motion as X
Axle, along the first segment scraper plate direction of motion is that Y-axis is to set up horizontal plane two-dimensional coordinate system perpendicular to crusing robot, long according to scraper plate
Spend the angle β of a and adjacent scraper platekThe movement locus of crusing robot is fitted, the linearity of movement locus is measured.
Further, described step two includes:
Step 2.1:Self-adaption binaryzation pretreatment is carried out to the every frame target image collected;
Step 2.2:The positioning mark in pretreated target image is carried using the method based on connected component
Take;
Further, edge lines fitting, tool are carried out to each positioning mark using receptive field cell model in step 3
Body step is:
Step 3.1:To center spacing and size distribution receptive field of j-th of positioning mark according to setting, using (r*2+
1) * (r*2+1) mask, j=1,2 ..., N, N for positioning marking plate number, r be receptive field cell radius, mask center with
Receptive field cell centre is overlapped;
Step 3.2:Each receptive field mask gradient direction is calculated using gradient operator, by mask gradient direction to each
Receptive field direction is qualitatively judged, so as to be respectively divided out on top edge, lower edge, left hand edge and the right hand edge of positioning mark
The receptive field cell of distribution;
Step 3.3:Using receptive field model, the response of each receptive field of top edge direction is calculated, according to single impression
The contrast fringes position of pixel formation and the relation of receptive field centre distance in wild, it is determined that each receptive field center is to receptive field
Then interior contrast fringes fit marking plate top edge to the distance at receptive field center using the LEAST SQUARES MODELS FITTING of belt restraining
Place straight line;
Step 3.4:Repeat step 3.3, is carried out to straight line where the lower edge, left hand edge, right hand edge of positioning mark respectively
Fitting;
Step 3.5:According to the top edge of fitting, lower edge, four linear equations of left hand edge and right hand edge, obtain j-th
Position four apex coordinates of mark;
Step 3.6:3.1~step 3.5 of repeat step, obtains four apex coordinates of all positioning marks.
Further, coordinate P of i-th of the summit of j-th of positioning mark in photocentre coordinate system in described step four
(Xji,Yji,Zji), i=1,2,3,4, calculated by formula (2):
Bj1=xj2yj3-xj4yj3+xj4yj2-xj2yj4+xj3yj4-xj3yj2
Bj2=xj3yj4-xj4yj3-xj1yj4-xj4yj1+xj1yj3-xj3yj1
Bj3=xj1yj2-xj4yj2-xj2yj1-xj1yj4+xj4yj1-xj2yj4
Bj4=xj3yj2-xj2yj3+xj1yj3-xj2yj1-xj1yj2-xj3yj1
Wherein, ljThe length of side of mark, (x are positioned for the length of long sides or square of j-th of rectangle positioning markji,yji)
The image coordinate on the positioning mark summit obtained for step 3, C is the effective focal length of video camera.
Further, in step 6, first summit of all positioning marks is taken, is carried out in the XOY plane of foundation straight
Line is fitted, and obtains straight line, then the linearity H of the movement locus of crusing robot is:
Wherein, L is first shooting point of video camera and the air line distance of last shooting point, djFor j-th of positioning
First distance between summit and fitting a straight line of mark, takes the ultimate range respectively d of fitting a straight line both sidesmax1And dmax2, b
=dmax1+dmax2。
Compared with prior art, the beneficial effects of the invention are as follows:
The present invention measures the movement locus linearity of working face crusing robot using contactless vision measurement technology,
The degree of accuracy is high, is that control, the safe operation of working face provide the necessary technical support.
Brief description of the drawings
Fig. 1 is flow chart of the invention.
Fig. 2 is the image after the image that the inventive method is gathered and its processing, (a) original image, the pretreatment of (b) binaryzation
Image afterwards, the positioning mark that (c) is extracted.
Fig. 3 is the schematic diagram calculation of crusing robot movement locus.
Coordinate when Fig. 4 is fitting crusing robot movement locus.
Explanation is further explained in detail to the particular content of the present invention with reference to embodiments.
Embodiment
Photocentre coordinate system (camera coordinates system):Using the photocentre of camera as the origin of coordinates, X-axis, Y-axis are respectively parallel to CCD and put down
The optical axis coincidence of two vertical edges in face, Z axis and camera.
Image coordinate system:The origin of coordinates is at the center of ccd image plane, and X-axis, Y-axis are respectively parallel to two of CCD planes
Vertical edges.
Pixel coordinate system:The origin of coordinates is respectively parallel to the X of image coordinate in the upper left corner of ccd image plane, U axles, V axles
Axle, Y-axis.
The underground hydraulic support frame group pose measuring method based on many image sequences of the present invention, this method includes N number of liquid
Press and be connected with a scraper plate on support, each hydraulic support, described scraper plate is end to end, comprises the following steps:
Step one:Positioning on square positioning marking plate, described each hydraulic support is installed on each hydraulic support
The installation site of marking plate is identical, is installing video camera along along the crusing robot moved with working face parallel direction, and pass through
The video camera being fixedly mounted on crusing robot is acquired to hydraulic support image:When crusing robot walking is scraped to often section
Plate end, often the junction of section scraper plate and hydraulic support when distinguish shooting image, obtain 2N target image, N is positioning mark
Number;
Step 2:Noise reduction process is carried out to the target image collected, the positioning mark after pretreatment in image is extracted,
Wherein, the positioning of at least one in each image mark;
Step 2.1:Self-adaption binaryzation pretreatment is carried out to the every frame target image collected;
Step 2.2:The positioning mark in pretreated target image is carried using the method based on connected component
Take;
Step 3:Edge lines fitting is carried out to each positioning mark, four apex coordinates of each positioning mark are obtained;
Step 3.1:To center spacing and size distribution receptive field of j-th of positioning mark according to setting, using (r*2+
1) * (r*2+1) mask, j=1,2 ..., N, N for positioning marking plate number, r be receptive field cell radius, mask center with
Receptive field cell centre is overlapped;
Step 3.2:Each receptive field mask gradient direction is calculated using gradient operator;
By mask gradient direction, each receptive field direction is qualitatively judged, so that positioning mark is respectively divided out
Top edge, lower edge, left hand edge, the receptive field cell being distributed on right hand edge;
Step 3.3:Using receptive field model, the response S of each receptive field of top edge direction is calculated,
S=S1-S2 (3)
Wherein, σD=rD/ 4, σS=rS/4;rDAnd rSRepresent the center and perimeter region of receptive field (also comprising center respectively
The great circle in area) radius, when h (u, v, η) represents contrast to stimulate coverage rate be η, stimulate strong positioned at the pixel of pixel (u, v)
Degree, is 1 for the point value that pixel value in bianry image is 255, the point value that pixel value is 0 is 0;
S1For receptive field center response, S2For the response of receptive field neighboring area, DlFor positioned at receptive field center
Pixel, SeFor positioned at the pixel of receptive field neighboring area, l=0,1,2 ..., f, e=1,2 ..., m, f be positioned at sense
By the pixel of Yezhong heart district, m is the pixel number positioned at receptive field neighboring area.
According to the contrast fringes position of pixel formation and the relation of receptive field centre distance in single receptive field, it is determined that often
Individual receptive field center is to the interior fitting contrast fringes of receptive field apart from dh, h=1, wherein 2 ..., n, n represent receptive field cell
Number.
Assuming that linear equation (a, b) (u, v) where edgeT+ c=0, according to the range formula of point to straight line:
Wherein, (uh, vh) it is the coordinate points that receptive field center is fastened in pixel coordinate.
Straight line where the edge is fitted using the LEAST SQUARES MODELS FITTING of belt restraining;
Wherein, L (a, b, c, λ) represents Lagrangian, and λ is parameter;A, b, the c linear equation where the edge of fitting
Coefficient, U, V represent the n*1 matrixes being made up of the pixel coordinate of receptive field cell centre, and D represents dhThe n*1 matrixes of composition;
In constraints a2+b2Under=1, a, b, c optimal solutions (a are found*,b*,c*);
Step 3.4:Repeat step 3.3, is carried out to straight line where the lower edge, left hand edge, right hand edge of positioning mark respectively
Fitting;
Step 3.5:According to the top edge of fitting, lower edge, four linear equations of left hand edge and right hand edge, obtain j-th
Position four apex coordinates of mark;
Step 3.6:3.1~step 3.5 of repeat step, obtains four apex coordinates of all positioning marks.
Step 4:Using visual imaging model, each apex coordinate of each positioning mark obtained to step 3 is carried out
Inverse transformation is calculated, and carries out calculating coordinate P of each summit for obtaining each positioning mark in photocentre coordinate system by formula (2)
(Xji,Yji,Zji), i=1,2,3,4, calculated by formula (2):
Bj1=xj2yj3-xj4yj3+xj4yj2-xj2yj4+xj3yj4-xj3yj2
Bj2=xj3yj4-xj4yj3-xj1yj4-xj4yj1+xj1yj3-xj3yj1
Bj3=xj1yj2-xj4yj2-xj2yj1-xj1yj4+xj4yj1-xj2yj4
Bj4=xj3yj2-xj2yj3+xj1yj3-xj2yj1-xj1yj2-xj3yj1
Wherein, ljFor the length of side of j-th of positioning mark, rectangle is designated if positioning, for the length of side of longer sides, (xi,
yi) it is that the positioning that step 3 is obtained identifies the image coordinate on summit, C is the effective focal length of video camera;
Step 5:The angle between adjacent scraper plate is calculated using formula (1);
βk=θk-γk (1)
Wherein, k=1,2 ..., N-1, N be marking plate number,For the of+1 marking plate of kth
I apex coordinate, i is any one in four summits of positioning marking plate;LkFor+1 positioning of k-th of scraper plate end and kth
Projector distance between marking plate in photocentre coordinate system, lkFor between+1 positioning marking plate in+1 scraper plate end of kth and kth
Projector distance in photocentre coordinate system, l0For the length of every section scraper plate, often save below crusing robot it is identical during scraper plate length,
Often section scraper plate is corresponded with positioning mark.Such as β1For the angle between first segment scraper plate and second section scraper plate, by that analogy.
Fig. 3 show the schematic diagram calculation of crusing robot movement locus, wherein,For the 1st of the 1st marking plate
Apex coordinate,For the 1st apex coordinate of the 2nd marking plate, origin of coordinates O1For first segment scraper plate and first hydraulic pressure branch
The junction (i.e. the center of first segment scraper plate) of frame, O2For the end of first segment scraper plate, O2O4The distance between for one section scrape
The length l of plate0。
Step 6:Using first shooting point as the origin of coordinates, using crusing robot along the first segment scraper plate direction of motion as X
Axle, along the first segment scraper plate direction of motion is that Y-axis is to set up horizontal plane two-dimensional coordinate system perpendicular to crusing robot, as shown in figure 4,
The origin of coordinates is in the junction (i.e. the center of first segment scraper plate) of first segment scraper plate and first hydraulic support.According to every section
Scraper plate length l0With the angle β of adjacent scraper platekIt is fitted the movement locus of crusing robot;
First summit of all positioning marks is taken, fitting a straight line is carried out in the XOY plane of foundation, one is obtained directly
Line, using the straight line as datum line, then the linearity H of the movement locus of crusing robot is:
Wherein, L is first shooting point of video camera and the air line distance of last shooting point, djFor j-th of positioning
First distance between summit and fitting a straight line of mark, takes the ultimate range respectively d of fitting a straight line both sidesmax1And dmax2, b
=dmax1+dmax2, as shown in Figure 4.
Specific embodiment of the invention given below is, it is necessary to which explanation is that the invention is not limited in implement in detail below
Example, all equivalents done on the basis of technical scheme each fall within protection scope of the present invention.
Embodiment
A kind of crusing robot movement locus verticality measuring method based on many image sequences of the present embodiment, first,
In the square positioning mark of each hydraulic support same position placement, along along the crusing robot moved with working face parallel direction
Video camera is installed, the length that scraper plate is often saved in the present embodiment be in 480mm, the present embodiment hydraulic support be seven, it is corresponding
Scraper plate is seven sections, wherein, the angle between first segment scraper plate and second scraper plate is 4 °, second scraper plate and the 3rd scraper plate it
Between angle be -4 °, the angle between the 3rd scraper plate and the 4th scraper plate is -2 °, the 4th scraper plate and the 5th scraper plate it
Between angle be 2 °, the angle between the 5th scraper plate and the 6th scraper plate is 3 °, between the 6th scraper plate and the 7th scraper plate
Angle be -3 °.
Then hydraulic support image is acquired by the video camera being fixedly mounted on crusing robot, to collection
Image carries out self-adaption binaryzation pretreatment;Extract the connected component in the coal-winning machine image after binary conversion treatment;Utilize connection
The axial ratio and area information of component are extracted to be identified with being positioned in segmentation hydraulic support image, as shown in Figure 2.
It is straight that each positioning mark in the target image obtained using cell receptive field model to segmentation carries out edge respectively
Line is fitted, and then calculates now coordinate of four summits of positioning mark in photocentre coordinate system;Then obtained by above-mentioned formula (1)
It is to the angle between adjacent scraper plate:β1=4.03 °, β2=-4.08 °, β3=-1.99 °, β4=2.06 °, β5=3.12 °, β6
=-3.07 °.
According to the above-mentioned angle obtained between adjacent scraper plate and the length of scraper plate, the movement locus of crusing robot is carried out
Fitting, as shown in figure 4, wherein:X-axis represents length of the scraper plate composition track along working face, and Y-axis represents that scraper plate composition track exists
Dotted line is actually to measure obtained track in the distance of offset linear, figure in progradation, and solid line represents that the inventive method is measured
Obtained movement locus, wherein straight dashed line represent the fitting a straight line of movement locus, are used as datum line.Obtain the inventive method acquisition
Crusing robot linearity 9.845mm/m, the linearity of actual crusing robot is 9.768mm/m.As can be seen that this hair
The linearity of the linearity that bright method the is obtained movement locus actual with crusing robot is almost consistent, therefore, the inventive method
Accurately the linearity of crusing robot can be evaluated.
Claims (5)
1. the crusing robot movement locus verticality measuring method based on many image sequences, this method includes N number of hydraulic pressure branch
A scraper plate is connected with frame, each hydraulic support, crusing robot is moved along scraper plate, it is characterised in that including following step
Suddenly:
Step one:Positioning mark on square positioning marking plate, described each hydraulic support is installed on each hydraulic support
The position of plate is identical, distinguishes when crusing robot is walked to often section scraper plate end, the often junction of section scraper plate and hydraulic support
Gather image;
Step 2:Noise reduction process is carried out to the target image collected, the positioning mark after noise reduction process in image is extracted;
Step 3:Edge lines fitting is carried out to each positioning mark using receptive field cell model, each positioning mark is obtained
Four apex coordinates;
Step 4:Using visual imaging model, each apex coordinate of each positioning mark obtained to step 3 carries out inversion
Calculating is changed, coordinate of each summit of each positioning mark in photocentre coordinate system is obtained;
Step 5:The angle between adjacent scraper plate is calculated using formula (1);
βk=θk-γk (1)
Wherein, k=1,2 ..., N-1, N be marking plate number,For i-th of top of+1 marking plate of kth
Point coordinates, i is any one in four summits of positioning marking plate;LkIdentified for+1 positioning in kth scraper plate end and kth
Projector distance between plate in photocentre coordinate system, lkFor between+1 scraper plate end of kth and the positioning marking plate of kth+1 in light
Projector distance in heart coordinate system, a is the length for often saving scraper plate;
Step 6:Using first shooting point as the origin of coordinates, using crusing robot along the first segment scraper plate direction of motion as X-axis, hang down
Along the first segment scraper plate direction of motion be directly that Y-axis is to set up horizontal plane two-dimensional coordinate system in crusing robot, according to scraper plate length a and
The angle β of adjacent scraper platekThe movement locus of crusing robot is fitted, the linearity of movement locus is measured.
2. the underground hydraulic support frame group pose measuring method as claimed in claim 1 based on many image sequences, it is characterised in that:
Described step two includes:
Step 2.1:Self-adaption binaryzation pretreatment is carried out to the every frame target image collected;
Step 2.2:The positioning mark in pretreated target image is extracted using the method based on connected component.
3. the underground hydraulic support frame group pose measuring method as claimed in claim 1 based on many image sequences, it is characterised in that:
Edge lines fitting is carried out to each positioning mark using receptive field cell model in step 3, concretely comprised the following steps:
Step 3.1:To center spacing and size distribution receptive field of j-th of positioning mark according to setting, using (r*2+1) * (r*
Mask 2+1), j=1,2 ..., N, N is the number of positioning marking plate, and r is receptive field cell radius, mask center and receptive field
Cell centre is overlapped;
Step 3.2:Each receptive field mask gradient direction is calculated using gradient operator, by mask gradient direction to each impression
Wild direction is qualitatively judged, and is distributed so as to be respectively divided out on top edge, lower edge, left hand edge and the right hand edge of positioning mark
Receptive field cell;
Step 3.3:Using receptive field model, the response of each receptive field of top edge direction is calculated, according in single receptive field
The relation of contrast fringes position and the receptive field centre distance of pixel formation, it is determined that each receptive field center is to right in receptive field
Than the distance at edge to receptive field center, then marking plate top edge place is fitted using the LEAST SQUARES MODELS FITTING of belt restraining
Straight line;
Step 3.4:Repeat step 3.3, is fitted to straight line where the lower edge, left hand edge, right hand edge of positioning mark respectively;
Step 3.5:According to the top edge of fitting, lower edge, four linear equations of left hand edge and right hand edge, j-th of positioning is obtained
Four apex coordinates of mark;
Step 3.6:3.1~step 3.5 of repeat step, obtains four apex coordinates of all positioning marks.
4. the underground hydraulic support frame group pose measuring method as claimed in claim 1 based on many image sequences, it is characterised in that:
Coordinate P (X of i-th of the summit of j-th of positioning mark in photocentre coordinate system in described step fourji,Yji,Zji), i=1,
2,3,4, calculated by formula (2):
Bj1=xj2yj3-xj4yj3+xj4yj2-xj2yj4+xj3yj4-xj3yj2
Bj2=xj3yj4-xj4yj3-xj1yj4-xj4yj1+xj1yj3-xj3yj1
Bj3=xj1yj2-xj4yj2-xj2yj1-xj1yj4+xj4yj1-xj2yj4
Bj4=xj3yj2-xj2yj3+xj1yj3-xj2yj1-xj1yj2-xj3yj1
Wherein, ljThe length of side of mark, (x are positioned for the length of long sides or square of j-th of rectangle positioning markji,yji) it is step
The image coordinate on rapid three obtained positioning mark summits, C is the effective focal length of video camera.
5. the underground hydraulic support frame group pose measuring method as claimed in claim 1 based on many image sequences, it is characterised in that:
In step 6, first summit of all positioning marks is taken, fitting a straight line is carried out in the XOY plane of foundation, one is obtained directly
Line, then the linearity H of the movement locus of crusing robot be:
Wherein, L is first shooting point of video camera and the air line distance of last shooting point, djIdentified for j-th of positioning
First distance between summit and fitting a straight line, takes the ultimate range respectively d of fitting a straight line both sidesmax1And dmax2, b=dmax1+
dmax2。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710443460.5A CN107292926B (en) | 2017-06-13 | 2017-06-13 | Crusing robot movement locus verticality measuring method based on more image sequences |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710443460.5A CN107292926B (en) | 2017-06-13 | 2017-06-13 | Crusing robot movement locus verticality measuring method based on more image sequences |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107292926A true CN107292926A (en) | 2017-10-24 |
CN107292926B CN107292926B (en) | 2018-06-29 |
Family
ID=60097706
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710443460.5A Active CN107292926B (en) | 2017-06-13 | 2017-06-13 | Crusing robot movement locus verticality measuring method based on more image sequences |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107292926B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108106569A (en) * | 2017-11-28 | 2018-06-01 | 北京天地玛珂电液控制系统有限公司 | A kind of fully-mechanized mining working Linearity surveying method and system of view-based access control model |
CN108033203B (en) * | 2017-11-28 | 2019-01-25 | 天地科技股份有限公司 | Fully mechanized coal face rear portion drag conveyor straightness determining device draws shifting system |
CN109781036A (en) * | 2019-03-07 | 2019-05-21 | 山东科技大学 | A kind of hydraulic support linearity testing apparatus and detection method |
CN111127406A (en) * | 2019-12-10 | 2020-05-08 | 创维集团智能装备有限公司 | Back plate machining position adjusting method, terminal, system and storage medium |
CN112964173A (en) * | 2020-12-31 | 2021-06-15 | 四川和心亿科技有限公司 | Structural member quality detection method |
CN113029046A (en) * | 2021-03-11 | 2021-06-25 | 精英数智科技股份有限公司 | Method and device for detecting straightness of scraper conveyor based on video identification |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006266897A (en) * | 2005-03-24 | 2006-10-05 | Shimizu Corp | Method for observing and measuring linearity of pipe material |
CN102431784A (en) * | 2011-08-25 | 2012-05-02 | 北京天地玛珂电液控制系统有限公司 | Attitude control system and method based on wireless three-dimensional gyroscope technology for scraper conveyer |
CN106595557A (en) * | 2016-10-31 | 2017-04-26 | 中国矿业大学 | Detection device for straightness of scraper conveyer and detection method |
-
2017
- 2017-06-13 CN CN201710443460.5A patent/CN107292926B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006266897A (en) * | 2005-03-24 | 2006-10-05 | Shimizu Corp | Method for observing and measuring linearity of pipe material |
CN102431784A (en) * | 2011-08-25 | 2012-05-02 | 北京天地玛珂电液控制系统有限公司 | Attitude control system and method based on wireless three-dimensional gyroscope technology for scraper conveyer |
CN106595557A (en) * | 2016-10-31 | 2017-04-26 | 中国矿业大学 | Detection device for straightness of scraper conveyer and detection method |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108106569A (en) * | 2017-11-28 | 2018-06-01 | 北京天地玛珂电液控制系统有限公司 | A kind of fully-mechanized mining working Linearity surveying method and system of view-based access control model |
CN108033203B (en) * | 2017-11-28 | 2019-01-25 | 天地科技股份有限公司 | Fully mechanized coal face rear portion drag conveyor straightness determining device draws shifting system |
CN109781036A (en) * | 2019-03-07 | 2019-05-21 | 山东科技大学 | A kind of hydraulic support linearity testing apparatus and detection method |
CN111127406A (en) * | 2019-12-10 | 2020-05-08 | 创维集团智能装备有限公司 | Back plate machining position adjusting method, terminal, system and storage medium |
CN111127406B (en) * | 2019-12-10 | 2022-09-27 | 创维集团智能装备有限公司 | Back plate machining position adjusting method, terminal, system and storage medium |
CN112964173A (en) * | 2020-12-31 | 2021-06-15 | 四川和心亿科技有限公司 | Structural member quality detection method |
CN113029046A (en) * | 2021-03-11 | 2021-06-25 | 精英数智科技股份有限公司 | Method and device for detecting straightness of scraper conveyor based on video identification |
CN113029046B (en) * | 2021-03-11 | 2021-12-14 | 精英数智科技股份有限公司 | Method and device for detecting straightness of scraper conveyor based on video identification |
Also Published As
Publication number | Publication date |
---|---|
CN107292926B (en) | 2018-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107292926A (en) | Crusing robot movement locus verticality measuring method based on many image sequences | |
CN107316330B (en) | Underground hydraulic support frame group pose and verticality measuring method based on more image sequences | |
CN101943563B (en) | Rapid calibration method of line-structured light vision sensor based on space plane restriction | |
CN107152295B (en) | A kind of shield pipe sheet assembling machine automatically grabs method and system with vision platform | |
CN105113403A (en) | Intelligent detecting equipment and method for bottom of bridge | |
CN106996748A (en) | A kind of wheel footpath measuring method based on binocular vision | |
CN113063795B (en) | Expressway tunnel defect position determining method and system | |
CN109443307A (en) | A kind of measuring system and method for transmission tower sedimentation and inclination angle based on optical measurement | |
CN106643545A (en) | Calibration method for steel rail profile measured by adopting laser displacement technology | |
CN107316288B (en) | A kind of boom-type roadheader cutterhead pose vision measuring method | |
CN1566903A (en) | Laser vision on-line automatic measuring method for tire multiple geometrical parameters | |
CN107817044B (en) | Device and method for measuring plate vibration based on machine vision | |
CN108592816A (en) | A kind of three-dimensional measuring apparatus and method for large scale surface | |
CN105800464A (en) | Positioning method based on automatic lifting hook system | |
CN103810676B (en) | A kind of monitoring method of the steel pipe speed of service | |
CN115018872B (en) | Intelligent control method of dust collection equipment for municipal construction | |
CN111452840B (en) | Railway steel rail crawling displacement detection method based on monocular vision measurement technology | |
CN112785654A (en) | Calibration method and device for track geometry detection system | |
CN104006804B (en) | Method for detecting offset of contact net based on observation benchmark instability compensation | |
CN111390913B (en) | Automatic detection method for bridge bottom structure | |
Liu et al. | A new measurement method of real-time pose estimation for an automatic hydraulic excavator | |
CN107388979A (en) | A kind of tunnel surface deformation monitoring system and computer | |
CN107131837B (en) | A kind of rocker arm of coal mining machine mining height vision measuring method | |
CN102779341B (en) | Novel method for identifying support position in pit construction process | |
CN111583241B (en) | Mobile detection method and device for regularity of ultra-large area honeycomb products |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |