CN108932477A - A kind of crusing robot charging house vision positioning method - Google Patents
A kind of crusing robot charging house vision positioning method Download PDFInfo
- Publication number
- CN108932477A CN108932477A CN201810555809.9A CN201810555809A CN108932477A CN 108932477 A CN108932477 A CN 108932477A CN 201810555809 A CN201810555809 A CN 201810555809A CN 108932477 A CN108932477 A CN 108932477A
- Authority
- CN
- China
- Prior art keywords
- marker
- coordinate system
- crusing robot
- matrix
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 239000003550 marker Substances 0.000 claims abstract description 161
- 239000011159 matrix material Substances 0.000 claims description 70
- 238000013519 translation Methods 0.000 claims description 16
- 238000007689 inspection Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 230000007704 transition Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000004321 preservation Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims 1
- 230000000007 visual effect Effects 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 230000005611 electricity Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011496 digital image analysis Methods 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a kind of crusing robot charging house vision positioning method, camera is had on crusing robot, the setting mark Marker in charging house;Crusing robot is matched to after Marker, the position and direction by the coordinate and Marker of Marker relative to camera obtain physical location of the current robot in map, to realize vision positioning by camera scanning and identification marking Marker;After vision positioning, the position of robot is adjusted to match charging pile, realizes charging.Robot by visual identity Marker, finds the position of oneself accurately by the identification to Marker, and with the calculating of itself distance, angle, to complete recharging, safely and reliably realizes that substation is unattended in charging house.
Description
Technical field
The invention belongs to location and navigation technology field, in particular to a kind of crusing robot charging house vision positioning method.
Background technique
Robot used for intelligent substation patrol is a kind of outdoor wheeled mobile robot.It can carry Visible Light Camera, red
Outer camera carries out all the period of time autonomous inspection to outdoor power transformation station equipment.Navigation and positioning are the entirely autonomous operations of crusing robot
Key, the navigator fix performance of crusing robot and precision directly determine the quality of patrol task.It is most of at present outdoor
There are two types of the mainstream location and navigation technologies that robot uses:One is the navigator fix sides that subslot guides cooperation RFID positioning
Formula, which need to be laid with magnetic conductance rail on the inspection route of robot in advance, and embedded in the position that robot needs to stop
RFID tag, in robot operational process, the array of magnetic sensors detection robot centre of motion in robot is relative to track
The offset of mark controls the differential of two-wheeled by motion controller, so that robot be made to run along setting path;RFID card reader
Then robot is notified to reach predeterminated position after detecting the label being embedded on path, robot can be in vehicle computing at this time
Machine control is lower to complete the movements such as stop, steering, acceleration and deceleration, equipment detection.Although repeatable accuracy that this method has positioning is high, anti-dry
The features such as ability is strong is disturbed, but is faced with very big problem in actual use, for example, magnetic conductance rail is laid at high cost, Er Qieke
Changing capability is poor, and robot inspection route is not flexible etc., and robot obstacle climbing ability is limited to the detection height of Magnetic Sensor.Second
Kind be laser positioning, this location navigation mode, laser sensor can achieve Centimeter Level in terms of positioning, then plus control can
To allow positioning to reach grade, laser equipment is at low cost, can work all the period of time, and not by electromagnetic interference, positioning accuracy is high, ground
The features such as surface construction is simple, but conventional laser positioning still have the shortcomings that it is very big, it is well known that laser positioning is in original
There is the matching on map plane, if environment has greatly changed, most basic laser positioning will appear " not recognizing road "
Saying, robot will lose oneself in inspection.
In addition, robot is the support for needing power at work, this power is exactly electricity, so can robot correct
The charge position that finds be also the big reason that can guarantee work quality.In the epoch that present science and technology rapidly develops, robot
More and more important, robot will guarantee that work quality must have accurately positioning and just can guarantee work quality, so in power transformation
Standing must assure that the precise positioning of robot and recharging are just able to achieve safe and reliable realization in this stringent environment
Substation is unattended.
Summary of the invention
The purpose of the invention is to provide a kind of inspection of precise positioning and recharging that can guarantee robot
Robot charging house vision positioning method.
For this purpose, the technical scheme is that:A kind of crusing robot charging house vision positioning method, the inspection machine
People includes video camera, it is characterised in that:Include the following steps:
1) the setting mark Marker in charging house;The black background that Marker has square is identified, centre is by white
Color region marks various characteristic patterns;
2) crusing robot specifically includes following steps by camera-scanning and identification marking Marker:
A) binary conversion treatment is carried out to video image collected, image is become into bianry image;
B) processing is split to image, extracts the dark border of Marker;
C) contour line of tracking mark, and four vertex of mark are determined on this basis;
D) four square angles are fitted, discriminate whether to tilt according to angle;
E) it to the image normalization of Marker, and is matched with the library Marker, sees whether find and current Marker
Corresponding template;
If f) matching rate is greater than a certain threshold value, then it is assumed that recognize current Marker, read what current Marker was saved
Coordinate information under world coordinate system;
3) it is matched to Marker and obtains Marker after the coordinate information under world coordinate system, Marker is calculated and exists
Rotational translation matrix under camera coordinate system;Carry out calculation process after, obtain the Marker relative to camera distance and
Direction;
4) after being matched to Marker, charging house position corresponding to the Marker can be specified, from the travelling route of robot
Determine that crusing robot enters the mode of the charging house, to obtain by the Marker coordinate information read in step 3)
Current physical location of the crusing robot in map, realizes vision positioning;
5) after vision positioning, crusing robot is mobile to charging pile position, in moving process, crusing robot according to
The transformation of the relative position Marker constantly corrects travelling route, matches until with charging pile, realizes charging.
World coordinate system is the real scene coordinate system in system;Marker coordinate system is and the associated seat of certain objects
Mark system, each Marker have its specific coordinate system, and when Marker is mobile or changes, Marker related to this is sat
Mark system also moves and changes direction;Camera coordinate system is using the three-dimensional system of coordinate at camera lens optical center as origin.In AR system
In system, conversion process from world coordinate system to camera coordinate system is image to be acquired by video camera and by computer graphic
As relativeness is derived in analysis processing.In this process, need to acquire the inside and outside parameter of video camera, it is true by coordinate transform
Determine the transformational relation between Marker coordinate system and world coordinate system, and calculates azimuth information of the video camera in real scene
Parameter.
As a result, further, in above-mentioned steps 3) in, distance and direction of the Marker relative to video camera are calculated, including
Following steps:
I) after identifying Marker, coordinate (x of the Marker under world coordinate system can be readw, yw, zw);
Ii) physical meaning of the Trans matrix of ARToolKit includes the three dimensional orientation information of Marker;Trans
Matrix, can as obtained by arGetTransMatSquareCont/arGetTransMatSquare function in ARToolKiy,
Trans matrix is one 3 × 4 matrix, and canonical form is as follows:
Wherein:R1To R9To join spin matrix, x outside video camerat、yt、ztFor the translation matrix joined outside video camera;Above-mentioned matrix
What is indicated is the transition matrix that Marker is transformed into camera coordinate system from the coordinate system of itself;
Above-mentioned matrix is denoted as matrix A, coordinate of the video camera under local Coordinate System is (Xc, Yc, Zc), and Marker is taking the photograph
Coordinate under camera coordinate system is (XA,YA,ZA), then coordinate information of the Marker under camera coordinate system can use following equation
It indicates:
Wherein:A indicates Trans matrix, and by joining spin matrix outside video camera, translation matrix forms;Spin matrix and translation
Matrix describes how that a point is transformed into camera coordinate system from world coordinate system jointly;Spin matrix describes world coordinate system
Direction of the reference axis relative to camera coordinates axis;Translation matrix describes under camera coordinate system, the position of space origins
It sets;
ARToolKit is the library that a C/C++ language is write, and can easily write augmented reality by it with let us
Application program.Augmented reality (AR) is to cover the image of computer virtual in real world picture, this technology industry and
All there is great potential in terms of theoretical research;
Iii) using Trans matrix obtained in step ii), inverse matrix is obtained, the 4th number of the first row in inverse matrix
For xd, the 4th number of the second row is yd, the 4th number of the third line is zd, (xd, yd, zd) it is Marker under world coordinate system
Center point coordinate, for calculating distance and direction of the center Marker relative to video camera;
Distance=sqrt (xd^2+yd^2+zd^2);Distance is actual range of the center Marker to video camera;
Angle=atan2 (xd,yd)*180/PI;Angle is angle of the center Marker to video camera.
Preferably, in step 4), judge that crusing robot enters the charging house from the travelling route of crusing robot
Mode, i.e. crusing robot enters from the charging house front door or back door, calculates the seat of crusing robot in world coordinate system
Scale value (x, y, z) determines coordinate of the crusing robot in map;
When crusing robot enters from charging house front door, x=xw+xd, y=yw+yd, z=zw+zd;
When crusing robot enters from charging house back door, x=xw-xd, y=yw-yd, z=zw-zd。
Preferably, in step 5), after vision positioning, when crusing robot is mobile to charging pile position, inspection machine
Relative position between people and Marker constantly changes, and with the artificial object of reference of inspection machine, i.e. the position Marker is constantly sent out
Changing, then coordinate system associated with Marker also changes correspondingly;
The Marker of original state is denoted as Marker A, the Marker after relative movement is denoted as Marker B, Marker
Coordinate of the B under camera coordinate system is (XB,YB,ZB), positional relationship formula of the Marker B under camera coordinate system is:
Wherein:B equally indicates Trans matrix, and by joining spin matrix outside video camera, translation matrix forms;
So, transformational relation of the Marker B under Marker A coordinate system is:
Transformation matrix
Transformation matrix T is transition matrix of the Marker B under coordinate system where Marker A, thus, it is possible to by Marker
Information at an arbitrary position is transformed under the coordinate system of original state Marker A, and final crusing robot is determined according to Marker's
Travelling route is constantly corrected in position, matches until with charging pile, realizes charging.
Robot when work is completed or electricity is lower than given threshold of robot can independently return to charging house and go to fill
Electricity, by visual identity Marker in charging house, by the identification to Marker, and the calculating with itself distance, angle
The position of oneself is found accurately, to complete recharging;It ensure that intelligent inspection robot in this stringent environment of substation
Middle precise positioning completes the work in every such as recharging, realizes entirely autonomous inspection, safely and reliably realizes the unmanned value of substation
It keeps.
Detailed description of the invention
It is described in further detail below in conjunction with attached drawing and embodiments of the present invention
Fig. 1 is that multiple Marker of the invention are identified;
Fig. 2 is the flow chart of present invention identification Marker.
Specific embodiment
It is vision positioning method of the crusing robot in charging house described in the present embodiment, crusing robot includes camera shooting
Machine acquires image by video camera to carry out self poisoning.
When Image Acquisition, since ARToolKit default the operation such as is calibrated, positioned using USB video camera, and we are needed
Web camera is used, ARToolKit is not supported, so needing to combine OpenCv and ARToolKit.
ARToolKit is the library that a C/C++ language is write, and augmented reality application journey can be easily write with let us by it
Sequence.Augmented reality (AR) is to cover the image of computer virtual in real world picture, this technology is ground in industry and theory
Studying carefully aspect, all there is great potential.Piece image in the Format Type of OpenCv and ARToolKit be it is different,
ARToolKit is ARUint8 (unsigned char) format to the operation of image, and OpenCv is then IplImage class, is needed
ARUint8 format is converted by IplImage.
The charging house vision positioning method of the present embodiment, includes the following steps:
1) the setting mark Marker in charging house;The Marker of design for Marker, use has square
Black background, middle white region mark various characteristic patterns, and Fig. 1 gives the example of three kinds of Marker.
Marker inner square is encoded by the library of ARToolKit, and decoding then is based on a kind of height
Simplified matching algorithm calculates other unknown apex coordinates by coordinate vertices known to the identification region Marker.By scheming
It is found that can easily identify the rectangular profile of square from the image of acquisition.Then according to profile apex coordinate
Variation and coordinate conversion relation calculate Marker at a distance from video camera.
2) crusing robot passes through camera-scanning and identification marking Marker;As shown in Fig. 2, first to view collected
Frequency image carries out binary conversion treatment, and image is become bianry image;Then image is split out, extracts the black of Marker
Color frame;And then the contour line of mark is tracked, and determine four vertex of mark on this basis;Four pros are fitted simultaneously
The angle of shape discriminates whether to tilt according to angle;Then it to the image normalization of Marker, and is matched, is seen with the library Marker
Whether with current Marker corresponding template is found;If matching rate is greater than a certain threshold value, then it is assumed that recognize current
Marker reads the coordinate information that current Marker is saved, such as:
Marker_ID:00000003xw=333333, yw=300, zw=3333;
Wherein:xw、yw、zwFor the coordinate in world coordinate system.
This information preservation is in " inFile.txt " file.Read using C/C++ language " inFile.txt " the file information.
The FileStorage function in OpenCv is not used, is because OpenCv is generated " xml ", " files such as yml " have at the time of reading
Possible read error influences to judge.
3) it is matched to Marker and reads Marker after the coordinate information under world coordinate system, Marker is calculated and exists
Rotational translation matrix under camera coordinate system;
In AR system, the conversion process of (camera coordinates system) from world coordinate system to camera coordinate system, is by taking the photograph
Camera acquisition image simultaneously derives relativeness by computer image analysis processing.In this process, it needs to acquire camera shooting
The inside and outside parameter of machine determines the transformational relation between camera coordinate system and world coordinate system by coordinate transform, and calculates
Azimuth information parameter of the video camera in real scene.The rotation translation of video camera belongs to outer ginseng, for describing camera in static state
The movement of camera under scene, or when camera is fixed, the rigid motion of moving object.Therefore, in image mosaic or three-dimensional
In reconstruction, it is necessary to the relative motion between a few width images is asked using outer ginseng, to be registered under the same coordinate system
Face is come;Spin matrix and translation matrix describe how that a point is transformed into camera coordinate system from world coordinate system jointly;Rotation
Direction of the reference axis of matrix description world coordinate system relative to camera coordinates axis;Translation matrix is described sits in video camera
Under mark system, the position of space origins.
World coordinate system:It is a real scene coordinate system referred in mark post and system for describing other coordinate systems,
Any one object can find the position and direction of oneself in world coordinate system;
Marker coordinate system:It is and the associated coordinate system of certain objects.Each Marker has its specific coordinate system,
When Marker is mobile or changes, mark coordinate system related to this also moved change direction;
Camera coordinate system:Using at camera lens optical center as the three-dimensional system of coordinate of origin.
It determines the coordinate of the Marker central point under world coordinate system, i.e., to obtain Marker under camera coordinate system
Rotational translation matrix.The physical meaning of the Trans matrix of ARToolKit includes the three dimensional orientation information of Marker.
Trans matrix, can be obtained by arGetTransMatSquareCont/arGetTransMatSquare function in ARToolKiy
It arrives, Trans matrix is one 3 × 4 matrix, and canonical form is as follows:
What this matrix indicated is the transition matrix that Marker is transformed into camera coordinate system from the coordinate system of itself,
In:R1 to R9 is Camera extrinsic spin matrix, xt、yt、ztFor the translation matrix of Camera extrinsic.
Above-mentioned Trans inverse of a matrix matrix is acquired, the 4th number of the first row is x in inverse matrixd, the 4th of the second row
Number is yd, the 4th number of the third line is zd, (xd, yd, zd) be world coordinate system under Marker center point coordinate, for calculating
Distance and deflection angle of the center Marker relative to video camera;The i.e. posture information of crusing robot, had both included on map
Position, also comprising deflection angle at that time.
Actual range distance=sqrt (x of the center Marker to video camerad^2+yd^2+zd^2);
Corner angle=atan2 (x of the center Marker to video camerad,yd)*180/PI。
Above-mentioned matrix is denoted as matrix A, coordinate of the video camera under local Coordinate System is (Xc, Yc, Zc).By original state
Marker be denoted as Marker A, coordinate of the Marker A under camera coordinate system is (XA,YA,ZA), then Marker A is taking the photograph
Coordinate information under camera coordinate system can be indicated with following equation:
Wherein:A indicates Trans matrix, is made of Camera extrinsic spin matrix, translation matrix.
4) can also sentence after recognizing Marker due to being equipped at least one mark Marker in each charging house
Which charging house where disconnected crusing robot out is, and there are the information of crusing robot for the position of each charging house
In library, it is possible thereby to know crusing robot enter the charging house mode, i.e., crusing robot from the charging house front door or after
Door enters, to calculate the coordinate value (x, y, z) of crusing robot in world coordinate system;When crusing robot is before charging house
When door enters, x=xw+xd, y=yw+yd, z=zw+zd;When crusing robot enters from charging house back door, x=xw-xd, y=
yw-yd, z=zw-zd。
5) under normal circumstances, in order to realize that Marker is arbitrarily placed in world coordinate system, i.e. Marker and world coordinates
System is non-coincidence relationship.The relative positional relationship that world coordinates is bound by changing Marker, can obtain Marker any
The information of position.
When crusing robot is mobile, the relative position between crusing robot and Marker constantly changes, to patrol
The artificial object of reference of machine is examined, i.e. the position Marker constantly changes, then coordinate system associated with Marker also changes correspondingly;
The Marker of original state is denoted as Marker A, the Marker after relative movement is denoted as Marker B, and Marker B is being imaged
Coordinate under machine coordinate system is (XB,YB,ZB), the distance of Marker A to Marker B is set as a, the θ of corner, can by calculating
With release the offset X of coordinate system ', Y ':
X '=a sin θ
Y '=acos θ
Realize anywhere Marker can calculate actual range, it is necessary to by the relativeness with video camera into
Row conversion.Positional relationship formula of Marker A and the Marker B under camera coordinate system be:
Transformational relation of the Marker B under Marker A coordinate system be:
Transformation matrixIt is transition matrix of the Marker B at Marker coordinate system A,
The information of Marker at an arbitrary position can be thus transformed under the coordinate system of original state Marker A, realize Marker
Positioning, final crusing robot constantly corrects travelling route according to the positioning of Marker, matches until with charging pile, real
Now charge.
Claims (4)
1. a kind of crusing robot charging house vision positioning method, the crusing robot include video camera, it is characterised in that:Packet
Include following steps:
1) the setting mark Marker in charging house;The black background that Marker has square is identified, centre passes through white area
Domain marks various characteristic patterns;
2) crusing robot specifically includes following steps by camera-scanning and identification marking Marker:
A) binary conversion treatment is carried out to video image collected, image is become into bianry image;
B) processing is split to image, extracts the dark border of Marker;
C) contour line of tracking mark, and four vertex of mark are determined on this basis;
D) four square angles are fitted, discriminate whether to tilt according to angle;
E) to the image normalization of Marker, and matched with the library Marker, see whether find it is corresponding with current Marker
Template;
If f) matching rate is greater than a certain threshold value, then it is assumed that recognize current Marker, read the alive of current Marker preservation
Coordinate information under boundary's coordinate system;
3) it is matched to Marker and obtains Marker after the coordinate information under world coordinate system, Marker is calculated and is imaging
Rotational translation matrix under machine coordinate system;After carrying out calculation process, distance and direction of the Marker relative to camera are obtained;
4) after being matched to Marker, charging house position corresponding to the Marker can be specified, is determined from the travelling route of robot
Crusing robot enters the mode of the charging house, to obtain current by the Marker coordinate information read in step 3)
Physical location of the crusing robot in map realizes vision positioning;
5) after vision positioning, crusing robot is mobile to charging pile position, in moving process, crusing robot according to
The transformation of the relative position Marker constantly corrects travelling route, matches until with charging pile, realizes charging.
2. a kind of crusing robot charging house vision positioning method as described in claim 1, it is characterised in that:In above-mentioned steps
3) in, distance and direction of the Marker relative to video camera is calculated, is included the following steps:
I) after identifying Marker, coordinate (x of the Marker under world coordinate system can be readw, yw, zw);
Ii) physical meaning of the Trans matrix of ARToolKit includes the three dimensional orientation information of Marker;Trans square
Battle array, can be as obtained by arGetTransMatSquareCont/arGetTransMatSquare function in ARToolKiy, Trans
Matrix is one 3 × 4 matrix, and canonical form is as follows:
Wherein:R1To R9To join spin matrix, x outside video camerat、yt、ztFor the translation matrix joined outside video camera;Above-mentioned matrix indicates
Be transition matrix that Marker is transformed into camera coordinate system from the coordinate system of itself;
Above-mentioned matrix is denoted as matrix A, coordinate of the video camera under local Coordinate System is (Xc, Yc, Zc), and Marker is in video camera
Coordinate under coordinate system is (XA,YA,ZA), then coordinate information of the Marker under camera coordinate system can be indicated with following equation:
Iii) using Trans matrix obtained in step ii), inverse matrix is obtained, the 4th number of the first row is x in inverse matrixd,
4th number of the second row is yd, the 4th number of the third line is zd, (xd, yd, zd) be world coordinate system under Marker center
Point coordinate, for calculating distance and direction of the center Marker relative to video camera;
Distance=sqrt (xd^2+yd^2+zd^2);Distance is actual range of the center Marker to video camera;
Angle=atan2 (xd,yd)*180/PI;Angle is angle of the center Marker to video camera.
3. a kind of crusing robot charging house vision positioning method as described in claim 1, it is characterised in that:In step 4)
In, from the travelling route of crusing robot judge crusing robot enter the charging house mode, i.e., crusing robot from this
Charging house front door or back door enter, and calculate the coordinate value (x, y, z) of crusing robot in world coordinate system, that is, determine survey monitor
Coordinate of the device people in map;
When crusing robot enters from charging house front door, x=xw+xd, y=yw+yd, z=zw+zd;
When crusing robot enters from charging house back door, x=xw-xd, y=yw-yd, z=zw-zd。
4. a kind of crusing robot charging house vision positioning method as described in claim 1, it is characterised in that:In step 5),
Relative position after vision positioning, when crusing robot is mobile to charging pile position, between crusing robot and Marker
It constantly changes, with the artificial object of reference of inspection machine, i.e. the position Marker constantly changes, then associated with Marker
Coordinate system also changes correspondingly;
The Marker of original state is denoted as Marker A, the Marker after relative movement is denoted as Marker B, and Marker B exists
Coordinate under camera coordinate system is (XB,YB,ZB), positional relationship formula of the Marker B under camera coordinate system is:
Wherein:B equally indicates Trans matrix, and by joining spin matrix outside video camera, translation matrix forms;
So, transformational relation of the Marker B under Marker A coordinate system is:
Transformation matrix
Transformation matrix T is transition matrix of the Marker B under coordinate system where Marker A, thus, it is possible to which Marker is in office
Meaning position information be transformed under the coordinate system of original state Marker A, final crusing robot according to the positioning of Marker come
Constantly amendment travelling route, matches until with charging pile, realizes charging.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810555809.9A CN108932477A (en) | 2018-06-01 | 2018-06-01 | A kind of crusing robot charging house vision positioning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810555809.9A CN108932477A (en) | 2018-06-01 | 2018-06-01 | A kind of crusing robot charging house vision positioning method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108932477A true CN108932477A (en) | 2018-12-04 |
Family
ID=64449780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810555809.9A Pending CN108932477A (en) | 2018-06-01 | 2018-06-01 | A kind of crusing robot charging house vision positioning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108932477A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110334648A (en) * | 2019-07-02 | 2019-10-15 | 北京云迹科技有限公司 | Charging pile identifying system and method suitable for robot |
CN111443717A (en) * | 2020-04-24 | 2020-07-24 | 张咏 | Patrol and examine robot system based on speech recognition control |
CN111572377A (en) * | 2020-05-13 | 2020-08-25 | 广州华立科技职业学院 | Visual guidance method for automatic alignment of mobile robot charging station |
CN111625005A (en) * | 2020-06-10 | 2020-09-04 | 浙江欣奕华智能科技有限公司 | Robot charging method, robot charging control device and storage medium |
CN111679671A (en) * | 2020-06-08 | 2020-09-18 | 南京聚特机器人技术有限公司 | Method and system for automatic docking of robot and charging pile |
CN111864920A (en) * | 2020-07-09 | 2020-10-30 | 中国电力科学研究院有限公司 | Transformer substation inspection robot, wireless charging room, wireless charging system and method |
CN112260353A (en) * | 2020-10-10 | 2021-01-22 | 南京飞舟科技有限公司 | Automatic charging system and method for inspection robot of transformer room |
CN112406608A (en) * | 2019-08-23 | 2021-02-26 | 国创新能源汽车能源与信息创新中心(江苏)有限公司 | Charging pile and automatic charging device and method thereof |
CN113949142A (en) * | 2021-12-20 | 2022-01-18 | 广东科凯达智能机器人有限公司 | Inspection robot autonomous charging method and system based on visual identification |
CN114997195A (en) * | 2022-05-12 | 2022-09-02 | 安徽大学绿色产业创新研究院 | Component checking and positioning method based on inspection robot |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050221840A1 (en) * | 2004-03-24 | 2005-10-06 | Kabushiki Kaisha Toshiba | Mobile device and mobile device system therefor |
JP2009255263A (en) * | 2008-04-21 | 2009-11-05 | Morihiro Saito | Traveling robot |
CN106125724A (en) * | 2016-06-13 | 2016-11-16 | 华讯方舟科技有限公司 | A kind of method and system of robot autonomous charging |
CN106302798A (en) * | 2016-08-31 | 2017-01-04 | 杭州申昊科技股份有限公司 | A kind of substation inspection communication system |
CN106849355A (en) * | 2017-02-10 | 2017-06-13 | 云南电网有限责任公司电力科学研究院 | A kind of transformer station's detecting system based on X-ray machine people |
-
2018
- 2018-06-01 CN CN201810555809.9A patent/CN108932477A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050221840A1 (en) * | 2004-03-24 | 2005-10-06 | Kabushiki Kaisha Toshiba | Mobile device and mobile device system therefor |
JP2009255263A (en) * | 2008-04-21 | 2009-11-05 | Morihiro Saito | Traveling robot |
CN106125724A (en) * | 2016-06-13 | 2016-11-16 | 华讯方舟科技有限公司 | A kind of method and system of robot autonomous charging |
CN106302798A (en) * | 2016-08-31 | 2017-01-04 | 杭州申昊科技股份有限公司 | A kind of substation inspection communication system |
CN106849355A (en) * | 2017-02-10 | 2017-06-13 | 云南电网有限责任公司电力科学研究院 | A kind of transformer station's detecting system based on X-ray machine people |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110334648A (en) * | 2019-07-02 | 2019-10-15 | 北京云迹科技有限公司 | Charging pile identifying system and method suitable for robot |
CN110334648B (en) * | 2019-07-02 | 2022-01-11 | 北京云迹科技有限公司 | Charging pile identification system and method suitable for robot |
CN112406608A (en) * | 2019-08-23 | 2021-02-26 | 国创新能源汽车能源与信息创新中心(江苏)有限公司 | Charging pile and automatic charging device and method thereof |
CN111443717A (en) * | 2020-04-24 | 2020-07-24 | 张咏 | Patrol and examine robot system based on speech recognition control |
CN111572377A (en) * | 2020-05-13 | 2020-08-25 | 广州华立科技职业学院 | Visual guidance method for automatic alignment of mobile robot charging station |
CN111679671A (en) * | 2020-06-08 | 2020-09-18 | 南京聚特机器人技术有限公司 | Method and system for automatic docking of robot and charging pile |
CN111625005A (en) * | 2020-06-10 | 2020-09-04 | 浙江欣奕华智能科技有限公司 | Robot charging method, robot charging control device and storage medium |
CN111864920A (en) * | 2020-07-09 | 2020-10-30 | 中国电力科学研究院有限公司 | Transformer substation inspection robot, wireless charging room, wireless charging system and method |
CN112260353A (en) * | 2020-10-10 | 2021-01-22 | 南京飞舟科技有限公司 | Automatic charging system and method for inspection robot of transformer room |
CN113949142A (en) * | 2021-12-20 | 2022-01-18 | 广东科凯达智能机器人有限公司 | Inspection robot autonomous charging method and system based on visual identification |
CN114997195A (en) * | 2022-05-12 | 2022-09-02 | 安徽大学绿色产业创新研究院 | Component checking and positioning method based on inspection robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108932477A (en) | A kind of crusing robot charging house vision positioning method | |
Minaeian et al. | Vision-based target detection and localization via a team of cooperative UAV and UGVs | |
Tardif et al. | Monocular visual odometry in urban environments using an omnidirectional camera | |
Alonso et al. | Accurate global localization using visual odometry and digital maps on urban environments | |
Scaramuzza | Omnidirectional vision: from calibration to root motion estimation | |
Zhao et al. | A vehicle-borne urban 3-D acquisition system using single-row laser range scanners | |
Nair et al. | Moving obstacle detection from a navigating robot | |
CN110119698A (en) | For determining the method, apparatus, equipment and storage medium of Obj State | |
Zhang et al. | High-precision localization using ground texture | |
Bao et al. | Vision-based horizon extraction for micro air vehicle flight control | |
Fernández et al. | Guidance of a mobile robot using an array of static cameras located in the environment | |
CN108362205A (en) | Space ranging method based on fringe projection | |
Bazin et al. | UAV attitude estimation by vanishing points in catadioptric images | |
Kim et al. | External vehicle positioning system using multiple fish-eye surveillance cameras for indoor parking lots | |
Aryal | Object detection, classification, and tracking for autonomous vehicle | |
CN104166995B (en) | Harris-SIFT binocular vision positioning method based on horse pace measurement | |
Ye et al. | Extrinsic calibration of a monocular camera and a single line scanning Lidar | |
Qian et al. | Survey on fish-eye cameras and their applications in intelligent vehicles | |
Bousaid et al. | Perspective distortion modeling for image measurements | |
Junejo et al. | Autoconfiguration of a dynamic nonoverlapping camera network | |
CN113487726A (en) | Motion capture system and method | |
Del Pizzo et al. | Reliable vessel attitude estimation by wide angle camera | |
Jiang et al. | Icp stereo visual odometry for wheeled vehicles based on a 1dof motion prior | |
Mitsudome et al. | Autonomous mobile robot searching for persons with specific clothing on urban walkway | |
Van Hamme et al. | Robust visual odometry using uncertainty models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |