CN108919810A - The localization for Mobile Robot and navigation system of view-based access control model teaching - Google Patents
The localization for Mobile Robot and navigation system of view-based access control model teaching Download PDFInfo
- Publication number
- CN108919810A CN108919810A CN201810832511.8A CN201810832511A CN108919810A CN 108919810 A CN108919810 A CN 108919810A CN 201810832511 A CN201810832511 A CN 201810832511A CN 108919810 A CN108919810 A CN 108919810A
- Authority
- CN
- China
- Prior art keywords
- teaching
- camera
- personal computer
- robot
- industrial personal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004807 localization Effects 0.000 title claims abstract description 16
- 230000033001 locomotion Effects 0.000 claims abstract description 30
- 230000005622 photoelectricity Effects 0.000 claims abstract description 4
- 238000004422 calculation algorithm Methods 0.000 claims description 33
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000009987 spinning Methods 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 5
- 230000000694 effects Effects 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 230000005611 electricity Effects 0.000 claims description 2
- 239000007787 solid Substances 0.000 claims 1
- 238000000034 method Methods 0.000 description 14
- 238000005516 engineering process Methods 0.000 description 10
- 230000007613 environmental effect Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 229910002056 binary alloy Inorganic materials 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000037361 pathway Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000003796 beauty Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241001269238 Data Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present invention provides the localization for Mobile Robot and navigation system of a kind of view-based access control model teaching, is related to mobile robot teaching technical field.The system includes industrial personal computer, camera, power supply, on-vehicle battery, boost module, basic motion control unit, handle, basic motion control unit includes difference wheel trolley, embedded board, photoelectricity codec, DC speed-reducing, TTL turns USB module, by imparting knowledge to students and repeating two stages, in teaching phase, vehicle is by handle drives, store the path being made of the movement of arbitrary sequence, in duplication stages, odometer information is used to index teaching phase institute village's image information as distance, to match Current camera middle image information collected, calculate the deviation angular speed of headstock, it is added in deposited speed, to compensate Vehicular turn, to ensure that it tracks scheduled path.The present invention is demarcated repeatedly without camera, can autonomous learning and traversal arbitrary shape path, repeatedly along expectation path automatic running.
Description
Technical field
The present invention relates to mobile robot teaching technical field more particularly to a kind of mobile robots of view-based access control model teaching
Position fixing and navigation system.
Background technique
With the raising of human cost and the development of science and technology, robot, especially industrial robot, industrial automatic
Change field is more and more widely used.Self-navigation carrier (Automated Guided Vehicle, hereinafter referred to as AGV)
Also referred to as transfer robot is the important link in modern intelligent logistics system, however the AGV guidance technology of current mainstream is simultaneously
It cannot be made entirely autonomous from truly realizing at work.
Navigation is the most important mark of AGV technology development as one of core technology.AGV can according to navigation mode difference
To be divided into fixed route method and free path method two major classes.
Fixed route method is mainly characterized by giving AGV setting path in advance, AGV is according to being embedded in road using electromagnetism guiding as representative
Cable under face generates the driving direction that electromagnetic field adjusts itself.This guide mode uses physical pathway, environment representation letter
It is single, it is not necessarily to automatic map building technology, when positioning, only needs to detect the deviation between car body and physical pathway by sensor, but
The mode flexibility of fixation locus is too low, not convenient for safeguarding, if path changes, needs to dig rewiring, cost is too high.
Free path method mainly includes laser navigation, millimetre-wave radar navigation, inertial navigation system, GPS system etc..Inertia
There are systematic errors for navigation system;GPS positioning system is larger due to position error, it is difficult to be suitble to positioning accuracy request very high
AGV.With the development of science and technology laser sensor, imaging sensor are gradually applied in AGV airmanship.Compared to camera
Equal onboard sensors, laser radar have high-precision, high-resolution advantage, and extensive on many automatic Pilot instruction carriages
It carries.But this technology also has its ignorable disadvantage --- it is at high cost.In recent years, due to image processing techniques and computer
The development of technology, in addition the advantage that camera is at low cost, the airmanship of view-based access control model is received significant attention.In comparison,
At present with camera+software " vision guided navigation technology, be industrially easy to reach and cost be lower.
But compared with conventional navigation mode, it is more that vision guided navigation is related to face, and technology is complicated.Firstly, vision guided navigation is not present
Physical pathway, therefore the deviation between car body and physical pathway is directly detected like that without image of Buddha conventional navigation mode.Therefore it is regarding
Feel in navigation, AGV acquires environmental data by in-vehicle camera in real time, by by the processing result of environmental data in advance establish
Environmental map is compared obtained from the pose estimation in map, and this pose estimation is in contrast locating conventional navigation mode
The calculating of direct deviation want complicated.Secondly, map used in vision guided navigation, which has, significantly estimates information, need to carry out accurate
It draws, the environmental map precision created influences AGV positioning accuracy.
In circumstances not known, due to wall etc. in investigative range and the measurement accuracy limitation of environment sensing sensor, environment
Object to sensor detect the problems such as blocking, robot can not pass through one-shot measurement create global context map.Therefore, machine
People can only obtain enough environment sensing data in constantly environment heuristic process, could complete the creation of global context map
Work.For the robot in the environment local map of each position creation, robot each position is only known exactly which
Pose the local map could be converted to global map, i.e. the creation of map depends on the pose of robot.However it is real
Situation is, what the pose of robot was obtained often by environmental map, i.e., the pose of robot depends on environmental map.Pose
This complementary relationship between map estimation proposes new project to navigation of the robot under circumstances not known, claims
Be " simultaneous localization and mapping " (Simultaneous Localization and Mapping, hereinafter referred to as SLAM)
Problem.The final purpose of SLAM is the consistent environmental map of creation, if can be realized correct closed loop is method self performance
A kind of embodiment;And the map creating method of position error is eliminated based on closed loop, passing through position and attitude error estimated by flying track conjecture
In biggish situation, constraint condition can only be established by householder method, i.e., under the premise of establishing correct closed loop, then pass through optimization
Method corrects robot pose and creation environmental map.Therefore such realize of slam system needs to construct globally consistent map
Therefore higher calculating cost is brought.
Summary of the invention
The technical problem to be solved by the present invention is in view of the above shortcomings of the prior art, provide a kind of view-based access control model teaching
Localization for Mobile Robot and navigation system, computational efficiency is high, demarcates repeatedly without camera, can rapidly carry out indoor AGV's
Navigation, layman is easy to get started, easy to operate, easy to use, and can with autonomous learning and traversal arbitrary shape path,
It can be repeatedly along expected path automatic running without manually participating in.
In order to solve the above technical problems, the technical solution adopted by the present invention is that:
A kind of localization for Mobile Robot and navigation system of view-based access control model teaching, including:Industrial personal computer, camera, power supply,
On-vehicle battery, boost module, basic motion control unit, handle;
The basic motion control unit includes difference wheel trolley, embedded board, photoelectricity codec, direct current deceleration
Motor, TTL turn USB module, for realizing the motion control of trolley platform, display trolley status information and encoder data
It reads and sends;The difference wheel trolley connects power supply with the power end of embedded board;The camera passes through screw
It is fixed on the upper platform of difference wheel trolley, and visual angle is towards difference wheel trolley direction of advance;The photoelectric encoder is mounted on
On each wheel of difference wheel trolley;DC speed-reducing is mounted on each wheel of difference wheel trolley, for controlling difference
Take turns vehicle motor revolving speed;
The industrial personal computer is communicated by USB interface and camera, obtains the image information that camera generates;Industrial personal computer also with
The power end of boost module connection, the boost module connects on-vehicle battery, for increasing the 12V output voltage of on-vehicle battery
19V voltage needed for industrial personal computer;Industrial personal computer turns USB module by TTL and connect with embedded board, for opening to embedded
It sends out plate and sends control instruction, while reading the data of wheel photoelectric encoder, obtain the odometer information of trolley;
The handle is connect with industrial personal computer, for providing the control of difference wheel moving of car enabled effect;
Basic motion control program, including data transmit-receive program and motor movement control are stored in the embedded board
Processing procedure sequence;Data transmit-receive program is based on SCIP2.0 agreement and writes, and is used for and industrial personal computer transmission speed information and encoder data;
Motor motion control program is used to velocity information solution being counted as PWM wave to control each DC speed-reducing revolving speed;
It is stored in the industrial personal computer based on ROS (i.e. robot operating system) platform under linux operating system, use
The teaching software upper layer Navigator and PID control program that C Plus Plus is write, for realizing the entire run of whole system, institute
Stating teaching software upper layer Navigator includes teaching phase and duplication stages;In teaching phase, robot is placed in initial position,
Robot mobile free routing on the basis of forward and backward and spinning motion, and record path point are driven by operator's operation handle
Neighbouring image information, vehicle speed information and the waypoint position coordinate obtained by odometer information, the figure near path point
Picture information is by camera captured in real-time, after the completion, presses the Ctrl-c key in industrial personal computer terminal, stops recording, and record information is stored in
In industrial personal computer;In duplication stages, the speed command deposited will play back again, pin handle and enable key, robot starts along religion
The path automatic Pilot in stage, robot by the camera image information read in real time and it teaching phase save image believe
Breath is matched using ORB algorithm, and calculates the conversion of the image from the image of reading to record, then sends this conversion to
PID control program carries out error compensation, calculates the deviation angular speed and velocity magnitude of headstock, be added to deposited speed
In, so that Vehicular turn and velocity magnitude are compensated, it is final to realize automatic repeat in teaching phase to ensure to track scheduled path
Taught route;
The basic motion control program is used to receive the rate control instruction of teaching software upper layer Navigator transmission, borrows
The characteristic for helping difference wheel trolley realizes the forward and backward and spinning motion of robot by the operation of handle.
Further, in teaching phase, the timestamp for storing camera image information and path point should be consistent, and root
According to scene domain size and the customized spacing distance of object location required precision, an image is recorded every one section of spacing distance
Information.
Further, in teaching phase deposited path point with document form there are in industrial personal computer, it is reusable, and
The repeated teaching stage can get new path.
Further, the path point in teaching phase in load document and camera image information are used into memory, are sentenced
Breaking, all whether teaching is complete for all points, if it is, teaching terminates;Otherwise next reference path point and camera figure are obtained
As information, and execute subsequent operation.
Further, in duplication stages, if the small vehicle speed of difference wheel is close to zero and the time is more than five seconds or deviation reference
Path 1m or more stops continuing teaching and reports an error or alarm.
Further, the starting point of duplication stages must be in the distance range of the 1m of teaching phase starting point, and pose is towards one
It causes.
Generated beneficial effect is by adopting the above technical scheme:A kind of shifting of view-based access control model teaching provided by the invention
Mobile robot position fixing and navigation system, computational efficiency is high, demarcates repeatedly without camera, can rapidly carry out leading for indoor AGV
Boat, layman is easy to get started, easy to operate, easy to use, and can with autonomous learning and traversal arbitrary shape path,
Teaching phase, robot are driven by a human operator, and robot can store its speed, image information and odometer letter
Breath;In independent navigation, it is not necessarily to explicit robot localization in two dimension even three-dimensional space, need to only play back in teaching phase
The speed of habit, while velocity magnitude and direction can also be carried out according to image information and by the location information that odometer information generates
Amendment, therefore, robot can be repeatedly along expected path automatic running without manually participating in.
Detailed description of the invention
Fig. 1 is the localization for Mobile Robot and navigation system structural frames of view-based access control model teaching provided in an embodiment of the present invention
Figure;
Fig. 2 is system software block schematic illustration provided in an embodiment of the present invention;
Fig. 3 is matching algorithm flow chart provided in an embodiment of the present invention;
Fig. 4 is that characteristic image provided in an embodiment of the present invention generates schematic diagram;
Fig. 5 is the image schematic diagram to be processed in FAST algorithm provided in an embodiment of the present invention.
Specific embodiment
With reference to the accompanying drawings and examples, specific embodiments of the present invention will be described in further detail.Implement below
Example is not intended to limit the scope of the invention for illustrating the present invention.
As shown in Figure l, the present embodiment provides the localization for Mobile Robot and navigation system of a kind of view-based access control model teaching, packets
It includes:Industrial personal computer, camera, power supply, on-vehicle battery, basic motion control unit, handle.
Basic motion control unit includes difference wheel trolley, embedded board, DC speed-reducing (band photoelectric coding
Device), TTL turn USB module, for realizing the motion control of trolley platform, display trolley status information and encoder data
It reads and sends.
In the present embodiment, the model STM32 of embedded board, the model of industrial personal computer accounts for beautiful gk400, and direct current slows down
The model MD36 planetary reducing motor (500 linear light photoelectric coder of band) of motor, the model that TTL turns USB module is CH340G, phase
The model of machine is Asus Xtion pro live, model MKX-DC12V~19V of boost module, the model of on-vehicle battery
Triumphant MagicWell 12V100A, the model SKYRC IMAX B6 of power supply, the model Sony PS3 of handle.
Difference wheel trolley connects power supply with the power end of embedded board STM32, is supplied by SKYRC IMAX B6
Electricity.Camera is fixed by screws in the upper platform of difference wheel trolley, and visual angle is towards difference wheel trolley direction of advance.Photoelectricity is compiled
Code device is mounted in the DC speed-reducing of difference wheel trolley;DC speed-reducing is mounted on each wheel of difference wheel trolley
On, for controlling difference wheel vehicle motor revolving speed.
Industrial personal computer accounts for beautiful gk400 and is communicated by USB3.0 interface and camera Xtion pro live, and it is raw to obtain camera
At image information, it is ensured that between camera and industrial personal computer data transmit stability, safety and real-time, be industrial personal computer on
The teaching software of operation handles the great amount of images data transmitted by camera in real time and provides effective support.Industrial personal computer accounts for beautiful gk400
It also connect with boost module, is then powered by on-vehicle battery Kai Meiwei 12V100A, carried out especially by the port on-vehicle battery 12V
Power supply, boost module is for 19V voltage needed for the 12V voltage output of on-vehicle battery is increased to industrial personal computer.Industrial personal computer accounts for beauty
Gk400 turns USB module CH340G by TTL and connect with embedded board STM32, controls for sending to embedded board
Instruction, while the data of wheel photoelectric encoder are read, obtain the odometer information of trolley.Carry linux system accounts for beauty
Gk400 industrial personal computer is the brain of whole system, it on the one hand pass through TTL turn USB module CH340G to STM32 send control refer to
Enable, can forward and backward and spinning motion, on the other hand read wheel photoelectric encoder data and camera data, realize
The fusion of two kinds of sensing datas.
Handle Sony PS3 is connect with industrial personal computer, for providing the control of difference wheel moving of car enabled effect.
Basic motion control program, including data transmit-receive program and motor movement are stored in embedded board STM32
Program is controlled, wherein data transmit-receive program is write based on SCIP2.0 agreement, is responsible for the data communication with industrial personal computer, motor movement
Control program is responsible for resolving the speed command that industrial personal computer issues, for controlling motor speed;
It is stored in the industrial personal computer based on ROS (i.e. robot operating system) platform under linux operating system, use
The teaching software upper layer Navigator that C Plus Plus is write, for realizing the entire run of whole system, including teaching phase and again
The multiple stage.The basic motion control program is used to receive the rate control instruction of teaching software upper layer Navigator transmission, borrows
The characteristic for helping difference wheel trolley realizes the forward and backward and spinning motion of robot by the operation of handle.System is soft in the present embodiment
Part frame is as shown in Figure 2.
In teaching phase, robot is placed in initial position, by operator's operation handle driving robot it is forward and backward and from
Mobile free routing on the basis of transhipment is dynamic, and image information near record path point, vehicle speed information and by odometer
The waypoint position coordinate that information obtains, the image information near path point is by camera captured in real-time, after the completion, in industrial personal computer
Ctrl+c key is pressed in Ubuntu system terminal, stops recording, deposited path point is stored in industrial personal computer with document form, can be weighed
It is multiple to use, and the repeated teaching stage can get new path.The timestamp of storage camera image information and path point should be kept
Unanimously, and according to scene domain size and the customized spacing distance of object location required precision, remember every one section of spacing distance
Record an image information.Path point and camera image information in load document are used into memory, whether judge all points
All teaching is complete, if it is, teaching terminates;Otherwise next reference path point and camera image information are obtained, and is executed
Subsequent operation.
In duplication stages, robot starting point must be in the distance range of the 1m of teaching phase starting point, and pose direction is consistent,
The speed command deposited will play back again, pin handle and enable key (can customize), robot starts the road along teaching phase
Diameter automatic Pilot, robot by the camera image information read in real time and it teaching phase save image information use ORB
Algorithm is matched, and calculates the conversion of the image from the image of reading to record, then sends PID control journey for this conversion
Sequence carries out error compensation, calculates the deviation angular speed and velocity magnitude of headstock, is added in deposited speed, to compensate
Vehicular turn and velocity magnitude, it is final to realize automatic repeat in the taught route of teaching phase to ensure to track scheduled path.
If the small vehicle speed of difference wheel is close to zero and the time is more than five seconds or deviates reference path 1m or more, stop continuing teaching and report an error
Or alarm.
ORB (full name is ORiented Brief) algorithm is that the new Corner Detection of one kind and feature describe algorithm, process
As shown in figure 3, including the following steps:It is primarily based on FAST algorithm and realizes crucial point location, realize that selection is best based on Harris
Key point, carry out scale pyramid transform, center and angle direction calculate, extract binary system BRIEF description, carry out low phase
The block of pixels filtering of closing property finally receives final 256 description.
Image Feature Matching is a very important intermediate steps, it is fixed in resurfacing, Three-dimension object recognition, camera
There is extremely important application in the problems such as position.The characteristic image of generation is as shown in Figure 4.Algorithm comparison has with other algorithms
Similar even preferably repeatability, robustness, and calculate faster.ORB algorithm is that FAST Corner Detection and BRIEF is special
Sign description combines and has carried out improved Image Feature Matching algorithm, improves the execution efficiency of algorithm, counts for algorithm in real time
It is applied in calculation machine vision system and provides possibility.
Wherein FAST angle point is defined as by the presenter Rosten etc. of FAST:If certain pixel with it is enough in its surrounding neighbors
Pixel difference it is larger, then the pixel may be angle point.The step of FAST algorithm:
Step 1, as shown in figure 5, take one centered on pixel p in an image to be processed, radius be 3 circle, should
There is 16 pixels (p1, p2 ..., p16) on circle;
Step 2, define a threshold value, calculate p1, p9, p5, p13 and center p pixel difference, if their absolute value have to
Few 3 are more than threshold value, then regard candidate angular, then carry out next step investigation;Otherwise, it is impossible to be angle point;
If step 3, p are candidate points, the pixel difference of this 16 points and center p of p1~p16 is calculated, is at least connected if they have
Continuous 9 are more than threshold value, then are angle points;Otherwise, it is impossible to be angle point;
Step 4 carries out non-maxima suppression to image:Calculate FAST score value (i.e. score value namely s that feature is pointed out
Value), judge in the neighborhood (such as 3x3 or 5x5) centered on characteristic point p, if calculating there are multiple characteristic points, judgement is each
The s value (the absolute value summations of 16 points and central difference) of characteristic point, if p is that response is maximum in all characteristic points of neighborhood,
Then retain, otherwise, inhibits;If only one characteristic point (i.e. angle point) in neighborhood, retains;Score calculation formula is as follows:
Wherein, V indicates score, and t indicates threshold value.
Above method is FAST-9, certain FAST-10, FAST-11, FAST-12 be also it is the same, only in step 3,
It is different more than the number of threshold value.FAST algorithm implements simply, especially famous fastly with speed.
But the algorithm has only determined the position of characteristic point, does not obtain other any information.In ORB algorithm, still
The position of characteristic point is detected using FAST, but algorithm has carried out following change:(by taking FAST-9 as an example)
(1) assume to extract N number of characteristic point in the picture, then reduce the threshold value of FAST, the spy for detecting FAST algorithm
Sign point is greater than N;
(2) at characteristic point position, the Harris response R of characteristic point is calculated, takes the point conduct that top n response is big
FAST characteristic point (Harris angle point response computation:Mathematical derivation in Harris Corner Detection);
(3) due to the rotational invariance of BRIEF algorithm to be solved, then need to calculate the principal direction of characteristic point.
It is calculated in ORB algorithm using center of gravity, it is as follows:
θ=atan2 (m01, m10)
Wherein, (x, y) is the point in feature neighborhood, and atan2 indicates arc tangent, and obtained θ value is exactly FAST characteristic point
Principal direction.
BRIEF algorithm is that the characteristic point having detected that is described, it is a kind of binary-coded description, is abandoned
Using the conventional method of area grayscale histogram Expressive Features points, the speed of feature descriptor foundation is greatly accelerated, together
When also greatly reduce time of characteristic matching, be it is a kind of very quickly, very promising algorithm.
Since BRIEF is only Feature Descriptor, so to obtain the position of characteristic point in advance, FAST feature can use
The position of point detection algorithm or Harris Corner Detection Algorithm or SIFT, SURF scheduling algorithm detection characteristic point.Next in feature
Vertex neighborhood establishes feature descriptor using BRIEF algorithm.
Specific step is as follows for BRIEF algorithm:
Step 1 is to reduce noise jamming, first carries out gaussian filtering, variance 2, Gauss window 9x9 to image;
Step 2, centered on characteristic point, take the neighborhood window of S × S, a pair of (two) point randomly selected in window, than
Compared with the size of the two pixel, following binary system assignment is carried out:
Wherein, p (x), p (y) are the pixel value of random point x=(u1, v1), y=(u2, v2) respectively;
Step 3 randomly selects N to random point in the window, repeats the binary system assignment of step 2, forms a binary system
Coding, this coding are exactly the description to characteristic point, i.e. Feature Descriptor.General N=256.
It is the step of BRIEF feature describes algorithm above.
One 256bit has been obtained for each of width figure characteristic point by feature extraction algorithm above
Binary coding.Next to there is similar or lap two images to be registrated.
Feature pairing is made decisions using Hamming distance:
It is not centainly pairing 1. two feature codings correspond to the number of identical element on the position bit less than 128;
2. on a width figure on the characteristic point position bit corresponding with feature coding on another width figure identical element the largest number of spies
Sign point matches.
The algorithm speed advantage is quite obvious, but have the shortcomings that three it is fatal.It, can be as SIFT for scale invariability
Algorithm is the same, and sub- scale space construction image pyramid solves.
Thus matched point pair in two images is obtained, and is calculated a little to corresponding X, Y coordinates information.Because camera towards
Trolley direction of advance and perpendicular to trolley platform makes the difference matched point to X and at being transmitted to controller after certain coefficient to correct
The image information of trolley course, the image information for seeing trolley direction of advance camera and teaching phase record is coincide.
Finally it should be noted that:The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although
Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that:It still may be used
To modify to technical solution documented by previous embodiment, or some or all of the technical features are equal
Replacement;And these are modified or replaceed, model defined by the claims in the present invention that it does not separate the essence of the corresponding technical solution
It encloses.
Claims (6)
1. a kind of localization for Mobile Robot and navigation system of view-based access control model teaching, it is characterised in that:Including:Industrial personal computer, camera,
Power supply, on-vehicle battery, boost module, basic motion control unit, handle;
The basic motion control unit includes difference wheel trolley, embedded board, photoelectricity codec, direct current deceleration electricity
Machine, TTL turn USB module, for realizing trolley platform motion control, display trolley status information and encoder data reading
It takes and sends;The difference wheel trolley connects power supply with the power end of embedded board;The camera is solid by screw
It is scheduled on the upper platform of difference wheel trolley, and visual angle is towards difference wheel trolley direction of advance;The photoelectric encoder is mounted on difference
On each wheel of minute wheel trolley;DC speed-reducing is mounted on each wheel of difference wheel trolley, for controlling difference wheel
Vehicle motor revolving speed;
The industrial personal computer is communicated by USB interface and camera, obtains the image information that camera generates;Industrial personal computer also with boosting
The power end of module connection, the boost module connects on-vehicle battery, for the 12V output voltage of on-vehicle battery to be increased to work
19V voltage needed for control machine;Industrial personal computer turns USB module by TTL and connect with embedded board, for embedded board
Control instruction is sent, while reading the data of wheel photoelectric encoder, obtains the odometer information of trolley;
The handle is connect with industrial personal computer, for providing the control of difference wheel moving of car enabled effect;
Basic motion control program, including data transmit-receive program and motor motion control journey are stored in the embedded board
Sequence;Data transmit-receive program is based on SCIP2.0 agreement and writes, and is used for and industrial personal computer transmission speed information and encoder data;Motor
Motion control program is used to velocity information solution being counted as PWM wave to control each DC speed-reducing revolving speed;
Be stored in the industrial personal computer based under linux operating system ROS (i.e. robot operating system) platform, using C++
The teaching software upper layer Navigator and PID control program that language is write, it is described to show for realizing the entire run of whole system
Teaching software upper layer Navigator includes teaching phase and duplication stages;In teaching phase, robot is placed in initial position, by grasping
Work person's operation handle drives robot mobile free routing on the basis of forward and backward and spinning motion, and near record path point
Image information, vehicle speed information and the waypoint position coordinate obtained by odometer information, the image letter near path point
Breath is by camera captured in real-time, after the completion, presses the Ctrl-c key in industrial personal computer terminal, stops recording, and record information is stored in industry control
In machine;In duplication stages, the speed command deposited will play back again, pin handle and enable key, robot starts along teaching rank
Section path automatic Pilot, robot by the camera image information read in real time and it teaching phase save image information adopt
It is matched with ORB algorithm, and calculates the conversion of the image from the image of reading to record, then send PID for this conversion
It controls program and carries out error compensation, calculate the deviation angular speed and velocity magnitude of headstock, be added in deposited speed, from
And Vehicular turn and velocity magnitude are compensated, it is final to realize that automatic repetition is taught in teaching phase to ensure to track scheduled path
Route;
The basic motion control program is used to receive the rate control instruction of teaching software upper layer Navigator transmission, by difference
The characteristic of minute wheel trolley realizes the forward and backward and spinning motion of robot by the operation of handle.
2. the localization for Mobile Robot and navigation system of view-based access control model teaching according to claim 1, it is characterised in that:?
In the teaching phase, the timestamp for storing camera image information and path point should be consistent, and according to scene domain size
With the customized spacing distance of object location required precision, an image information is recorded every one section of spacing distance.
3. the localization for Mobile Robot and navigation system of view-based access control model teaching according to claim 1 or 2, feature exist
In:In the teaching phase deposited path point with document form there are in industrial personal computer, it is reusable, and repeated teaching rank
Section can get new path.
4. the localization for Mobile Robot and navigation system of view-based access control model teaching according to claim 3, it is characterised in that:?
Path point and camera image information in the teaching phase in load document are used into memory, whether all to judge all points
Teaching is complete, if it is, teaching terminates;Otherwise next reference path point and camera image information are obtained, and after execution
Continuous operation.
5. the localization for Mobile Robot and navigation system of view-based access control model teaching according to claim 1, it is characterised in that:?
The duplication stages stop if the small vehicle speed of difference wheel is close to zero and the time is more than five seconds or deviates reference path 1m or more
Continue teaching and reports an error or alarm.
6. the localization for Mobile Robot and navigation system of view-based access control model teaching, feature exist according to claim 1 or 5
In:The starting point of the duplication stages must be in the distance range of the 1m of teaching phase starting point, and pose is towards unanimously.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810832511.8A CN108919810A (en) | 2018-07-26 | 2018-07-26 | The localization for Mobile Robot and navigation system of view-based access control model teaching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810832511.8A CN108919810A (en) | 2018-07-26 | 2018-07-26 | The localization for Mobile Robot and navigation system of view-based access control model teaching |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108919810A true CN108919810A (en) | 2018-11-30 |
Family
ID=64418105
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810832511.8A Pending CN108919810A (en) | 2018-07-26 | 2018-07-26 | The localization for Mobile Robot and navigation system of view-based access control model teaching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108919810A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110058594A (en) * | 2019-04-28 | 2019-07-26 | 东北大学 | The localization for Mobile Robot navigation system and method for multisensor based on teaching |
CN110097586A (en) * | 2019-04-30 | 2019-08-06 | 青岛海信网络科技股份有限公司 | A kind of Face datection method for tracing and device |
CN111624990A (en) * | 2019-02-28 | 2020-09-04 | 富华科精密工业(深圳)有限公司 | Automatic navigation method, server and storage medium |
CN112008690A (en) * | 2019-05-30 | 2020-12-01 | 精工爱普生株式会社 | Robot system and portable teaching device |
CN112572269A (en) * | 2020-12-25 | 2021-03-30 | 广州华立科技职业学院 | Raw material distribution device with tracking function for construction site and implementation method thereof |
CN113946156A (en) * | 2021-12-20 | 2022-01-18 | 广州朗国电子科技股份有限公司 | Motion path teaching control method and control system of wheeled robot |
CN116117799A (en) * | 2022-12-19 | 2023-05-16 | 广东建石科技有限公司 | Machine vision tracking compensation method and device, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102087530A (en) * | 2010-12-07 | 2011-06-08 | 东南大学 | Vision navigation method of mobile robot based on hand-drawing map and path |
CN103226924A (en) * | 2013-04-12 | 2013-07-31 | 华南理工大学广州学院 | Tour guiding and explaining service robot system and tour guiding and explaining method thereof |
CN105116886A (en) * | 2015-08-11 | 2015-12-02 | 余路 | Robot autonomous walking method |
CN107305381A (en) * | 2016-04-21 | 2017-10-31 | 上海慧流云计算科技有限公司 | A kind of self-navigation robot and automatic navigation method |
-
2018
- 2018-07-26 CN CN201810832511.8A patent/CN108919810A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102087530A (en) * | 2010-12-07 | 2011-06-08 | 东南大学 | Vision navigation method of mobile robot based on hand-drawing map and path |
CN103226924A (en) * | 2013-04-12 | 2013-07-31 | 华南理工大学广州学院 | Tour guiding and explaining service robot system and tour guiding and explaining method thereof |
CN105116886A (en) * | 2015-08-11 | 2015-12-02 | 余路 | Robot autonomous walking method |
CN107305381A (en) * | 2016-04-21 | 2017-10-31 | 上海慧流云计算科技有限公司 | A kind of self-navigation robot and automatic navigation method |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111624990A (en) * | 2019-02-28 | 2020-09-04 | 富华科精密工业(深圳)有限公司 | Automatic navigation method, server and storage medium |
CN110058594A (en) * | 2019-04-28 | 2019-07-26 | 东北大学 | The localization for Mobile Robot navigation system and method for multisensor based on teaching |
CN110097586A (en) * | 2019-04-30 | 2019-08-06 | 青岛海信网络科技股份有限公司 | A kind of Face datection method for tracing and device |
CN112008690A (en) * | 2019-05-30 | 2020-12-01 | 精工爱普生株式会社 | Robot system and portable teaching device |
CN112572269A (en) * | 2020-12-25 | 2021-03-30 | 广州华立科技职业学院 | Raw material distribution device with tracking function for construction site and implementation method thereof |
CN113946156A (en) * | 2021-12-20 | 2022-01-18 | 广州朗国电子科技股份有限公司 | Motion path teaching control method and control system of wheeled robot |
CN116117799A (en) * | 2022-12-19 | 2023-05-16 | 广东建石科技有限公司 | Machine vision tracking compensation method and device, electronic equipment and storage medium |
CN116117799B (en) * | 2022-12-19 | 2023-08-04 | 广东建石科技有限公司 | Machine vision tracking compensation method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108919810A (en) | The localization for Mobile Robot and navigation system of view-based access control model teaching | |
US9989964B2 (en) | System and method for controlling vehicle using neural network | |
Li et al. | Springrobot: A prototype autonomous vehicle and its algorithms for lane detection | |
CN108196535A (en) | Automated driving system based on enhancing study and Multi-sensor Fusion | |
Walter et al. | A situationally aware voice‐commandable robotic forklift working alongside people in unstructured outdoor environments | |
CN114127806A (en) | System and method for enhancing visual output from a robotic device | |
CN106203341B (en) | A kind of Lane detection method and device of unmanned vehicle | |
Carballo et al. | End-to-end autonomous mobile robot navigation with model-based system support | |
US12100224B1 (en) | Key point detection | |
CN110058594A (en) | The localization for Mobile Robot navigation system and method for multisensor based on teaching | |
CN111708010B (en) | Mobile equipment positioning method, device and system and mobile equipment | |
CN113589685B (en) | Vehicle moving robot control system and method based on deep neural network | |
Newman et al. | Self-driving cars: A platform for learning and research | |
Gajjar et al. | A comprehensive study on lane detecting autonomous car using computer vision | |
CN112462762A (en) | Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit | |
Thorpe et al. | The new generation system for the CMU Navlab | |
Lim et al. | Evolution of a reliable and extensible high-level control system for an autonomous car | |
Mutz et al. | Following the leader using a tracking system based on pre-trained deep neural networks | |
Ferrein et al. | Controlling a fleet of autonomous LHD vehicles in mining operation | |
CN116991104A (en) | Automatic driving device for unmanned vehicle | |
TW202303183A (en) | Adaptive mobile manipulation apparatus and method | |
Jahoda et al. | Autonomous car chasing | |
Bayramoglu et al. | Mobile robot navigation in a corridor using visual odometry | |
Natan et al. | DeepIPC: Deeply integrated perception and control for an autonomous vehicle in real environments | |
Sahal et al. | Lane Keeping System using Convolutional Neural Network for Autonomous Car |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181130 |
|
RJ01 | Rejection of invention patent application after publication |