CN107167141A - Robot autonomous navigation system based on double line laser radars - Google Patents
Robot autonomous navigation system based on double line laser radars Download PDFInfo
- Publication number
- CN107167141A CN107167141A CN201710450742.8A CN201710450742A CN107167141A CN 107167141 A CN107167141 A CN 107167141A CN 201710450742 A CN201710450742 A CN 201710450742A CN 107167141 A CN107167141 A CN 107167141A
- Authority
- CN
- China
- Prior art keywords
- mtd
- msub
- robot
- mtr
- radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000006073 displacement reaction Methods 0.000 claims abstract description 15
- 230000009466 transformation Effects 0.000 claims abstract description 7
- 238000010276 construction Methods 0.000 claims abstract description 4
- 230000033001 locomotion Effects 0.000 claims description 21
- 238000000034 method Methods 0.000 claims description 13
- 230000007613 environmental effect Effects 0.000 claims description 11
- 230000004888 barrier function Effects 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 238000013461 design Methods 0.000 claims description 4
- 238000009825 accumulation Methods 0.000 claims description 2
- 230000005540 biological transmission Effects 0.000 claims description 2
- 238000009434 installation Methods 0.000 claims description 2
- 238000011426 transformation method Methods 0.000 claims description 2
- 238000013519 translation Methods 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 4
- 238000012545 processing Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 230000000630 rising effect Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000004387 environmental modeling Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0219—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present invention realizes a kind of brand-new robot autonomous navigation system based on double line laser radars.Compared to existing main flow Indoor Robot navigation system, the present invention only used two line laser radars of low cost one as information source, but reach navigation and the avoidance effect of the scheme for being substantially better than existing equal cost.Based on the robot autonomous navigation system of double line laser radars, including double line radar systems, upper strata navigation system, bottom control algorithm, Motor execution system;Upper strata navigation system includes SLAM algorithms, coordinate transformation algorithm;The two-dimensional map of the original laser radar data dynamic construction current spatial gathered using SLAM algorithms according to the radar being horizontally mounted, while calculating the displacement information of robot platform.
Description
Technical field
The present invention is a kind of autonomous navigation scheme applied to Indoor Robot, affiliated robot field.
Technical background
With continuing to develop for computer technology and the information processing technology, using these technologies as the robot technology of core
It is widely applied in industry-by-industry, including to substitute, manpower completes such as home services, patrol and goods is carried
It is not high etc. complexity but need the progress indoors in space constantly to move work for the purpose of indoor mobile robot.
Airmanship is one of core technology of mobile robot, in indoor environment complicated and changeable, and one kind is stable to be had
The navigation scheme of effect is to determine that robot can complete the key of its work.Will to precision for working in the open exterior space
Not high mobile robot platform is sought, generally can be positioned and be navigated by GPS, but the moving machine for work indoors
For device people, the relatively low civilian GPS navigation system of unstable and precision can not provide sufficiently exact information for robot narrow
Moved in the space of small closing.
Obstacle detection is carried out using radar or infrared ray more than low side Indoor Robot common at present, and is running into barrier
When autonomous is realized by passive avoidance, its motion track has significantly randomness, and can not carry out in advance
Specify the displacement of good destination.Part Higher-end machines people scheme is then more to be entered using a line laser radar or stereo visual system
Row avoidance and navigation, although both schemes can realize displacement and automatic obstacle avoidance by obtaining more environmental informations,
But it has some limitations in actual applications.Although the scheme navigated by a line laser radar can pass through
Obtain environment two-dimensional silhouette come realizing route planning and avoidance, but the program only can to a line radar plane enter
Row scanning and modeling, and it is then helpless for the barrier below or above its plane of scanning motion.Although utilizing stereoscopic vision energy
Enough to obtain three-dimensional environmental information, but the limitation at the visual angle due to camera, its scanning range can not cover enough space regions
Domain, and the program need also exist for processing data volume is big, big noise jamming and the shortcomings of short coverage.
The content of the invention
In order to overcome the shortcomings of above-mentioned existing scheme, the present invention realizes a kind of brand-new based on double line laser radars
Robot autonomous navigation system.Compared to existing main flow Indoor Robot navigation system, the present invention only used two low costs
One line laser radar has reached navigation and the avoidance effect of the scheme for being substantially better than existing equal cost as information source.
The technical solution adopted by the present invention is:
Robot autonomous navigation system based on double line laser radars, it is characterised in that including double line radar systems,
Upper strata navigation system, bottom control algorithm, Motor execution system;The upper strata navigation system include SLAM algorithms,Coordinate transform Algorithm;The original laser radar data dynamic construction current spatial gathered using SLAM algorithms according to the radar being horizontally mountedTwo Tie up map, simultaneouslyResolveGo out the displacement information of robot platform;The coordinate transformation algorithmThe machine obtained according to SLAM algorithms
Device people displacement information carries out coordinate transform and superposition to the original laser radar data acquired in radar each moment of dynamic scan,
So as to which the environmental information of collection is determined into barrier in robot surrounding spaceThree-dimensional information;According to structureTwo-dimensional map
Carry outGlobal path planning, cook up a feasible path between current location and destination;At the same time robot is gathered
Real-time speed;So basis is collectedVelocity information、Global path、Three-dimensional dataLocal paths planning is carried out, and is producedControl instruction, so as to realize robot upper strata navigation;What the bottom control algorithm was generated to upper strata navigation systemControl instruction
Parsed and coordinate transform, and Motor execution system is controlled using PID control, so as to drive robot to move.
Compared with existing system technology, the beneficial effect of the system is:
1) the system can be flexibly applied to multiple indoor scene, and be easy to deployment.
2) double line laser radars, which coordinate, obtains three-dimensional information, and enough environment are obtained in the case of compared with small data quantity
Information, improves the utilization rate of data, and reduces data processing cost.
3) based on space three-dimensional information realization navigation and obstacle avoidance system, effectively improve robot autonomous ability and
Avoidance effect, adds the practicality and reliability of system.
Brief description of the drawings
Fig. 1 is double line laser radar arrangement schematic diagrames.
Fig. 2 is radar scanning scope schematic diagram.
Fig. 3 is robot system architecture's schematic diagram.
Fig. 4 is Algorithms of Robots Navigation System schematic diagram.
Fig. 5 is bottom control cellular system structural representation.
Fig. 6 robots bobbin movement speed and three wheel movement velocitys.
Embodiment
Technical solution of the present invention is described further below in conjunction with drawings and examples.
Part I technical scheme principles and methods:
1st, double line radar systems are constituted using two line laser radars and peripheral control circuits, gathers environmental data.
As shown in figure 1, two laser radars use different mounting means, wherein a level is fixedly mounted, for obtaining robot
The face profile in place space.Another radar is swept above and below being carried out in the case where providing the control of radar controller of power by steering wheel 1
Retouch, and sent the angle information of the current radar plane of scanning motion and horizontal plane to data handling system by radar controller.
2nd, robot hardware's platform is built using NVIDIA Jetson TX1 embedded platforms as core controller.It is whole
Individual platform can be divided into four major parts, respectively data acquisition unit, data processing unit, Motor execution system according to function
And accessory.Wherein data acquisition unit is the data source of whole system, utilizes double one for this project particular design
Wire laser radar system gathers environmental information.The data collected can be sent to data processing unit, and the part is whole system
The core of system, robot operating system and various upper strata algorithms are used as using one piece of NVIDIA Jetson TX1 embedded platform
The carrier of operation.Motor execution system includes mechanical platform, bottom controller and the drive circuit of robot.In addition, system
Also include some accessories such as router, power supply etc..
3rd, robot upper strata navigation system is built.The navigation system is horizontally mounted using SLAM algorithms (prior art) basis
Radar collection original laser radar data dynamic construction current spatialTwo-dimensional map, while according to building during figureIt is special Matching is levied to resolve(by the part of SLAM algorithms, being already belonging to prior art) goes out the displacement information of robot platform.
In navigation systemCoordinate transformation algorithmThe robot displacement information obtained according to SLAM algorithms is to dynamic scan
Radar each moment acquired in original laser radar data carry out coordinate transform and superposition so that by the environmental information that gathers most
Barrier in robot surrounding space is obtained eventuallyThree-dimensional information。
According to structureTwo-dimensional map,(it is prior art, builds in ROS robots Ubuntu using ROS navigation features bag
In operating system) carry outGlobal path planning, cook up a feasible path between current location and destination;At the same time
The real-time speed of robot is gathered using mileage information processing unit, ROS navigation features bags is recycled afterwards according to collectingVelocity informationAndGlobal pathWithThe three-dimensional data that coordinate transform is generatedLocal paths planning is carried out, and is producedControl refers to Order, so as to realize robot upper strata navigation system.
4th, bottom control algorithm is generated to upper strata navigation systemControl instructionParsed and coordinate transform, and utilized
PID control is controlled to motor speed, so as to drive robot to move.Bottom control cellular system structure is as shown in Figure 5.
Part II is further illustrated with embodiment
1st, double wire laser radar systems
The present invention carries out data acquisition using two line laser radars.Two radars can carry out 360 in the horizontal plane
Degree scanning, so as to obtain the obstacle distance information of 360 degree of scopes in plane in a cycle.
At wherein one radar (radar 2) mounting robot distance from bottom ground 10cm, the plane of scanning motion level of radar 2 is put
Put, the elementary contour information of indoor environment where can obtaining in the process of running, the follow-up global map of user is built.By robot
Structure is limited, to protect radar not damaged in robot running because being collided, in the overall embedded robot of radar 2
Portion, and reserve in front end about 150 degree of space and be used to scan external environment condition, data are dropped in remaining angular range.Through surveying
Examination, the mounting means can meet real data collection demand completely.
Another laser radar (radar 1) is then arranged on the scanning radar controller at the top of robot.Whole scanning thunder
It is made up of up to controller mechanical device and control circuit.Mechanical device includes being used to fix the support of whole device, for installing
The base of radar 1, the connecting rod for transmission and the steering wheel 1 for providing power, whole device are in quadrilateral structure.The collection system
During system operation, steering wheel 1 is moved up and down under the control of control circuit (i.e. radar controller), so as to drive radar 1 bottom of around
Seat rotating shaft is scanned to space.
The scanning range of robot top radar 1 is as shown in Figure 2.
Because radar 1 is located at robot top, therefore its scanning range is mainly distributed on the plane where robot top
Below.As shown in Fig. 2 to allow the robot to smooth cut-through thing, surrounding reserves certain safe distance, can by geometrical relationship
Know, scanning range θ of the radar 1 below horizontal plane1It can be calculated and obtained by formula (1)
Wherein δ is safe distance, and H is robot.When safe distance is 20cm, and robot is 120 centimetres,
Can be calculated the following scanning angle of horizontal plane by formula (1) is about 80.54 °.In addition, also reserving certain safety above robot
Space, it is 10 ° to set horizontal plane scanning overhead scope, therefore radar 1 will be carried out in lower about 90 ° of space altogether in the horizontal plane
Scanning.
Double line radar systems include radar controller and power module and constituted, and radar controller is by PWM to 1 jiao of steering wheel
Degree is controlled, and radar 1 is scanned under the control of radar controller with a fixed step size in the range of 90 °, at the same time thunder
Can be sent the angle between current radar plane and horizontal plane to data handling system up to controller is used for coordinate transform.Power supply mould
Block two kinds of voltages of exportable 5V and 6V, 5V provides normal working voltage for radar controller, and 6V outputs are then powered for steering wheel 1.
2 robot hardware's platforms
Robot hardware's platform of the present invention is as shown in Figure 3.
It can be divided into data acquisition, data processing, Motor execution and accessory according to function platform.Wherein data are adopted
Realize that data processing section is embedding using the tall and handsome JetsonTX1 up to company by above-mentioned double wire laser radar systems in collection part
Enter formula platform, Ubuntu operating systems and ROS robot operating systems are equipped with the platform, for realizing to laser radar number
According to processing work and robot autonomous navigation algorithm.The main drive circuit with Arduino controllers in Motor execution part,
Motor and steering wheel and robot chassis composition.Robot chassis uses the robot platform HCR that increases income of three-wheel drive, its power
Come from three 12V DC speed-reducings, each motor independently drives a wheel, combined by the friction speed of three wheels
To realize the displacement of robot.From upper control command to the solution process of motor actual speed by running on Arduino
On control algolithm complete.Peripheral auxiliary circuits then include being used for the parts such as battery, Voltage stabilizing module and router, and power supply coordinates steady
Die block is used to provide corresponding power supply for system, and communication of the router then for host computer and robot provides support.
3rd, Algorithms of Robots Navigation System
3.1 coordinates transformation method
The data for the radar collection being scanned to space are the range informations based on the coordinate system being fixed on radar, and
There is relative motion between radar and robot system platform, therefore the obstacle under robot coordinate system is obtained according to radar data
Thing three-dimensional information then needs to carry out coordinate transform to data.
The initial data that radar is collected is expressed as that (ρ, φ z), right angle are converted to according to formula (2) in cylindrical-coordinate system
Coordinate
If being straight up Z axis positive direction using robot system platform direction of advance as X-axis positive direction, robot chassis
Center is that origin sets up coordinate system, then radar data can be calculated under robot coordinate system by following formula
Wherein [x y z] ' is to transform to the radar data under rectangular coordinate system, and [x'y'z'] ' is to transform to robot
Radar data under coordinate system,It is the angle between the current radar plane of scanning motion and horizontal plane beamed back by radar controller, [x0
y0 z0] ' it is then coordinate of the radar under robot coordinate system.
Constantly moving, therefore existing between robot coordinate system and space coordinates when being scanned in view of robot
Relative motion, needs the radar data under robot coordinate system being further converted to space to obtain more accurately spatial information
Under coordinate system, need to obtain position of the robot under space coordinates for this.System using the laser radar 1 of fixed installation as
Obtain the data source of the positional information.Different from the radar for scanning, radar collection is in same level
Environmental information.
The data are carried out basic coordinate translation first and converted, convert it to space coordinates by data processing algorithm
Under, characteristic matching is carried out in the data not gathered in the same time to robot using SLAM algorithms afterwards, and then calculate robot
Displacement information in space.By being added up to displacement information, while the robot translational speed measured according to encoder
Accumulation result is modified, the position under space coordinates can be obtained further according to the initial position of robot.
Robot is being obtained behind the position under space coordinates, the radar data under robot coordinate system is being translated
It can obtain the spatial information under space coordinates.After robot completes the scanning in the range of 90 °, all scannings are arrived
Point be overlapped after above-mentioned conversion, finally give current robot nearby space three-dimensional information.
3.2 SLAM algorithms
According to practical application and performance requirement, present invention employs Hector SLAM algorithms (prior art) conduct
Build the algorithm of map.The algorithm, which is employed, occupies grating map (Occupancy Gird Map) as the mould of cartographic information
Type, the position relationship between each data point is calculated using the feature matching method based on bilinear interpolation and gauss-newton method, so that
Construct two-dimensional environment map and obtain robot displacement information.
3.3 upper strata navigation system
The navigation of robot can be divided into three processes:Environmental modeling, path planning and robot control.Environmental modeling mistake
Journey includes previously described coordinate transform and map structuring, and the process has ultimately generated required two-dimensional environment map and comprising barrier
Hinder the three-dimensional point cloud of thing information.Path planning is the decision part of whole system, and the process is believed according to environmental map and barrier
The motion state of breath and robot carries out multi-level path planning and motion simulation, thus generate it is most reasonable under current state
Desired motion state, and be encapsulated into control instruction and send to bottom controller.Robot control is then calculated with bottom control
Method is core, is moved according to the actual driving robot platform of control instruction.Complete Algorithms of Robots Navigation System such as Fig. 4 institutes
Show.
3.4 bottom control algorithms
Bottom control algorithm is developed based on Arduino platforms, and it mainly includes instruction parsing, coordinate transform, PID controls
Several parts such as system and data conversion.Shown in bottom control algorithm structure Fig. 5.
Bottom control algorithm is solved to it first after receiving the control instruction that upper strata navigation system is produced by serial ports
Analysis.Expectation translational speed and angular velocity of rotation comprising robot in the control instruction.The design uses three-wheel drive
Mechanical platform, and control instruction provides the speed of whole robot platform, it is therefore desirable to by the expectation speed of robot
Degree is converted to the desired speed of three wheels.
Using robot center chassis as origin, robot direction of advance is x-axis positive direction, is straight up z-axis positive direction
Coordinate system is set up, if desired speed of the robot in x-axis and y-axis direction and the vector of the rotating speed composition around z-axis are [vx vy
ω]T, and the vector that the speed of three wheels is constituted is [v1 v2 v3]T.If there is transformation relation therebetween
Above formula is rewritable to be
[v in formulax vy ω]TValue given by control instruction, therefore need to only determine that parameter matrix can solve three wheels
Velocity vector.
Robot bobbin movement speed and three wheel movement velocity as shown in Figure 6
According to upper figure, it can respectively be tried to achieve in parameter matrix and be respectively worth using principle of dynamics.The ginseng finally tried to achieve in the design
The parameter matrix of matrix number (after actual conditions amendment) isCarry it into formula
Obtain the relation between three wheel speeds and robot translational speed.After the desired speed of three wheels is obtained, bottom control
Unit will be respectively controlled using pid algorithm to three wheels, finally realize the motion control to robot.
Bottom control unit also achieves the collection of encoder data and turned in addition to realizing the control to base plate electric machine
Change.The encoder of motor employed in the present invention is that AB phases are exported, and phase differs 90 degree between two-phase, by any phase
Rising edge counted and can obtain motor present speed, and by detecting that the phase difference between two-phase can determine whether motor
Rotation direction.Interrupt to count encoder A phase pulses by the GPIO of bottom controller in the present invention, and A phases and B phases
Between phase difference then by judging that A phases rising edges arrives when B phases level height determine.If B phases when A phases rising edge arrives
For low level, then motor is rotated forward, and otherwise motor is then reversion.
, it is necessary to which the rotating speed of three motors is converted into robot after motor speed and direction is obtained by encoder
Speed, the transfer process is opposite with the transfer process that rotating speed is expected from robot desired speed to motor.The bottom after conversion is completed
Communication mechanism using ROS issue on/velocity topics by layer control unit, and the algorithm on upper strata passes through subscription
The topic obtains the real-time velocity information of robot.
The innovative point of the present invention
1) a kind of new robot autonomous navigation system based on laser radar location technology has been invented, has passed through two lines
The mutual cooperation of laser radar realizes the Indoor Robot independent navigation function more reliable and more stable compared to traditional scheme.
2) dual radars are realized using mechanical device and embedded controller to cooperate.
3) present invention has higher portability, and can be flexibly applied to a variety of different indoor scenes.
Claims (8)
1. the robot autonomous navigation system based on double line laser radars, it is characterised in that including double line radar systems, on
Layer navigation system, bottom control algorithm, Motor execution system;
The upper strata navigation system include SLAM algorithms,Coordinate transformation algorithm;
The original laser radar data dynamic construction current spatial gathered using SLAM algorithms according to the radar being horizontally mountedTwo Tie up map, simultaneouslyResolveGo out the displacement information of robot platform;
The coordinate transformation algorithmRadar each moment of the robot displacement information obtained according to SLAM algorithms to dynamic scan
Acquired original laser radar data carries out coordinate transform and superposition, so that the environmental information of collection is determined into robot week
Barrier in confining spaceThree-dimensional information;
According to structureTwo-dimensional mapCarry outGlobal path planning, cook up one between current location and destination can walking along the street
Footpath;At the same time the real-time speed of robot is gathered;So basis is collectedVelocity information、Global path、Three-dimensional dataEnter
Row local paths planning, and produceControl instruction, so as to realize robot upper strata navigation;
What the bottom control algorithm was generated to upper strata navigation systemControl instructionParsed and coordinate transform, and utilize PID
Control is controlled to Motor execution system, so as to drive robot to move.
2. the system as claimed in claim 1, it is characterised in that double wire laser radar systems:Wherein one radar (thunder
Up at 2) mounting robot distance from bottom ground 10cm, the plane of scanning motion horizontal positioned of radar 2, where can obtaining in the process of running
The elementary contour information of indoor environment, the follow-up global map of user is built;Another laser radar (radar 1) is then arranged on machine
On scanning radar controller at the top of people.
3. system as claimed in claim 2, it is characterised in that limited by robot architecture, to protect radar not in robot
Damaged in running because being collided, the overall embedded robot interior of radar 2, and reserve in front end about 150 degree of space
For scanning external environment condition, data are dropped in remaining angular range.
4. system as claimed in claim 2, it is characterised in that whole scanning radar controller is by mechanical device and control circuit
Composition;Mechanical device include being used for fixing the support of whole device, the base for installing radar 1, the connecting rod for transmission and
Steering wheel 1 for providing power, whole device is in quadrilateral structure.When the acquisition system is run, steering wheel 1 is controlling circuit (i.e.
Radar controller) control under moved up and down, so as to drive radar 1 to be scanned around base shaft to space.
5. system as claimed in claim 2, it is characterised in that because radar 1 is located at robot top, therefore its scanning range
It is mainly distributed on below the plane where robot top.As shown in Fig. 2 to allow the robot to smooth cut-through thing, week
Reserved certain safe distance is enclosed, from geometrical relationship, scanning range θ of the radar 1 below horizontal plane1It can be counted by formula (1)
Obtain
Wherein δ is safe distance, and H is robot.
6. system as claimed in claim 2, it is characterised in that the coordinates transformation method:
The data for the radar collection being scanned to space are the range informations based on the coordinate system being fixed on radar, and radar
There is relative motion between robot system platform, therefore the barrier three under robot coordinate system is obtained according to radar data
Dimension information then needs to carry out coordinate transform to data;
The initial data that radar is collected is expressed as that (ρ, φ z), rectangular co-ordinate are converted to according to formula (2) in cylindrical-coordinate system
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mi>&rho;</mi>
<mi>c</mi>
<mi>o</mi>
<mi>s</mi>
<mi>&phi;</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mi>&rho;</mi>
<mi>s</mi>
<mi>i</mi>
<mi>n</mi>
<mi>&phi;</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>z</mi>
<mo>=</mo>
<mi>z</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
If being straight up Z axis positive direction using robot system platform direction of advance as X-axis positive direction, robot center chassis
Coordinate system is set up for origin, then radar data can be calculated under robot coordinate system by following formula
Wherein [x y z] ' is to transform to the radar data under rectangular coordinate system, and [x'y'z'] ' is to transform to robot coordinate
Radar data under system,It is the angle between the current radar plane of scanning motion and horizontal plane beamed back by radar controller, [x0 y0
z0] ' it is then coordinate of the radar under robot coordinate system;
Constantly moving, therefore existing between robot coordinate system and space coordinates relative when being scanned in view of robot
Motion, needs the radar data under robot coordinate system being further converted to space coordinate to obtain more accurately spatial information
Under system, need to obtain position of the robot under space coordinates for this.System regard the laser radar 1 of fixed installation as acquisition
The data source of the positional information, what the radar was gathered is the environmental information in same level;
Basic coordinate translation is carried out to the data first to convert, converted it under space coordinates, calculated afterwards using SLAM
Method carries out characteristic matching to robot in the data not gathered in the same time, and then calculates the displacement letter of robot in space
Breath.By being added up to displacement information, while the robot translational speed measured according to encoder is repaiied to accumulation result
Just, the position under space coordinates can be obtained further according to the initial position of robot;
Robot is being obtained behind the position under space coordinates, the radar data under robot coordinate system is being translated
Obtain the spatial information under space coordinates.After robot completes the scanning in the range of 90 °, by all points scanned
It is overlapped after above-mentioned conversion, finally gives the three-dimensional information in space near current robot.
7. system as claimed in claim 2, it is characterised in that the path planning is the decision part of whole system, the mistake
Journey carries out path planning according to the motion state of environmental map and obstacle information and robot, and is encapsulated into control and refers to
Order is sent to bottom controller.
8. system as claimed in claim 2, it is characterised in that the bottom control algorithm:
It is parsed first after receiving the control instruction that upper strata navigation system is produced by serial ports.Wrapped in the control instruction
Expectation translational speed and angular velocity of rotation containing robot.The design uses the mechanical platform of three-wheel drive, and controls
Instruction provides the speed of whole robot platform, it is therefore desirable to which the desired speed of robot is converted into three wheels
Desired speed.
Using robot center chassis as origin, robot direction of advance is x-axis positive direction, is set up straight up for z-axis positive direction
Coordinate system, if desired speed of the robot in x-axis and y-axis direction and the vector of the rotating speed composition around z-axis are [vx vy ω]T,
And the vector that the speed of three wheels is constituted is [v1 v2 v3]T.If there is transformation relation therebetween
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>x</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>y</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>&omega;</mi>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>K</mi>
<mn>11</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>12</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>13</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>K</mi>
<mn>21</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>22</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>23</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>K</mi>
<mn>31</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>32</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>33</mn>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mn>2</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mn>3</mn>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Above formula is rewritable to be
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mn>2</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mn>3</mn>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<msup>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>K</mi>
<mn>11</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>12</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>13</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>K</mi>
<mn>21</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>22</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>23</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>K</mi>
<mn>31</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>32</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>K</mi>
<mn>33</mn>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>x</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>y</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mi>&omega;</mi>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
[v in formulax vy ω]TValue given by control instruction, therefore need to only determine that parameter matrix can solve the speed of three wheels
Degree vector.
The parameter matrix for the parameter matrix (after actual conditions amendment) finally tried to achieve is
The relation between three wheel speeds and robot translational speed can be obtained by carrying it into formula.Obtaining the expectation speed of three wheels
After degree, bottom control unit will be respectively controlled using pid algorithm to three wheels, finally realize the motion control to robot
System.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710450742.8A CN107167141B (en) | 2017-06-15 | 2017-06-15 | Robot autonomous navigation system based on double laser radars |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710450742.8A CN107167141B (en) | 2017-06-15 | 2017-06-15 | Robot autonomous navigation system based on double laser radars |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107167141A true CN107167141A (en) | 2017-09-15 |
CN107167141B CN107167141B (en) | 2020-08-14 |
Family
ID=59818567
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710450742.8A Active CN107167141B (en) | 2017-06-15 | 2017-06-15 | Robot autonomous navigation system based on double laser radars |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107167141B (en) |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107576950A (en) * | 2017-09-28 | 2018-01-12 | 西安电子科技大学 | A kind of optimized treatment method of pulse compression radar echo-signal |
CN108008409A (en) * | 2017-11-28 | 2018-05-08 | 深圳市杉川机器人有限公司 | Region contour method for drafting and device |
CN108170145A (en) * | 2017-12-28 | 2018-06-15 | 浙江捷尚人工智能研究发展有限公司 | Robot obstacle-avoiding system and its application process based on laser radar |
CN108170144A (en) * | 2017-12-27 | 2018-06-15 | 北斗七星(重庆)物联网技术有限公司 | A kind of control system and security robot applied to security robot |
CN108375373A (en) * | 2018-01-30 | 2018-08-07 | 深圳市同川科技有限公司 | Robot and its air navigation aid, navigation device |
CN108401461A (en) * | 2017-12-29 | 2018-08-14 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional mapping method, device and system, cloud platform, electronic equipment and computer program product |
CN108958267A (en) * | 2018-08-10 | 2018-12-07 | 洛阳中科龙网创新科技有限公司 | A kind of unmanned vehicle barrier-avoiding method based on laser radar |
CN109001752A (en) * | 2018-05-18 | 2018-12-14 | 威海晶合数字矿山技术有限公司 | A kind of three-dimensional measurement modeling and its method |
CN109358342A (en) * | 2018-10-12 | 2019-02-19 | 东北大学 | Three-dimensional laser SLAM system and control method based on 2D laser radar |
CN109389053A (en) * | 2018-09-20 | 2019-02-26 | 同济大学 | High performance vehicle detection system based on vehicle perpendicular type feature |
CN109507686A (en) * | 2018-11-08 | 2019-03-22 | 歌尔科技有限公司 | A kind of control method wears display equipment, electronic equipment and storage medium |
CN110095786A (en) * | 2019-04-30 | 2019-08-06 | 北京云迹科技有限公司 | Three-dimensional point cloud based on a line laser radar ground drawing generating method and system |
CN110398751A (en) * | 2019-09-11 | 2019-11-01 | 北京云迹科技有限公司 | The system and method for map is generated based on laser radar |
CN110411435A (en) * | 2018-04-26 | 2019-11-05 | 北京京东尚科信息技术有限公司 | Robot localization method, apparatus and robot |
CN110456785A (en) * | 2019-06-28 | 2019-11-15 | 广东工业大学 | A kind of autonomous heuristic approach in interior based on caterpillar robot |
CN111007528A (en) * | 2019-12-17 | 2020-04-14 | 深圳市优必选科技股份有限公司 | Inspection robot |
CN111067180A (en) * | 2020-01-08 | 2020-04-28 | 中国人民武装警察部队工程大学 | Map drawing and positioning system based on tactical command and helmet |
CN111552296A (en) * | 2020-05-14 | 2020-08-18 | 宁波智能装备研究院有限公司 | Local smooth track planning method based on curved cylindrical coordinate system |
CN111830977A (en) * | 2020-07-02 | 2020-10-27 | 中国兵器科学研究院 | Autonomous navigation software framework and navigation method for mobile robot |
CN111947657A (en) * | 2020-06-12 | 2020-11-17 | 南京邮电大学 | Mobile robot navigation method suitable for dense bent frame environment |
CN111984017A (en) * | 2020-08-31 | 2020-11-24 | 苏州三六零机器人科技有限公司 | Cleaning equipment control method, device and system and computer readable storage medium |
CN112113565A (en) * | 2020-09-22 | 2020-12-22 | 温州科技职业学院 | Robot positioning system for agricultural greenhouse environment |
CN112198491A (en) * | 2020-09-30 | 2021-01-08 | 广州赛特智能科技有限公司 | Robot three-dimensional sensing system and method based on low-cost two-dimensional laser radar |
CN112394359A (en) * | 2019-08-15 | 2021-02-23 | 北醒(北京)光子科技有限公司 | Laser radar and one-dimensional scanning method thereof |
CN112629520A (en) * | 2020-11-25 | 2021-04-09 | 北京集光通达科技股份有限公司 | Robot navigation and positioning method, system, equipment and storage medium |
CN112797990A (en) * | 2020-12-24 | 2021-05-14 | 深圳市优必选科技股份有限公司 | Storage medium, robot and navigation bitmap generation method and device thereof |
CN112882481A (en) * | 2021-04-28 | 2021-06-01 | 北京邮电大学 | Mobile multi-mode interactive navigation robot system based on SLAM |
CN113168184A (en) * | 2018-10-12 | 2021-07-23 | 波士顿动力公司 | Terrain-aware step planning system |
CN113253731A (en) * | 2021-05-26 | 2021-08-13 | 常州市工业互联网研究院有限公司 | Mobile embedded automatic platform of self-organizing path based on SLAM algorithm |
CN114194685A (en) * | 2021-12-23 | 2022-03-18 | 山东新华医疗器械股份有限公司 | Stacking AGV control system, method and device |
CN114415661A (en) * | 2021-12-15 | 2022-04-29 | 中国农业大学 | Planar laser SLAM and navigation method based on compressed three-dimensional space point cloud |
WO2022252335A1 (en) * | 2021-06-01 | 2022-12-08 | 南京驭逡通信科技有限公司 | Industrial robot and industrial robot obstacle avoidance system thereof |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101975951A (en) * | 2010-06-09 | 2011-02-16 | 北京理工大学 | Field environment barrier detection method fusing distance and image information |
US9002511B1 (en) * | 2005-10-21 | 2015-04-07 | Irobot Corporation | Methods and systems for obstacle detection using structured light |
CN105074600A (en) * | 2013-02-27 | 2015-11-18 | 夏普株式会社 | Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method |
CN106094836A (en) * | 2016-08-08 | 2016-11-09 | 成都希德电子信息技术有限公司 | A kind of microrobot control system based on two-dimensional laser radar and method |
CN106227218A (en) * | 2016-09-27 | 2016-12-14 | 深圳乐行天下科技有限公司 | The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device |
CN106325275A (en) * | 2016-09-14 | 2017-01-11 | 广州今甲智能科技有限公司 | Robot navigation system, robot navigation method and robot navigation device |
CN106406338A (en) * | 2016-04-14 | 2017-02-15 | 中山大学 | Omnidirectional mobile robot autonomous navigation apparatus and method based on laser range finder |
CN206105865U (en) * | 2016-08-31 | 2017-04-19 | 路琨 | Barrier system that keeps away in robot |
CN106708058A (en) * | 2017-02-16 | 2017-05-24 | 上海大学 | Robot object conveying method and control system based on ROS (Robot Operating System) |
-
2017
- 2017-06-15 CN CN201710450742.8A patent/CN107167141B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9002511B1 (en) * | 2005-10-21 | 2015-04-07 | Irobot Corporation | Methods and systems for obstacle detection using structured light |
CN101975951A (en) * | 2010-06-09 | 2011-02-16 | 北京理工大学 | Field environment barrier detection method fusing distance and image information |
CN105074600A (en) * | 2013-02-27 | 2015-11-18 | 夏普株式会社 | Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method |
CN106406338A (en) * | 2016-04-14 | 2017-02-15 | 中山大学 | Omnidirectional mobile robot autonomous navigation apparatus and method based on laser range finder |
CN106094836A (en) * | 2016-08-08 | 2016-11-09 | 成都希德电子信息技术有限公司 | A kind of microrobot control system based on two-dimensional laser radar and method |
CN206105865U (en) * | 2016-08-31 | 2017-04-19 | 路琨 | Barrier system that keeps away in robot |
CN106325275A (en) * | 2016-09-14 | 2017-01-11 | 广州今甲智能科技有限公司 | Robot navigation system, robot navigation method and robot navigation device |
CN106227218A (en) * | 2016-09-27 | 2016-12-14 | 深圳乐行天下科技有限公司 | The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device |
CN106708058A (en) * | 2017-02-16 | 2017-05-24 | 上海大学 | Robot object conveying method and control system based on ROS (Robot Operating System) |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107576950B (en) * | 2017-09-28 | 2020-10-16 | 西安电子科技大学 | Optimization processing method for pulse compression radar echo signals |
CN107576950A (en) * | 2017-09-28 | 2018-01-12 | 西安电子科技大学 | A kind of optimized treatment method of pulse compression radar echo-signal |
US11275178B2 (en) | 2017-11-28 | 2022-03-15 | Shenzhen 3Irobotix Co., Ltd. | Method and device for drawing region outline and computer readable storage medium |
CN108008409A (en) * | 2017-11-28 | 2018-05-08 | 深圳市杉川机器人有限公司 | Region contour method for drafting and device |
CN108008409B (en) * | 2017-11-28 | 2019-12-10 | 深圳市杉川机器人有限公司 | Region contour drawing method and device |
WO2019104866A1 (en) * | 2017-11-28 | 2019-06-06 | 深圳市杉川机器人有限公司 | Method and device for drawing region outline and computer readable storage medium |
CN108170144A (en) * | 2017-12-27 | 2018-06-15 | 北斗七星(重庆)物联网技术有限公司 | A kind of control system and security robot applied to security robot |
CN108170145A (en) * | 2017-12-28 | 2018-06-15 | 浙江捷尚人工智能研究发展有限公司 | Robot obstacle-avoiding system and its application process based on laser radar |
WO2019127445A1 (en) * | 2017-12-29 | 2019-07-04 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional mapping method, apparatus and system, cloud platform, electronic device, and computer program product |
CN108401461A (en) * | 2017-12-29 | 2018-08-14 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional mapping method, device and system, cloud platform, electronic equipment and computer program product |
CN108375373A (en) * | 2018-01-30 | 2018-08-07 | 深圳市同川科技有限公司 | Robot and its air navigation aid, navigation device |
CN110411435B (en) * | 2018-04-26 | 2021-06-29 | 北京京东尚科信息技术有限公司 | Robot positioning method and device and robot |
CN110411435A (en) * | 2018-04-26 | 2019-11-05 | 北京京东尚科信息技术有限公司 | Robot localization method, apparatus and robot |
CN109001752A (en) * | 2018-05-18 | 2018-12-14 | 威海晶合数字矿山技术有限公司 | A kind of three-dimensional measurement modeling and its method |
CN108958267A (en) * | 2018-08-10 | 2018-12-07 | 洛阳中科龙网创新科技有限公司 | A kind of unmanned vehicle barrier-avoiding method based on laser radar |
CN109389053A (en) * | 2018-09-20 | 2019-02-26 | 同济大学 | High performance vehicle detection system based on vehicle perpendicular type feature |
CN109389053B (en) * | 2018-09-20 | 2021-08-06 | 同济大学 | Method and system for detecting position information of vehicle to be detected around target vehicle |
CN113168184A (en) * | 2018-10-12 | 2021-07-23 | 波士顿动力公司 | Terrain-aware step planning system |
CN109358342A (en) * | 2018-10-12 | 2019-02-19 | 东北大学 | Three-dimensional laser SLAM system and control method based on 2D laser radar |
CN109358342B (en) * | 2018-10-12 | 2022-12-09 | 东北大学 | Three-dimensional laser SLAM system based on 2D laser radar and control method |
CN109507686A (en) * | 2018-11-08 | 2019-03-22 | 歌尔科技有限公司 | A kind of control method wears display equipment, electronic equipment and storage medium |
CN110095786A (en) * | 2019-04-30 | 2019-08-06 | 北京云迹科技有限公司 | Three-dimensional point cloud based on a line laser radar ground drawing generating method and system |
CN110456785A (en) * | 2019-06-28 | 2019-11-15 | 广东工业大学 | A kind of autonomous heuristic approach in interior based on caterpillar robot |
CN112394359A (en) * | 2019-08-15 | 2021-02-23 | 北醒(北京)光子科技有限公司 | Laser radar and one-dimensional scanning method thereof |
CN110398751A (en) * | 2019-09-11 | 2019-11-01 | 北京云迹科技有限公司 | The system and method for map is generated based on laser radar |
CN111007528A (en) * | 2019-12-17 | 2020-04-14 | 深圳市优必选科技股份有限公司 | Inspection robot |
CN111067180A (en) * | 2020-01-08 | 2020-04-28 | 中国人民武装警察部队工程大学 | Map drawing and positioning system based on tactical command and helmet |
CN111552296B (en) * | 2020-05-14 | 2021-03-26 | 宁波智能装备研究院有限公司 | Local smooth track planning method based on curved cylindrical coordinate system |
CN111552296A (en) * | 2020-05-14 | 2020-08-18 | 宁波智能装备研究院有限公司 | Local smooth track planning method based on curved cylindrical coordinate system |
CN111947657B (en) * | 2020-06-12 | 2024-04-19 | 南京邮电大学 | Mobile robot navigation method suitable for compact shelving environment |
CN111947657A (en) * | 2020-06-12 | 2020-11-17 | 南京邮电大学 | Mobile robot navigation method suitable for dense bent frame environment |
CN111830977A (en) * | 2020-07-02 | 2020-10-27 | 中国兵器科学研究院 | Autonomous navigation software framework and navigation method for mobile robot |
CN111984017A (en) * | 2020-08-31 | 2020-11-24 | 苏州三六零机器人科技有限公司 | Cleaning equipment control method, device and system and computer readable storage medium |
CN112113565A (en) * | 2020-09-22 | 2020-12-22 | 温州科技职业学院 | Robot positioning system for agricultural greenhouse environment |
CN112198491B (en) * | 2020-09-30 | 2023-06-09 | 广州赛特智能科技有限公司 | Robot three-dimensional sensing system and method based on low-cost two-dimensional laser radar |
CN112198491A (en) * | 2020-09-30 | 2021-01-08 | 广州赛特智能科技有限公司 | Robot three-dimensional sensing system and method based on low-cost two-dimensional laser radar |
CN112629520A (en) * | 2020-11-25 | 2021-04-09 | 北京集光通达科技股份有限公司 | Robot navigation and positioning method, system, equipment and storage medium |
CN112797990A (en) * | 2020-12-24 | 2021-05-14 | 深圳市优必选科技股份有限公司 | Storage medium, robot and navigation bitmap generation method and device thereof |
WO2022134937A1 (en) * | 2020-12-24 | 2022-06-30 | 深圳市优必选科技股份有限公司 | Storage medium, robot, and navigation bitmap generation method and device therefor |
CN112882481A (en) * | 2021-04-28 | 2021-06-01 | 北京邮电大学 | Mobile multi-mode interactive navigation robot system based on SLAM |
CN113253731A (en) * | 2021-05-26 | 2021-08-13 | 常州市工业互联网研究院有限公司 | Mobile embedded automatic platform of self-organizing path based on SLAM algorithm |
CN113253731B (en) * | 2021-05-26 | 2022-11-11 | 常州市工业互联网研究院有限公司 | Mobile embedded automatic platform of self-organizing path based on SLAM algorithm |
WO2022252335A1 (en) * | 2021-06-01 | 2022-12-08 | 南京驭逡通信科技有限公司 | Industrial robot and industrial robot obstacle avoidance system thereof |
CN114415661A (en) * | 2021-12-15 | 2022-04-29 | 中国农业大学 | Planar laser SLAM and navigation method based on compressed three-dimensional space point cloud |
CN114415661B (en) * | 2021-12-15 | 2023-09-22 | 中国农业大学 | Planar laser SLAM and navigation method based on compressed three-dimensional space point cloud |
CN114194685A (en) * | 2021-12-23 | 2022-03-18 | 山东新华医疗器械股份有限公司 | Stacking AGV control system, method and device |
Also Published As
Publication number | Publication date |
---|---|
CN107167141B (en) | 2020-08-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107167141A (en) | Robot autonomous navigation system based on double line laser radars | |
CN105892489A (en) | Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method | |
CN112147999B (en) | Automatic driving experiment AGV vehicle platform | |
CN108663681A (en) | Mobile Robotics Navigation method based on binocular camera Yu two-dimensional laser radar | |
CN104914865A (en) | Transformer station inspection tour robot positioning navigation system and method | |
Li et al. | Localization and navigation for indoor mobile robot based on ROS | |
CN206833250U (en) | A kind of unmanned investigation dolly based on laser radar | |
CN107097228A (en) | Autonomous traveling robot system | |
CN106383517A (en) | Control system used for autonomous mobile robot platform and control method and device thereof | |
CN113282083B (en) | Unmanned vehicle formation experiment platform based on robot operating system | |
CN109002046A (en) | A kind of Navigation System for Mobile Robot and air navigation aid | |
CN108955677A (en) | A kind of topological map creation method based on laser radar and GPS and build map device | |
CN108205325A (en) | A kind of round-the-clock unmanned cruiser system of four-wheel drive low speed | |
CN108469820A (en) | A kind of round-the-clock unmanned cruiser system of two-wheel drive low speed | |
CN107045342B (en) | A kind of autonomous guidance system of interactive mode based on three-wheel Omni-mobile and method | |
CN108646759B (en) | Intelligent detachable mobile robot system based on stereoscopic vision and control method | |
Asadi et al. | An integrated aerial and ground vehicle (UAV-UGV) system for automated data collection for indoor construction sites | |
Son et al. | The practice of mapping-based navigation system for indoor robot with RPLIDAR and Raspberry Pi | |
Lamon et al. | Mapping with an autonomous car | |
CN108279683A (en) | A kind of round-the-clock unmanned cruiser system of six wheel drive low speed | |
Jensen et al. | Laser range imaging using mobile robots: From pose estimation to 3D-models | |
CN108227720A (en) | A kind of round-the-clock unmanned cruiser system of four-wheel drive high speed | |
CN214174915U (en) | Navigation control system for wall-climbing robot | |
Nuñez et al. | A design of an autonomous mobile robot base with omnidirectional wheels and plane-based navigation with Lidar sensor | |
CN108490936A (en) | A kind of round-the-clock unmanned cruiser system of two-wheel drive high speed |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |