CN108037756A - A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system - Google Patents
A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system Download PDFInfo
- Publication number
- CN108037756A CN108037756A CN201711229561.9A CN201711229561A CN108037756A CN 108037756 A CN108037756 A CN 108037756A CN 201711229561 A CN201711229561 A CN 201711229561A CN 108037756 A CN108037756 A CN 108037756A
- Authority
- CN
- China
- Prior art keywords
- unmanned vehicle
- laser radar
- obstacle avoidance
- middling speed
- group
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 36
- 230000004888 barrier function Effects 0.000 claims abstract description 52
- 238000001514 detection method Methods 0.000 claims abstract description 45
- 230000004438 eyesight Effects 0.000 claims abstract description 28
- 230000033001 locomotion Effects 0.000 claims abstract description 25
- 238000012545 processing Methods 0.000 claims description 35
- 238000004891 communication Methods 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000000034 method Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 230000008901 benefit Effects 0.000 description 7
- 238000013461 design Methods 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 206010039203 Road traffic accident Diseases 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000002347 injection Methods 0.000 description 3
- 239000007924 injection Substances 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 230000001568 sexual effect Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000009123 feedback regulation Effects 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006555 catalytic reaction Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000368 destabilizing effect Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 230000012010 growth Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- VIKNJXKGJWUCNN-XGXHKTLJSA-N norethisterone Chemical compound O=C1CC[C@@H]2[C@H]3CC[C@](C)([C@](CC4)(O)C#C)[C@@H]4[C@@H]3CCC2=C1 VIKNJXKGJWUCNN-XGXHKTLJSA-N 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 210000000697 sensory organ Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0225—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Acoustics & Sound (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention belongs to unmanned technical field, discloses a kind of Multi-sensor Fusion middling speed unmanned vehicle detection obstacle avoidance system.The present invention includes control system and roof laser radar, vehicle-mounted monocular vision camera and front laser radar group and rear laser radar group, and roof laser radar is used to detect the fluctuating of unmanned vehicle road ahead and detects the barrier situation in unmanned vehicle forward path in the lump with front laser radar group;Front laser radar group is additionally operable to the barrier situation in detection unmanned vehicle left front and right front motion path;Rear laser radar group is used for the barrier situation for detecting unmanned vehicle rear;The sign and assemble conjunction detection barrier situation with roof laser radar and front laser radar that vehicle-mounted monocular vision camera is used to identify in front of unmanned vehicle;Control system includes host computer and slave computer, and the slave computer is the dual-core controller being made of FPGA and ARM.The present invention is cheap, cost performance is of a relatively high, has very strong practicality.
Description
Technical field
The invention belongs to unmanned technical field, and in particular to a kind of Multi-sensor Fusion middling speed unmanned vehicle detects avoidance
System.
Background technology
With economic fast development, automobile has become part more and more important in people's life.Drive
The negligence for the person of sailing can all cause many accidents, every year number dead in whole world traffic accident about 1,000,000
People, China probably have nearly 100,000 people to die of traffic accident every year.Since driver error is numerous, automobile manufacturers will collect certainly
The system that middle energy design can ensure that automotive safety, is one of principal element for pulling automatic driving car demand growth safely;Its
Secondary, serious traffic jam makes driving not so fine in China big city, allows the unmanned vehicle of artificial intelligence to be driven instead of someone
Sail the problems such as being fully solved traffic jam;In addition, bad air regime is also the " catalysis for promoting pilotless automobile
Agent ".
Pilotless automobile is to perceive road environment by vehicle-mounted sensor-based system, and automatic planning travelling line simultaneously controls vehicle
Reach the intelligent automobile of predeterminated target.It is to perceive vehicle-periphery using onboard sensor, and is obtained according to perceiving
Road, vehicle location and obstacle information, steering and the speed of vehicle are controlled, so as to enable the vehicle to reliably and securely exist
Travelled on road.Pilotless automobile integrate automatically control, architecture, artificial intelligence, vision calculate etc. numerous technologies,
The product of computer science, pattern-recognition and intelligent control technology high development, and weigh a national research strength and
One important symbol of industrial level, has broad application prospects in national defence and national economy field.
At present, unmanned vehicle development is still in infancy, and each state has all started the research of Intelligent unattended driving in succession.
The either intelligent driving of which kind of degree, the first step is all to perceive, that is, perceives the road conditions environment of vehicle-surroundings complexity, at this
Corresponding path planning and driving behavior decision-making can be just made on the basis of a, the selection of detecting sensor is unmanned vehicle success avoidance
Premise.Common ranging detecting sensor has:Ultrasonic distance-measuring sensor, infrared distance sensor, CCD vision systems, milli
Metre wave radar, microwave radar and laser radar etc..
Laser radar is actually that one kind is operated in optical region(Special wave band)Radar, laser radar belong to actively visit
Survey, independent of the radiation characteristic of extraneous illumination condition or target in itself, it only need to launch the laser beam of oneself, be sent out by detecting
The echo-signal of laser beam is penetrated to obtain target information.Laser wave length, can launch the very small laser beam of the angle of divergence, multipath
Effect is small, detectable low latitude/treetop level target.Single line laser radar is one kind in laser radar, due to only launching all the way
Receive all the way, structure is relatively easy, and use is also more convenient;The single line laser radar scan period is shorter, to direction of advance ring
The sweep speed in border is fast, and angular resolution is higher, and radar small volume itself, weight is relatively light, and power consumption is also than relatively low, reliably
Higher, the relative inexpensiveness of property;Single line laser radar investigative range is relatively wide, can provide a large amount of environmental scanning point distance letters
Breath, decision-making can provide larger convenience, it is unknown that above advantage make it that single line laser radar becomes unmanned vehicle perception in order to control
One prioritizing selection of environment.
General common simple unmanned bassinet structure such as Fig. 1, detection and obstacle avoidance system principle such as Fig. 2.Automatic driving car
By(Single line is multi-thread)Laser radar sensor detection system detects environment and is conveyed to PC machine(Host computer), then PC machine pass through
Coded treatment, sends control instruction and sends control after communication decodes to SCM Based slave computer, single chip control module
To DC brushless motor controller, controller drives multiple DC brushless motor movements for system instruction;Single-chip computer control system according to
The change of peripheral environment carrys out the speed of regulation motor, and then controls the position of unmanned vehicle in the actual environment, realizes that unmanned vehicle exists
Walking and avoidance among actual condition, existing simple unmanned vehicle control is to control single single line by single microcontroller
Laser radar sensor or multi-line laser radar sensor realize above-mentioned function.
But above-mentioned technical proposal long-play can find, there is problems, mainly have:
(1)Since unmanned vehicle is by the interference of surrounding environment destabilizing factor, SCM Based controller antijamming capability is poor, warp
Exception often occurs, causes unmanned vehicle out of control.
(2)Existing automatic driving car uses rudimentary DSP, ARM family chip, working frequency most 100 megahertzs of great talent
Hereby left and right, can not meet the rapid computations of unmanned vehicle complex data.
(3)Influenced by unmanned vehicle PC machine performance, the sensor gathered data of unmanned vehicle quickly can not be calculated and stored.
(4)The data that single line laser radar obtains are 2D data, cannot be distinguished from the information such as the height of target, some babies
Cognition is ignored, and eventually becomes barrier, and single single line laser radar sensor navigation becomes the bottleneck of automotive field.
(5)Single single line laser radar can not obtain information of road surface, it is necessary to coordinate other sensors to terrestrial information into
Row reads and differentiates.
(6)Although multi-line laser radar can realize 2.5D or 3D data, it can be determined that the height of barrier, handles ground
Information etc., but price is relatively expensive, and the laser radar price of 64 beams is up to 700,000 RMB, can not large area
Promote the use of.
(7)Single single line laser radar can not detect the information such as bent angle, cliff road, it is necessary to coordinate other sensors to use
Peripheral obstacle signal or alignment sensor mark can just be read.
(8)Present unmanned vehicle substantially only considers forward detection and avoidance, does not consider the obstacle information at rear, has
When the rear barrier that occurs can hurt unmanned vehicle body, and unmanned vehicle can not realize that acceleration is hidden.
(9)Moment is just being started there is a detection blind area based on single single line laser radar unmanned vehicle, once have
Barrier is in blind area, is easy to produce traffic accident.
(10)Detection blind area is also occurred during actual travel based on single single line laser radar unmanned vehicle, once
Traffic accident can also be produced by having barrier to enter movement blind area during the motion.
(11)It is slower to road ahead Image Acquisition speed based on the unmanned vehicle of single line laser radar, it is fast to have impact on unmanned vehicle
Speed is advanced.
(12)In long range travels, poor, Wu Fashi is recognized to surrounding environment based on the unmanned vehicle of single line laser radar
Now it is accurately positioned.
(13)In regular traffic, there are various traffic signs on the ground of unmanned vehicle driving path, but single line laser
Radar is beyond recognition, and loses assisting navigation during unmanned vehicle Fast marching.
(14)In regular traffic, unmanned vehicle driving path in the air there are traffic lights etc. to indicate, but single line laser thunder
Up to being beyond recognition, the security of unmanned vehicle Fast marching is weakened.
(15)The influence of optical radar price and performance is excited, general sexual valence compares higher laser radar detection scope deficiency
100 meters, this distance is unfavorable for the judgement of unmanned vehicle Fast marching barrier.
The principle and structure of visual sensor are similar with the sense organ tissue of the mankind, and visual sensor have it is small,
Cost is low, easy for installation, good concealment, have the advantages that investigative range it is wide and comprising contain much information.Visited in unmanned vehicle environment
Camera is added in examining system can sense the environment of surrounding in real time, collect data, carry out static, dynamic object identification, detect
Survey and follow the trail of, and navigation instrument map datum, the computing and analysis of system are carried out, in advance controller can be allowed to perceive possibility
The danger of generation, is effectively increased the comfort and security of car steering;
Therefore, it is necessary to the unmanned spy to the existing single line laser radar controlled based on DSP or ARM or multi-line laser radar
Examining system is redesigned, and introduces the relatively wide and higher sexual valence video acquisition type sensor of investigative range so that unmanned vehicle
Discovery barrier that can be farther out simultaneously realizes quick avoidance.
The content of the invention
The present invention seeks to:In order to overcome the deficiencies in the prior art, the present invention provides a kind of Multi-sensor Fusion
Middling speed unmanned vehicle detects obstacle avoidance system.
Specifically, the present invention is realized using following technical scheme, including control system and roof laser radar, also
Including vehicle-mounted monocular vision camera, configuration unmanned vehicle lower car body front laser radar group and rear laser radar group,
Wherein described roof laser radar is used to detect the fluctuating of unmanned vehicle road ahead and is visited in the lump with front laser radar group
Survey the barrier situation in unmanned vehicle forward path;The front laser radar group be additionally operable to detection unmanned vehicle left front and
Barrier situation in the motion path of right front;The rear laser radar group is used for the obstacle principle for detecting unmanned vehicle rear
Condition;The vehicle-mounted monocular vision camera be used to identifying sign in front of unmanned vehicle and with roof laser radar and front laser thunder
Detection barrier situation is closed up to assembling;The control system includes host computer and slave computer, each laser thunder of host computer real-time reception
Up to feedback signal and decode, then communicated with slave computer and transmit input control signal to slave computer;The slave computer is served as reasons
FPGA and ARM forms dual-core controller, and wherein FPGA is obtained and the view data of processing vehicle-mounted monocular vision camera, ARM roots
The data handled according to FPGA carry out images match, and combine decoded input control signal control unmanned vehicle traveling.
Furthermore, the roof laser radar is 1 single line laser radar, is located slightly above roof and and horizontal plane
Similar to 5 ~ 15 degree obliquely of roof front center portion.
Furthermore, the roof laser radar is LMS151 single line laser radars.
Furthermore, the front laser radar group is made of 3 single line laser radars, wherein there is two to be located at respectively
The left front portion of headstock and right front portion, both center positions have the angle of 30 degree of an approximation away from unmanned vehicle direction of advance, remain
Remaining one it is consistent with unmanned vehicle direction of advance positioned at both center, its center position.
Furthermore, the about liftoff 40cm of setting height of the front laser radar group.
Furthermore, the front laser radar group is LMS151 single line laser radars.
Furthermore, the rear laser radar group is made of two single line laser radars parallel with horizontal plane, point
Not Wei Yu the tailstock both sides.
Furthermore, the rear laser radar group sets the about liftoff 40cm ~ 60cm of height.
Furthermore, the rear laser radar group is LMS122 single line laser radars.
Furthermore, the front ultrasonic sensor group and rear supersonic sensing for being arranged on unmanned car bottom are further included
Device group, the front ultrasonic sensor group are used for blind area detection avoidance, the rear ultrasonic sensor group in front of unmanned vehicle
Avoidance, the slave computer and front ultrasonic sensor group and rear ultrasonic sensor group are detected for unmanned vehicle rear blind area
Communication.
Furthermore, the front ultrasonic sensor group is made of 5 ultrasonic sensors.
Furthermore, the rear ultrasonic sensor group is made of 5 ultrasonic sensors.
Furthermore, the host computer is the NUC microcomputers of Intel.
Furthermore, the slave computer is STM32F7 MCU.
Furthermore, the control system is communicated by wireless device and unmanned vehicle master station, when unmanned vehicle and master station lose
When going communication, slave computer implements automatic stopping control.
Furthermore, the slave computer also reads the site identity on ground by vehicle-mounted monocular vision camera.
Furthermore, unmanned vehicle is electric car, and the control system is joined according to the internal resistance of electromobile battery and temperature
Several terminal voltages to storage battery are detected.
Furthermore, the vehicle-mounted monocular vision camera is CCD black and white cameras.
Beneficial effects of the present invention are as follows:
1st, during the motion, the effect of battery in this system has been taken into full account, based on tri- nuclear control of ARM+FPGA+NUC
The device moment is all monitored and computing in the operating status to unmanned vehicle, the generation of high current is avoided, so fundamentally solving
Impact of the high current of having determined to battery, avoids the generation of the storage battery overaging phenomenon caused by heavy-current discharge.
2nd, in fast discharge process, in the voltage detecting process of opposite end, the parameters such as internal resistance, the temperature of storage battery are introduced,
So that terminal voltage is closer to actual parameter, the favourable low-voltage variation for using battery.
3:By the data fusion of more single line laser radars of NUC processing unmanned vehicles so that control is fairly simple, greatly improves
Arithmetic speed, solves the slower bottleneck of single ARM running softwares, and it is short to shorten the construction cycle, and program transportability ability
By force.
4:Present invention saves control panel occupied space, and also achieve effective detection of the multiple isolated areas of unmanned vehicle
And avoidance, be conducive to improve unmanned vehicle system stability and dynamic property.
5:Since controller of the present invention is using the data and algorithm of a large amount of single line laser radar sensors of NUC processing, and fill
Divide the interference source for considering surrounding, ARM is freed from hard work amount, effectively prevent motion control main program
" run fly ", unmanned vehicle antijamming capability greatly enhances.
6:Since this controller is using view data of the FPGA processing largely based on CCD camera monocular vision, and fully
The interference source of surrounding is considered, ARM is freed from heavy image processing work, not only increases arithmetic speed, and
And " run and fly " of motion control main program is effectively prevent, unmanned vehicle antijamming capability greatly enhances.
7:CCD camera single camera vision system is more remote than the distance that practical single line laser radar detects so that unmanned vehicle hinders
Hinder thing investigative range wider, be conducive to the acceleration and deceleration of unmanned vehicle, improve the dynamic property and travel speed of unmanned vehicle.
7:ARM controller can effectively be judged using the collection view data of the sample characteristics storehouse matching CCD camera of storage
It is people or which kind of object to go out barrier, so as to effectively estimate with a distance from these barriers, realizes avoidance early warning in advance.
9:It can effectively be detected around unmanned vehicle traffic direction based on black-white CCD video camera single camera vision system and protrude ground
Barrier, can not only improve the accuracy of avoidance, and these barriers can also be provided for unmanned vehicle navigation it is accurate fixed
Position.
10:Lane detection line in regular traffic, straight can be effectively told based on black-white CCD video camera single camera vision system
Row and turn etc. road sign, unmanned vehicle can rely on these mark correct oneself position and posture, improve unmanned vehicle from
By the stability and accuracy of independent navigation when driving.
11:Green light in regular traffic, amber light and red can be effectively told based on black-white CCD video camera single camera vision system
The traffic such as lamp are prompted, and unmanned vehicle can adjust the speed of itself according to these information and meet the needs such as traveling, parking, improve nothing
The security of people's car freely when driving.
12:Since the single line laser radar of unmanned bus-top has certain angle with ground, this angle can help
Top layer single line laser radar is accurately positioned the fluctuating on the movement road surface of CCD camera discovery in advance, prevents caused by road surface breakage
Dell influence unmanned vehicle and normally travel.
13:Since the single line laser radar of unmanned bus-top has certain angle with ground, this angle can help
Top layer single line laser radar be accurately positioned CCD camera find movement road surface temporarily lose fall small obstacle, prior notice without
People's vehicle control, which is realized, to be avoided, and has ensured that unmanned vehicle normally travels.
14:The more single line laser radar sensor fusion systems in front, can be accurately positioned the barrier of CCD camera discovery
Position, the unmanned vehicle control of prior notice, which is realized, to be avoided, and is conducive to improve the rapidity and security of unmanned vehicle traveling.
15:The more single line laser radar sensor fusion systems in front, since there is friendship in the direction of single line laser radar sensor
Fork, can the both sides columnar object that is found to CCD camera of accurately detecting, can be positioned for unmanned vehicle advance and certain help is provided.
16:The more single line laser radar sensor fusion systems in front, since there is friendship in the direction of single line laser radar sensor
Fork, can the both sides clear area that is found to CCD camera of accurately detecting, can be that advance turning and avoidance of unmanned vehicle provide necessarily
Help.
17:The more single line laser radar sensor fusion systems in rear, can effectively detect unmanned vehicle and rear moving disorder
The distance of thing, when in case of emergency, unmanned vehicle can accelerate to flee danger region under controller help, play protection nothing
The effect of people's car body.
18:The front blind-area detecting system being made of multiple ultrasonic sensors can effectively eliminate unmanned vehicle, and just startup is accelerated forwardly
When the short-distance blind section that occurs, improve unmanned vehicle forward Acceleration of starting when safety and reliability.
19:The rear blind area detecting system being made of multiple ultrasonic sensors can effectively eliminate when unmanned vehicle has just started reversing
Existing short-distance blind section, improves safety and reliability during unmanned vehicle reversing.
20:The front blind-area detecting system being made of multiple ultrasonic sensors can effectively eliminate real-time during unmanned vehicle normally travel
The short-distance blind section of appearance, further increases unmanned vehicle safety and reliability.
21:The rear blind area detecting system being made of multiple ultrasonic sensors occurs in real time when can effectively eliminate unmanned vehicle reversing
Short-distance blind section, further increase unmanned vehicle safety and reliability.
22:For the unmanned vehicle of this structure, in order to meet a wide range of multi-site operation, add with certain redundancy
The site sensor of degree, not only beneficial to the positioning of unmanned vehicle, but also is also beneficial to tracking of the master station to unmanned vehicle.
23:Image capturing system based on CCD camera can in case of emergency pass through wireless device in unmanned vehicle
Image scene is transmitted to master station, the scheme for needing emergent management is made by master station's anticipation.
Brief description of the drawings
Fig. 1 is common simple automatic driving car two-dimensional structure figure.
Fig. 2 is the detection of common unmanned vehicle and obstacle avoidance system schematic diagram.
Fig. 3 is for ARM with FPGA connection figures as handling principle figure.
Fig. 4 is Multi-sensor Fusion automatic driving car two-dimensional structure figure.
Fig. 5 arranges two-dimensional structure figure for front single line laser radar group and CCD camera.
Fig. 6 arranges two-dimensional structure figure for front blind zone supersonic sensor group.
Fig. 7 arranges two-dimensional structure figure for rear single line laser radar group and ultrasonic wave group.
Fig. 8 is the detection of Multi-sensor Fusion unmanned vehicle and obstacle avoidance system schematic diagram.
Fig. 9 is Multi-sensor Fusion unmanned vehicle operation schematic diagram.
Embodiment
With reference to embodiment and the present invention is described in further detail referring to the drawings.
Embodiment 1:
One embodiment of the present of invention, its concrete scheme are as described below.
The unmanned vehicle sensor construction of the present embodiment is arranged as shown in Fig. 4, Fig. 5, Fig. 6, Fig. 7.Specifically, SICK companies
Laser radar using ripe laser -- time flight theory and multiple echo technology, non-contact detection can be according to existing
Field needs, and sets the protection zone of various figures, and figure can be simply changed at any time, by interior according to the needs at scene
Portion filters and multiple echo technology causes sensor to have reliable interference free performance.LMS151 and LMS122 is that SICK companies are new
The high-performance of release is directed to the laser radar of proximity detection respectively, and LMS151 series is directed to the object of 10% reflectivity, distance
50 meters can be reached, LMS122 detecting distances are farthest 20 meters reachable.In view of above feature, the present embodiment is used and is based on
The laser radar group of LMS1XXX series forms unmanned vehicle closely front and back obstacle detection and protection system:This reality
Apply example using a position be slightly above roof, with horizontal plane similar to 5 ~ 15 degree, obliquely, positioned at roof front center portion
LMS151-10100 single line laser radars L1 is with the liftoff LMS151-10100 single lines probably parallel with horizontal plane 40cm of unification group
Laser radar(Generally 3, be respectively L2, L3, L4)The accurate front proximity detection of composition and obstacle avoidance system, wherein horizontal
Respectively positioned at the left front portion of headstock and right front portion, they have one closely by center position away from the direction of motion by L2, L4 in radar group
Like 30 degree of angle, the barrier on the left of unmanned vehicle with right side can be effectively detected respectively, L3 is located at the center of L2 and L4, its
Center position is consistent with the direction of motion;The present embodiment is using one group of liftoff probably 40cm ~ 60cm LMS122- parallel with horizontal plane
10100 laser radar groups(Generally 2, be respectively L5, L6)To form the detection of unmanned vehicle rear and protection system.
Camera needs to have following several features in selection:See that sees is more remote more abundant with regard to that can have enough to remote
Time judge and react so that avoid or reduce accident occur caused by loss.But see more remote, that brings asks
Topic is that visual angle is narrower, so needing to consider.Dynamic property is good, and the black and white camera of high dynamic range not only can effectively suppress
Halation phenomenon, is conducive to improve image quality, highly beneficial to follow-up image procossing, and at insufficient light area and night
The region of lighting apparatus can not be installed, when only monitoring position or the movement of scenery, B/W camera is substantially superior to colored shooting
Machine;Monocular vision has the advantages of calculation amount is small, and real-time is preferable relative to binocular vision, and the present embodiment uses CCD black and white phases
The monocular vision of machine carries out long distance environment detection and avoidance to coordinate laser radar.
Due to sensor combinations, generally in forward region, there are one when startup moves forward for unmanned vehicle
Blind area, in order to prevent start when collide, the present embodiment unmanned vehicle bottom add one group by ultrasonic sensor US1,
The front blind-area detection of US2, US3, US4, US5 composition and obstacle avoidance system.Start in unmanned vehicle and move forward moment, front blind-area detection
System works, if can be transferred to more single in safety zone, unmanned vehicle there is no obstacle when unmanned vehicle Acceleration of starting moves forward
Line laser radar group and CCD monocular visions fusion sensing navigational state;Due to sensor combinations, unmanned vehicle start to
Collide when moving backward afterwards generally in rear moving region there are a blind area when starting in order to prevent, the present embodiment is at nobody
The bottom of car adds one group by ultrasonic sensor US7, US8, US9, US10, US11 rear blind area detection formed and avoidance system
System.Start reversing back moment, rear blind area detection system work, if in unmanned vehicle Acceleration of starting reversing back in unmanned vehicle
There is no obstacle in safety zone, unmanned vehicle can be transferred to more single line laser radar groups and CCD monocular visions fusion sensing navigation shape
State.
The brand-new STM32F7 MCU series of products that STM companies are produced are global first volume productions and possess 32 bits
The microcontroller of ARM Cortex-M7 processors, product, which is all equipped with, possesses floating-point operation unit and DSP extended functions
Cortex-M7 cores, arithmetic speed highest 216MHz;It is total with the AXI and more AHB interconnected towards kernel, peripheral hardware and memory
Wire matrix, using 6 grades of super scalar pipelines and floating point unit (Floating Point Unit, FPU);Two general DMA controls
Device processed and a DMA for being exclusively used in graphics accelerator;Peripheral hardware speed is independently of CPU speed(Doubleclocking is supported)So that during system
Clock change does not influence peripheral hardware work;Compared to STM32 series before, possess more rich peripheral hardware;Above-mentioned outstanding efficiency is given the credit to
In 90 nanometers of leading manufacturing process of the market of STMicw Electronics, exclusive reduction flash memory memory access time, advanced dominant frequency and work(
Optimisation technique is consumed, under the stop mode that all registers and SRAM contents can continue holding, there are the exemplary currents of 100 μ A
Consumption, while STM32F7 has excellent instruction and pin compatibility:Cortex-M4 instruction set that Cortex-M7 is backward compatible,
STM32F7 series and STM32F4 series pin compatibilities;ARM Cortex-M7 efficiency is surmounted morning by STM32F7 MCU series of products
Phase core(For example Cortex-M4)Advantage apply to ultimate attainment, efficiency reaches nearly twice of DSP, and These characteristics cause
STM32F7 is very suitable for substituting the data processing that STM32F4 family chips do unmanned vehicle Multi-sensor Fusion.
Image procossing can substantially be divided into rudimentary processing and advanced processes, and the data volume of rudimentary processing is big, and algorithm is simple, deposit
In larger concurrency;The algorithm of advanced processes is complicated, and data volume is small.Image rudimentary processing stage, it is using software processing
One very time-consuming process, but utilize hardware handles, it is possible to parallel processing is carried out to mass data, can greatly be improved
Processing speed.FPGA is the cell array of standard in itself, and without function possessed by general integrated circuit, but user can be with
According to the design needs of oneself, its inside is carried out by specific placement-and-routing's instrument to reconfigure connection, when shortest
It is interior to design the application-specific integrated circuit of oneself, since FPGA uses the design philosophy of software implementation to realize the design of hardware circuit,
So allow for the system based on FPGA design with good reusable and modification property, this brand-new design philosophy by
Gradually apply on high performance image procossing and fast-developing.
With reference to the advantages of ARM and FPGA, the present embodiment uses FPGA in image rudimentary processing stage, and image advanced processes
Stage then uses ARM, the two data processing connection such as Fig. 3, so not only realizes image real time transfer, but also maximise
The function of ARM and FPGA.
With reference to Fig. 3, the view data of shooting is transferred to FPGA by CCD camera, and image preprocessing, figure are first carried out by FPGA
As obtaining logical process, image segmentation and data processing, and corresponding data storage is carried out, then by FPGA by the data transfer of processing
To ARM, graphical analysis and identification are carried out by ARM, and contrasted with data characteristics storehouse, realizes images match.
Therefore, to overcome the shortcomings that existing unmanned vehicle stability is poor, rapidity difference and sexual valence is poor, the present embodiment is given up
Single single line laser radar or multi-line laser radar operating mode used by existing unmanned vehicle, have used and were based on for the 7th generation
NUC microcomputers+ARM(Newest embedded STM32F767)The brand-new three nuclear control pattern of+FPGA.In order to reduce unmanned vehicle
Overall hardware cost and the distance for improving unmanned vehicle detection, using more single line laser radar+CCD camera+ultrasonic sensors
Integration technology realizes the detection of barrier and avoidance.For control panel using STM32F767 as processing core, real-time reception is based on NUC7
Host computer multisensor digital convergence signal and ccd image collection signal based on FPGA, and various interruptions of real-time response it is real
The now real-time data communication with master station and storage.
In order to improve arithmetic speed, ensure the rapidity, stability and reliability of unmanned vehicle control, the present embodiment exists
FPGA and the 7th generation NUC microcomputer of Intel are introduced in ARM controller based on STM32F767, formation is based on ARM+
The three nuclear control devices of FPGA+NUC, this controller are concentrated the detection of more single line laser radars and obstacle avoidance system controller system and are set
Meter, and take into full account effect of the storage battery in this system, realize detection and avoidance of the unmanned vehicle in regional.Unmanned vehicle
More single line laser radar signal processings of workload maximum give the processing of NUC microcomputers in control system, give full play to NUC
The characteristics of microcomputer data processing speed is very fast, the monocular vision graphic processing data of CCD black and white cameras give ARM and FPGA
The advantages of being jointly processed by, playing each comfortable image procossing different phase, and blind area detection and avoidance, man-machine interface, online output
STM32F767 is given etc. function individually to complete, and thereby realizes the division of labor of ARM, FPGA, NUC microcomputer, while three
Between carry out communication in real time and carry out data exchange and calling.
For the present embodiment based on tri- nuclear control devices of ARM+NUC+FPGA, under power-on state, ARM, FPGA and
NUC completes to initialize first, and then vehicle-mounted computer NUC controls master station to transfer unmanned vehicle driving path and map letter by unmanned vehicle
Breath, subsequent blind-spot sensor, the monocular vision based on CCD and single line laser radar are started to work, and with ARM+FPGA controllers
Communication, ARM+FPGA controllers determine that clear opens unmanned vehicle walking mode after entering working region and calculates CCD in real time
The image acquisition data of video camera, while mutually communicated with NUC controllers, NUC real-time reception single line laser radar feedback signals
And decode, then with ARM+FPGA controller communications and transmitting input control signal and giving ARM+FPGA controllers, ARM+FPGA controls
Device processed accurately controls direct current brushless servo motor by decoding input control signal, and direct current brushless servo motor becomes through mechanical device
Unmanned vehicle traveling is driven after the power that moves, and the signal such as Real-time Feedback displacement, speed and acceleration gives ARM+FPGA controllers.
With reference to Fig. 8, specific implementation step is:
Unmanned vehicle control is divided into two parts:Master system based on vehicle-mounted computer NUC and based on STM32F767's
ARM+FPGA double-core lower computer systems.Path and map input, more sensings are wherein completed based on vehicle-mounted computer NUC master systems
The function such as the data fusion of device and online output;The servo of unmanned vehicle system is completed based on ARM++FPGA lower computer control systems
The multiaxis brush DC servo-drive system of the functions, wherein workload maximum such as control, the monocular vision data processing of CCD, I/O controls
ARM+FPGA processing is given in control and the monocular vision data processing based on CCD, and it is very fast to give full play to ARM+FPGA data processings
The advantages of, the division of labor of NUC and ARM+FPGA are thereby realized, while can be communicated again between three, in real time into line number
According to exchange and call.
With reference to Fig. 3, Fig. 4, Fig. 5, Fig. 6, Fig. 7, Fig. 8 and Fig. 9, its specific function is realized as follows:
1)Before unmanned vehicle is not connected to motion command, it generally can be in the life of setting out that the master station to be controlled such as waiting area sends
Order, if voltage is relatively low, unmanned vehicle can be docked with charging unit automatically to charge.
2)For unmanned vehicle in waiting time after the task of setting out is connected to, unmanned vehicle vehicle-mounted computer NUC transfers nothing by master station
People's car driving path and navigation map information, subsequent ARM+FPGA controllers open blind-spot sensor US1 ~ US5 and blind area are carried out
Scanning, if barrier enters movement blind area, ARM+FPGA controllers can send alarm, and wait the removing of barrier;Such as
Fruit clear enters movement blind area, and unmanned vehicle starts to automatically speed up.
3)After unmanned vehicle starts startup, ARM+FPGA controllers just open CCD B/W cameras and each single line laser radar
Sensor L1 ~ L6 simultaneously starts to navigate by them, starts to walk along fixed course.
4)After unmanned vehicle enters moving line, CCD camera starts long-range detection, first of CCD B/W cameras
Task is exactly to combine the index point that existing road map information finds road, these index points are probably roadhead,
It is likely to be the place of turning, it is also possible to which some stop websites, CCD can be communicated after finding these collection points with FPGA, FPGA
The image of CCD B/W cameras decode and then is communicated with STM32F767, STM32F767 continues to locate according to internal algorithm
Manage ccd image data message, be then converted to the pwm control signal of direct current brushless servo motor, controller drive unmanned vehicle into
Positioning and pose adjustment before row normally travel;Unmanned vehicle combination CCD camera Image Acquisition completes positioning and pose adjustment
Afterwards, unmanned vehicle will be according to vehicular map information normally travel, and realtime graphic is transferred to by CCD camera in the process of moving
STM32F767 is transmitted data to after FPGA, FPGA decoding process, ARM controller will be special real-time decoding view data and target
Levy storehouse and carry out images match, and then identify that front obstacle is which kind of vehicle, object, pedestrian or various path indicators, then
ARM starts according to the approximate distance of target magnitude estimation unmanned vehicle in the picture and barrier, then ARM+FPGA controllers
The control of direct current brushless servo motor is finely tuned, unmanned vehicle is come into effect remote avoidance, and lead in real time with vehicle-mounted computer NUC
News, NUC receive the real-time proximity detection signal of single line laser radar, and certain suspected target enters single line laser radar detection model
Enclose, single line laser radar will be confirmed and be accurately positioned to suspected target with reference to CCD B/W camera data.
Single line laser is detected with front parallel to the ground the diagonally forward laser sensor L1 at ground into approximate 5 ~ 15 degree of angles
Radar group(L2、L3、L4)The suspected environmental in moment detection front:Diagonally forward laser sensor with ground into approximate 5 ~ 15 degree of angles
L1 can work independently, and due to that can detect the fluctuating of road ahead very well with certain angle of inclination, L1, ARM is utilized
The data of ccd image data and L1 after FPGA processing can be very easy to find the depth and width of fluctuating, and barrier with
The accurate distance of unmanned vehicle;ARM can be accurate using the CCD camera view data after L1, L3 detection data and FPGA processing
The presence or absence of detection front barrier, and the accurate distance of barrier and unmanned vehicle;ARM utilizes L2, L3 detection data
Can be with the presence or absence of accurately detecting left front barrier, and barrier with the CCD camera view data after FPGA processing
With the accurate distance of unmanned vehicle;ARM can essence using the CCD camera view data after L4, L3 detection data and FPGA processing
The really presence or absence of detection right front barrier, and the accurate distance of barrier and unmanned vehicle.
If L1 and CCD camera accurately detecting into forward path there are the fluctuating pitting of certain altitude, if
Height and width have exceeded the requirement that unmanned vehicle is crossed, and interrupt requests will be sent to ARM+FPGA controllers at the same time fluctuating pitting
Data are transferred to NUC and are handled and preserved, and STM32F767 can hide the sub- journey of protection to interrupting priority treatment and entering front
Sequence;If the height of fluctuating pitting and width in unmanned vehicle tolerance, unmanned vehicle by according to the normal speed of setting into every trade
Sail the pitting that surmounts obstacles.
If L1, L3 and CCD camera accurately detecting, there are barrier, will be controlled into forward path to ARM+FPGA
Device processed sends interrupt requests and barrier data is transferred to NUC processing at the same time, and STM32F767 can be to interrupting priority treatment simultaneously
Subprogram is protected into front avoidance:The data that STM32F767 is communicated according to NUC give way into leftward or rightward avoidance;Such as
Fruit does not have barrier to enter range of operation, and unmanned vehicle will be travelled according to the normal speed of setting.
If L2, L3 and CCD camera accurately detecting into left front motion path there are barrier, will be to ARM+FPGA
Controller sends interrupt requests and barrier data is transferred to NUC processing at the same time, and STM32F767 can be to interrupting priority treatment
And enter left front avoidance protection subprogram:If make unmanned vehicle remote by remote ccd image feedback servo controller
From barrier, unmanned vehicle will be travelled according to the normal speed of setting.If after remote ccd image feedback regulation according to
So with the presence of the safety movement of barrier unmanned vehicle in the range of, the data entrance that ARM+FPGA controllers communicates according to NUC is to the right
Avoidance give way.
If L4, L3 and CCD camera accurately detecting into right front motion path there are barrier, will be to ARM+FPGA
Send interrupt requests and barrier data are transferred to NUC processing at the same time, STM32F767 can be to interrupting priority treatment and entering
Avoidance protects subprogram before the right side:If unmanned vehicle is made away from barrier by remote ccd image feedback servo controller
Hinder thing, unmanned vehicle will be travelled according to the normal speed of setting.If still there is barrier after remote ccd image feedback regulation
Hinder thing there are the safety movement of unmanned vehicle in the range of, the data that ARM+FPGA controllers are communicated according to NUC enter avoidance to the left
Give way.
5)After unmanned vehicle enters moving line, rear parallel to the ground detection single line laser radar group(L5、L6)Moment
Detect rear environment, if L5 or L6 judge rear there are barrier to unmanned vehicle it is close when, will be sent to STM32F767
Barrier data are transferred to NUC processing by interrupt requests at the same time, and STM32F767 can be to interruption priority treatment, then ARM+
FPGA controller enters rear avoidance protection subprogram and sends alarm;If rear does not have barrier to enter protection domain, nothing
People's car will be travelled according to the normal speed of setting.
6)After unmanned vehicle enters moving line, during front and rear blind-spot sensor US1 ~ US5 and US6 ~ US10 parallel to the ground
Carve the environment of check frequency, if US1 ~ US5 or US6 ~ US10 judge to have interim barrier to unmanned vehicle blind area it is close when, will
Interrupt requests being sent to STM32F767 barrier data being transferred to NUC processing at the same time, STM32F767 can be excellent to interrupting
First handle, then ARM+FPGA controllers enter blind area avoidance protection subprogram and send alarm;If blind area does not have barrier
Into protection domain, unmanned vehicle will be travelled according to the normal speed of setting.
7)Under conditions of unmanned vehicle injection normal running speed reaches requirement, its sensor L1 ~ L6 to navigate,
US1 ~ US10 and CCD camera are conveyed to NUC and ARM+FPGA controllers by work, and feedback signal, first have NUC to be swashed
Optical radar Data Fusion, FPGA to ccd image data processing and STM32F767 response it is various interrupt protect, then NUC and
FPGA and STM32F767 is communicated, by STM32F767 controllers according to sensor decoder signal generation control signal to brush DC
Servomotor, realizes that the movement velocity of unmanned vehicle and the direction of motion change by adjusting the movement of servomotor so that nobody
Car can easily follow vehicle-mounted input path.
8)In unmanned vehicle injection normal operation, CCD camera is each to the ground of some emphasis mark regions in real time
Kind navigation marker is read out, and FPGA can be transferred directly to by then gathering image, and ARM is transferred to after FPGA decoding process,
STM32F767 will be matched according to the data of these image decodings with java standard library, and unmanned vehicle will rely on these after successful match
Mark carries out secondary pose adjustment as one of advance navigation marker.
9)In unmanned vehicle injection normal operation, CCD camera is in real time to the aerial each of some emphasis mark regions
Kind navigation marker is read out, and FPGA can be transferred directly to by then gathering image, and ARM is transferred to after FPGA decoding process,
STM32F767 will be matched according to the data of these image decodings with java standard library, and unmanned vehicle will rely on these after successful match
Mark such as is stopped, started and is turned at the task.
10)Due to unmanned vehicle as a rule, be not one-stop service pattern, arrival it is local more, in order to
Realize the website function of unmanned vehicle, the present embodiment adds surface mark in site location, when unmanned vehicle will reach website,
ARM+FPGA controllers can be read out website mark by CCD B/W cameras, will automatically add up after website reading, is
Realize the automatic traveling circulating function of unmanned vehicle, unmanned vehicle reaches after last website can automatic clear and slave site again
1 counts.
11)After unmanned vehicle, which enters, stops website, STM32F767 storage generations are entered the station information record table, then pass through nothing
Line apparatus is sent to master station, is conducive to the scheduling of tracking and unmanned vehicle of the master station to unmanned truck position.
12)In order to meet the actual functional capability needs of unmanned vehicle in scenic spot etc. in special circumstances, the present embodiment adds
Bus stop selection function:Unmanned vehicle can be freely set to need the bus stop gone, Ran Houwu in unmanned vehicle initial operating stage master station
People's car can be with this setting of complete independently by the sensor of itself, if an emergency situation is encountered in the process of running, master station needs
Change operating path or stop website, main website is communicated by wireless device and unmanned vehicle ARM+FPGA controllers, and is passed through
Change walking information wirelessly is sent to ARM+FPGA controllers, ARM+FPGA controllers communicate simultaneously with unmanned vehicle vehicle-mounted computer NUC
Transmit more new route and stop site information, unmanned vehicle completes task according to new requirement.
13)When unmanned vehicle by fixed route when driving, a variety of acoustooptic alarm systems in system are by work, it is easy to remind
The presence of surrounding pedestrian's unmanned vehicle, when unmanned vehicle loses communication with main website, ARM+FPGA controllers can send automatic stopping letter
Number, direct original place locks the motion servo motor of unmanned vehicle, is thus not easy to collide with other unmanned vehicles, at this time main website by
In the transmission information that can not be collected into unmanned vehicle, fast track will be carried out according to a upper anchor point information, and solve failure and ask
Topic.
Although the present invention is disclosed as above with preferred embodiment, embodiment is not for limiting the present invention's.Not
In the spirit and scope for departing from the present invention, any equivalence changes done or retouching, also belong to the protection domain of the present invention.Cause
This protection scope of the present invention should be using the content that claims hereof is defined as standard.
Claims (18)
1. a kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system, including control system and roof laser radar, it is special
Sign is, further includes vehicle-mounted monocular vision camera, configuration swashs in the front laser radar group of unmanned vehicle lower car body and rear
Optical radar group, wherein:
The roof laser radar is used to detect the fluctuating of unmanned vehicle road ahead and is visited in the lump with front laser radar group
Survey the barrier situation in unmanned vehicle forward path;The front laser radar group be additionally operable to detection unmanned vehicle left front and
Barrier situation in the motion path of right front;The rear laser radar group is used for the obstacle principle for detecting unmanned vehicle rear
Condition;The vehicle-mounted monocular vision camera be used to identifying sign in front of unmanned vehicle and with roof laser radar and front laser thunder
Detection barrier situation is closed up to assembling;
The control system includes host computer and slave computer, and each laser radar feedback signal of host computer real-time reception simultaneously decodes, so
Communicated afterwards with slave computer and transmit input control signal to slave computer;The slave computer is the double-core control being made of FPGA and ARM
Device processed, wherein FPGA obtain and the view data of processing vehicle-mounted monocular vision camera, the data that ARM is handled according to FPGA carry out
Images match, and combine decoded input control signal control unmanned vehicle traveling.
2. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that the car
Top laser radar is 1 single line laser radar, is located slightly above roof and with horizontal plane similar to 5 ~ 15 degree obliquely of roof
Front center portion.
3. Multi-sensor Fusion middling speed unmanned vehicle according to claim 2 detects obstacle avoidance system, it is characterised in that the car
Top laser radar is LMS151 single line laser radars.
4. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that before described
Square laser radar group is made of 3 single line laser radars, wherein have two left front portion and the right front portions for being located at headstock respectively, both
Center position has an angle of 30 degree of an approximation away from unmanned vehicle direction of advance, a remaining center positioned at both,
Its center position is consistent with unmanned vehicle direction of advance.
5. Multi-sensor Fusion middling speed unmanned vehicle according to claim 4 detects obstacle avoidance system, it is characterised in that before described
The about liftoff 40cm of setting height of square laser radar group.
6. Multi-sensor Fusion middling speed unmanned vehicle according to claim 4 detects obstacle avoidance system, it is characterised in that before described
Square laser radar group is LMS151 single line laser radars.
7. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that after described
Square laser radar group is made of two single line laser radars parallel with horizontal plane, respectively positioned at the both sides of the tailstock.
8. Multi-sensor Fusion middling speed unmanned vehicle according to claim 7 detects obstacle avoidance system, it is characterised in that after described
Square laser radar group sets the about liftoff 40cm ~ 60cm of height.
9. Multi-sensor Fusion middling speed unmanned vehicle according to claim 7 detects obstacle avoidance system, it is characterised in that after described
Square laser radar group is LMS122 single line laser radars.
10. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that also wraps
Include the front ultrasonic sensor group and rear ultrasonic sensor group for being arranged on unmanned car bottom, the front supersonic sensing
Device group is used for detection avoidance in blind area in front of unmanned vehicle, and the rear ultrasonic sensor group is used for the detection of unmanned vehicle rear blind area and keeps away
Barrier, the slave computer are communicated with front ultrasonic sensor group and rear ultrasonic sensor group.
11. Multi-sensor Fusion middling speed unmanned vehicle according to claim 10 detects obstacle avoidance system, it is characterised in that described
Front ultrasonic sensor group is made of 5 ultrasonic sensors.
12. Multi-sensor Fusion middling speed unmanned vehicle according to claim 10 detects obstacle avoidance system, it is characterised in that described
Rear ultrasonic sensor group is made of 5 ultrasonic sensors.
13. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that described
Host computer is the NUC microcomputers of Intel.
14. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that described
Slave computer is STM32F7 MCU.
15. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that described
Control system is communicated by wireless device and unmanned vehicle master station, and when unmanned vehicle loses communication with master station, slave computer is implemented automatic
Parking toll.
16. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that described
Slave computer also reads the website mark on ground by vehicle-mounted monocular vision camera.
17. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that nobody
Car is electric car, and the control system examines the terminal voltage of storage battery according to the internal resistance and temperature parameter of electromobile battery
Survey.
18. Multi-sensor Fusion middling speed unmanned vehicle according to claim 1 detects obstacle avoidance system, it is characterised in that described
Vehicle-mounted monocular vision camera is CCD black and white cameras.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711229561.9A CN108037756A (en) | 2017-11-29 | 2017-11-29 | A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711229561.9A CN108037756A (en) | 2017-11-29 | 2017-11-29 | A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108037756A true CN108037756A (en) | 2018-05-15 |
Family
ID=62094505
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711229561.9A Pending CN108037756A (en) | 2017-11-29 | 2017-11-29 | A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108037756A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213162A (en) * | 2018-09-01 | 2019-01-15 | 哈尔滨工程大学 | A kind of autonomous berthing offshore method in unmanned surface vehicle pond combined of multi-sensor information |
WO2020135730A1 (en) * | 2018-12-28 | 2020-07-02 | 百度在线网络技术(北京)有限公司 | Vehicle-mounted control unit, and fpga-based vehicle automatic driving method and device |
CN111522350A (en) * | 2020-07-06 | 2020-08-11 | 深圳裹动智驾科技有限公司 | Sensing method, intelligent control equipment and automatic driving vehicle |
CN111824180A (en) * | 2020-06-29 | 2020-10-27 | 安徽海博智能科技有限责任公司 | Unmanned mine car automatic driving control system with fusion obstacle avoidance function |
CN111880548A (en) * | 2020-09-02 | 2020-11-03 | 北京云迹科技有限公司 | Carrying robot |
CN111962442A (en) * | 2020-06-29 | 2020-11-20 | 长沙中联重科环境产业有限公司 | Multifunctional self-following sanitation robot and self-following method thereof |
CN113895543A (en) * | 2021-10-09 | 2022-01-07 | 西安电子科技大学 | Intelligent unmanned vehicle driving system based on park environment |
-
2017
- 2017-11-29 CN CN201711229561.9A patent/CN108037756A/en active Pending
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213162A (en) * | 2018-09-01 | 2019-01-15 | 哈尔滨工程大学 | A kind of autonomous berthing offshore method in unmanned surface vehicle pond combined of multi-sensor information |
KR102471010B1 (en) * | 2018-12-28 | 2022-11-25 | 아폴로 인텔리전트 드라이빙 테크놀로지(베이징) 컴퍼니 리미티드 | Vehicle-mounted control unit, FPGA-based vehicle automatic driving method and device |
WO2020135730A1 (en) * | 2018-12-28 | 2020-07-02 | 百度在线网络技术(北京)有限公司 | Vehicle-mounted control unit, and fpga-based vehicle automatic driving method and device |
KR20200115594A (en) * | 2018-12-28 | 2020-10-07 | 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 | Vehicle-mounted control unit, FPGA-based vehicle automatic driving method and device |
US12022237B2 (en) | 2018-12-28 | 2024-06-25 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Vehicle-mounted control unit, and method and apparatus for FPGA based automatic driving of vehicle |
EP3839686A4 (en) * | 2018-12-28 | 2022-05-04 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Vehicle-mounted control unit, and fpga-based vehicle automatic driving method and device |
CN111824180A (en) * | 2020-06-29 | 2020-10-27 | 安徽海博智能科技有限责任公司 | Unmanned mine car automatic driving control system with fusion obstacle avoidance function |
CN111962442A (en) * | 2020-06-29 | 2020-11-20 | 长沙中联重科环境产业有限公司 | Multifunctional self-following sanitation robot and self-following method thereof |
CN111962442B (en) * | 2020-06-29 | 2021-12-17 | 长沙中联重科环境产业有限公司 | Multifunctional self-following sanitation robot and self-following method thereof |
CN111522350A (en) * | 2020-07-06 | 2020-08-11 | 深圳裹动智驾科技有限公司 | Sensing method, intelligent control equipment and automatic driving vehicle |
CN111522350B (en) * | 2020-07-06 | 2020-10-09 | 深圳裹动智驾科技有限公司 | Sensing method, intelligent control equipment and automatic driving vehicle |
CN111880548A (en) * | 2020-09-02 | 2020-11-03 | 北京云迹科技有限公司 | Carrying robot |
CN113895543A (en) * | 2021-10-09 | 2022-01-07 | 西安电子科技大学 | Intelligent unmanned vehicle driving system based on park environment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107957583A (en) | A kind of round-the-clock quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion | |
CN108021133A (en) | A kind of Multi-sensor Fusion high speed unmanned vehicle detects obstacle avoidance system | |
WO2022082843A1 (en) | Multi-sensor integrated unmanned vehicle detection and obstacle avoidance system and obstacle avoidance method | |
CN108177651A (en) | A kind of quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion | |
CN108037756A (en) | A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system | |
CN109606354B (en) | Automatic parking method and auxiliary system based on hierarchical planning | |
CN107614349B (en) | Controller of vehicle and control method for vehicle | |
CN107977004A (en) | A kind of round-the-clock high speed unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion | |
CN206532138U (en) | A kind of unmanned vehicle automatic Pilot intelligence system | |
WO2020029462A1 (en) | Self-driving system for electric vehicle | |
CN110568852A (en) | Automatic driving system and control method thereof | |
CN108189834A (en) | A kind of Multi-sensor Fusion low speed unmanned vehicle detects obstacle avoidance system | |
CN113085896B (en) | Auxiliary automatic driving system and method for modern rail cleaning vehicle | |
US11325524B2 (en) | Collaborative vehicle headlight directing | |
CN207301793U (en) | A kind of unmanned intelligent vehicle of image recognition processing | |
CN110060467A (en) | Prediction meanss, prediction technique and storage medium | |
CN110103962A (en) | Controller of vehicle, control method for vehicle and storage medium | |
CN114442101B (en) | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar | |
CN112068574A (en) | Control method and system for unmanned vehicle in dynamic complex environment | |
US20210213869A1 (en) | Collaborative Vehicle Headlight Directing | |
CN108205325A (en) | A kind of round-the-clock unmanned cruiser system of four-wheel drive low speed | |
CN110371123A (en) | Controller of vehicle, control method for vehicle and storage medium | |
CN108061903A (en) | A kind of round-the-clock low speed unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion | |
CN108469820A (en) | A kind of round-the-clock unmanned cruiser system of two-wheel drive low speed | |
Mei et al. | Development of ‘Intelligent Pioneer’unmanned vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20180530 Address after: 211106 first floor, block C4, Kowloon Lake International Business Park, 19 Jiangning economic and Technological Development Zone, Nanjing, Jiangsu. Applicant after: Zhang Haoming Address before: 211106 first floor, block C4, Kowloon Lake International Business Park, 19 Jiangning economic and Technological Development Zone, Nanjing, Jiangsu. Applicant before: Jiangsu Ruobo Robot Technology Co., Ltd. |
|
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180515 |