CN107161141A - Pilotless automobile system and automobile - Google Patents
Pilotless automobile system and automobile Download PDFInfo
- Publication number
- CN107161141A CN107161141A CN201710136501.6A CN201710136501A CN107161141A CN 107161141 A CN107161141 A CN 107161141A CN 201710136501 A CN201710136501 A CN 201710136501A CN 107161141 A CN107161141 A CN 107161141A
- Authority
- CN
- China
- Prior art keywords
- information
- barrier
- pilotless automobile
- subsystem
- radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle for navigation systems
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Abstract
The present invention relates to a kind of automobile and pilotless automobile system and automobile.Pilotless automobile system includes environment sensing subsystem, data fusion subsystem, path planning decision-making subsystem and traveling control subsystem.The ambient condition information of image information and three-dimensional coordinate information is included by the fusion of data fusion subsystem, and the tracked information of obstacle information, lane line information, traffic mark information and dynamic barrier is extracted, improve the recognition capability and precision to ambient condition information.Information and traveling destination information planning driving path that path planning decision-making subsystem is extracted according to data fusion subsystem, travel control subsystem and control instruction is generated according to driving path, and pilotless automobile is controlled according to control instruction control, and then the high unmanned function of security performance can be realized.
Description
Technical field
The present invention relates to automobile technical field, more particularly to pilotless automobile system and automobile.
Background technology
Current autonomous driving vehicle technology possesses substantially to be automatically brought into operation and driveability, for example, pacifying on automobile
The advanced instruments such as camera, radar sensor and laser detector are filled, speed limit and the roadside of highway can be perceived by them
Traffic sign, and surrounding vehicle situation of movement, if setting out only need to be navigated by map.It is unmanned
System mainly perceives vehicle-periphery using onboard sensor, and according to perception obtained road, vehicle location and barrier
Hinder thing information, control steering and the speed of vehicle, so as to enable the vehicle to reliably and securely travel on road.
At present, pilotless automobile is a kind of intelligent automobile, relies primarily on the intelligence based on computer system of in-car
Pilot is unmanned to realize.But, Unmanned Systems' wherein difficult point is to roadside traffic and surrounding environment identification feelings
The resolving ability of condition is inaccurate etc. so as to the data that cause Unmanned Systems to collect.
The content of the invention
Based on this, it is necessary to which in view of the above-mentioned problems, providing, a kind of recognition capability to ambient condition information is strong and precision
Pilotless automobile system and automobile high and can drive safely.
A kind of pilotless automobile system, including:
Environment sensing subsystem, information of vehicles and ambient condition information for gathering pilotless automobile, the surrounding
Environmental information includes the image information and three-dimensional coordinate information of surrounding environment;
Data fusion subsystem, for merging the image information and three-dimensional coordinate information and extracting lane line information, barrier
Hinder the tracked information of thing information, traffic mark information and dynamic barrier;
Path planning decision-making subsystem, for according to the information of vehicles, data fusion subsystem extraction information and
Travel destination information planning driving path;
Control subsystem is travelled, for generating control instruction according to the driving path, and according to the control instruction pair
Pilotless automobile is controlled.
In one of the embodiments, the environment sensing subsystem includes:
Vision sensor, the image information for gathering pilotless automobile surrounding environment;
Radar, the three-dimensional coordinate information of the surrounding environment for gathering pilotless automobile.
In one of the embodiments, the data fusion subsystem includes:
Lane line Fusion Module, is folded for the ambient condition information to the vision sensor and radar collection
Plus or exclude, and extract the lane line information;
Obstacle recognition Fusion Module, enters for the ambient condition information to the vision sensor and radar collection
Row fusion, and extract the obstacle information;
Traffic mark Fusion Module, is carried out for the ambient condition information to the vision sensor and radar collection
Detection, and extract the traffic mark information;
Barrier dynamic tracing Fusion Module, believes for the surrounding environment to the vision sensor and radar collection
Breath is merged, and extracts the tracked information of the dynamic barrier.
In one of the embodiments, the lane line Fusion Module includes vision lane detection unit and radar road
Line detection unit;The vision lane detection unit is used to handle the image information, and extracts vision lane line
Information;The radar truck diatom detection unit is used for the information of road surface for extracting pilotless automobile traveling, and according to the road surface
Acquisition of information track outline information;The lane line Fusion Module is additionally operable to the vision lane line information and track foreign steamer
Wide information is overlapped or excluded, and obtains the lane line information.
In one of the embodiments, the obstacle recognition Fusion Module includes visual barrier recognition unit and radar
Obstacle recognition unit;The visual barrier recognition unit is used to be partitioned into background information and prospect according to the image information
Information, is identified to the foreground information and obtains the visual barrier information with colour information;The radar barrier is known
Other unit is additionally operable to recognize the radar obstacle information with three-dimensional coordinate information in the range of the first preset height;The barrier
Hinder thing to recognize Fusion Module, for merging the visual barrier information and radar obstacle information, obtain the barrier letter
Breath.
In one of the embodiments, the traffic mark Fusion Module includes visual traffic label detection unit and radar
Traffic mark detection unit;The visual traffic label detection unit is detected to the image information, and extracts vision friendship
Logical identification information;The radar traffic label detection unit is used to extract ground traffic sign information;Detection is additionally operable to second
Suspension traffic mark information in the range of preset height;The traffic mark Fusion Module is additionally operable to according to the traffic above-ground mark
Know information and suspension traffic mark information determines the position of the traffic mark information, and obtain described in the band of position
The classification of traffic mark information.
In one of the embodiments, the barrier dynamic tracing Fusion Module includes vision dynamic tracing unit and thunder
Up to dynamic tracing unit, the vision dynamic tracing unit is used to the image information is identified, and connects in adjacent two frame
Dynamic barrier is positioned in continuous frame, and obtains the color information of the dynamic barrier;The radar dynamic tracing unit is used for
Follow the trail of the three-dimensional coordinate information of dynamic barrier;The barrier dynamic tracing Fusion Module is additionally operable to merge the dynamic disorder
The color information of thing and the three-dimensional coordinate information of dynamic barrier, obtain the tracked information of the dynamic barrier.
In one of the embodiments, the environment sensing subsystem also includes:
GPS location navigator, current geographical position and time for gathering pilotless automobile;
Inertial Measurement Unit, the vehicle attitude for measuring the pilotless automobile;
Speed data collection module, for obtaining the speed that pilotless automobile is currently run.
In one of the embodiments, in addition to:
Supervise communication subsystem, driving path real-time Transmission to the outside for the path planning decision-making subsystem to be planned
Control center.
In addition, a kind of automobile is also provided, including above-mentioned pilotless automobile system.
The pilotless automobile system of the embodiment of the present invention, includes image information and three by the fusion of data fusion subsystem
The ambient condition information of dimension coordinate information, and extract obstacle information, lane line information, traffic mark information and dynamic disorder
The tracked information of thing, improves the recognition capability and precision to ambient condition information.Path planning decision-making subsystem is according to number
The information and traveling destination information planning driving path extracted according to fusion subsystem, traveling control subsystem is according to the row
Coordinates measurement control instruction is sailed, and pilotless automobile is controlled according to control instruction control, and then can be realized
The high unmanned function of security performance.
Brief description of the drawings
Fig. 1 is the structural framing figure of pilotless automobile system in one embodiment;
Fig. 2 is the structural framing figure of environment sensing subsystem in one embodiment;
Fig. 3 is the structural framing figure of data fusion subsystem in one embodiment.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, it is right below in conjunction with drawings and Examples
The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and
It is not used in the restriction present invention.
Fig. 1 is the structural framing figure of pilotless automobile system in one embodiment, a kind of pilotless automobile system bag
Include environment sensing subsystem 10, data fusion subsystem 20, path planning decision-making subsystem 30 and traveling control subsystem 40.
Wherein, environment sensing subsystem 10, information of vehicles and ambient condition information for gathering pilotless automobile, its
In, ambient condition information includes the image information and three-dimensional coordinate information of surrounding environment.
Data fusion subsystem 20, for merging ambient condition information and extracting obstacle information, lane line information, traffic
The tracked information of identification information and dynamic barrier.
Path planning decision-making subsystem 30, for according to information of vehicles, data fusion subsystem 20 extraction information and
Travel destination information planning driving path.
Control subsystem 40 is travelled, for generating control instruction according to driving path, and according to control instruction control to nothing
People's driving is controlled.
Above-mentioned pilotless automobile system, includes image information by the fusion of data fusion subsystem 20 and three-dimensional coordinate is believed
The ambient condition information of breath, and extract the tracking of obstacle information, lane line information, traffic mark information and dynamic barrier
Information, improves the recognition capability and precision to ambient condition information.Path planning decision-making subsystem 30 is according to data fusion
Information and traveling destination information planning driving path that subsystem 20 is extracted, traveling control subsystem 40 is according to driving path
Control instruction is generated, and pilotless automobile is controlled according to control instruction control, and then security performance pole can be realized
High unmanned function.
In one embodiment, with reference to Fig. 2, environment sensing subsystem 10 includes vision sensor 110 and radar 120.Its
In, vision sensor 110 is mainly made up of one or two image sensors, to be also equipped with light projector sometimes and other are auxiliary
Help equipment.Imaging sensor can use laser scanner, linear array and Array CCD Camera or TV video cameras or
Most emerging digital camera etc..Vision sensor 110 is arranged on pilotless automobile, for gathering pilotless automobile
Ambient condition information, also just collection pilotless automobile near real-time road condition information, including obstacle information, lane line letter
Breath, traffic mark information and the dynamic tracing information to barrier.The ambient condition information gathered is the shadow of surrounding environment
As information, video information can be referred to as again.
Radar 120 is used for the three-dimensional coordinate information for gathering the surrounding environment of pilotless automobile.The pilotless automobile system
System includes multiple radars 120.In one embodiment, multiple radars 120 include laser radar and millimetre-wave radar.Laser thunder
Up to using mechanical many wire harness laser radars, mainly by launching laser beam, the feature such as position, speed to detect target
Amount, can also carry out obstacle detection and tracking using the echo strength information of laser radar.Laser radar has investigative range more
Extensively, the high advantage of detection accuracy.The wavelength of millimetre-wave radar has microwave guidance and photoelectricity system concurrently between centimeter wave and light wave
The advantage led, and its seeker has small volume, light weight, spatial resolution high, millimeter-wave seeker penetrating fog, cigarette, dust
Ability it is strong the characteristics of.In an example, while using laser radar and millimetre-wave radar, laser radar can be solved in pole
The drawbacks of can not putting performance to good use under the weather of end, can greatly promote the detection performance of pilotless automobile.
In one embodiment, environment sensing subsystem 10 is additionally operable to gather the information of vehicles of pilotless automobile.Wherein,
Information of vehicles includes current geographical position and time, the vehicle attitude and the speed currently run etc. of pilotless automobile.Ring
Border, which perceives subsystem 10, also includes GPS location navigator 130, (the Inertial measurement of Inertial Measurement Unit 140
Unit, IMU) and speed data collection module 150.Wherein, GPS location navigator 130 gathers the current geography of pilotless automobile
Position and time.In the process of moving, the global positioning device that in-car is installed will obtain standard where automobile at any time to pilotless automobile
True orientation, further improves security.Inertial Measurement Unit 140 is used for the vehicle attitude for measuring pilotless automobile.Speed is adopted
Collection module 150 is used to obtain the speed that pilotless automobile is currently run.
In one embodiment, with reference to Fig. 3, data fusion subsystem 20 includes:Lane line Fusion Module 210, barrier
Recognize Fusion Module 220, traffic mark Fusion Module 230 and barrier dynamic tracing Fusion Module 240.
Wherein, lane line Fusion Module 210, for the ambient condition information to vision sensor 110 and the collection of radar 120
It is overlapped or excludes, and extracts lane line information.Obstacle recognition Fusion Module 220, for vision sensor 110 and thunder
Ambient condition information up to 120 collections is merged, and extracts obstacle information.Traffic mark Fusion Module 230, for regarding
Feel that the ambient condition information that sensor 110 and radar 120 are gathered is detected, and extract traffic mark information.Barrier dynamic
Fusion Module 240 is followed the trail of, is merged for the ambient condition information to vision sensor 110 and the collection of radar 120, and extract
Lane line information.
In one embodiment, lane line Fusion Module 210 includes vision lane detection unit 211 and radar truck diatom
Detection unit 213.
Vision lane detection unit 211 is used to handle image information, and extracts vision lane line information.Vision
The image information that lane detection unit 211 is obtained to vision sensor 110 carries out the pretreatment such as denoising, enhancing, segmentation, and carries
Take out vision lane line information.
Radar truck diatom detection unit 213 is used for the information of road surface for extracting pilotless automobile traveling, and is believed according to road surface
Breath obtains track outline information.Radar truck diatom detection unit 213 is obtained when obtaining track outline information to laser radar
The three-dimensional coordinate information of the running ground of the pilotless automobile taken is calibrated, and is calculated discrete in three-dimensional coordinate information
Point, wherein, discrete point may be defined as the distance between the adjacent 2 points points for being more than preset range.And place is filtered to discrete point
Reason, the positional information on ground is fitted using stochastical sampling coherence method, obtains track outline information, namely obtain radar
120 lane line information.
The vision lane line information and track outline information of 210 pairs of acquisitions of lane line Fusion Module are merged (superposition)
Or exclude, obtain real-time lane line information.By lane line Fusion Module 210, the identification of lane line information can be improved
Accuracy, can avoid the situation for Lou obtaining lane line information from occurring.
In one embodiment, obstacle recognition Fusion Module 220 includes visual barrier recognition unit 221 and radar hinders
Hinder thing recognition unit 223.Wherein, visual barrier recognition unit 221 is used to be partitioned into background information with before according to image information
Scape information, is identified to foreground information and obtains the visual barrier information with colour information.Visual barrier recognition unit
221 are handled image information by methods such as pattern-recognition or machine learning, and background is set up using context update algorithm
Model and it is partitioned into prospect.The prospect being partitioned into is identified and obtains the visual barrier information with colour information.
Radar obstacle recognition unit 223, which is used to recognize in the range of the first preset height, has three-dimensional coordinate information
Radar obstacle information.
The pilotless automobile ambient condition information that radar obstacle recognition unit 223 is obtained to laser radar is located in advance
Reason, removes terrestrial information, and screen the three-dimensional coordinate information for identifying the surrounding environment in the range of the first preset height.According to
Lane line information this constraints detection area-of-interest (region of interest, ROI), wherein, area-of-interest
Region to be processed is needed to be sketched the contours of with modes such as square frame, circle, ellipse, irregular polygons.The area-of-interest that will identify that
Data message rasterizing, and carry out barrier block cluster segmentation.Original laser radar points corresponding to each piece of barrier block
Cloud data carry out secondary cluster, place less divided.Using institute's cloud data of secondary cluster as training sample set, according to training sample
This collection generates sorter model, then, and Classification and Identification is carried out to the barrier block after secondary cluster using training pattern and is obtained
Radar obstacle information with three-dimensional coordinate information.
Obstacle recognition Fusion Module 220, for merging visual barrier information and radar obstacle information, obtains obstacle
Thing information.Because visual barrier information can fail in strong light environment or the fast-changing scene of light, and radar 120 is
Obstacle information is detected by active light source, its stability is strong.When pilotless automobile is in strong light environment or light
When being travelled in fast-changing scene, the identification Fusion Module 220 that can break the barriers hinders to visual barrier information and radar
Thing information is hindered to be overlapped, it is possible to obtain accurate barrier letter in strong light environment or the fast-changing scene of light
Breath.
Because radar 120 is relatively low in the resolution ratio of vertical direction, gathered be barrier three-dimensional coordinate information and
Not RGB RGB color information, remote or the feelings of wrong identification also occur in the case of having barrier to block
Condition.And the obstacle information that visual barrier recognition unit 221 is obtained contains abundant RGB RGB information, and pixel
It is high.The three-dimensional coordinate information of colour information and barrier to barrier is overlapped fusion, it is possible to while obtaining comprising coloured silk
The obstacle information of color information and three-dimensional information.The identification Fusion Module 220 that breaks the barriers can reduce false recognition rate, improve and know
The other degree of accuracy, further ensures safe driving.
In one embodiment, traffic mark Fusion Module 230 includes visual traffic label detection unit 231 and radar is handed over
Logical label detection unit 233.
Visual traffic label detection unit 231 is detected to image information, and extracts visual traffic identification information.Vision
Traffic mark detection unit 231 is detected to image information, and image is believed by methods such as pattern-recognition or machine learning
Breath is handled, and obtains visual traffic identification information, wherein, RGB RGB color is contained in visual traffic identification information
Information.
Radar traffic label detection unit 233 is used to extract ground traffic sign information;It is additionally operable to detection default second
Suspension traffic mark information in altitude range.Wherein, radar traffic label detection unit 233 is carried according to reflected intensity gradient
Traffic mark point is taken, recycles curve matching to go out ground traffic sign information (ground traffic sign line), can also be according to barrier
Hinder thing cluster principle, obtain in the range of the second preset height and be shaped as standard rectangular and circular object, and definition should
Object is suspension traffic mark information
Traffic mark Fusion Module 230 is used to determine traffic according to ground traffic sign information and suspension traffic mark information
The position of identification information.In the specific location area of acquisition, the visual traffic obtained according to visual traffic label detection unit 231
Identification information identifies the classification or species of traffic mark information.Can accurately it be obtained by traffic mark Fusion Module 230
Bottom surface or the various traffic mark information of suspension, it is ensured that pilotless automobile security row under the precursor observed traffic rules and regulations
Sail.
In one embodiment, barrier dynamic tracing Fusion Module 240 includes vision dynamic tracing unit 241 and radar
Dynamic tracing unit 243.
Vision dynamic tracing unit 241 is used to image information is identified, and positioning is dynamic in adjacent two frames successive frame
State barrier, and obtain the color information of dynamic barrier.Vision dynamic tracing unit 241 passes through pattern-recognition or engineering
The methods such as habit are handled image information (video image) sequence, are recognized in the successive frame of video image and are positioned dynamic barrier
Hinder thing, and obtain the color information of barrier.
Radar dynamic tracing unit 243 is used for the three-dimensional coordinate information for following the trail of dynamic barrier.Radar dynamic tracing unit
243, according to related objective association algorithm, determination adjacent two are combined using closest matching algorithm and polynary hypothesis tracing algorithm
Frame or the barrier of multiframe are same target.The three dimensional local information and speed of the target are obtained according to the test data of laser radar
Information is spent, and then the target after association is tracked.At the same time it can also utilize the filtering of Kalman filtering and particle filter
Algorithm is filtered to the obtained measuring state of target and predicted state and obtains the three of more accurate dynamic barrier
Dimension coordinate information.
Barrier dynamic tracing Fusion Module 240 is used for the three-dimensional seat of the color information for merging dynamic barrier and barrier
Information is marked, the tracked information of dynamic barrier is obtained.Because vision dynamic barrier information is easily become by strong light or illumination
The interference of change, three coordinate informations without accurate dynamic barrier, but contained in vision dynamic barrier information rich
Rich RGB RGB colour information.The dynamic barrier information of thunderous acquisition does not have RGB RGB colour information, in fortune
None- identified goes out specifically which dynamic object when being separated after occurring blocking and block during dynamic, still, and laser radar is obtained
Dynamic barrier information stability it is strong, external interference, and the dynamic barrier that laser radar is obtained such as will not be changed by light intensity
Hinder thing information that there is accurate three-dimensional coordinate information, there is more accurate motion model to the dynamic tracking of moving object.Therefore,
The color information for the dynamic barrier that 240 pairs of the dynamic tracing that can break the barriers Fusion Module is obtained from image information and swash
The three-dimensional coordinate information for the dynamic barrier information that optical radar is obtained is merged, and can both be obtained comprising color information and three-dimensional
The dynamic barrier of coordinate information, can accurately be followed the trail of dynamic barrier.
In one embodiment, path planning decision-making subsystem 30 is used for according to information of vehicles, data fusion subsystem 20
The information and traveling destination information planning driving path of extraction.Path planning decision-making subsystem 30 can be according to environment sensing
Information of vehicles (the current geographical position of pilotless automobile and time, vehicle attitude and current operation that subsystem 10 is obtained
Speed), data fusion subsystem 20 extract ambient condition information (obstacle information, lane line information, traffic mark information
And to the dynamic tracing information of barrier) and the traveling destination information of pilotless automobile plan driving path.Road
The driving path that footpath project decision subsystem 30 combines planning carries out path planning to the position of pilotless automobile subsequent time,
And calculate the control data of pilotless automobile, including angular speed, linear velocity, travel direction etc..
In one embodiment, traveling control subsystem 40 is used to generate control instruction according to driving path, and according to control
System instruction control is controlled to pilotless automobile.Traveling control subsystem 40 is calculated according to path planning decision-making subsystem 30
Control data generation control instruction, the control instruction include to the travel speed of vehicle, travel direction (front, rear, left and right),
The control of the form gear of throttle and vehicle, so ensure automatic driving vehicle can safety and steady traveling, realize that nobody drives
The function of sailing.
In one embodiment, pilotless automobile system also includes communication subsystem 50, and communication subsystem 50 is used for will
The driving path real-time Transmission that path planning decision-making subsystem 30 is planned is to outside Surveillance center.By outside Surveillance center to nobody
The travel conditions of driving are monitored.
Above-mentioned pilotless automobile system, includes image information by the fusion of data fusion subsystem 20 and three-dimensional coordinate is believed
The ambient condition information of breath, and extract the tracking of obstacle information, lane line information, traffic mark information and dynamic barrier
Information, improves the recognition capability and precision to ambient condition information.Path planning decision-making subsystem 30 is according to data fusion
Information and traveling destination information planning driving path that subsystem 20 is extracted, traveling control subsystem 40 is according to driving path
Control instruction is generated, and pilotless automobile is controlled according to control instruction control, and then security performance pole can be realized
High unmanned function.
In addition, embodiments of the invention also provide a kind of automobile, including the pilotless automobile system in the various embodiments described above
System.Automobile according to embodiments of the present invention, can be by the data fusion subsystem 20 in the pilotless automobile system in automobile
Fusion includes the ambient condition information of image information and three-dimensional coordinate information, and extracts obstacle information, lane line information, traffic
The tracked information of identification information and dynamic barrier, improves the recognition capability and precision to ambient condition information.Path
Information and traveling destination information planning driving path that project decision subsystem 30 is extracted according to data fusion subsystem 20,
Travel control subsystem 40 and control instruction is generated according to driving path, and pilotless automobile is carried out according to control instruction control
Control, and then the high unmanned function of security performance can be realized.
Each technical characteristic of embodiment described above can be combined arbitrarily, to make description succinct, not to above-mentioned reality
Apply all possible combination of each technical characteristic in example to be all described, as long as however, the combination of these technical characteristics is not deposited
In contradiction, the scope of this specification record is all considered to be.
Embodiment described above only expresses the several embodiments of the present invention, and it describes more specific and detailed, but simultaneously
Can not therefore it be construed as limiting the scope of the patent.It should be pointed out that coming for one of ordinary skill in the art
Say, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to the protection of the present invention
Scope.Therefore, the protection domain of patent of the present invention should be determined by the appended claims.
Claims (10)
1. a kind of pilotless automobile system, it is characterised in that including:
Environment sensing subsystem, information of vehicles and ambient condition information for gathering pilotless automobile, the surrounding environment
Information includes the image information and three-dimensional coordinate information of surrounding environment;
Data fusion subsystem, for merging the image information and three-dimensional coordinate information and extracting lane line information, barrier
The tracked information of information, traffic mark information and dynamic barrier;
Path planning decision-making subsystem, for the information and traveling according to the information of vehicles, the extraction of data fusion subsystem
Destination information plans driving path;
Control subsystem is travelled, for generating control instruction according to the driving path, and according to the control instruction to nobody
Driving is controlled.
2. pilotless automobile system according to claim 1, it is characterised in that the environment sensing subsystem includes:
Vision sensor, the image information for gathering pilotless automobile surrounding environment;
Radar, the three-dimensional coordinate information of the surrounding environment for gathering pilotless automobile.
3. pilotless automobile system according to claim 2, it is characterised in that the data fusion subsystem includes:
Lane line Fusion Module, for the vision sensor and the radar collection ambient condition information be overlapped or
Exclude, and extract the lane line information;
Obstacle recognition Fusion Module, melts for the ambient condition information to the vision sensor and radar collection
Close, and extract the obstacle information;
Traffic mark Fusion Module, is examined for the ambient condition information to the vision sensor and radar collection
Survey, and extract the traffic mark information;
Barrier dynamic tracing Fusion Module, enters for the ambient condition information to the vision sensor and radar collection
Row fusion, and extract the tracked information of the dynamic barrier.
4. pilotless automobile system according to claim 3, it is characterised in that the lane line Fusion Module includes regarding
Feel lane detection unit and radar truck diatom detection unit;
The vision lane detection unit is used to handle the image information, and extracts vision lane line information;Institute
The information of road surface that radar truck diatom detection unit is used to extract pilotless automobile traveling is stated, and is obtained according to the information of road surface
Track outline information;
The lane line Fusion Module is additionally operable to that the vision lane line information and track outline information are overlapped or arranged
Remove, obtain the lane line information.
5. pilotless automobile system according to claim 3, it is characterised in that the obstacle recognition Fusion Module bag
Include visual barrier recognition unit and radar obstacle recognition unit;
The visual barrier recognition unit is used to be partitioned into background information and foreground information according to the image information, to described
Foreground information, which is identified, obtains the visual barrier information with colour information;The radar obstacle recognition unit is used to know
The radar obstacle information with three-dimensional coordinate information not in the range of the first preset height;
The obstacle recognition Fusion Module is additionally operable to merge the visual barrier information and radar obstacle information, obtains institute
State obstacle information.
6. pilotless automobile system according to claim 3, it is characterised in that the traffic mark Fusion Module includes
Visual traffic label detection unit and radar traffic label detection unit;
The visual traffic label detection unit is detected to the image information, and extracts visual traffic identification information;Institute
Stating radar traffic label detection unit is used to extract ground traffic sign information, is additionally operable to detection in the range of the second preset height
Suspension traffic mark information;
The traffic mark Fusion Module is additionally operable to be determined according to the ground traffic sign information and suspension traffic mark information
The position of the traffic mark information, and obtain in the band of position classification of the traffic mark information.
7. pilotless automobile system according to claim 3, it is characterised in that the barrier dynamic tracing merges mould
Block includes vision dynamic tracing unit and radar dynamic tracing unit;
The vision dynamic tracing unit is used to the image information is identified, and positioning is dynamic in adjacent two frames successive frame
State barrier, and obtain the color information of the dynamic barrier;The radar dynamic tracing unit is used to follow the trail of dynamic disorder
The three-dimensional coordinate information of thing;
The barrier dynamic tracing Fusion Module is additionally operable to merge the color information and dynamic barrier of the dynamic barrier
Three-dimensional coordinate information, obtain the tracked information of the dynamic barrier.
8. pilotless automobile system according to claim 1, it is characterised in that the environment sensing subsystem is also wrapped
Include:
GPS location navigator, current geographical position and time for gathering pilotless automobile;
Inertial Measurement Unit, the vehicle attitude for measuring the pilotless automobile;
Speed data collection module, for obtaining the speed that pilotless automobile is currently run.
9. pilotless automobile system according to claim 1, it is characterised in that also include:
Communication subsystem, during the driving path real-time Transmission for the path planning decision-making subsystem to be planned is monitored to outside
The heart.
10. a kind of automobile, it is characterised in that including pilotless automobile system such as according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710136501.6A CN107161141B (en) | 2017-03-08 | 2017-03-08 | Unmanned automobile system and automobile |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710136501.6A CN107161141B (en) | 2017-03-08 | 2017-03-08 | Unmanned automobile system and automobile |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107161141A true CN107161141A (en) | 2017-09-15 |
CN107161141B CN107161141B (en) | 2023-05-23 |
Family
ID=59848697
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710136501.6A Active CN107161141B (en) | 2017-03-08 | 2017-03-08 | Unmanned automobile system and automobile |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107161141B (en) |
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107600062A (en) * | 2017-09-06 | 2018-01-19 | 深圳市招科智控科技有限公司 | A kind of whole-control system and method |
CN107826115A (en) * | 2017-10-26 | 2018-03-23 | 杨晓艳 | A kind of automobile recognition methods |
CN108008727A (en) * | 2017-12-11 | 2018-05-08 | 梁金凤 | A kind of pilotless automobile that can be run at high speed |
CN108021132A (en) * | 2017-11-29 | 2018-05-11 | 芜湖星途机器人科技有限公司 | Paths planning method |
CN108256413A (en) * | 2017-11-27 | 2018-07-06 | 科大讯飞股份有限公司 | It can traffic areas detection method and device, storage medium, electronic equipment |
CN108375775A (en) * | 2018-01-17 | 2018-08-07 | 上海禾赛光电科技有限公司 | The method of adjustment of vehicle-mounted detection equipment and its parameter, medium, detection system |
CN108416257A (en) * | 2018-01-19 | 2018-08-17 | 北京交通大学 | Merge the underground railway track obstacle detection method of vision and laser radar data feature |
CN108469817A (en) * | 2018-03-09 | 2018-08-31 | 武汉理工大学 | The unmanned boat obstruction-avoiding control system merged based on FPGA and information |
CN108628320A (en) * | 2018-07-04 | 2018-10-09 | 广东猪兼强互联网科技有限公司 | A kind of intelligent automobile Unmanned Systems |
CN108646739A (en) * | 2018-05-14 | 2018-10-12 | 北京智行者科技有限公司 | A kind of sensor information fusion method |
CN108725452A (en) * | 2018-06-01 | 2018-11-02 | 湖南工业大学 | A kind of automatic driving vehicle control system and control method based on the perception of full audio frequency |
CN108873013A (en) * | 2018-06-27 | 2018-11-23 | 江苏大学 | A kind of road using multi-line laser radar can traffic areas acquisition methods |
CN108896994A (en) * | 2018-05-11 | 2018-11-27 | 武汉环宇智行科技有限公司 | A kind of automatic driving vehicle localization method and equipment |
CN108919805A (en) * | 2018-07-04 | 2018-11-30 | 广东猪兼强互联网科技有限公司 | A kind of unmanned auxiliary system of vehicle |
CN108961749A (en) * | 2018-07-12 | 2018-12-07 | 南方科技大学 | A kind of intelligent transportation system and intellectual traffic control method |
CN109002053A (en) * | 2018-08-17 | 2018-12-14 | 河南科技大学 | Unmanned equipment Intellectualized space positioning and environmental perception device and method |
CN109061655A (en) * | 2018-06-01 | 2018-12-21 | 湖南工业大学 | A kind of full audio frequency sensory perceptual system of intelligent driving vehicle and its intelligent control method |
CN109597404A (en) * | 2017-09-30 | 2019-04-09 | 徐工集团工程机械股份有限公司 | Road roller and its controller, control method and system |
CN109817021A (en) * | 2019-01-15 | 2019-05-28 | 北京百度网讯科技有限公司 | A kind of laser radar trackside blind area traffic participant preventing collision method and device |
CN109829351A (en) * | 2017-11-23 | 2019-05-31 | 华为技术有限公司 | Detection method, device and the computer readable storage medium of lane information |
CN109855646A (en) * | 2019-04-30 | 2019-06-07 | 奥特酷智能科技(南京)有限公司 | It is distributed centralized automated driving system and method |
CN109883439A (en) * | 2019-03-22 | 2019-06-14 | 百度在线网络技术(北京)有限公司 | A kind of automobile navigation method, device, electronic equipment and storage medium |
WO2019134110A1 (en) * | 2018-01-05 | 2019-07-11 | Driving Brain International Ltd. | Autonomous driving methods and systems |
WO2019134389A1 (en) * | 2018-01-08 | 2019-07-11 | 北京图森未来科技有限公司 | Automatic driving system |
CN110162026A (en) * | 2018-02-11 | 2019-08-23 | 北京图森未来科技有限公司 | A kind of object identification system, method and device |
CN110389359A (en) * | 2018-04-19 | 2019-10-29 | 法拉第未来公司 | System and method for ground level detection |
CN110435648A (en) * | 2019-07-26 | 2019-11-12 | 中国第一汽车股份有限公司 | Travel control method, device, vehicle and the storage medium of vehicle |
CN110667591A (en) * | 2018-07-02 | 2020-01-10 | 百度(美国)有限责任公司 | Planned driving perception system for autonomous vehicles |
CN110764108A (en) * | 2019-11-05 | 2020-02-07 | 畅加风行(苏州)智能科技有限公司 | Obstacle detection method and device for port automatic driving scene |
CN110843792A (en) * | 2019-11-29 | 2020-02-28 | 北京百度网讯科技有限公司 | Method and apparatus for outputting information |
CN110908366A (en) * | 2018-08-28 | 2020-03-24 | 大陆泰密克汽车系统(上海)有限公司 | Automatic driving method and device |
CN110969178A (en) * | 2018-09-30 | 2020-04-07 | 长城汽车股份有限公司 | Data fusion system and method for automatic driving vehicle and automatic driving system |
WO2020083349A1 (en) * | 2018-10-24 | 2020-04-30 | 长沙智能驾驶研究院有限公司 | Method and device for data processing for use in intelligent driving equipment, and storage medium |
CN111127701A (en) * | 2019-12-24 | 2020-05-08 | 武汉光庭信息技术股份有限公司 | Vehicle failure scene detection method and system |
CN111142528A (en) * | 2019-12-31 | 2020-05-12 | 天津职业技术师范大学(中国职业培训指导教师进修中心) | Vehicle dangerous scene sensing method, device and system |
CN111223354A (en) * | 2019-12-31 | 2020-06-02 | 塔普翊海(上海)智能科技有限公司 | Unmanned trolley, and AR and AI technology-based unmanned trolley practical training platform and method |
CN111242986A (en) * | 2020-01-07 | 2020-06-05 | 北京百度网讯科技有限公司 | Cross-camera obstacle tracking method, device, equipment, system and medium |
CN111247525A (en) * | 2019-01-14 | 2020-06-05 | 深圳市大疆创新科技有限公司 | Lane detection method and device, lane detection equipment and mobile platform |
WO2020118623A1 (en) * | 2018-12-13 | 2020-06-18 | Continental Automotive Gmbh | Method and system for generating an environment model for positioning |
CN111307162A (en) * | 2019-11-25 | 2020-06-19 | 奥特酷智能科技(南京)有限公司 | Multi-sensor fusion positioning method for automatic driving scene |
CN111427349A (en) * | 2020-03-27 | 2020-07-17 | 齐鲁工业大学 | Vehicle navigation obstacle avoidance method and system based on laser and vision |
CN111551976A (en) * | 2020-05-20 | 2020-08-18 | 四川万网鑫成信息科技有限公司 | Method for automatically completing abnormal positioning by combining various data |
CN111746557A (en) * | 2019-03-26 | 2020-10-09 | 通用汽车环球科技运作有限责任公司 | Path planning fusion for vehicles |
CN111768621A (en) * | 2020-06-17 | 2020-10-13 | 北京航空航天大学 | Urban road and vehicle fusion global perception method based on 5G |
CN111775934A (en) * | 2020-07-21 | 2020-10-16 | 湖南汽车工程职业学院 | Intelligent driving obstacle avoidance system of automobile |
CN112101069A (en) * | 2019-06-18 | 2020-12-18 | 华为技术有限公司 | Method and device for determining driving area information |
CN112130153A (en) * | 2020-09-23 | 2020-12-25 | 的卢技术有限公司 | Method for realizing edge detection of unmanned vehicle based on millimeter wave radar and camera |
CN112519799A (en) * | 2020-11-10 | 2021-03-19 | 深圳市豪恩汽车电子装备股份有限公司 | Motor vehicle road auxiliary driving device and method |
CN112703423A (en) * | 2019-01-31 | 2021-04-23 | 动态Ad有限责任公司 | Merging data from multiple LiDAR devices |
CN112829768A (en) * | 2021-03-02 | 2021-05-25 | 刘敏 | Unmanned automobile and control system thereof |
CN113002396A (en) * | 2020-04-14 | 2021-06-22 | 青岛慧拓智能机器有限公司 | A environmental perception system and mining vehicle for automatic driving mining vehicle |
WO2021134742A1 (en) * | 2020-01-02 | 2021-07-08 | 华为技术有限公司 | Predicted motion trajectory processing method and device, and restriction barrier displaying method and device |
CN113252053A (en) * | 2021-06-16 | 2021-08-13 | 中智行科技有限公司 | High-precision map generation method and device and electronic equipment |
CN113298910A (en) * | 2021-05-14 | 2021-08-24 | 阿波罗智能技术(北京)有限公司 | Method, apparatus and storage medium for generating traffic sign line map |
CN113415289A (en) * | 2021-07-30 | 2021-09-21 | 佛山市顺德区中等专业学校(佛山市顺德区技工学校) | Identification device and method for unmanned vehicle |
CN113486836A (en) * | 2021-07-19 | 2021-10-08 | 安徽江淮汽车集团股份有限公司 | Automatic driving control method for low-pass obstacle |
CN113753052A (en) * | 2021-09-01 | 2021-12-07 | 苏州莱布尼茨智能科技有限公司 | Whole car safety intelligence drive control system of new energy automobile |
CN113848956A (en) * | 2021-11-09 | 2021-12-28 | 盐城工学院 | Unmanned vehicle system and unmanned method |
CN114019978A (en) * | 2021-11-08 | 2022-02-08 | 陕西欧卡电子智能科技有限公司 | Unmanned pleasure boat and unmanned method |
CN114281075A (en) * | 2021-11-19 | 2022-04-05 | 岚图汽车科技有限公司 | Emergency obstacle avoidance system based on service-oriented, control method and equipment thereof |
US11346926B2 (en) | 2018-01-17 | 2022-05-31 | Hesai Technology Co., Ltd. | Detection device and method for adjusting parameter thereof |
CN115145272A (en) * | 2022-06-21 | 2022-10-04 | 大连华锐智能化科技有限公司 | Coke oven vehicle environment sensing system and method |
CN115339453A (en) * | 2022-10-19 | 2022-11-15 | 禾多科技(北京)有限公司 | Vehicle lane change decision information generation method, device, equipment and computer medium |
CN111414848B (en) * | 2020-03-19 | 2023-04-07 | 小米汽车科技有限公司 | Full-class 3D obstacle detection method, system and medium |
CN116166033A (en) * | 2023-04-21 | 2023-05-26 | 深圳市速腾聚创科技有限公司 | Vehicle obstacle avoidance method, device, medium and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100106356A1 (en) * | 2008-10-24 | 2010-04-29 | The Gray Insurance Company | Control and systems for autonomously driven vehicles |
CN104267721A (en) * | 2014-08-29 | 2015-01-07 | 陈业军 | Unmanned driving system of intelligent automobile |
CN104943684A (en) * | 2014-03-31 | 2015-09-30 | 比亚迪股份有限公司 | Pilotless automobile control system and automobile with same |
CN106441319A (en) * | 2016-09-23 | 2017-02-22 | 中国科学院合肥物质科学研究院 | System and method for generating lane-level navigation map of unmanned vehicle |
CN206691107U (en) * | 2017-03-08 | 2017-12-01 | 深圳市速腾聚创科技有限公司 | Pilotless automobile system and automobile |
-
2017
- 2017-03-08 CN CN201710136501.6A patent/CN107161141B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100106356A1 (en) * | 2008-10-24 | 2010-04-29 | The Gray Insurance Company | Control and systems for autonomously driven vehicles |
CN104943684A (en) * | 2014-03-31 | 2015-09-30 | 比亚迪股份有限公司 | Pilotless automobile control system and automobile with same |
CN104267721A (en) * | 2014-08-29 | 2015-01-07 | 陈业军 | Unmanned driving system of intelligent automobile |
CN106441319A (en) * | 2016-09-23 | 2017-02-22 | 中国科学院合肥物质科学研究院 | System and method for generating lane-level navigation map of unmanned vehicle |
CN206691107U (en) * | 2017-03-08 | 2017-12-01 | 深圳市速腾聚创科技有限公司 | Pilotless automobile system and automobile |
Cited By (98)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107600062A (en) * | 2017-09-06 | 2018-01-19 | 深圳市招科智控科技有限公司 | A kind of whole-control system and method |
CN109597404A (en) * | 2017-09-30 | 2019-04-09 | 徐工集团工程机械股份有限公司 | Road roller and its controller, control method and system |
CN107826115A (en) * | 2017-10-26 | 2018-03-23 | 杨晓艳 | A kind of automobile recognition methods |
CN109829351A (en) * | 2017-11-23 | 2019-05-31 | 华为技术有限公司 | Detection method, device and the computer readable storage medium of lane information |
CN108256413A (en) * | 2017-11-27 | 2018-07-06 | 科大讯飞股份有限公司 | It can traffic areas detection method and device, storage medium, electronic equipment |
CN108256413B (en) * | 2017-11-27 | 2022-02-25 | 科大讯飞股份有限公司 | Passable area detection method and device, storage medium and electronic equipment |
CN108021132A (en) * | 2017-11-29 | 2018-05-11 | 芜湖星途机器人科技有限公司 | Paths planning method |
CN108008727A (en) * | 2017-12-11 | 2018-05-08 | 梁金凤 | A kind of pilotless automobile that can be run at high speed |
WO2019134110A1 (en) * | 2018-01-05 | 2019-07-11 | Driving Brain International Ltd. | Autonomous driving methods and systems |
WO2019134389A1 (en) * | 2018-01-08 | 2019-07-11 | 北京图森未来科技有限公司 | Automatic driving system |
US11648958B2 (en) | 2018-01-08 | 2023-05-16 | Beijing Tusen Weilai Technology Co., Ltd. | Autonomous driving system |
US11346926B2 (en) | 2018-01-17 | 2022-05-31 | Hesai Technology Co., Ltd. | Detection device and method for adjusting parameter thereof |
CN108375775A (en) * | 2018-01-17 | 2018-08-07 | 上海禾赛光电科技有限公司 | The method of adjustment of vehicle-mounted detection equipment and its parameter, medium, detection system |
CN108416257A (en) * | 2018-01-19 | 2018-08-17 | 北京交通大学 | Merge the underground railway track obstacle detection method of vision and laser radar data feature |
CN110162026B (en) * | 2018-02-11 | 2022-06-21 | 北京图森智途科技有限公司 | Object recognition system, method and device |
CN110162026A (en) * | 2018-02-11 | 2019-08-23 | 北京图森未来科技有限公司 | A kind of object identification system, method and device |
US11869249B2 (en) | 2018-02-11 | 2024-01-09 | Beijing Tusen Zhitu Technology Co., Ltd. | System, method and apparatus for object identification |
US11532157B2 (en) | 2018-02-11 | 2022-12-20 | Beijing Tusen Zhitu Technology Co., Ltd. | System, method and apparatus for object identification |
CN108469817A (en) * | 2018-03-09 | 2018-08-31 | 武汉理工大学 | The unmanned boat obstruction-avoiding control system merged based on FPGA and information |
CN110389359A (en) * | 2018-04-19 | 2019-10-29 | 法拉第未来公司 | System and method for ground level detection |
CN108896994A (en) * | 2018-05-11 | 2018-11-27 | 武汉环宇智行科技有限公司 | A kind of automatic driving vehicle localization method and equipment |
CN108646739A (en) * | 2018-05-14 | 2018-10-12 | 北京智行者科技有限公司 | A kind of sensor information fusion method |
CN108725452A (en) * | 2018-06-01 | 2018-11-02 | 湖南工业大学 | A kind of automatic driving vehicle control system and control method based on the perception of full audio frequency |
CN109061655B (en) * | 2018-06-01 | 2022-09-06 | 湖南工业大学 | Full-audio sensing system of intelligent driving vehicle and intelligent control method thereof |
CN109061655A (en) * | 2018-06-01 | 2018-12-21 | 湖南工业大学 | A kind of full audio frequency sensory perceptual system of intelligent driving vehicle and its intelligent control method |
CN108873013A (en) * | 2018-06-27 | 2018-11-23 | 江苏大学 | A kind of road using multi-line laser radar can traffic areas acquisition methods |
CN108873013B (en) * | 2018-06-27 | 2022-07-22 | 江苏大学 | Method for acquiring passable road area by adopting multi-line laser radar |
CN110667591A (en) * | 2018-07-02 | 2020-01-10 | 百度(美国)有限责任公司 | Planned driving perception system for autonomous vehicles |
CN110667591B (en) * | 2018-07-02 | 2022-11-04 | 百度(美国)有限责任公司 | Planned driving perception system for autonomous vehicles |
CN108919805B (en) * | 2018-07-04 | 2021-09-28 | 江苏大块头智驾科技有限公司 | Vehicle unmanned auxiliary system |
CN108628320A (en) * | 2018-07-04 | 2018-10-09 | 广东猪兼强互联网科技有限公司 | A kind of intelligent automobile Unmanned Systems |
CN108919805A (en) * | 2018-07-04 | 2018-11-30 | 广东猪兼强互联网科技有限公司 | A kind of unmanned auxiliary system of vehicle |
CN108961749A (en) * | 2018-07-12 | 2018-12-07 | 南方科技大学 | A kind of intelligent transportation system and intellectual traffic control method |
CN109002053A (en) * | 2018-08-17 | 2018-12-14 | 河南科技大学 | Unmanned equipment Intellectualized space positioning and environmental perception device and method |
CN110908366A (en) * | 2018-08-28 | 2020-03-24 | 大陆泰密克汽车系统(上海)有限公司 | Automatic driving method and device |
CN110908366B (en) * | 2018-08-28 | 2023-08-25 | 大陆智行科技(上海)有限公司 | Automatic driving method and device |
CN110969178A (en) * | 2018-09-30 | 2020-04-07 | 长城汽车股份有限公司 | Data fusion system and method for automatic driving vehicle and automatic driving system |
CN110969178B (en) * | 2018-09-30 | 2023-09-12 | 毫末智行科技有限公司 | Data fusion system and method for automatic driving vehicle and automatic driving system |
WO2020083349A1 (en) * | 2018-10-24 | 2020-04-30 | 长沙智能驾驶研究院有限公司 | Method and device for data processing for use in intelligent driving equipment, and storage medium |
WO2020118623A1 (en) * | 2018-12-13 | 2020-06-18 | Continental Automotive Gmbh | Method and system for generating an environment model for positioning |
CN111247525A (en) * | 2019-01-14 | 2020-06-05 | 深圳市大疆创新科技有限公司 | Lane detection method and device, lane detection equipment and mobile platform |
WO2020146983A1 (en) * | 2019-01-14 | 2020-07-23 | 深圳市大疆创新科技有限公司 | Lane detection method and apparatus, lane detection device, and mobile platform |
CN109817021A (en) * | 2019-01-15 | 2019-05-28 | 北京百度网讯科技有限公司 | A kind of laser radar trackside blind area traffic participant preventing collision method and device |
CN109817021B (en) * | 2019-01-15 | 2021-11-02 | 阿波罗智能技术(北京)有限公司 | Method and device for avoiding traffic participants in roadside blind areas of laser radar |
CN112703423A (en) * | 2019-01-31 | 2021-04-23 | 动态Ad有限责任公司 | Merging data from multiple LiDAR devices |
US11333762B2 (en) | 2019-01-31 | 2022-05-17 | Motional Ad Llc | Merging data from multiple LiDAR devices |
CN109883439A (en) * | 2019-03-22 | 2019-06-14 | 百度在线网络技术(北京)有限公司 | A kind of automobile navigation method, device, electronic equipment and storage medium |
CN111746557B (en) * | 2019-03-26 | 2024-03-29 | 通用汽车环球科技运作有限责任公司 | Path plan fusion for vehicles |
CN111746557A (en) * | 2019-03-26 | 2020-10-09 | 通用汽车环球科技运作有限责任公司 | Path planning fusion for vehicles |
US20220204025A1 (en) * | 2019-04-30 | 2022-06-30 | Autocore Intellegent Technology (Nanjing) Co., Ltd. | Distributed centralized automatic driving method |
WO2020207504A1 (en) * | 2019-04-30 | 2020-10-15 | 奥特酷智能科技(南京)有限公司 | Distributed centralized automatic driving system |
US11926340B2 (en) * | 2019-04-30 | 2024-03-12 | Autocore Technology (Nanjing) Co., Ltd. | Distributed centralized automatic driving method |
CN109855646A (en) * | 2019-04-30 | 2019-06-07 | 奥特酷智能科技(南京)有限公司 | It is distributed centralized automated driving system and method |
CN111174805A (en) * | 2019-04-30 | 2020-05-19 | 奥特酷智能科技(南京)有限公司 | Distributed centralized automatic driving system |
CN112101069A (en) * | 2019-06-18 | 2020-12-18 | 华为技术有限公司 | Method and device for determining driving area information |
US20220108552A1 (en) | 2019-06-18 | 2022-04-07 | Huawei Technologies Co., Ltd. | Method and Apparatus for Determining Drivable Region Information |
WO2020253764A1 (en) * | 2019-06-18 | 2020-12-24 | 华为技术有限公司 | Method and apparatus for determining running region information |
US11698459B2 (en) | 2019-06-18 | 2023-07-11 | Huawei Technologies Co., Ltd. | Method and apparatus for determining drivable region information |
EP3975042A4 (en) * | 2019-06-18 | 2022-08-17 | Huawei Technologies Co., Ltd. | Method and apparatus for determining running region information |
CN110435648A (en) * | 2019-07-26 | 2019-11-12 | 中国第一汽车股份有限公司 | Travel control method, device, vehicle and the storage medium of vehicle |
CN110435648B (en) * | 2019-07-26 | 2021-02-26 | 中国第一汽车股份有限公司 | Vehicle travel control method, device, vehicle, and storage medium |
CN110764108A (en) * | 2019-11-05 | 2020-02-07 | 畅加风行(苏州)智能科技有限公司 | Obstacle detection method and device for port automatic driving scene |
CN110764108B (en) * | 2019-11-05 | 2023-05-02 | 畅加风行(苏州)智能科技有限公司 | Obstacle detection method and device for port automatic driving scene |
CN111307162A (en) * | 2019-11-25 | 2020-06-19 | 奥特酷智能科技(南京)有限公司 | Multi-sensor fusion positioning method for automatic driving scene |
CN110843792A (en) * | 2019-11-29 | 2020-02-28 | 北京百度网讯科技有限公司 | Method and apparatus for outputting information |
CN110843792B (en) * | 2019-11-29 | 2021-05-25 | 北京百度网讯科技有限公司 | Method and apparatus for outputting information |
CN111127701A (en) * | 2019-12-24 | 2020-05-08 | 武汉光庭信息技术股份有限公司 | Vehicle failure scene detection method and system |
CN111142528A (en) * | 2019-12-31 | 2020-05-12 | 天津职业技术师范大学(中国职业培训指导教师进修中心) | Vehicle dangerous scene sensing method, device and system |
CN111223354A (en) * | 2019-12-31 | 2020-06-02 | 塔普翊海(上海)智能科技有限公司 | Unmanned trolley, and AR and AI technology-based unmanned trolley practical training platform and method |
CN111142528B (en) * | 2019-12-31 | 2023-10-24 | 天津职业技术师范大学(中国职业培训指导教师进修中心) | Method, device and system for sensing dangerous scene for vehicle |
WO2021134742A1 (en) * | 2020-01-02 | 2021-07-08 | 华为技术有限公司 | Predicted motion trajectory processing method and device, and restriction barrier displaying method and device |
CN111242986A (en) * | 2020-01-07 | 2020-06-05 | 北京百度网讯科技有限公司 | Cross-camera obstacle tracking method, device, equipment, system and medium |
CN111242986B (en) * | 2020-01-07 | 2023-11-24 | 阿波罗智能技术(北京)有限公司 | Cross-camera obstacle tracking method, device, equipment, system and medium |
CN111414848B (en) * | 2020-03-19 | 2023-04-07 | 小米汽车科技有限公司 | Full-class 3D obstacle detection method, system and medium |
CN111427349A (en) * | 2020-03-27 | 2020-07-17 | 齐鲁工业大学 | Vehicle navigation obstacle avoidance method and system based on laser and vision |
CN113002396A (en) * | 2020-04-14 | 2021-06-22 | 青岛慧拓智能机器有限公司 | A environmental perception system and mining vehicle for automatic driving mining vehicle |
CN111551976A (en) * | 2020-05-20 | 2020-08-18 | 四川万网鑫成信息科技有限公司 | Method for automatically completing abnormal positioning by combining various data |
CN111768621A (en) * | 2020-06-17 | 2020-10-13 | 北京航空航天大学 | Urban road and vehicle fusion global perception method based on 5G |
CN111775934A (en) * | 2020-07-21 | 2020-10-16 | 湖南汽车工程职业学院 | Intelligent driving obstacle avoidance system of automobile |
CN112130153A (en) * | 2020-09-23 | 2020-12-25 | 的卢技术有限公司 | Method for realizing edge detection of unmanned vehicle based on millimeter wave radar and camera |
CN112519799A (en) * | 2020-11-10 | 2021-03-19 | 深圳市豪恩汽车电子装备股份有限公司 | Motor vehicle road auxiliary driving device and method |
CN112829768A (en) * | 2021-03-02 | 2021-05-25 | 刘敏 | Unmanned automobile and control system thereof |
CN113298910A (en) * | 2021-05-14 | 2021-08-24 | 阿波罗智能技术(北京)有限公司 | Method, apparatus and storage medium for generating traffic sign line map |
CN113252053B (en) * | 2021-06-16 | 2021-09-28 | 中智行科技有限公司 | High-precision map generation method and device and electronic equipment |
CN113252053A (en) * | 2021-06-16 | 2021-08-13 | 中智行科技有限公司 | High-precision map generation method and device and electronic equipment |
CN113486836B (en) * | 2021-07-19 | 2023-06-06 | 安徽江淮汽车集团股份有限公司 | Automatic driving control method for low-pass obstacle |
CN113486836A (en) * | 2021-07-19 | 2021-10-08 | 安徽江淮汽车集团股份有限公司 | Automatic driving control method for low-pass obstacle |
CN113415289B (en) * | 2021-07-30 | 2022-09-13 | 佛山市顺德区中等专业学校(佛山市顺德区技工学校) | Identification device and method for unmanned vehicle |
CN113415289A (en) * | 2021-07-30 | 2021-09-21 | 佛山市顺德区中等专业学校(佛山市顺德区技工学校) | Identification device and method for unmanned vehicle |
CN113753052A (en) * | 2021-09-01 | 2021-12-07 | 苏州莱布尼茨智能科技有限公司 | Whole car safety intelligence drive control system of new energy automobile |
CN114019978A (en) * | 2021-11-08 | 2022-02-08 | 陕西欧卡电子智能科技有限公司 | Unmanned pleasure boat and unmanned method |
CN113848956A (en) * | 2021-11-09 | 2021-12-28 | 盐城工学院 | Unmanned vehicle system and unmanned method |
CN114281075A (en) * | 2021-11-19 | 2022-04-05 | 岚图汽车科技有限公司 | Emergency obstacle avoidance system based on service-oriented, control method and equipment thereof |
CN115145272A (en) * | 2022-06-21 | 2022-10-04 | 大连华锐智能化科技有限公司 | Coke oven vehicle environment sensing system and method |
CN115145272B (en) * | 2022-06-21 | 2024-03-29 | 大连华锐智能化科技有限公司 | Coke oven vehicle environment sensing system and method |
CN115339453A (en) * | 2022-10-19 | 2022-11-15 | 禾多科技(北京)有限公司 | Vehicle lane change decision information generation method, device, equipment and computer medium |
CN115339453B (en) * | 2022-10-19 | 2022-12-23 | 禾多科技(北京)有限公司 | Vehicle lane change decision information generation method, device, equipment and computer medium |
CN116166033A (en) * | 2023-04-21 | 2023-05-26 | 深圳市速腾聚创科技有限公司 | Vehicle obstacle avoidance method, device, medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN107161141B (en) | 2023-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN206691107U (en) | Pilotless automobile system and automobile | |
CN107161141A (en) | Pilotless automobile system and automobile | |
JP7125214B2 (en) | Programs and computing devices | |
JP7432285B2 (en) | Lane mapping and navigation | |
CN106681353B (en) | The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream | |
US11573090B2 (en) | LIDAR and rem localization | |
CN107235044B (en) | A kind of restoring method realized based on more sensing datas to road traffic scene and driver driving behavior | |
US9057609B2 (en) | Ground-based camera surveying and guiding method for aircraft landing and unmanned aerial vehicle recovery | |
EP2372310B1 (en) | Image processing system and position measurement system | |
CN109405824A (en) | A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile | |
US11280630B2 (en) | Updating map data | |
CN110345961A (en) | Based on the main vehicle of the Characteristics Control that parks cars detected | |
DE112020004133T5 (en) | SYSTEMS AND PROCEDURES FOR IDENTIFICATION OF POSSIBLE COMMUNICATION BARRIERS | |
CN108108750A (en) | Metric space method for reconstructing based on deep learning and monocular vision | |
CN113255520B (en) | Vehicle obstacle avoidance method based on binocular vision and deep learning and electronic equipment | |
CN110176156A (en) | A kind of airborne ground early warning system | |
CN109583415A (en) | A kind of traffic lights detection and recognition methods merged based on laser radar with video camera | |
GB2614379A (en) | Systems and methods for vehicle navigation | |
CN115661204B (en) | Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster | |
CN111796602A (en) | Plant protection unmanned aerial vehicle barrier is surveyed and early warning system | |
CN107451988A (en) | The method represented is synthesized to element interested in the inspection system of aircraft | |
WO2018161278A1 (en) | Driverless automobile system and control method thereof, and automobile | |
KR101510745B1 (en) | Autonomous vehicle system | |
Kawamura et al. | Ground-Based Vision Tracker for Advanced Air Mobility and Urban Air Mobility | |
KR101441422B1 (en) | Decision-Making Device and Method using Ontology technique to predict a collision with obstacle during take-off and landing of aircraft |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |