CN107161141B - Unmanned automobile system and automobile - Google Patents

Unmanned automobile system and automobile Download PDF

Info

Publication number
CN107161141B
CN107161141B CN201710136501.6A CN201710136501A CN107161141B CN 107161141 B CN107161141 B CN 107161141B CN 201710136501 A CN201710136501 A CN 201710136501A CN 107161141 B CN107161141 B CN 107161141B
Authority
CN
China
Prior art keywords
information
obstacle
subsystem
radar
traffic sign
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710136501.6A
Other languages
Chinese (zh)
Other versions
CN107161141A (en
Inventor
邱纯鑫
刘乐天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suteng Innovation Technology Co Ltd
Original Assignee
Suteng Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suteng Innovation Technology Co Ltd filed Critical Suteng Innovation Technology Co Ltd
Priority to CN201710136501.6A priority Critical patent/CN107161141B/en
Publication of CN107161141A publication Critical patent/CN107161141A/en
Application granted granted Critical
Publication of CN107161141B publication Critical patent/CN107161141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to an automobile, an unmanned automobile system and an automobile. The unmanned automobile system comprises an environment sensing subsystem, a data fusion subsystem, a path planning decision subsystem and a driving control subsystem. Surrounding environment information comprising image information and three-dimensional coordinate information is fused through the data fusion subsystem, barrier information, lane line information, traffic identification information and tracking information of dynamic barriers are extracted, and the recognition capability and accuracy of the surrounding environment information are improved. The path planning decision subsystem plans a driving path according to the information extracted by the data fusion subsystem and the driving destination information, and the driving control subsystem generates a control instruction according to the driving path and controls the unmanned automobile according to the control instruction, so that an unmanned function with extremely high safety performance can be realized.

Description

Unmanned automobile system and automobile
Technical Field
The invention relates to the technical field of automobiles, in particular to an unmanned automobile system and an automobile.
Background
Current autopilot technology has basically automatic operation and driving capabilities, for example, advanced instruments such as cameras, radar sensors, laser detectors, etc. are installed on the automobile, and can sense the speed limit of the highway and the traffic signs at the roadside, and the surrounding movement of the vehicle, if the vehicle starts, the vehicle can be navigated by means of a map. The unmanned system mainly utilizes vehicle-mounted sensors to sense the surrounding environment of the vehicle, and controls the steering and the speed of the vehicle according to the road, the vehicle position and the obstacle information obtained by sensing, so that the vehicle can safely and reliably run on the road.
At present, an unmanned automobile is an intelligent automobile, and unmanned is realized mainly by means of an intelligent driver taking a computer system as a main part in the automobile. However, the difficulty in the unmanned system is the ability to identify the traffic and surrounding environment, which may lead to inaccurate data collected by the unmanned system.
Disclosure of Invention
In view of the above, it is necessary to provide an unmanned vehicle system and a vehicle which can safely travel with high recognition capability and high accuracy of surrounding environment information.
An unmanned vehicle system comprising:
the environment sensing subsystem is used for collecting vehicle information and surrounding environment information of the unmanned automobile, wherein the surrounding environment information comprises image information and three-dimensional coordinate information of the surrounding environment;
the data fusion subsystem is used for fusing the image information and the three-dimensional coordinate information and extracting lane line information, barrier information, traffic identification information and tracking information of dynamic barriers;
the path planning decision subsystem is used for planning a driving path according to the vehicle information, the information extracted by the data fusion subsystem and the driving destination information;
and the driving control subsystem is used for generating a control instruction according to the driving path and controlling the unmanned automobile according to the control instruction.
In one embodiment, the context awareness subsystem includes:
the visual sensor is used for collecting image information of the surrounding environment of the unmanned automobile;
and the radar is used for acquiring three-dimensional coordinate information of the surrounding environment of the unmanned automobile.
In one embodiment, the data fusion subsystem comprises:
the lane line fusion module is used for superposing or excluding surrounding environment information acquired by the visual sensor and the radar and extracting the lane line information;
the obstacle recognition fusion module is used for fusing the vision sensor with surrounding environment information acquired by the radar and extracting the obstacle information;
the traffic sign fusion module is used for detecting surrounding environment information acquired by the visual sensor and the radar and extracting traffic sign information;
and the obstacle dynamic tracking fusion module is used for fusing the vision sensor with surrounding environment information acquired by the radar and extracting tracking information of the dynamic obstacle.
In one embodiment, the lane line fusion module comprises a visual lane line detection unit and a radar lane line detection unit; the visual lane line detection unit is used for processing the image information and extracting visual lane line information; the radar lane line detection unit is used for extracting road surface information of the unmanned automobile and acquiring lane outline information according to the road surface information; the lane line fusion module is also used for superposing or excluding the visual lane line information and the lane outline information to acquire the lane line information.
In one embodiment, the obstacle recognition fusion module includes a visual obstacle recognition unit and a radar obstacle recognition unit; the visual obstacle recognition unit is used for dividing background information and foreground information according to the image information, and recognizing the foreground information to obtain visual obstacle information with color information; the radar obstacle identifying unit is further used for identifying radar obstacle information with three-dimensional coordinate information in a first preset height range; the obstacle recognition fusion module is used for fusing the vision obstacle information and the radar obstacle information and acquiring the obstacle information.
In one embodiment, the traffic sign fusion module comprises a visual traffic sign detection unit and a radar traffic sign detection unit; the visual traffic sign detection unit detects the image information and extracts visual traffic sign information; the radar traffic sign detection unit is used for extracting ground traffic sign information; and also for detecting suspended traffic identification information within a second preset elevation range; the traffic sign fusion module is also used for determining the position of the traffic sign information according to the ground traffic sign information and the suspension traffic sign information and obtaining the category of the traffic sign information in the position area.
In one embodiment, the obstacle dynamic tracking fusion module includes a visual dynamic tracking unit and a radar dynamic tracking unit, where the visual dynamic tracking unit is configured to identify the image information, locate a dynamic obstacle in two consecutive frames, and obtain color information of the dynamic obstacle; the radar dynamic tracking unit is used for tracking three-dimensional coordinate information of dynamic obstacles; the obstacle dynamic tracking fusion module is also used for fusing the color information of the dynamic obstacle and the three-dimensional coordinate information of the dynamic obstacle and acquiring the tracking information of the dynamic obstacle.
In one embodiment, the context awareness subsystem further comprises:
the GPS positioning navigator is used for acquiring the current geographic position and time of the unmanned automobile;
an inertia measurement unit for measuring a vehicle posture of the unmanned vehicle;
the vehicle speed acquisition module is used for acquiring the current running speed of the unmanned automobile.
In one embodiment, the method further comprises:
and the communication subsystem is used for transmitting the running path planned by the path planning decision subsystem to an external monitoring center in real time.
In addition, an automobile is provided, and the unmanned automobile system is included.
According to the unmanned automobile system provided by the embodiment of the invention, the surrounding environment information comprising the image information and the three-dimensional coordinate information is fused through the data fusion subsystem, and the barrier information, the lane line information, the traffic sign information and the tracking information of the dynamic barrier are extracted, so that the recognition capability and the accuracy of the surrounding environment information are improved. The path planning decision subsystem plans a driving path according to the information extracted by the data fusion subsystem and the driving destination information, and the driving control subsystem generates a control instruction according to the driving path and controls the unmanned automobile according to the control instruction, so that an unmanned function with extremely high safety performance can be realized.
Drawings
FIG. 1 is a structural frame diagram of an unmanned vehicle system in one embodiment;
FIG. 2 is a structural framework diagram of an environmental awareness subsystem in one embodiment;
FIG. 3 is a structural framework diagram of a data fusion subsystem in one embodiment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a structural framework diagram of an unmanned vehicle system in one embodiment, which includes an environmental awareness subsystem 10, a data fusion subsystem 20, a path planning decision subsystem 30, and a travel control subsystem 40.
The environment sensing subsystem 10 is configured to collect vehicle information and surrounding environment information of the unmanned vehicle, where the surrounding environment information includes image information and three-dimensional coordinate information of the surrounding environment.
The data fusion subsystem 20 is used for fusing surrounding environment information and extracting barrier information, lane line information, traffic identification information and tracking information of dynamic barriers.
The path planning decision subsystem 30 is configured to plan a driving path according to the vehicle information, the information extracted by the data fusion subsystem 20, and the driving destination information.
The driving control subsystem 40 is configured to generate a control instruction according to the driving path, and control the unmanned vehicle according to the control instruction.
In the unmanned vehicle system, the data fusion subsystem 20 fuses the surrounding environment information including the image information and the three-dimensional coordinate information, and extracts the barrier information, the lane line information, the traffic identification information and the tracking information of the dynamic barrier, thereby improving the recognition capability and the accuracy of the surrounding environment information. The path planning decision subsystem 30 plans a driving path according to the information extracted by the data fusion subsystem 20 and the driving destination information, and the driving control subsystem 40 generates a control instruction according to the driving path and controls the unmanned automobile according to the control instruction, so that an unmanned function with extremely high safety performance can be realized.
In one embodiment, referring to FIG. 2, the environmental awareness subsystem 10 includes a vision sensor 110 and a radar 120. The vision sensor 110 is mainly composed of one or two graphic sensors, and is sometimes matched with a light projector and other auxiliary devices. The image sensor may be a laser scanner, a linear array or an area array CCD camera or a TV camera, or may be a digital camera or the like which has recently appeared. The vision sensor 110 is installed on the unmanned vehicle, and is used for collecting surrounding environment information of the unmanned vehicle, that is, collecting real-time road condition information near the unmanned vehicle, including obstacle information, lane line information, traffic sign information and dynamic tracking information of the obstacle. The collected surrounding information is image information of the surrounding environment, which can be also called as video information.
The radar 120 is used to collect three-dimensional coordinate information of the surrounding environment of the unmanned vehicle. The unmanned vehicle system includes a plurality of radars 120. In one embodiment, the plurality of radars 120 includes lidar and millimeter wave radars. The laser radar adopts a mechanical multi-beam laser radar, mainly detects the characteristic quantities such as the position, the speed and the like of a target by emitting laser beams, and can also detect and track obstacles by utilizing the echo intensity information of the laser radar. The laser radar has the advantages of wider detection range and high detection precision. The millimeter wave radar has the wavelength between centimetre wave and light wave, and has the advantages of microwave guidance and photoelectric guidance, and the guide head has the features of small size, light weight, high spatial resolution and powerful fog, smoke and dust penetrating capacity. In one example, the laser radar and the millimeter wave radar are adopted simultaneously, so that the defect that the laser radar cannot perform under extreme weather can be overcome, and the detection performance of the unmanned automobile can be greatly improved.
In one embodiment, the context awareness subsystem 10 is also used to gather vehicle information for an unmanned vehicle. The vehicle information comprises the current geographic position and time of the unmanned automobile, the vehicle posture, the current running speed and the like. The environment awareness subsystem 10 further includes a GPS positioning navigator 130, an inertial measurement unit 140 (Inertial measurement unit, IMU) and a vehicle speed acquisition module 150. Wherein the GPS positioning navigator 130 collects the current geographic position and time of the unmanned car. In the running process of the unmanned automobile, the global positioning instrument installed in the automobile can acquire the accurate position of the automobile at any time, so that the safety is further improved. The inertia measurement unit 140 is used to measure the vehicle posture of the unmanned vehicle. The vehicle speed acquisition module 150 is used for acquiring the current running speed of the unmanned vehicle.
In one embodiment, referring to FIG. 3, the data fusion subsystem 20 includes: lane line fusion module 210, obstacle recognition fusion module 220, traffic sign fusion module 230, and obstacle dynamic tracking fusion module 240.
The lane line fusion module 210 is configured to superimpose or exclude surrounding environmental information acquired by the vision sensor 110 and the radar 120, and extract lane line information. The obstacle recognition fusion module 220 is configured to fuse the surrounding environment information acquired by the vision sensor 110 and the radar 120, and extract obstacle information. The traffic sign fusion module 230 is configured to detect surrounding environment information collected by the vision sensor 110 and the radar 120, and extract traffic sign information. The obstacle dynamic tracking fusion module 240 is configured to fuse the surrounding information acquired by the vision sensor 110 and the radar 120, and extract lane line information.
In one embodiment, the lane line fusion module 210 includes a visual lane line detection unit 211 and a radar lane line detection unit 213.
The visual lane line detection unit 211 is used for processing the image information and extracting visual lane line information. The visual lane line detection unit 211 performs preprocessing such as denoising, enhancement, segmentation, etc. on the image information acquired by the visual sensor 110, and extracts visual lane line information.
The radar lane line detection unit 213 is used for extracting road surface information of the unmanned vehicle running, and acquiring lane outline information according to the road surface information. When the radar lane line detection unit 213 acquires lane outline information, it calibrates three-dimensional coordinate information of the driving ground of the unmanned vehicle acquired by the laser radar, and calculates discrete points in the three-dimensional coordinate information, where the discrete points may be defined as points where the distance between two adjacent points is greater than a preset range. And filtering the discrete points, fitting the position information of the ground by using a random sampling consistency method, and acquiring the outer contour information of the lane, namely the lane line information of the radar 120.
The lane line fusion module 210 fuses (superimposes) or excludes the acquired visual lane line information and lane outer contour information to acquire real-time lane line information. By the lane line fusion module 210, the accuracy of identifying lane line information can be improved, and the situation that the lane line information is not acquired can be avoided.
In one embodiment, the obstacle recognition fusion module 220 includes a visual obstacle recognition unit 221 and a radar obstacle recognition unit 223. The visual obstacle identifying unit 221 is configured to segment background information and foreground information according to the image information, and identify the foreground information to obtain visual obstacle information with color information. The visual obstacle recognition unit 221 processes the image information by a pattern recognition or machine learning method, and establishes a background model and segments a foreground using a background update algorithm. And identifying the segmented foreground to obtain vision obstacle information with color information.
The radar obstacle identifying unit 223 is configured to identify radar obstacle information having three-dimensional coordinate information within a first preset height range.
The radar obstacle recognition unit 223 performs preprocessing on the surrounding environment information of the unmanned vehicle acquired by the laser radar, removes ground information, and screens three-dimensional coordinate information of the surrounding environment recognized within a first preset height range. And detecting a region of interest (region of interest, ROI) according to the constraint condition of the lane line information, wherein the region of interest is a region which is outlined in a square, circle, ellipse, irregular polygon and the like and needs to be processed. And rasterizing the data information of the identified region of interest, and performing clustering segmentation on the obstacle blocks. And carrying out secondary clustering on the original laser radar point cloud data corresponding to each obstacle block, and placing under-segmentation. And taking the point cloud data of the secondary clustering as a training sample set, generating a classifier model according to the training sample set, and then classifying and identifying the obstacle blocks after the secondary clustering by using the training model and acquiring radar obstacle information with three-dimensional coordinate information.
The obstacle recognition fusion module 220 is configured to fuse the vision obstacle information and the radar obstacle information, and obtain the obstacle information. Since the visual obstacle information may fail in a strong light environment or a scene where light is rapidly changed, the radar 120 detects the obstacle information through the active light source, and the stability is strong. When the unmanned vehicle runs in a strong light environment or a scene with rapid light change, the obstacle recognition fusion module 220 can be used for superposing the visual obstacle information and the radar obstacle information, so that accurate obstacle information can be obtained in the strong light environment or the scene with rapid light change.
Since the resolution of the radar 120 in the vertical direction is low, three-dimensional coordinate information of the obstacle is acquired and red, green and blue RGB color information is not acquired, and erroneous recognition may occur even in the case of a long distance or an obstacle. While the obstacle information acquired by the visual obstacle recognition unit 221 contains rich RGB information, and the pixels are high. And overlapping and fusing the color information of the obstacle and the three-dimensional coordinate information of the obstacle, so that the obstacle information containing the color information and the three-dimensional information can be obtained at the same time. The obstacle recognition fusion module 220 can reduce the false recognition rate, improve the recognition accuracy and further ensure safe driving.
In one embodiment, the traffic sign fusion module 230 includes a visual traffic sign detection unit 231 and a radar traffic sign detection unit 233.
The visual traffic identification detection unit 231 detects the image information and extracts the visual traffic identification information. The visual traffic sign detecting unit 231 detects the image information, processes the image information by a mode recognition or machine learning method, and obtains visual traffic sign information, wherein the visual traffic sign information contains red, green and blue RGB color information.
The radar traffic sign detection unit 233 is used for extracting ground traffic sign information; and also for detecting hanging traffic identification information within a second preset altitude range. The radar traffic sign detection unit 233 extracts traffic sign line points according to the reflection intensity gradient, and then fits the traffic sign line points to obtain ground traffic sign information (ground traffic sign line), or obtains a target object which is in a second preset height range and has a standard rectangular shape and a round shape according to an obstacle clustering principle, and defines the target object as hanging traffic sign information
The traffic identification fusion module 230 is configured to determine a location of traffic identification information based on ground traffic identification information and suspension traffic identification information. In the acquired specific location area, the category or kind of traffic identification information is identified from the visual traffic identification information acquired by the visual traffic identification detection unit 231. Various traffic identification information of the bottom surface or the suspension can be accurately acquired through the traffic identification fusion module 230, and the unmanned automobile can be ensured to safely run under the precursor which obeys the traffic rules.
In one embodiment, obstacle dynamic tracking fusion module 240 includes a visual dynamic tracking unit 241 and a radar dynamic tracking unit 243.
The visual dynamic tracking unit 241 is used for identifying the image information, positioning the dynamic barrier in two consecutive frames, and obtaining the color information of the dynamic barrier. The visual dynamic tracking unit 241 processes the image information (video image) sequence by a pattern recognition or machine learning method, etc., recognizes and locates a dynamic obstacle in successive frames of the video image, and acquires color information of the obstacle.
The radar dynamic tracking unit 243 is used for tracking three-dimensional coordinate information of a dynamic obstacle. The radar dynamic tracking unit 243 determines that the obstacles of two or more adjacent frames are the same target by combining the nearest neighbor matching algorithm and the multivariate hypothesis tracking algorithm according to the related target association algorithm. And acquiring three-dimensional position information and speed information of the target according to the test data of the laser radar, and further tracking the associated target. Meanwhile, the obtained measurement state and prediction state of the target can be filtered by using a Kalman filtering and particle filtering algorithm to obtain accurate three-dimensional coordinate information of the dynamic obstacle.
The obstacle dynamic tracking fusion module 240 is configured to fuse the color information of the dynamic obstacle and the three-dimensional coordinate information of the obstacle, and obtain tracking information of the dynamic obstacle. Because the visual dynamic obstacle information is easily interfered by strong light or illumination change, the three-position coordinate information of the dynamic obstacle is not accurate, but the visual dynamic obstacle information contains abundant red, green and blue RGB color information. The dynamic obstacle information acquired by the laser radar has no red, green and blue RGB color information, and can not identify which dynamic object is in particular when the dynamic obstacle information is shielded and separated after the shielding occurs in the movement process, however, the dynamic obstacle information acquired by the laser radar has strong stability and can not be interfered by the outside such as light intensity change, and the dynamic obstacle information acquired by the laser radar has accurate three-dimensional coordinate information and has a more accurate movement model for dynamic tracking of the moving object. Therefore, the color information of the dynamic obstacle obtained from the image information and the three-dimensional coordinate information of the dynamic obstacle information obtained by the laser radar can be fused by the obstacle dynamic tracking fusion module 240, so that the dynamic obstacle comprising the color information and the three-dimensional coordinate information can be obtained, and the dynamic obstacle can be accurately tracked.
In one embodiment, the path planning decision subsystem 30 is configured to plan a travel path based on the vehicle information, the information extracted by the data fusion subsystem 20, and the travel destination information. The path planning decision subsystem 30 may plan a travel path based on the vehicle information (current geographic position and time of the unmanned vehicle, vehicle posture, and current running speed) acquired by the environment awareness subsystem 10, the surrounding environment information (obstacle information, lane line information, traffic identification information, and dynamic tracking information of the obstacle) extracted by the data fusion subsystem 20, and the travel destination information of the unmanned vehicle. The path planning decision subsystem 30 performs path planning on the position of the unmanned vehicle at the next moment in combination with the planned driving path, and calculates control data of the unmanned vehicle, including angular speed, linear speed, driving direction and the like.
In one embodiment, the travel control subsystem 40 is configured to generate control commands according to the travel path and control the unmanned vehicle according to the control commands. The driving control subsystem 40 generates a control instruction according to the control data calculated by the path planning decision subsystem 30, wherein the control instruction comprises control of the driving speed, the driving direction (front, back, left and right), the accelerator and the form gear of the vehicle, so that the unmanned vehicle can safely and stably drive, and the unmanned function is realized.
In one embodiment, the unmanned vehicle system further comprises a communication subsystem 50, the communication subsystem 50 being configured to transmit the planned travel path of the path planning decision subsystem 30 to an external monitoring center in real time. The running condition of the unmanned automobile is monitored by an external monitoring center.
In the unmanned vehicle system, the data fusion subsystem 20 fuses the surrounding environment information including the image information and the three-dimensional coordinate information, and extracts the barrier information, the lane line information, the traffic identification information and the tracking information of the dynamic barrier, thereby improving the recognition capability and the accuracy of the surrounding environment information. The path planning decision subsystem 30 plans a driving path according to the information extracted by the data fusion subsystem 20 and the driving destination information, and the driving control subsystem 40 generates a control instruction according to the driving path and controls the unmanned automobile according to the control instruction, so that an unmanned function with extremely high safety performance can be realized.
In addition, the embodiment of the invention also provides an automobile, which comprises the unmanned automobile system in each embodiment. According to the automobile provided by the embodiment of the invention, the surrounding environment information comprising the image information and the three-dimensional coordinate information can be fused through the data fusion subsystem 20 in the unmanned automobile system in the automobile, and the barrier information, the lane line information, the traffic sign information and the tracking information of the dynamic barrier are extracted, so that the identification capability and the accuracy of the surrounding environment information are improved. The path planning decision subsystem 30 plans a driving path according to the information extracted by the data fusion subsystem 20 and the driving destination information, and the driving control subsystem 40 generates a control instruction according to the driving path and controls the unmanned automobile according to the control instruction, so that an unmanned function with extremely high safety performance can be realized.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (9)

1. An unmanned vehicle system, comprising:
the environment sensing subsystem is used for collecting vehicle information and surrounding environment information of the unmanned automobile, wherein the surrounding environment information comprises image information and three-dimensional coordinate information of the surrounding environment;
the data fusion subsystem is used for fusing the image information and the three-dimensional coordinate information and extracting lane line information, barrier information, traffic identification information and tracking information of dynamic barriers;
the path planning decision subsystem is used for planning a driving path according to the vehicle information, the information extracted by the data fusion subsystem and the driving destination information;
the driving control subsystem is used for generating a control instruction according to the driving path and controlling the unmanned automobile according to the control instruction;
the data fusion subsystem comprises a lane line fusion module;
the lane line fusion module is used for processing image information of the surrounding environment of the unmanned automobile and extracting visual lane line information; calibrating the acquired three-dimensional coordinate information of the running ground of the unmanned automobile, and calculating discrete points in the three-dimensional coordinate information, wherein the discrete points are points with a distance between two adjacent points being greater than a preset range; performing filtering processing on the discrete points, fitting out the position information of the ground by using a random sampling consistency method, and obtaining the lane outline information; and superposing or excluding the visual lane line information and the lane outline information to acquire real-time lane line information.
2. The unmanned vehicle system of claim 1, wherein the context awareness subsystem comprises:
the visual sensor is used for collecting image information of the surrounding environment of the unmanned automobile;
and the radar is used for acquiring three-dimensional coordinate information of the surrounding environment of the unmanned automobile.
3. The unmanned vehicle system of claim 2, wherein the data fusion subsystem further comprises:
the obstacle recognition fusion module is used for fusing the vision sensor with surrounding environment information acquired by the radar and extracting the obstacle information;
the traffic sign fusion module is used for detecting surrounding environment information acquired by the visual sensor and the radar and extracting traffic sign information;
and the obstacle dynamic tracking fusion module is used for fusing the vision sensor with surrounding environment information acquired by the radar and extracting tracking information of the dynamic obstacle.
4. The unmanned vehicle system of claim 3, wherein the obstacle recognition fusion module comprises a visual obstacle recognition unit and a radar obstacle recognition unit;
the visual obstacle recognition unit is used for dividing background information and foreground information according to the image information, and recognizing the foreground information to obtain visual obstacle information with color information; the radar obstacle recognition unit is used for recognizing radar obstacle information with three-dimensional coordinate information in a first preset height range;
the obstacle recognition fusion module is also used for fusing the vision obstacle information and the radar obstacle information and acquiring the obstacle information.
5. The unmanned vehicle system of claim 3, wherein the traffic sign fusion module comprises a visual traffic sign detection unit and a radar traffic sign detection unit;
the visual traffic sign detection unit detects the image information and extracts visual traffic sign information; the radar traffic sign detection unit is used for extracting ground traffic sign information and detecting suspension traffic sign information in a second preset height range;
the traffic sign fusion module is also used for determining the position of the traffic sign information according to the ground traffic sign information and the suspension traffic sign information and obtaining the category of the traffic sign information in the position area.
6. The unmanned vehicle system of claim 3, wherein the obstacle dynamic tracking fusion module comprises a visual dynamic tracking unit and a radar dynamic tracking unit;
the visual dynamic tracking unit is used for identifying the image information, positioning dynamic barriers in two adjacent continuous frames and acquiring color information of the dynamic barriers; the radar dynamic tracking unit is used for tracking three-dimensional coordinate information of dynamic obstacles;
the obstacle dynamic tracking fusion module is also used for fusing the color information of the dynamic obstacle and the three-dimensional coordinate information of the dynamic obstacle and acquiring the tracking information of the dynamic obstacle.
7. The unmanned vehicle system of claim 1, wherein the context awareness subsystem further comprises:
the GPS positioning navigator is used for acquiring the current geographic position and time of the unmanned automobile;
an inertia measurement unit for measuring a vehicle posture of the unmanned vehicle;
the vehicle speed acquisition module is used for acquiring the current running speed of the unmanned automobile.
8. The unmanned vehicle system of claim 1, further comprising:
and the communication subsystem is used for transmitting the running path planned by the path planning decision subsystem to an external monitoring center in real time.
9. An automobile comprising the unmanned automobile system according to any one of claims 1 to 8.
CN201710136501.6A 2017-03-08 2017-03-08 Unmanned automobile system and automobile Active CN107161141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710136501.6A CN107161141B (en) 2017-03-08 2017-03-08 Unmanned automobile system and automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710136501.6A CN107161141B (en) 2017-03-08 2017-03-08 Unmanned automobile system and automobile

Publications (2)

Publication Number Publication Date
CN107161141A CN107161141A (en) 2017-09-15
CN107161141B true CN107161141B (en) 2023-05-23

Family

ID=59848697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710136501.6A Active CN107161141B (en) 2017-03-08 2017-03-08 Unmanned automobile system and automobile

Country Status (1)

Country Link
CN (1) CN107161141B (en)

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107600062A (en) * 2017-09-06 2018-01-19 深圳市招科智控科技有限公司 A kind of whole-control system and method
CN109597404A (en) * 2017-09-30 2019-04-09 徐工集团工程机械股份有限公司 Road roller and its controller, control method and system
CN107826115A (en) * 2017-10-26 2018-03-23 杨晓艳 A kind of automobile recognition methods
CN109829351B (en) * 2017-11-23 2021-06-01 华为技术有限公司 Method and device for detecting lane information and computer readable storage medium
CN108256413B (en) * 2017-11-27 2022-02-25 科大讯飞股份有限公司 Passable area detection method and device, storage medium and electronic equipment
CN108021132A (en) * 2017-11-29 2018-05-11 芜湖星途机器人科技有限公司 Paths planning method
CN108008727A (en) * 2017-12-11 2018-05-08 梁金凤 A kind of pilotless automobile that can be run at high speed
WO2019134110A1 (en) * 2018-01-05 2019-07-11 Driving Brain International Ltd. Autonomous driving methods and systems
CN108196547B (en) * 2018-01-08 2020-06-23 北京图森未来科技有限公司 Automatic driving system
EP3742200B1 (en) 2018-01-17 2023-11-08 Hesai Technology Co., Ltd. Detection apparatus and parameter adjustment method thereof
CN108375775B (en) * 2018-01-17 2020-09-29 上海禾赛光电科技有限公司 Vehicle-mounted detection equipment and parameter adjusting method, medium and detection system thereof
CN108416257A (en) * 2018-01-19 2018-08-17 北京交通大学 Merge the underground railway track obstacle detection method of vision and laser radar data feature
CN110162026B (en) 2018-02-11 2022-06-21 北京图森智途科技有限公司 Object recognition system, method and device
CN108469817B (en) * 2018-03-09 2021-04-27 武汉理工大学 Unmanned ship obstacle avoidance control system based on FPGA and information fusion
US10705534B2 (en) * 2018-04-19 2020-07-07 Faraday&Future Inc. System and method for ground plane detection
CN108896994A (en) * 2018-05-11 2018-11-27 武汉环宇智行科技有限公司 A kind of automatic driving vehicle localization method and equipment
CN108646739A (en) * 2018-05-14 2018-10-12 北京智行者科技有限公司 A kind of sensor information fusion method
CN109061655B (en) * 2018-06-01 2022-09-06 湖南工业大学 Full-audio sensing system of intelligent driving vehicle and intelligent control method thereof
CN108725452B (en) * 2018-06-01 2019-12-31 湖南工业大学 Unmanned vehicle control system and control method based on full-audio perception
CN108873013B (en) * 2018-06-27 2022-07-22 江苏大学 Method for acquiring passable road area by adopting multi-line laser radar
US10551840B2 (en) * 2018-07-02 2020-02-04 Baidu Usa Llc Planning driven perception system for autonomous driving vehicles
CN108628320A (en) * 2018-07-04 2018-10-09 广东猪兼强互联网科技有限公司 A kind of intelligent automobile Unmanned Systems
CN108919805B (en) * 2018-07-04 2021-09-28 江苏大块头智驾科技有限公司 Vehicle unmanned auxiliary system
CN108961749A (en) * 2018-07-12 2018-12-07 南方科技大学 Intelligent traffic system and intelligent traffic control method
CN109002053A (en) * 2018-08-17 2018-12-14 河南科技大学 Unmanned equipment Intellectualized space positioning and environmental perception device and method
CN110908366B (en) * 2018-08-28 2023-08-25 大陆智行科技(上海)有限公司 Automatic driving method and device
CN110969178B (en) * 2018-09-30 2023-09-12 毫末智行科技有限公司 Data fusion system and method for automatic driving vehicle and automatic driving system
WO2020083349A1 (en) * 2018-10-24 2020-04-30 长沙智能驾驶研究院有限公司 Method and device for data processing for use in intelligent driving equipment, and storage medium
EP3894788A4 (en) * 2018-12-13 2022-10-05 Continental Automotive GmbH Method and system for generating an environment model for positioning
WO2020146983A1 (en) * 2019-01-14 2020-07-23 深圳市大疆创新科技有限公司 Lane detection method and apparatus, lane detection device, and mobile platform
CN109817021B (en) * 2019-01-15 2021-11-02 阿波罗智能技术(北京)有限公司 Method and device for avoiding traffic participants in roadside blind areas of laser radar
DK180562B1 (en) 2019-01-31 2021-06-28 Motional Ad Llc Merging data from multiple lidar devices
CN109883439A (en) * 2019-03-22 2019-06-14 百度在线网络技术(北京)有限公司 A kind of automobile navigation method, device, electronic equipment and storage medium
US11364904B2 (en) * 2019-03-26 2022-06-21 GM Global Technology Operations LLC Path-planning fusion for a vehicle
CN111174805A (en) * 2019-04-30 2020-05-19 奥特酷智能科技(南京)有限公司 Distributed centralized automatic driving system
CN112101069A (en) * 2019-06-18 2020-12-18 华为技术有限公司 Method and device for determining driving area information
CN110435648B (en) * 2019-07-26 2021-02-26 中国第一汽车股份有限公司 Vehicle travel control method, device, vehicle, and storage medium
CN110764108B (en) * 2019-11-05 2023-05-02 畅加风行(苏州)智能科技有限公司 Obstacle detection method and device for port automatic driving scene
CN111307162B (en) * 2019-11-25 2020-09-25 奥特酷智能科技(南京)有限公司 Multi-sensor fusion positioning method for automatic driving scene
CN110843792B (en) * 2019-11-29 2021-05-25 北京百度网讯科技有限公司 Method and apparatus for outputting information
CN111127701B (en) * 2019-12-24 2022-02-11 武汉光庭信息技术股份有限公司 Vehicle failure scene detection method and system
CN111142528B (en) * 2019-12-31 2023-10-24 天津职业技术师范大学(中国职业培训指导教师进修中心) Method, device and system for sensing dangerous scene for vehicle
CN111223354A (en) * 2019-12-31 2020-06-02 塔普翊海(上海)智能科技有限公司 Unmanned trolley, and AR and AI technology-based unmanned trolley practical training platform and method
EP4071441A4 (en) * 2020-01-02 2022-12-28 Huawei Technologies Co., Ltd. Predicted motion trajectory processing method and device, and restriction barrier displaying method and device
CN111242986B (en) * 2020-01-07 2023-11-24 阿波罗智能技术(北京)有限公司 Cross-camera obstacle tracking method, device, equipment, system and medium
CN111427349A (en) * 2020-03-27 2020-07-17 齐鲁工业大学 Vehicle navigation obstacle avoidance method and system based on laser and vision
CN113002396B (en) * 2020-04-14 2022-06-10 青岛慧拓智能机器有限公司 A environmental perception system and mining vehicle for automatic driving mining vehicle
CN111551976A (en) * 2020-05-20 2020-08-18 四川万网鑫成信息科技有限公司 Method for automatically completing abnormal positioning by combining various data
CN111768621B (en) * 2020-06-17 2021-06-04 北京航空航天大学 Urban road and vehicle fusion global perception method based on 5G
CN111775934A (en) * 2020-07-21 2020-10-16 湖南汽车工程职业学院 Intelligent driving obstacle avoidance system of automobile
CN112130153A (en) * 2020-09-23 2020-12-25 的卢技术有限公司 Method for realizing edge detection of unmanned vehicle based on millimeter wave radar and camera
CN112519799B (en) * 2020-11-10 2022-06-28 深圳市豪恩汽车电子装备股份有限公司 Motor vehicle road auxiliary driving device and method
CN112829768A (en) * 2021-03-02 2021-05-25 刘敏 Unmanned automobile and control system thereof
CN113298910A (en) * 2021-05-14 2021-08-24 阿波罗智能技术(北京)有限公司 Method, apparatus and storage medium for generating traffic sign line map
CN113252053B (en) * 2021-06-16 2021-09-28 中智行科技有限公司 High-precision map generation method and device and electronic equipment
CN113486836B (en) * 2021-07-19 2023-06-06 安徽江淮汽车集团股份有限公司 Automatic driving control method for low-pass obstacle
CN113415289B (en) * 2021-07-30 2022-09-13 佛山市顺德区中等专业学校(佛山市顺德区技工学校) Identification device and method for unmanned vehicle
CN113753052A (en) * 2021-09-01 2021-12-07 苏州莱布尼茨智能科技有限公司 Whole car safety intelligence drive control system of new energy automobile
CN114019978A (en) * 2021-11-08 2022-02-08 陕西欧卡电子智能科技有限公司 Unmanned pleasure boat and unmanned method
CN113848956A (en) * 2021-11-09 2021-12-28 盐城工学院 Unmanned vehicle system and unmanned method
CN114281075A (en) * 2021-11-19 2022-04-05 岚图汽车科技有限公司 Emergency obstacle avoidance system based on service-oriented, control method and equipment thereof
CN115145272B (en) * 2022-06-21 2024-03-29 大连华锐智能化科技有限公司 Coke oven vehicle environment sensing system and method
CN115339453B (en) * 2022-10-19 2022-12-23 禾多科技(北京)有限公司 Vehicle lane change decision information generation method, device, equipment and computer medium
CN115985122A (en) * 2022-10-31 2023-04-18 内蒙古智能煤炭有限责任公司 Unmanned system sensing method
CN116166033B (en) * 2023-04-21 2024-05-21 深圳市速腾聚创科技有限公司 Vehicle obstacle avoidance method, device, medium and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8126642B2 (en) * 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
CN104943684B (en) * 2014-03-31 2017-09-29 比亚迪股份有限公司 Pilotless automobile control system and the automobile with it
CN104267721A (en) * 2014-08-29 2015-01-07 陈业军 Unmanned driving system of intelligent automobile
CN106441319B (en) * 2016-09-23 2019-07-16 中国科学院合肥物质科学研究院 A kind of generation system and method for automatic driving vehicle lane grade navigation map
CN206691107U (en) * 2017-03-08 2017-12-01 深圳市速腾聚创科技有限公司 Pilotless automobile system and automobile

Also Published As

Publication number Publication date
CN107161141A (en) 2017-09-15

Similar Documents

Publication Publication Date Title
CN107161141B (en) Unmanned automobile system and automobile
CN206691107U (en) Pilotless automobile system and automobile
JP7432285B2 (en) Lane mapping and navigation
US12112497B2 (en) LIDAR-camera fusion where LIDAR and camera validly see different things
CN106908783B (en) Based on obstacle detection method combined of multi-sensor information
US9759812B2 (en) System and methods for intersection positioning
US20180307245A1 (en) Autonomous Vehicle Corridor
US10553117B1 (en) System and method for determining lane occupancy of surrounding vehicles
US20200250439A1 (en) Automated Road Edge Boundary Detection
US11280630B2 (en) Updating map data
CN113874683A (en) System and method for vehicle navigation
CN109583415A (en) A kind of traffic lights detection and recognition methods merged based on laser radar with video camera
CN108375775A (en) The method of adjustment of vehicle-mounted detection equipment and its parameter, medium, detection system
US11538241B2 (en) Position estimating device
CN112740225B (en) Method and device for determining road surface elements
Fernández et al. Free space and speed humps detection using lidar and vision for urban autonomous navigation
CN112149460A (en) Obstacle detection method and device
CN111353522A (en) Method and system for determining road signs in the surroundings of a vehicle
KR101510745B1 (en) Autonomous vehicle system
Vu et al. Traffic sign detection, state estimation, and identification using onboard sensors
WO2018161278A1 (en) Driverless automobile system and control method thereof, and automobile
CN114084129A (en) Fusion-based vehicle automatic driving control method and system
CN113884090A (en) Intelligent platform vehicle environment sensing system and data fusion method thereof
Wu Data processing algorithms and applications of LiDAR-enhanced connected infrastructure sensing
CN215495425U (en) Compound eye imaging system and vehicle using same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant