CN108458746A - One kind being based on sensor method for self-adaption amalgamation - Google Patents

One kind being based on sensor method for self-adaption amalgamation Download PDF

Info

Publication number
CN108458746A
CN108458746A CN201711411205.9A CN201711411205A CN108458746A CN 108458746 A CN108458746 A CN 108458746A CN 201711411205 A CN201711411205 A CN 201711411205A CN 108458746 A CN108458746 A CN 108458746A
Authority
CN
China
Prior art keywords
information
road
sensor
self
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711411205.9A
Other languages
Chinese (zh)
Inventor
李纪先
张远清
李振
王成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Of Tianjin Ka Yip Medical Technology Development Co Ltd
Original Assignee
State Of Tianjin Ka Yip Medical Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Of Tianjin Ka Yip Medical Technology Development Co Ltd filed Critical State Of Tianjin Ka Yip Medical Technology Development Co Ltd
Priority to CN201711411205.9A priority Critical patent/CN108458746A/en
Publication of CN108458746A publication Critical patent/CN108458746A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provide it is a kind of be based on sensor method for self-adaption amalgamation, including, automatic driving vehicle is by the GPS receiver that has loaded obtained from the Primary Location of vehicle;According to being accurately positioned, road information is retrieved from map datum, according to the road information of retrieval, is analyzed and determined, the unlatching of respective sensor is controlled, and exports final result;The present invention can effectively reduce from vehicle computing resource according to given travel scene, the configuration of dynamic optimization sensor, save power consumption.

Description

One kind being based on sensor method for self-adaption amalgamation
Technical field
The invention belongs to automatic Pilot technical field, is adaptively melted based on sensor more particularly, to one kind The system of conjunction.
Background technology
Automatic driving vehicle needs " eyes " of oneself to perceive from vehicle ambient enviroment in road driving, these eyes Eyeball is exactly various types of sensor, such as laser radar, millimetre-wave radar, camera.Autonomous driving vehicle need through Road traveling could be gone up by crossing sufficient road test.When automatic driving vehicle traveling under different road environments, such as structuring road When the types roads such as road, unstructured road, mountain road, desert road, using same environment sensing model, often account for According to a large amount of computing resources, or cause sensor resource over-redundancy, detecting distance is short in addition, and false drop rate is higher, and be easy by Illumination variation influences big.
Invention content
In view of this, the invention is directed to a kind of being not limited to adaptively merging based on sensor for environment Method, including.
In order to achieve the above objectives, the technical solution of the invention is realized in:
A kind of method for self-adaption amalgamation based on sensor, includes the following steps:
(1) Primary Location information is obtained by the GPS receiver loaded from vehicle;
(2) image from around vehicle is acquired by the camera loaded, and by image characteristics extraction algorithm, dispels environment Noise information obtains ambient condition information;
(3) posture information of vehicle is obtained from by the inertial navigation equipment loaded;
(4) described in the Primary Location information described in step (1), the ambient condition information described in step (2) and step (3) Posture information is obtained from the precise location information of vehicle by PF particle filter blending algorithms;
(5) precise location information according to step (4) carries out Characteristic Contrast matching with map data base, obtains road Road information, is analyzed and determined according to road information and weather conditions information, controls the unlatching of respective sensor;
(6) tracking is identified to barrier in respective sensor.
Further, step (1) the Primary Location resultant error is not more than 10 meters.
Further, the posture information described in step (2) includes from truck position information, velocity information, acceleration information and angle Velocity information.
Further, step (5) described road information includes structured road and unstructured road, the structuring road Road is flush-type road surface, carries traffic index line, the regular road of both sides clear;
The unstructured road is coarse formula road surface, and no traffic index line, there are the object rule road of barrier in both sides.
Further, the sensor described in step (5) includes laser radar, millimetre-wave radar and camera.
Compared with the existing technology, one kind described in the invention, which is based on sensor method for self-adaption amalgamation, has Following advantage:
(1) present invention can effectively reduce from vehicle according to given travel scene, the configuration of dynamic optimization sensor and calculate money Power consumption is saved in source;
(2) present invention can be backed up mutually/be examined between sensor according to different driving scenes, complete into line sensor Whole property is examined and thermodynamic state verification, switches different sensor models, enhances system robustness.
Description of the drawings
The attached drawing for constituting the part of the invention is used for providing further understanding the invention, present invention wound The illustrative embodiments and their description made do not constitute the improper restriction to the invention for explaining the invention. In attached drawing:
Fig. 1 is the structural schematic diagram described in the invention embodiment;
Fig. 2 is the attitude rectification schematic diagram described in the invention embodiment;
Fig. 3 is the automatic Pilot schematic diagram described in the invention embodiment;
Fig. 4 is the automatic Pilot operation schematic diagram described in the invention embodiment;
Specific implementation mode
It should be noted that in the absence of conflict, the feature in embodiment and embodiment in the invention can To be combined with each other.
In the description of the invention, it is to be understood that term "center", " longitudinal direction ", " transverse direction ", "upper", "lower", The orientation or positional relationship of the instructions such as "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outside" is It is based on the orientation or positional relationship shown in the drawings, is merely for convenience of description the invention and simplifies to describe, rather than indicate Or imply that signified device or element must have a particular orientation, with specific azimuth configuration and operation, therefore cannot understand For the limitation to the invention.In addition, term " first ", " second " etc. are used for description purposes only, and should not be understood as indicating Or it implies relative importance or implicitly indicates the quantity of indicated technical characteristic." first ", " second " etc. are defined as a result, Feature can explicitly or implicitly include one or more this feature.In the description of the invention, unless separately It is described, the meaning of " plurality " is two or more.
A kind of method for self-adaption amalgamation based on sensor, includes the following steps:
(1) Primary Location information is obtained by the GPS receiver loaded from vehicle;
(2) image from around vehicle is acquired by the camera loaded, and by image characteristics extraction algorithm, dispels environment Noise information obtains ambient condition information;
(3) posture information of vehicle is obtained from by the inertial navigation equipment loaded;
(4) described in the Primary Location information described in step (1), the ambient condition information described in step (2) and step (3) Posture information is obtained from the precise location information of vehicle by PF particle filter blending algorithms;
(5) precise location information according to step (4) carries out Characteristic Contrast matching with map data base, obtains road Road information, is analyzed and determined according to road information and weather conditions information, controls the unlatching of respective sensor;
(6) tracking is identified to barrier in respective sensor.
Step (1) the Primary Location resultant error is not more than 10 meters.
Posture information described in step (2) includes believing from truck position information, velocity information, acceleration information and angular speed Breath.
Step (5) described road information includes structured road and unstructured road, and the structured road is flat Integral expression road surface carries traffic index line, the regular road of both sides clear;
The unstructured road is coarse formula road surface, and no traffic index line, there are the object rule road of barrier in both sides; The sensor includes laser radar, millimetre-wave radar and camera.
It is of the present invention from fusion method logical process:
As shown in Figure 1, automatic driving vehicle by having loaded sensing equipment be accurately positioned from vehicle:
Primary Location of the automatic driving vehicle by the GPS receiver that has been loaded from vehicle obtained from vehicle in the whole world, this positioning About 10m grades of error.It is obtained from vehicle ambient condition information according to the camera loaded, and by feature extraction algorithm, is dispelled Ambient noise information is obtained from main information around vehicle, and the inertial navigation equipment can directly acquire the posture information from vehicle, but Since there are cumulative errors for it, by particle filter blending algorithm, the posture information that inertial navigation provides constantly is corrected, to obtain phase To accurately from information such as truck position, speed, acceleration, angular speed, as shown in Figure 3.Feature pair is carried out with from vehicle map datum Than to obtain being accurately positioned for automatic driving vehicle, and from the lane line information around vehicle.Specific PF particle filters melt It is as follows to close computational methods
For arbitrariness probability distributing p (xk) discrete particle collection progress Monte Carlo approximation can be used:
Wherein,NkIt is expressed as the state of k moment particles, the total number of weight and particle, δ () It is expressed as dirac delta function.
Assuming that there is N number of global sampling particle, the state of each particle can be expressed as:
The state set of characterization particle is car's location, speed, acceleration, angular speed etc. herein.
And the weight of corresponding each particle is:
Wherein ykFor the observation at k moment, the observation of as all types of sensors.
The state transition equation of particle filter model is:
Weight normalized.
Then, according to characteristics map information and the present situation, adaptive merge sensor data:
Whether the road information, including but not limited to lane line quantity track line width, actual situation line information, have finger Show that the mark of board, stop line, limit for height/again, signal lamp, speed(-)limit sign, present road type information (include but not limited to structuring road Road, unstructured road, urban road, mountain highway) etc..
Specific as follows includes (1) structured road:Show the way face neat and tidy, and lane line, stop line are clear, and both sides of the road are whole Neat rule;(2) unstructured road:Show the way face Rough Fuzzy, and lane line, stop line recognize unclear substantially, and both sides of the road, which have, to be faced When stop the irregularly shaped objects such as fruit stall.
Current condition information includes heavy traffic situation, Weather information (such as weather snow mist), concrete condition such as following table:
Be accurately positioned result and present road situation according to from vehicle, by currently from vehicle travelling state be determined as it is simple, general, Complicated three kinds of situations, according to from vehicle travelling state and available sensor type, correspondence is turned on and off various sensors, including Following situation:
If 1) in structured road, and road conditions are unobstructed, bright and clear, full camera perceptual model can be opened.
If 2) in structured road, but road congestion, open camera and laser radar.
3) if in structured road, and road conditions are unobstructed, but light variation is fast (such as access tunnel) or rather dark, can open Open laser radar, millimetre-wave radar perceptual model.
If 4) in unstructured road, but the road is clear, when bright and clear, camera mode is opened, or part is swashed Optical radar.
Hereafter, the laser radar of above-mentioned unlatching is clustered by laser point cloud, will be flocked together apart from upper similar point, when When point cloud distance meets certain regular Gaussian Profile, then it is assumed that it is tracking barrier, and camera is captured from vehicle ambient data, passes through Feature extraction, obtained from vehicle peripheral obstacle information.Two kinds of information are subjected to fusion association by particle filter model, generate dynamic Barrier track list, using Extended Kalman filter model to the information such as the position of barrier, speed, direction, angular speed into Mobile state tracks.
Kalman filtering, it is assumed that in the posterior probability of each time it is Gaussian Profile (premise of Kalman filtering), it is existing The discrete model of row system is:xk=Fkxk-1+wk-1
zk=Hkxk+vk
Wherein, FkWith HkIt is known system matrix and calculation matrix, wk-1And vkRespectively mean value is 0, variance Qk-1And Rk And mutually independent process noise and measurement noise.Kalman filtering is typical minimum variance estimate method.The present invention program In, xkIt is expressed as the information such as position, speed, acceleration, the angular speed of tracking barrier.zkIt is expressed as in k moment, all types of biographies The measured value of sensor.
Kalman filtering time update equation is:
Measurement updaue equation is:
Pk+1|k+1=(I-Kk+1Hk+1)Pk+1|k
Wherein, K is the gain matrix of Kalman filtering.
Automatic driving vehicle work process of the present invention:
After vehicle launch, whether detection all devices (including but not limited to sensing equipment) are in normal operating conditions, such as Fruit is not at normal condition, that is, enters and detect again.If detected again three times, testing result is exception, then by result It reports to user.If testing result is that part is possible, judge whether to meet minimum perceptional function, if conditions are not met, then will As a result report that, to user, user can enter manual drive pattern.
If aforesaid state is normal, then autonomous driving vehicle enters the environment sensing stage, and according to described in characteristics map Information and traffic information, adaptive configuration perceptual model.
In vehicle travel process, it is whether normal that detection loading working sensor state need to be spaced, if collide with, is crooked Even damage.Aforementioned process is repeated according to sensor last state.
The foregoing is merely the preferred embodiments of the invention, are not intended to limit the invention creation, all at this Within the spirit and principle of innovation and creation, any modification, equivalent replacement, improvement and so on should be included in the invention Protection domain within.

Claims (7)

1. a kind of method for self-adaption amalgamation based on sensor, it is characterised in that:Include the following steps:
(1) Primary Location information is obtained by the GPS receiver loaded from vehicle;
(2) image from around vehicle is acquired by the camera loaded, and by image characteristics extraction algorithm, dispels ambient noise Information obtains ambient condition information;
(3) posture information of vehicle is obtained from by the inertial navigation equipment loaded;
(4) the Primary Location information described in step (1), the ambient condition information described in step (2) and the posture described in step (3) Information is obtained from the precise location information of vehicle by PF particle filter blending algorithms;
(5) precise location information according to step (4) carries out Characteristic Contrast matching with map data base, obtains road letter Breath, is analyzed and determined according to road information and weather conditions information, controls the unlatching of respective sensor;
(6) tracking is identified to barrier in respective sensor.
2. according to claim 1 a kind of based on sensor method for self-adaption amalgamation, it is characterised in that:Step (1) the Primary Location resultant error is not more than 10 meters.
3. according to claim 1 a kind of based on sensor method for self-adaption amalgamation, it is characterised in that:Step (2) posture information described in includes from truck position information, velocity information, acceleration information and angular velocity information.
4. according to claim 1 a kind of based on sensor method for self-adaption amalgamation, it is characterised in that:Step (5) road information includes structured road and unstructured road, and the structured road is flush-type road surface, carries Traffic index line, the regular road of both sides clear;
The unstructured road is coarse formula road surface, and no traffic index line, there are the object rule road of barrier in both sides.
5. according to claim 1 a kind of based on sensor method for self-adaption amalgamation, it is characterised in that:Step (5) sensor described in includes laser radar, millimetre-wave radar and camera.
6. according to claim 1 a kind of based on sensor method for self-adaption amalgamation, it is characterised in that:Described Laser radar includes single line laser radar, multi-line laser radar and three-dimensional omnidirectional laser radar.
7. according to claim 1 a kind of based on sensor method for self-adaption amalgamation, it is characterised in that:Described Millimeter wave laser radar drives blind spot for monitoring short distance.
CN201711411205.9A 2017-12-23 2017-12-23 One kind being based on sensor method for self-adaption amalgamation Pending CN108458746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711411205.9A CN108458746A (en) 2017-12-23 2017-12-23 One kind being based on sensor method for self-adaption amalgamation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711411205.9A CN108458746A (en) 2017-12-23 2017-12-23 One kind being based on sensor method for self-adaption amalgamation

Publications (1)

Publication Number Publication Date
CN108458746A true CN108458746A (en) 2018-08-28

Family

ID=63220729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711411205.9A Pending CN108458746A (en) 2017-12-23 2017-12-23 One kind being based on sensor method for self-adaption amalgamation

Country Status (1)

Country Link
CN (1) CN108458746A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
CN110969178A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Data fusion system and method for automatic driving vehicle and automatic driving system
CN111077556A (en) * 2020-01-02 2020-04-28 东南大学 Airport luggage tractor positioning device and method integrating Beidou and multiple sensors
CN111192295A (en) * 2020-04-14 2020-05-22 中智行科技有限公司 Target detection and tracking method, related device and computer readable storage medium
CN111966108A (en) * 2020-09-02 2020-11-20 成都信息工程大学 Extreme weather unmanned control system based on navigation system
CN112650212A (en) * 2019-10-11 2021-04-13 丰田自动车株式会社 Remote automatic driving vehicle and vehicle remote indicating system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140032167A1 (en) * 2011-04-01 2014-01-30 Physical Sciences, Inc. Multisensor Management and Data Fusion via Parallelized Multivariate Filters
CN104573733A (en) * 2014-12-26 2015-04-29 上海交通大学 High-precision map generation system and method based on high-definition ortho-photo map
CN105109484A (en) * 2015-08-21 2015-12-02 奇瑞汽车股份有限公司 Target-barrier determining method and device
CN205940608U (en) * 2016-04-29 2017-02-08 沈阳承泰科技有限公司 Multi -sensor fusion's car radar system
CN106441319A (en) * 2016-09-23 2017-02-22 中国科学院合肥物质科学研究院 System and method for generating lane-level navigation map of unmanned vehicle
CN106840242A (en) * 2017-01-23 2017-06-13 驭势科技(北京)有限公司 The sensor self-checking system and multi-sensor fusion system of a kind of intelligent driving automobile
CN107328410A (en) * 2017-06-30 2017-11-07 百度在线网络技术(北京)有限公司 Method and automobile computer for positioning automatic driving vehicle
CN107422730A (en) * 2017-06-09 2017-12-01 武汉市众向科技有限公司 The AGV transportation systems of view-based access control model guiding and its driving control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140032167A1 (en) * 2011-04-01 2014-01-30 Physical Sciences, Inc. Multisensor Management and Data Fusion via Parallelized Multivariate Filters
CN104573733A (en) * 2014-12-26 2015-04-29 上海交通大学 High-precision map generation system and method based on high-definition ortho-photo map
CN105109484A (en) * 2015-08-21 2015-12-02 奇瑞汽车股份有限公司 Target-barrier determining method and device
CN205940608U (en) * 2016-04-29 2017-02-08 沈阳承泰科技有限公司 Multi -sensor fusion's car radar system
CN106441319A (en) * 2016-09-23 2017-02-22 中国科学院合肥物质科学研究院 System and method for generating lane-level navigation map of unmanned vehicle
CN106840242A (en) * 2017-01-23 2017-06-13 驭势科技(北京)有限公司 The sensor self-checking system and multi-sensor fusion system of a kind of intelligent driving automobile
CN107422730A (en) * 2017-06-09 2017-12-01 武汉市众向科技有限公司 The AGV transportation systems of view-based access control model guiding and its driving control method
CN107328410A (en) * 2017-06-30 2017-11-07 百度在线网络技术(北京)有限公司 Method and automobile computer for positioning automatic driving vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
安吉尧等: "用于车辆自主导航的多传感器数据融合方法", 《汽车工程》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
CN110969178A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Data fusion system and method for automatic driving vehicle and automatic driving system
CN110969178B (en) * 2018-09-30 2023-09-12 毫末智行科技有限公司 Data fusion system and method for automatic driving vehicle and automatic driving system
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN110532896B (en) * 2019-08-06 2022-04-08 北京航空航天大学 Road vehicle detection method based on fusion of road side millimeter wave radar and machine vision
CN112650212A (en) * 2019-10-11 2021-04-13 丰田自动车株式会社 Remote automatic driving vehicle and vehicle remote indicating system
CN111077556A (en) * 2020-01-02 2020-04-28 东南大学 Airport luggage tractor positioning device and method integrating Beidou and multiple sensors
CN111192295A (en) * 2020-04-14 2020-05-22 中智行科技有限公司 Target detection and tracking method, related device and computer readable storage medium
CN111192295B (en) * 2020-04-14 2020-07-03 中智行科技有限公司 Target detection and tracking method, apparatus, and computer-readable storage medium
CN111966108A (en) * 2020-09-02 2020-11-20 成都信息工程大学 Extreme weather unmanned control system based on navigation system

Similar Documents

Publication Publication Date Title
CN108458746A (en) One kind being based on sensor method for self-adaption amalgamation
US11982540B2 (en) Infrastructure mapping and layered output
US11685360B2 (en) Planning for unknown objects by an autonomous vehicle
US11874119B2 (en) Traffic boundary mapping
US10281920B2 (en) Planning for unknown objects by an autonomous vehicle
US10234864B2 (en) Planning for unknown objects by an autonomous vehicle
CN109641589B (en) Route planning for autonomous vehicles
CN102208012B (en) Landscape coupling reference data generation system and position measuring system
US9037403B2 (en) Intensity map-based localization with adaptive thresholding
US20150307107A1 (en) Driver performance determination based on geolocation
CN106462727A (en) Systems and methods for lane end recognition
CN105620489A (en) Driving assistance system and real-time warning and prompting method for vehicle
JP7024610B2 (en) Detection device and detection system
US11734880B2 (en) Sensor calibration with environment map
JP2023106536A (en) System for vehicle navigation based on image analysis
CN108458745A (en) A kind of environment perception method based on intelligent detection equipment
CN116783455A (en) System and method for detecting an open door
CN110307841A (en) One kind measuring incomplete vehicle movement parameter estimation method based on information
CN109241855A (en) Intelligent vehicle based on stereoscopic vision can travel area detection method
WO2018165199A1 (en) Planning for unknown objects by an autonomous vehicle
CN114426030A (en) Pedestrian passing intention estimation method, device and equipment and automobile
CN117198078A (en) Signal lamp regulation and control method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180828