CN114545385A - Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar - Google Patents

Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar Download PDF

Info

Publication number
CN114545385A
CN114545385A CN202210152602.3A CN202210152602A CN114545385A CN 114545385 A CN114545385 A CN 114545385A CN 202210152602 A CN202210152602 A CN 202210152602A CN 114545385 A CN114545385 A CN 114545385A
Authority
CN
China
Prior art keywords
target
radar
camera
split
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210152602.3A
Other languages
Chinese (zh)
Inventor
郑艳
唐为林
胡益汀
陈斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huayu Automotive Systems Co Ltd
Original Assignee
Huayu Automotive Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huayu Automotive Systems Co Ltd filed Critical Huayu Automotive Systems Co Ltd
Priority to CN202210152602.3A priority Critical patent/CN114545385A/en
Publication of CN114545385A publication Critical patent/CN114545385A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a fusion target detection method based on a vehicle-mounted forward-looking camera and a forward millimeter wave radar, which comprises the following steps: step S1, the vehicle-mounted front camera collects the image in front of the vehicle and outputs the target information of the camera, if the type of the camera target is a truck, the step S2 is carried out, otherwise, the process is ended; step S2, searching an initial radar target and outputting initial radar target information; step S3, distinguishing a main target and a split target in the initial radar target, if the initial radar target does not comprise the split target, acquiring main target information as final radar target information, and performing step S5; if the initial radar target comprises the split target, the step S4 is carried out; step S4, respectively acquiring main target information and split target information, and acquiring final radar target information; and step S5, acquiring a fusion target by using a fusion algorithm. The invention can enable the ADAS system to enter an acceleration mode or a braking mode in advance, thereby enabling each function to be executed more smoothly and comfortably.

Description

Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar
Technical Field
The invention relates to the technical field of automobile auxiliary driving, in particular to a fusion target detection method based on a vehicle-mounted forward-looking camera and a forward millimeter wave radar.
Background
In an application scenario of an automobile Assisted Driving (ADAS) System, when an obstacle exists in front of a vehicle or the distance between the vehicle and a front vehicle is too close, the ADAS can send a braking function to guarantee the safety of a journey. However, large trucks are a more specific class of targets: the vehicle body is long, wide and large in size, and when the millimeter wave radar detects the millimeter wave radar, a plurality of reflection points are generated and are often clustered into a plurality of targets.
As shown in the large truck target of fig. 1, the radar reflection points are clustered into 3 points, which are located at the rear, side of the body, and side of the head, respectively, i.e., solid points R1, R2, and R3 as shown. The camera target output is based on the recognition rectangular frame formed at the tail of the vehicle, and as shown in the figure, C1 is the position of the truck target output by the camera. Because the distance deviation between R2, R3 and C1 is too large, if R2 or R3 is selected as the radar target point, the distance judgment of the fusion target is wrong, so the fusion target is usually excluded from the association range with C1 by the fusion algorithm, and C1 and R1 are associated to form the fusion target. However, when such truck target is lane-changed into the lane where the vehicle is located in front of the vehicle, i.e. R3 or R2 has been lane-changed into the lane, but the system may consider the target after C1 and R1 are merged to be still in the adjacent lane, and when the ADAS system receives such information, especially when the ACC function is turned on, the brake is started late and dangerous.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a fusion target detection method based on a vehicle-mounted forward-looking camera and a forward millimeter wave radar, which is used for fusing a large truck target, so that an ADAS (adaptive navigation System) can enter an acceleration mode or a braking mode in advance when a vehicle detects that a large truck exists in front, and the driving safety is guaranteed.
The invention provides a fusion target detection method based on a vehicle-mounted forward-looking camera and a forward millimeter wave radar, which comprises the following steps:
step S1, the vehicle-mounted front-view camera collects the image in front of the vehicle, acquires a camera target and outputs camera target information, if the type of the camera target is a truck, the step S2 is carried out, otherwise, the flow is ended;
step S2, finding an initial radar target formed by a forward millimeter wave radar on a truck in the track range of the vehicle-mounted forward-looking camera, and outputting initial radar target information;
step S3, distinguishing a main target and a split target in the initial radar target according to the initial radar target information, if the initial radar target does not include the split target, acquiring the main target information as final radar target information, and performing step S5; if the initial radar target comprises the split target, the step S4 is carried out;
step S4, respectively acquiring main target information and split target information, and acquiring final radar target information according to the main target information and the split target information;
and step S5, acquiring a fusion target by using a fusion algorithm according to the final radar target information and the camera target information.
Further, the camera target information includes a camera target type, a width of the camera target, a longitudinal distance of the camera target, a transverse distance of the camera target, a longitudinal speed of the camera target, a transverse speed of the camera target, and a camera target state.
Further, the initial radar target information includes an energy level of an electromagnetic wave reflected by the initial radar target, a width of the initial radar target, a longitudinal distance of the initial radar target, a lateral distance of the initial radar target, a longitudinal speed of the initial radar target, a lateral speed of the initial radar target, and an initial radar target state.
Further, the step S3 further includes: judging whether the number of the split targets exceeds two, if so, ending the process; if not, the process proceeds to step S4.
Further, the information of the main target includes the energy of the electromagnetic wave reflected by the main target, the width of the main target, the longitudinal distance of the main target, the transverse distance of the main target, the longitudinal velocity of the main target, the transverse velocity of the main target and the state of the main target.
Further, the split target information includes the energy of the electromagnetic wave reflected by the split target, the width of the split target, the longitudinal distance of the split target, the transverse distance of the split target, the longitudinal speed of the split target, the transverse speed of the split target, and the state of the split target.
Further, the method for acquiring the final radar target information in step S4 specifically includes: and taking the electromagnetic wave energy reflected by the main target as the electromagnetic wave energy of the final radar target, taking the width of the main target as the width of the final radar target, taking the longitudinal distance of the main target as the longitudinal distance of the final radar target, taking the transverse distance of the main target as the transverse distance of the final radar target, taking the longitudinal speed of the main target as the longitudinal speed of the final radar target, taking the transverse speed of the main target as the transverse speed of the final radar target, and taking the state of the split target positioned at the foremost end of the truck as the state of the final radar target.
Aiming at the special target type of the truck, the invention adopts a main and multi-attached representation method to describe the fusion target, so that the ADAS system can enter an acceleration mode or a braking mode in advance, and further, each function can be executed more smoothly and comfortably.
Drawings
FIG. 1 is a schematic diagram of a large truck target fusion.
Fig. 2 is a flowchart of a fusion target detection method based on a vehicle-mounted front-view camera and a forward millimeter wave radar according to the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
As shown in fig. 2, the method for detecting a fusion target based on a vehicle-mounted forward-looking camera and a forward millimeter wave radar provided by the invention comprises the following steps:
step S1, the vehicle-mounted front-view camera collects the image in front of the vehicle, acquires a camera target, and outputs camera target information, wherein the camera target information comprises the camera target type (such as pedestrians, bicycles, cars, trucks and the like), the width of the camera target, the longitudinal distance of the camera target, the transverse distance of the camera target, the longitudinal speed of the camera target, the transverse speed of the camera target and the camera target state (such as static, accelerating, decelerating, cutting and the like); and when the camera target type is the truck, performing step S2, otherwise, ending the process.
Camera target information CmCan be expressed by the following formula:
Figure BDA0003511168090000041
in the formula, CT _ type represents a camera target type; CT _ width represents the width of the camera target; CT _ dx represents the longitudinal distance of the camera target; CT _ dy represents the lateral distance of the camera target; CT _ vx represents the longitudinal velocity of the camera target; CT _ vy represents the lateral velocity of the camera target; CT _ status represents the camera target state.
And step S2, finding an initial radar target formed by the forward millimeter wave radar on the truck in the track range of the vehicle-mounted forward-looking camera, and outputting initial radar target information. The initial radar target information includes an energy level of an electromagnetic wave reflected by the initial radar target, a width of the initial radar target, a longitudinal distance of the initial radar target, a lateral distance of the initial radar target, a longitudinal velocity of the initial radar target, a lateral velocity of the initial radar target, and an initial radar target state (e.g., stationary, accelerating, decelerating, cutting in, cutting out, etc.). The found initial radar target meets the requirement of being capable of being fused with the camera target, namely, the distance difference, the speed difference, the track initiation and other parameters of the camera target and the radar target meet the association strategy of the current fusion algorithm. For example, when the longitudinal distance CT _ dx of the current camera target is within the range of (D1, D2), the distance difference between a target capable of fusion association and the camera target is set to Δ D, the speed difference between a target capable of fusion association and the camera target is set to Δ v, and the track of the target is once associated by the fusion track.
Initial radar target information RmCan be expressed by the following formula:
Figure BDA0003511168090000042
in the formula, RT _ rcs represents the energy of the electromagnetic wave reflected by the initial radar target; RT _ width represents the width of the initial radar target; RT _ dx represents the longitudinal distance of the initial radar target; RT _ dy represents the transverse distance of the initial radar target, and RT _ vx represents the longitudinal speed of the initial radar target; RT _ vy represents the lateral velocity of the initial radar target; RT _ status represents the initial radar target state; RT _ num represents the number of sub-targets in the initial radar target, if RT _ num>1, indicating that the initial radar target comprises a main target and a split target, and indicating initial radar target information RmA new class including a primary target and all split targets; RT _ expandInfo [ num-1 [ ]]Representing split target information.
Step S3, distinguishing a main target and a split target in the initial radar target according to the initial radar target information, if the initial radar target does not include the split target, acquiring the main target information as final radar target information, and performing step S5; if the initial radar target includes the split target, the process proceeds to step S4.
The method for dividing the main target and the split target in the initial radar target comprises the following steps: when a plurality of targets with consistent speed and distance exist in the initial radar target, the targets are judged to be split targets. The target with the smallest longitudinal distance is the main target.
In order to avoid causing erroneous determination, step S3 further includes: judging whether the number of the split targets exceeds two, if so, indicating that the found initial radar target is a vehicle in front of the truck, and ending the process; if not, the process proceeds to step S4.
For example,if the initial radar target includes one main target and two split targets (as shown in fig. 1), the initial radar target information RmExpressed as:
Figure BDA0003511168090000051
in the formula, 3 denotes three targets in total in the initial radar target, and rm.rt _ rcs to rm.rt _ status denote the amount of electromagnetic wave energy reflected by the main target R1, the width of the main target R1, the longitudinal distance of the main target R1, the lateral distance of the main target R1, the longitudinal velocity of the main target R1, the lateral velocity of the main target R1, and the state of the main target R1, respectively; RT _ expandInfo [2] represents information of two split targets R2 and R3 in the original radar target, which includes RT _ expandInfo [0] and RT _ expandInfo [1 ]. Wherein, the information of the first split target R2 is represented by RT _ expandInfo [0], and the information of the second split target R3 is represented by RT _ expandInfo [1], that is:
Figure BDA0003511168090000061
wherein R1 _ rcs represents the energy of the electromagnetic wave reflected by the first split target R2; r1 _ width represents the width of the first split target R2; r1 _ dx represents the longitudinal distance of the first split target R2; r1 dy represents the lateral distance of the first split target R2; r1 _vxrepresents the longitudinal velocity of the first split target R2; r1 _vyrepresents the lateral velocity of the first split target R2; r < 1 > _ status represents the state of the first split target R2.
Figure BDA0003511168090000062
Wherein R2 _ rcs represents the energy of the electromagnetic wave reflected by the second split target R3; r2 _ width represents the width of the second split target R3; r < 2 > _ dx represents the longitudinal distance of the second split target R3; r2 _ dy represents the lateral distance of the second split target R3; r2 _vxrepresents the longitudinal velocity of the second split target R3; r2 _vyrepresents the lateral velocity of the second split target R3; r < 2 > _ status represents the state of the second split target R3.
And step S4, respectively acquiring main target information and split target information, and acquiring final radar target information according to the main target information and the split target information.
The method for acquiring the final radar target information according to the main target information and the split target information specifically comprises the following steps: the method comprises the steps of taking the electromagnetic wave energy reflected by a main target as the electromagnetic wave energy of a final radar target, taking the width of the main target as the width of the final radar target, taking the longitudinal distance of the main target as the longitudinal distance of the final radar target, taking the transverse distance of the main target as the transverse distance of the final radar target, taking the longitudinal speed of the main target as the longitudinal speed of the final radar target, taking the transverse speed of the main target as the transverse speed of the final radar target, and taking the state of a split target positioned at the foremost end of a truck as the state of the final radar target.
And step S5, acquiring a fusion target by using a fusion algorithm according to the final radar target information and the camera target information. The acquired fusion objective may be combined with other algorithms in the ADAS system to allow the vehicle to take further action.
Aiming at the special target type of the truck, the invention adopts a master-attached and multi-attached characterization method to describe the fusion target, and under the condition that one fusion target has master and attached target points, the cut-in or cut-out state of the target point at the forefront represents the state of the fusion target, so that the ADAS system can enter an acceleration mode or a brake mode in advance, and each function can be executed more smoothly and comfortably.
The above embodiments are merely preferred embodiments of the present invention, which are not intended to limit the scope of the present invention, and various changes may be made in the above embodiments of the present invention. All simple and equivalent changes and modifications made according to the claims and the content of the specification of the present application fall within the scope of the claims of the present patent application. The invention has not been described in detail in order to avoid obscuring the invention.

Claims (7)

1. A fusion target detection method based on a vehicle-mounted forward-looking camera and a forward millimeter wave radar is characterized by comprising the following steps:
step S1, the vehicle-mounted front-view camera collects the image in front of the vehicle, acquires a camera target and outputs camera target information, if the type of the camera target is a truck, the step S2 is carried out, otherwise, the flow is ended;
step S2, searching an initial radar target formed by a forward millimeter wave radar on a truck in the track range of the vehicle-mounted forward-looking camera, and outputting initial radar target information;
step S3, distinguishing a main target and a split target in the initial radar target according to the initial radar target information, if the initial radar target does not include the split target, acquiring the main target information as final radar target information, and performing step S5; if the initial radar target comprises the split target, the step S4 is carried out;
step S4, respectively acquiring main target information and split target information, and acquiring final radar target information according to the main target information and the split target information;
and step S5, acquiring a fusion target by using a fusion algorithm according to the final radar target information and the camera target information.
2. The method according to claim 1, wherein the camera target information comprises camera target type, width of camera target, longitudinal distance of camera target, transverse distance of camera target, longitudinal speed of camera target, transverse speed of camera target and camera target state.
3. The method of claim 1, wherein the initial radar target information comprises an amount of electromagnetic wave energy reflected by the initial radar target, a width of the initial radar target, a longitudinal distance of the initial radar target, a lateral distance of the initial radar target, a longitudinal velocity of the initial radar target, a lateral velocity of the initial radar target, and an initial radar target state.
4. The method for detecting the fusion target based on the vehicle-mounted forward-looking camera and the forward millimeter wave radar as claimed in claim 1, wherein the step S3 further comprises: judging whether the number of the split targets exceeds two, if so, ending the process; if not, the process proceeds to step S4.
5. The method as claimed in claim 1, wherein the primary target information includes electromagnetic wave energy reflected by the primary target, width of the primary target, longitudinal distance of the primary target, transverse distance of the primary target, longitudinal velocity of the primary target, transverse velocity of the primary target, and status of the primary target.
6. The method according to claim 5, wherein the split target information comprises the energy of the electromagnetic wave reflected by the split target, the width of the split target, the longitudinal distance of the split target, the transverse distance of the split target, the longitudinal speed of the split target, the transverse speed of the split target and the state of the split target.
7. The method for detecting the fusion target based on the vehicle-mounted forward-looking camera and the forward millimeter wave radar according to claim 6, wherein the method for acquiring the final radar target information in the step S4 specifically comprises: and taking the electromagnetic wave energy reflected by the main target as the electromagnetic wave energy of the final radar target, taking the width of the main target as the width of the final radar target, taking the longitudinal distance of the main target as the longitudinal distance of the final radar target, taking the transverse distance of the main target as the transverse distance of the final radar target, taking the longitudinal speed of the main target as the longitudinal speed of the final radar target, taking the transverse speed of the main target as the transverse speed of the final radar target, and taking the state of the split target positioned at the foremost end of the truck as the state of the final radar target.
CN202210152602.3A 2022-02-18 2022-02-18 Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar Pending CN114545385A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210152602.3A CN114545385A (en) 2022-02-18 2022-02-18 Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210152602.3A CN114545385A (en) 2022-02-18 2022-02-18 Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar

Publications (1)

Publication Number Publication Date
CN114545385A true CN114545385A (en) 2022-05-27

Family

ID=81676173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210152602.3A Pending CN114545385A (en) 2022-02-18 2022-02-18 Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar

Country Status (1)

Country Link
CN (1) CN114545385A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007232411A (en) * 2006-02-27 2007-09-13 Toyota Motor Corp Object detecting apparatus
CN106428001A (en) * 2016-09-28 2017-02-22 浙江吉利控股集团有限公司 Forealarming method and system for lane changing of vehicle
CN106926779A (en) * 2017-03-09 2017-07-07 吉利汽车研究院(宁波)有限公司 A kind of vehicle lane change accessory system
KR20180006006A (en) * 2016-07-07 2018-01-17 주식회사 만도 Target selection apparatus and target selection method
KR20180025552A (en) * 2016-09-01 2018-03-09 주식회사 만도 Vehicle control apparatus and vehicle control method
CN107933475A (en) * 2017-11-21 2018-04-20 重庆电讯职业学院 A kind of car collision avoidance System for reducing collsion damage
US20190293758A1 (en) * 2016-03-31 2019-09-26 Denso Corporation Object recognition apparatus and object recognition method
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN111226132A (en) * 2019-03-18 2020-06-02 深圳市大疆创新科技有限公司 Target detection method and device, millimeter wave radar and movable platform
CN111324120A (en) * 2020-02-26 2020-06-23 中汽研汽车检验中心(天津)有限公司 Cut-in and cut-out scene extraction method for automatic driving front vehicle
CN111798698A (en) * 2020-06-24 2020-10-20 中国第一汽车股份有限公司 Method and device for determining front target vehicle and vehicle
CN111891124A (en) * 2020-06-08 2020-11-06 福瑞泰克智能系统有限公司 Method, system, computer device and readable storage medium for target information fusion
CN113034972A (en) * 2021-03-03 2021-06-25 江苏琥珀汽车科技有限公司 Highway automatic lane changing method based on travelable area
CN113139607A (en) * 2021-04-27 2021-07-20 苏州挚途科技有限公司 Obstacle detection method and device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007232411A (en) * 2006-02-27 2007-09-13 Toyota Motor Corp Object detecting apparatus
US20190293758A1 (en) * 2016-03-31 2019-09-26 Denso Corporation Object recognition apparatus and object recognition method
KR20180006006A (en) * 2016-07-07 2018-01-17 주식회사 만도 Target selection apparatus and target selection method
KR20180025552A (en) * 2016-09-01 2018-03-09 주식회사 만도 Vehicle control apparatus and vehicle control method
CN106428001A (en) * 2016-09-28 2017-02-22 浙江吉利控股集团有限公司 Forealarming method and system for lane changing of vehicle
CN106926779A (en) * 2017-03-09 2017-07-07 吉利汽车研究院(宁波)有限公司 A kind of vehicle lane change accessory system
CN107933475A (en) * 2017-11-21 2018-04-20 重庆电讯职业学院 A kind of car collision avoidance System for reducing collsion damage
CN111226132A (en) * 2019-03-18 2020-06-02 深圳市大疆创新科技有限公司 Target detection method and device, millimeter wave radar and movable platform
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN111324120A (en) * 2020-02-26 2020-06-23 中汽研汽车检验中心(天津)有限公司 Cut-in and cut-out scene extraction method for automatic driving front vehicle
CN111891124A (en) * 2020-06-08 2020-11-06 福瑞泰克智能系统有限公司 Method, system, computer device and readable storage medium for target information fusion
CN111798698A (en) * 2020-06-24 2020-10-20 中国第一汽车股份有限公司 Method and device for determining front target vehicle and vehicle
CN113034972A (en) * 2021-03-03 2021-06-25 江苏琥珀汽车科技有限公司 Highway automatic lane changing method based on travelable area
CN113139607A (en) * 2021-04-27 2021-07-20 苏州挚途科技有限公司 Obstacle detection method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
VALENTIN MAGNIER ET AL.: "Automotive LIDAR objects Detection and Classification Algorithm Using the Belief Theory", 2017 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 31 July 2017 (2017-07-31), pages 746 - 751 *
闫凌 等: "矿用卡车无人驾驶系统研究", 工矿自动化, vol. 47, no. 4, 30 April 2021 (2021-04-30), pages 19 - 29 *
马晓晨: "基于PowerPC 的雷达点迹凝聚方法研究", 中国优秀硕士学位论文全文数据库, no. 3, 15 March 2017 (2017-03-15) *

Similar Documents

Publication Publication Date Title
US10777081B2 (en) Collision preventing control device
EP3407326B1 (en) Pedestrian determination method and determination device
US10759432B2 (en) Vehicle control apparatus, vehicle control method, and vehicle control program
CN107848534B (en) Vehicle control device, vehicle control method, and medium storing vehicle control program
US20120078484A1 (en) Vehicle cruise control apparatus
US8078383B2 (en) Speed control system for vehicles
CN101318472B (en) Cruise control system and method
CN112208533B (en) Vehicle control system, vehicle control method, and storage medium
KR102295578B1 (en) Control Method of Autonomous Vehicle
US10906543B2 (en) Vehicle control system and method
US20180339670A1 (en) Collision preventing control device
EP4074565B1 (en) Automated lane changing device and method for vehicle
CN112389430B (en) Determination method for vehicle lane change cutting-in motorcade period based on offset rate
US11136028B2 (en) Speed controller for platooning vehicle and method therefor
JP2020163870A (en) Vehicle control device, vehicle control method, and program
CN112277938A (en) Vehicle control method, device, storage medium, active safety system and vehicle
CN113859240A (en) Lane change assist system and lane change method using the same
CN111746550B (en) Vehicle control device, vehicle control method, and storage medium
US20200301429A1 (en) Vehicle control device
JP2020082850A (en) Vehicle travel control method and vehicle travel control system
CN110758393B (en) Vehicle running control method and device
US11608065B2 (en) Method for advanced inertia drive control of vehicle
CN112440989B (en) vehicle control system
CN111746530B (en) Vehicle control device, vehicle control method, and storage medium
CN114545385A (en) Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination