CN115476923A - System and method for active blind zone assistance - Google Patents

System and method for active blind zone assistance Download PDF

Info

Publication number
CN115476923A
CN115476923A CN202110801860.5A CN202110801860A CN115476923A CN 115476923 A CN115476923 A CN 115476923A CN 202110801860 A CN202110801860 A CN 202110801860A CN 115476923 A CN115476923 A CN 115476923A
Authority
CN
China
Prior art keywords
time
vehicle
host vehicle
target vehicle
time periods
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110801860.5A
Other languages
Chinese (zh)
Inventor
Z·赖恩
F·博罗尔基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Steering Solutions IP Holding Corp
Original Assignee
Steering Solutions IP Holding Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Steering Solutions IP Holding Corp filed Critical Steering Solutions IP Holding Corp
Publication of CN115476923A publication Critical patent/CN115476923A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0265Automatic obstacle avoidance by steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0255Automatic changing of lane, e.g. for passing another vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D5/00Power-assisted or power-driven steering
    • B62D5/04Power-assisted or power-driven steering electrical, e.g. using an electric servo-motor connected to, or forming part of, the steering gear
    • B62D5/0457Power-assisted or power-driven steering electrical, e.g. using an electric servo-motor connected to, or forming part of, the steering gear characterised by control features of the drive means as such
    • B62D5/046Controlling the motor
    • B62D5/0463Controlling the motor calculating assisting torque from the motor based on driver input

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

Systems and methods for active blind zone assistance are disclosed. The method comprises the following steps: a plurality of sensor values is received prior to a first time and a target vehicle in a blind zone of a host vehicle is identified based on the plurality of sensor values. The method also includes determining, at a first time, that the host vehicle initiated a steering maneuver and identifying a plurality of time periods between the first time and a second time. The method also includes updating the plurality of sensor values and determining a heading angle of the target vehicle relative to the host vehicle. The method also includes estimating a location of a target vehicle at each of the plurality of time periods, and estimating a location of the target vehicle at a second time using each location of the target vehicle at each corresponding time period of the plurality of time periods.

Description

System and method for active blind zone assistance
Technical Field
The present disclosure relates to vehicle active blind spot assist, and more particularly to systems and methods for camera-less active blind spot assist.
Background
Vehicles such as cars, trucks, sport utility vehicles, cross-over vehicles, minivans, boats, airplanes, all terrain vehicles, recreational vehicles, or other suitable vehicles increasingly include blind spot assist systems. Such systems may be configured to use various sensors, including image capture devices, such as cameras disposed adjacent a front portion of a corresponding vehicle (e.g., host vehicle). Typically, such systems are configured to warn an operator of the vehicle and avoid a potential collision in response to a potential risk of collision detected during, for example, performance of a lane change.
In such systems, short-range sensors, such as radio detection and ranging (radar) sensors, housed on both sides of the rear bumper of the host vehicle can monitor the area directly beside and behind the host vehicle. An image capture device (e.g., a camera) may be forward facing and may be used to detect the lane indicator, and based on the lane indicator, a controller of the vehicle may determine a position of the target vehicle at a blind spot of the host vehicle. Such position information of the target vehicle may be used by the controller during a lane change maneuver of the host vehicle to avoid a collision between the host vehicle and the target vehicle.
Disclosure of Invention
The present disclosure relates generally to vehicle blind spot assist.
One aspect of the disclosed embodiments includes a method for active blind spot assistance. The method includes receiving a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time, and identifying a target vehicle in a blind zone of the host vehicle based on the plurality of sensor values. The method also includes determining, at a first time, that the host vehicle initiated a steering maneuver and identifying a plurality of time periods between the first time and a second time. The method also includes updating the plurality of sensor values using the at least one transformation function, and determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values. The method also includes estimating a position of the target vehicle at each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle, and estimating a position of the target vehicle at the second time using each position of the target vehicle at each corresponding time period of the plurality of time periods.
In some embodiments, a system for active blind spot assistance without using an image capture device includes a processor and a memory. The memory includes instructions that, when executed by the processor, cause the processor to: receiving a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time; identifying a target vehicle in a blind zone of a host vehicle based on the plurality of sensor values; determining, at a first time, that a host vehicle initiated a steering maneuver; identifying a plurality of time periods between a first time and a second time; updating the plurality of sensor values using at least one transformation function; determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values; estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle; and estimating a position of the target vehicle at the second time using each position of the target vehicle at each corresponding time period of the plurality of time periods.
In some embodiments, an apparatus for active blind spot assistance includes a processor and a memory. The memory includes instructions that, when executed by the processor, cause the processor to: receiving a plurality of sensor values from at least one radio detection and ranging sensor disposed proximate a rear portion of the host vehicle prior to a first time; identifying a target vehicle in a blind zone of a host vehicle based on the plurality of sensor values; determining, at a first time, that a host vehicle initiated a steering maneuver; identifying a plurality of time periods between a first time and a second time; updating the plurality of sensor values using at least one transformation function; determining a heading angle of the target vehicle relative to the host vehicle by applying a linear regression to the updated plurality of sensor values; estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle; estimating a location of the target vehicle at a second time using each location of the target vehicle at each corresponding time period of the plurality of time periods; determining a time at which a collision is to occur between the host vehicle and the target vehicle using at least the position of the target vehicle at the second time; and in response to determining that the time to collision is less than the threshold, applying a torque overlay to at least one motor of a steering system of the host vehicle to steer the host vehicle away from the target vehicle.
These and other aspects of the disclosure are disclosed in the following detailed description of the embodiments, the appended claims and the accompanying drawings.
Drawings
The disclosure is best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that, according to common practice, the various features of the drawings are not to scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity.
FIG. 1 generally illustrates a vehicle according to the principles of the present disclosure.
FIG. 2 generally illustrates an active blind spot assist system including a controller according to the principles of the present disclosure.
Fig. 3A and 3B generally illustrate a vehicle lane change maneuver according to the principles of the present disclosure.
FIG. 4 is a flow chart generally illustrating an active blind spot assist method according to the principles of the present disclosure.
Detailed Description
Various embodiments of the present disclosure are discussed below. Although one or more of these embodiments may be preferred, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. Furthermore, one skilled in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
As noted above, vehicles such as cars, trucks, sport utility vehicles, cross-over vehicles, minivans, boats, airplanes, all terrain vehicles, recreational vehicles, or other suitable vehicles increasingly include blind spot assistance systems. Such systems may be configured to use various sensors, including image capture devices, such as cameras disposed adjacent a front portion of a corresponding vehicle (e.g., host vehicle). Typically, such systems are configured to warn an operator of the vehicle and avoid a potential collision in response to a potential risk of collision detected during, for example, performance of a lane change.
In such systems, short-range sensors, such as radio detection and ranging (radar) sensors, housed on both sides of the rear bumper of the host vehicle can monitor the area directly beside and behind the host vehicle. An image capture device (e.g., a camera) may be forward facing and may be used to detect a lane indicator, and based on the lane indicator, a controller of the vehicle may determine a position of the target vehicle at a blind spot of the host vehicle. Such position information of the target vehicle may be used by the controller during a lane change maneuver of the host vehicle to avoid a collision between the host vehicle and the target vehicle.
However, the performance of the image capture device is dependent on the conditions of the surroundings of the host vehicle (e.g., because the image capture device may be a passive sensor). For example, fog, direct sunlight, dust on the image capture device lens or covering lane markings, heavy rain blocking the image capture device lens, snow on the image capture device lens or covering lane markings, faded lane markings on roads adjacent to the host vehicle or no lane markings on roads adjacent to the host vehicle, and/or other conditions of the environment of the host vehicle may reduce the efficacy of the image capture device (e.g., because the image capture device may not be able to capture images of the environment of the host vehicle that may be used to identify lane markings, other vehicles, etc.), which may reduce the efficacy of the blind spot assist feature of the host vehicle. Additionally or alternatively, the image capture device may experience a malfunction and/or a malfunction may occur in a communication path between the controller and the image capture device. Further, some host vehicles may not include an image capture device. Such a system may use the image capture device to determine the polynomial Y = a given as the marker 0 +a 1 x+a 2 x 2 +a 3 x 3 . Such systems may use polynomials to identify objects that are proximate to the vehicle.
Accordingly, systems and methods such as those described herein that perform blind spot assist features without the use of an image capture device may be desirable. In some embodiments, the systems and methods described herein may be configured to provide blind spot assistance using one or more sensors, such as one or more radar sensors or other suitable sensors (e.g., without the use of passive image capture sensors or devices).
In some embodiments, the systems and methods described herein may be configured to: the steering system of the host vehicle is used in response to measurements or sensed values of one or more sensors to avoid and/or mitigate the consequences of a collision between the host vehicle and the target vehicle. The systems and methods described herein may be configured to track an object (e.g., such as a target vehicle or other suitable object) at a blind spot of a host vehicle using corner sensors (e.g., radar or other suitable sensors disposed at or near each side of the rear of the host vehicle). The systems and methods described herein may be configured to use measured or sensed values of corner sensors of a host vehicle to predict a possible collision during a steering maneuver performed by the host vehicle.
In some embodiments, the systems and methods described herein may be configured to receive a plurality of sensor values from at least one sensor disposed proximate a rear portion of the host vehicle prior to the first time. The at least one sensor may comprise at least one radar sensor. For example, the host vehicle may include a respective radar sensor disposed on each side of the rear of the host vehicle. The host vehicle may include an electronic power steering system, a steer-by-wire steering system, or other suitable steering system.
The systems and methods described herein may be configured to identify a target vehicle in a blind zone of a host vehicle based on the plurality of sensor values. The systems and methods described herein may be configured to determine, at a first time, that a host vehicle initiated a steering maneuver. The steering maneuver includes a lane change maneuver or other suitable steering maneuver.
The systems and methods described herein may be configured to identify a plurality of time periods between a first time and a second time. The systems and methods described herein may be configured to update the plurality of sensor values using at least one transformation function. In some embodiments, the at least one transformation function comprises a homogeneous transformation matrix or other suitable transformation function. In some embodiments, the systems and methods described herein may be configured to apply linear regression to the updated plurality of sensor values.
In some embodiments, the systems and methods described herein may be configured to determine a heading angle of the target vehicle relative to the host vehicle using a plurality of sensor values updated after applying the linear regression. The systems and methods described herein may be configured to estimate a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle. The systems and methods described herein may be configured to estimate the location of the target vehicle at the second time using each location of the target vehicle at each corresponding time period of the plurality of time periods.
In some embodiments, the systems and methods described herein may be configured to estimate the speed of the target vehicle relative to the host vehicle using the updated plurality of sensor values and the constant speed model. The systems and methods described herein may be configured to determine a speed of a host vehicle. The systems and methods described herein may be configured to estimate a location of the host vehicle at each of the plurality of time periods using a fixed turn radius model.
In some embodiments, the systems and methods described herein may be configured to determine, for each of the plurality of time periods, a time at which a collision is to occur between the host vehicle and the target vehicle based on one or more of an estimated velocity of the target vehicle, a velocity of the host vehicle, a position of the target vehicle at each of the plurality of time periods, a position of the host vehicle at each of the plurality of time periods, other suitable information, or any combination thereof.
In some embodiments, the systems and methods described herein may be configured to determine whether a time to collision for a respective time period of the plurality of time periods is less than a threshold. The systems and methods described herein may be configured to initiate at least one steering control operation in response to determining that a time to collision for a respective time period of the plurality of time periods is less than a threshold. The at least one steering control operation includes applying a torque overlay to the at least one motor of the steering system of the host vehicle, other suitable steering control operations, or a combination thereof. Steering control operations (e.g., including applying a torque overlay to the at least one motor of the steering system) may be configured to direct the host vehicle away from the target vehicle.
FIG. 1 generally illustrates a vehicle 10 according to the principles of the present disclosure. Vehicle 10 may include any suitable vehicle, such as a car, truck, sport utility vehicle, minivan, cross-over vehicle, any other passenger vehicle, any suitable commercial vehicle, or any other suitable vehicle. Although the vehicle 10 is illustrated as a passenger car having wheels and being used on a road, the principles of the present disclosure may be applied to other vehicles, such as airplanes, boats, trains, drones, or other suitable vehicles.
The vehicle 10 includes a body 12 and a hood 14. The passenger compartment 18 is at least partially defined by the vehicle body 12. Another portion of the vehicle body 12 defines an engine compartment 20. The hood 14 is movably attached to a portion of the vehicle body 12 such that the hood 14 provides access to the engine compartment 20 when the hood 14 is in a first or open position and the hood 14 covers the engine compartment 20 when the hood 14 is in a second or closed position. In some embodiments, the engine compartment 20 may be disposed in a rear portion of the vehicle 10 (as compared to what is generally shown).
The passenger compartment 18 may be disposed rearward of the engine compartment 20, but in embodiments where the engine compartment 20 is disposed in a rear portion of the vehicle 10, the passenger compartment 18 may be disposed forward of the engine compartment 20. The vehicle 10 may include any suitable propulsion system including an internal combustion engine, one or more electric motors (e.g., an electric vehicle), one or more fuel cells, a hybrid (which includes a combination of an internal combustion engine, one or more electric motors, and is, for example, a hybrid vehicle) propulsion system, and/or any other suitable propulsion system.
In some embodiments, the vehicle 10 may include a gasoline-fueled engine, such as a spark-ignition engine. In some embodiments, the vehicle 10 may include a diesel fuel engine, such as a compression ignition engine. The engine compartment 20 houses and/or encloses at least some components of the propulsion system of the vehicle 10. Additionally or alternatively, propulsion control devices, such as an accelerator actuator (e.g., an accelerator pedal), a brake actuator (e.g., a brake pedal), a steering wheel, and other such components, are disposed in the passenger compartment 18 of the vehicle 10. The propulsion control devices may be actuated or controlled by a driver of the vehicle 10 and may be directly connected to corresponding components of the propulsion system, such as a throttle, a brake, an axle, and a vehicle transmission (transmission), among others. In some embodiments, the propulsion control device may transmit a signal to a vehicle computer (e.g., steer-by-wire driving), which in turn may control a corresponding propulsion component of the propulsion system. As such, in some embodiments, the vehicle 10 may be an autonomous vehicle.
In some embodiments, the vehicle 10 includes a transmission in communication with the crankshaft via a flywheel or clutch or fluid coupling. In some embodiments, the transmission comprises a manual transmission. In some embodiments, the transmission comprises an automatic transmission. In the case of an internal combustion engine or hybrid vehicle, the vehicle 10 may include one or more pistons that operate in conjunction with a crankshaft to generate a force that is transmitted through a transmission to one or more axles to rotate the wheels 22. When the vehicle 10 includes one or more electric motors, the vehicle battery and/or fuel cell provides energy to the electric motors to rotate the wheels 22.
The vehicle 10 may include an automatic vehicle propulsion system, such as cruise control, adaptive cruise control, automatic brake control, other automatic vehicle propulsion systems, or a combination thereof. The vehicle 10 may be an autonomous vehicle or a semi-autonomous vehicle, or other suitable type of vehicle. The vehicle 10 may include additional or fewer features than those generally illustrated and/or disclosed herein.
In some embodiments, the vehicle 10 may include an ethernet component 24, a Controller Area Network (CAN) bus 26, a media oriented system transfer component (MOST) 28, a FlexRay component 30 (e.g., steer-by-wire, etc.), and a local interconnect network component (LIN) 32. The vehicle 10 may use the CAN bus 26, the MOST 28, the FlexRay component 30, the LIN 32, other suitable network or communication system, or a combination thereof to communicate various information from sensors, e.g., inside or outside the vehicle, to various processors or controllers, e.g., inside or outside the vehicle. The vehicle 10 may include additional or fewer features than those generally illustrated and/or disclosed herein.
In some embodiments, the vehicle 10 may include a steering system, such as an EPS system, a steer-by-wire steering system (e.g., which may include or communicate with one or more controllers that control components of the steering system without using a mechanical connection between a hand-held steering wheel of the vehicle 10 and the wheels 22), or other suitable steering system. The system may include an open loop feedback control system or mechanism, a closed loop feedback control system or mechanism, or a combination thereof. The steering system may be configured to receive various inputs including, but not limited to, a hand-held steering wheel position, an input torque, one or more wheel positions, other suitable inputs or information, or a combination thereof. Additionally or alternatively, the inputs may include a hand-held steering wheel torque, a hand-held steering wheel angle, a motor speed, a vehicle speed, an estimated motor torque command, other suitable inputs, or a combination thereof. The steering system may be configured to provide steering functionality and/or control to the vehicle 10. For example, the steering system may generate an assist torque based on various inputs. The steering system may be configured to selectively control a motor of the steering system using the assist torque to provide steering assistance to an operator of the vehicle 10.
In some embodiments, the vehicle 10 may include a controller, such as the controller 100 as generally shown in fig. 2. The controller 100 may include any suitable controller, such as an electronic control unit or other suitable controller. The controller 100 may be configured to control various functions of the steering system and/or various functions of the vehicle 10, for example. The controller 100 may include a processor 102 and a memory 104. Processor 102 may include any suitable processor, such as those described herein. Additionally or alternatively, controller 100 may include any suitable number of processors in addition to processor 102 or in place of processor 102. The memory 104 may comprise a single disk or multiple disks (e.g., hard disk drives) and include a storage management module that manages one or more partitions within the memory 104. In some embodiments, the memory 104 may include flash memory, semiconductor (solid state) memory, or the like. The memory 104 may include Random Access Memory (RAM), read Only Memory (ROM), or a combination thereof. The memory 104 may include instructions that, when executed by the processor 102, cause the processor 102 to control at least various aspects of the vehicle 10.
The controller 100 may receive one or more signals from various measurement devices or sensors 106 indicative of sensed or measured characteristics of the vehicle 10. The sensors 106 may include any suitable sensors, measurement devices, and/or other suitable mechanisms. For example, the sensors 106 may include one or more torque sensors or devices, one or more hand-held steering wheel position sensors or devices, one or more motor position sensors or devices, one or more position sensors or devices, other suitable sensors or devices, or a combination thereof. The one or more signals may indicate hand-held steering wheel torque, steering wheel angle, motor speed, vehicle speed, other suitable information, or a combination thereof.
In some embodiments, the sensors 106 may include one or more image capture devices (e.g., such as a camera), one or more audio input devices (e.g., such as a microphone), one or more global positioning devices, one or more proximity sensing devices, one or more radar sensors, one or more light detection and ranging sensors, one or more ultrasonic sensors, other suitable sensors or devices, or a combination thereof.
In some embodiments, the controller 100 may be configured to implement a blind spot assist feature of the vehicle 10. For example, the controller 100 may receive measured or sensed values from the sensors 106. As described above, the sensors 106 may include one or more radar sensors disposed adjacent the rear of the vehicle 10. For example, the first sensor 106 may be disposed on a first side of the vehicle 10 at or near a first rear corner of the vehicle 10, and the second sensor 106 may be disposed on a second side of the vehicle 10 opposite the first side at or near a second rear corner of the vehicle 10 opposite the first corner. It should be appreciated that although first and second sensors are described, the vehicle 10 may include any suitable number of sensors 106 or other suitable sensors. The controller 100 may use the measured or sensed values of the sensors 106 to determine one or more locations of objects proximate the vehicle 10. The controller 100 may selectively control various aspects of the steering system of the vehicle 10 based on the measurements or sensed values of the sensors 106 to avoid or mitigate the consequences of a collision with an object.
In some embodiments, as generally shown in fig. 3A, at time t (which may be referred to as a first time, for example), controller 100 may track (e.g., prior to time t) one or more locations of target vehicle 200 using values received from sensors 106. In some embodiments, the controller 100 may use a ring buffer or other suitable mechanism to track values from the sensors 106 indicative of the location of the target vehicle 200. The controller 100 determines (e.g., based on inputs provided by various components of the steering system or other suitable components of the vehicle 10) that the vehicle 10 (e.g., which may be referred to as the host vehicle 10 or the host vehicle) is initiating a steering maneuver, such as a lane change maneuver or other suitable steering maneuver, at time t + dt (e.g., which may be referred to as a second time).
At time t + dt, controller 100 may update the values received from sensors 106 (e.g., and the corresponding position or state of target vehicle 200) using one or more transformation functions. For example, the controller 100 may update the values of the sensors 106 using any suitable transformation function (including, but not limited to, a homogeneous transformation matrix), which may be defined as:
Figure BDA0003164968590000091
where h indicates the vehicle speed of the vehicle 10 at time t + dt, k is a constant set to 0 (e.g., or other suitable value), and α indicates the yaw rate of the vehicle 10 at time t + dt. The vehicle speed, ya yaw rate, and/or other suitable characteristics of the vehicle 10 may be provided to the controller 100, which may be various sensors or components of the vehicle 10.
In some embodiments, after the values of the sensors 106 (e.g., and the corresponding state or position of the target vehicle 200) are updated, the controller 100 may apply a linear regression to the updated values of the sensors 106 (e.g., and the corresponding state or position of the target vehicle 200). The controller 100 may use the result of applying linear regression to the updated values of the sensors 106 to calculate a relative heading angle of the target vehicle 200 (e.g., a heading angle relative to the vehicle 10). The calculation of the relative heading angle of the target vehicle 200 may be defined as:
Figure BDA0003164968590000092
where y indicates a component of the position of the vehicle 10 along the y-axis, as generally shown in FIG. 3A, and x indicates a component of the position of the vehicle 10 along the x-axis, as generally shown in FIG. 3A.
In some embodiments, the controller 100 may be configured to: for each of a plurality of time periods between time t and time t + dt, the position of target vehicle 200 and the position of vehicle 10 are predicted or estimated. For example, the time period may correspond to a subdivision of the time between time t and time t + dt. The controller 100 may use the values of the sensors 106, the results of applying a linear regression to the values of the sensors 106, the heading angle of the target vehicle 200 relative to the vehicle 10, other suitable information, or a combination thereof to estimate the location of the target vehicle 200 for each time period and the location of the vehicle 10 for each time period.
In some embodiments, the controller 100 may avoid a potential collision between the target vehicle 200 and the vehicle 10 and/or mitigate the consequences of a potential collision between the target vehicle 200 and the vehicle 10 based on the location of the target vehicle 200 at each time period, the location of the vehicle 10 at each time period, the speed of the target vehicle 200 relative to the vehicle 10, the speed of the vehicle 10, other suitable information, or a combination thereof. The controller 100 may determine the speed of the target vehicle 200 relative to the vehicle 10. The speed of the target vehicle 200 relative to the vehicle 10 may be defined as:
VehSpd TV =SQRT[(RelSpd.x) 2 +(RelSpd.y) 2 ]
wherein VehSpd TV RelSpdX corresponds to the speed of target vehicle 200, and RelSpdY corresponds to the X component of the relative (e.g., with respect to vehicle 10) speed of target vehicle 200, and RelSpdY corresponds to the Y component of the relative (e.g., with respect to vehicle 10) speed of target vehicle 200. The controller 100 may determine the speed of the vehicle 10 based on one or more values received from one or more various sensors of the vehicle 10. The controller 100 may calculate for the vehicle 10:
the Radius (Radius) of the Radius (Radius),
Figure BDA0003164968590000101
S' y ,EV=R–R.Cos(dΦ)
S' x ,EV=R.Sin(dΦ)
controller 100 may further calculate yaw rate (YawRate) in steady state, r = [ (V/L) { 1/(1 + kv2/57.3 Lg) }]* δ, δ corresponding to steering angle (degrees), V corresponding to speed of the vehicle 10, L corresponding to wheel track (feet), K corresponding to understeer gradient (degrees/g), and g corresponding to a gravitational acceleration constant (e.g., 32.2 (feet/second) 2 ) R corresponds to the associated circular motion, S 'yEV corresponds to the Y component of the travel distance between t and dt, S' xEV corresponds to the X component of the travel distance between t and dt, and d Φ (e.g., heading angle) = r.dt.
The controller 100 may determine the distance between a forward point 202 of the target vehicle 200 and various points 204 of the vehicle 10, as generally shown in fig. 3B. In some embodiments, the controller 100 may use a constant radius model to predict the position of the vehicle 10 (e.g., and/or the distance between the vehicle 10 and the target vehicle 200) at each time period. In some embodiments, the controller 100 may use a constant velocity model to determine the location of the target vehicle 200 (e.g., and/or the distance between the vehicle 10 and the target vehicle 200) for each time period. For example, the controller 100 may determine a distance between a front point 202 of the target vehicle 200 and a first upper corner point 204' of the vehicle 10. The distance between the front point 202 and the first upper corner point 204' of the vehicle 10 may be defined as:
S' y,EV,firstupper =S' y,EV +(d1)*sin(dΦ)+(w)*cos(dΦ)
S' x,EV,firstupper =S' x,EV +(d1)*cos(dΦ)+(w)*(-sin(dΦ))
which may define a new location of the first upper corner point 204' of the vehicle 10.
The controller 100 may determine the distance between the front point 202 and the second lower corner point 204 "of the vehicle 10. The distance between the front point 202 and the second lower corner point 204 "may be defined as:
S' y,EV,secondlower =S' y,EV +(-d2)*sin(dΦ)+(-w)*cos(dΦ)
S' x,EV,secondlower =S' x,EV +(-d2)*cos(dΦ)+(-w)*(-sin(dΦ))
which may define a new location of the second lower corner point 204 "of the vehicle 10.
Where d1 corresponds to the distance between the center of the rear axle of the vehicle 10 and the front end of the vehicle 10, d2 corresponds to the distance between the center of the rear axle of the vehicle 10 and the rear end of the vehicle 10, and w corresponds to half the width of the vehicle 10.
In some embodiments, the controller 100 may determine the path of the target vehicle 200 based on the vehicle speed of the target vehicle 200 relative to the vehicle 10 and the heading angle of the target vehicle 200 relative to the vehicle 10. The path of the target vehicle 200 may be defined as:
S' x,TV =S' 0x +VehSpd TV .Cos(headngAngle TV ).dt
S' y,TV =S' 0y +VehSpd TV .Sin(headAngle TV ).dt
the controller 100 may determine whether the predicted distance between the target vehicle 200 and the vehicle 10 is less than a threshold distance. If the controller 100 determines that the predicted distance between the target vehicle 200 and the vehicle 10 is equal to or greater than the threshold distance, the controller 100 may allow the vehicle 10 to perform a steering maneuver.
Alternatively, if the controller 100 determines that the predicted distance between the target vehicle 200 and the vehicle 10 is less than the threshold distance, the controller 100 may initiate at least one steering control operation to avoid a potential collision indicated by the predicted distance between the target vehicle 200 and the vehicle 10 being less than the threshold distance or to mitigate the consequences of a potential collision indicated by the predicted distance between the target vehicle 200 and the vehicle 10 being less than the threshold distance. For example, the controller 100 may determine an appropriate amount of torque overlay to apply to at least one motor of a steering system of the vehicle 10 to steer the vehicle 10 away from the target vehicle 200. The controller 100 may apply a torque superposition to the motor of the steering system. The vehicle 10 may change paths to avoid collision with the target vehicle 200. Additionally or alternatively, the controller 100 may initiate other control steering operations in addition to or instead of applying the torque overlay, such as providing an indication to an operator of the vehicle 10 that a collision is possible (e.g., using one or more output devices of the vehicle 10).
In some embodiments, the controller 100 receives a plurality of sensor values from at least one sensor 106 disposed proximate a rear portion of the host vehicle 10 prior to the first time. The at least one sensor may comprise at least one radar sensor. For example, the host vehicle 10 may include a respective radar sensor disposed on each side of the rear of the host vehicle 10. The host vehicle 10 may include an electronic power steering system, a steer-by-wire steering system, or other suitable steering system.
The controller 100 may identify the target vehicle 200 in the blind zone of the host vehicle 10 based on the plurality of sensor values. The controller 100 may determine that the host vehicle 10 initiates a steering maneuver at a first time. The steering maneuver includes a lane change maneuver or other suitable steering maneuver.
The controller 100 may identify a plurality of time periods between the first time and the second time. The controller 100 may update the plurality of sensor values using at least one transformation function. In some embodiments, the at least one transformation function comprises a homogeneous transformation matrix or other suitable transformation function. In some embodiments, the controller 100 may apply a linear regression to the updated plurality of sensor values.
In some embodiments, the controller 100 may use a plurality of sensor values that are updated after applying the linear regression to determine the heading angle of the target vehicle 200 relative to the host vehicle 10. The controller 100 may estimate the position of the target vehicle 200 for each of the plurality of time periods based on the heading angle of the target vehicle 200 relative to the host vehicle 10. The controller 100 may estimate the position of the target vehicle 200 at the second time using each position of the target vehicle 200 at each corresponding time period of the plurality of time periods.
In some embodiments, the controller 100 may use the updated plurality of sensor values and the constant velocity model to estimate the velocity of the target vehicle 200 relative to the host vehicle 10. The controller 100 may determine the speed of the host vehicle 10. The controller 100 may estimate the position of the host vehicle 10 for each of the plurality of time periods using a constant turning radius model.
In some embodiments, the controller 100 may determine a time at which a collision is to occur between the host vehicle 10 and the target vehicle 200 for each of the plurality of time periods based on one or more of an estimated velocity of the target vehicle 200, a velocity of the host vehicle 10, a position of the target vehicle 200 for each of the plurality of time periods, a position of the host vehicle 10 for each of the plurality of time periods, other suitable information, or any combination thereof.
In some embodiments, the controller 100 may determine whether the time to collision for a respective time period of the plurality of time periods is less than a threshold. The controller 100 may initiate at least one steering control operation in response to determining that the time to collision for a respective time period of the plurality of time periods is less than a threshold. The at least one steering control operation may include applying a torque overlay to at least one motor of a steering system of the host vehicle 10, other suitable steering control operations, or a combination thereof. Steering control operations (e.g., including applying a torque overlay to at least one motor of the steering system) may be configured to direct the host vehicle 10 away from the target vehicle 200.
In some embodiments, the controller 100 may perform the methods described herein. However, the methods described herein as being performed by the controller 100 are not meant to be limiting, and any type of software executing on the controller or processor may perform the methods described herein without departing from the scope of the present disclosure. For example, a controller (such as a processor executing software within a computing device) may perform the methods described herein.
FIG. 4 is a flow chart generally illustrating an active blind spot assist method 300 in accordance with the principles of the present disclosure. At 302, method 300 receives a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time. For example, the controller 100 may receive a plurality of sensor values from the sensor 106.
At 304, the method 300 identifies a target vehicle in a blind zone of the host vehicle based on the plurality of sensor values. For example, the controller 100 may identify the target vehicle 200 in the blind zone of the vehicle 10 based on the plurality of sensor values.
At 306, the method 300 determines that the host vehicle is initiating a steering maneuver at a first time. For example, the controller 100 may determine that the vehicle 10 is initiating a steering maneuver.
At 308, the method 300 identifies a plurality of time periods between the first time and the second time. For example, the controller 100 may identify a plurality of time periods between the first time and the second time.
At 310, method 300 updates the plurality of sensor values using at least one transformation function. For example, the controller 100 may update the plurality of sensor values using the at least one transformation function.
At 312, the method 300 determines a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values. For example, the controller 100 may use the updated plurality of sensor values to determine a heading angle of the target vehicle 200 relative to the vehicle 10.
At 314, the method 300 estimates the position of the target vehicle for each of the plurality of time periods based on the heading angle of the target vehicle relative to the vehicle 10. For example, the controller 100 may estimate the position of the target vehicle 200 for each of the plurality of time periods based on a heading angle of the target vehicle 200 relative to the vehicle 10.
At 316, the method 300 estimates a position of the target vehicle at a second time using each position of the target vehicle at each respective time period of the plurality of time periods. For example, the controller 100 may estimate the position of the target vehicle at the second time using each position of the target vehicle 200 at each corresponding time period of the plurality of time periods.
In some embodiments, the method 300 may estimate the speed of the target vehicle relative to the host vehicle using the updated plurality of sensor values and the constant speed model. For example, the controller 100 may use the updated plurality of sensor values and the constant speed model to estimate the speed of the target vehicle 200 relative to the vehicle 10.
In some embodiments, the method 300 may determine a speed of the host vehicle and may estimate a location of the host vehicle at each of the plurality of time periods using a constant turn radius model. For example, the controller 100 may determine the speed of the vehicle 10 and may use a constant turning radius model to estimate the position of the vehicle 10 for each of the plurality of time periods.
In some embodiments, the method 300 may determine, for each of the plurality of time periods, a time at which a collision is to occur between the host vehicle and the target vehicle based on at least one of the estimated velocity of the target vehicle, the velocity of the host vehicle, the position of the target vehicle at each of the plurality of time periods, and the position of the host vehicle at each of the plurality of time periods. For example, the controller 100 may determine, for each of the plurality of time periods, a time at which a collision is to occur between the vehicle 10 and the target vehicle 200 based on at least one of the estimated speed of the target vehicle 200, the speed of the vehicle 10, the position of the target vehicle 200 at each of the plurality of time periods, and the position of the vehicle 10 at each of the plurality of time periods.
In some embodiments, the method 300 may determine whether the time to collision for a respective time period of the plurality of time periods is less than a threshold. For example, the controller 100 may determine whether the time to collision for a respective time period of the plurality of time periods is less than a threshold.
In some embodiments, the method 300 may initiate at least one steering control operation in response to determining that the time to collision for a respective time period of the plurality of time periods is less than a threshold. For example, the controller 100 may initiate at least one steering control operation in response to determining that the time to collision for a respective time period of the plurality of time periods is less than a threshold.
In some embodiments, a method for active blind zone assistance includes receiving a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time, and identifying a target vehicle in a blind zone of the host vehicle based on the plurality of sensor values. The method also includes determining that the host vehicle is initiating a steering maneuver at a first time and identifying a plurality of time periods between the first time and a second time. The method also includes updating the plurality of sensor values using the at least one transformation function, and determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values. The method also includes estimating a position of the target vehicle at each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle, and estimating a position of the target vehicle at the second time using each position of the target vehicle at each respective time period of the plurality of time periods.
In some embodiments, the at least one transformation function comprises a homogeneous transformation matrix. In some embodiments, determining the heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values includes applying a linear regression to the updated plurality of sensor values. In some embodiments, the method further comprises estimating a speed of the target vehicle relative to the host vehicle using the updated plurality of sensor values and the constant speed model. In some embodiments, the method further comprises determining a speed of the host vehicle, and estimating a location of the host vehicle for each of the plurality of time periods using a constant turn radius model. In some embodiments, the method further comprises: for each of the plurality of time periods, a time at which a collision is to occur between the host vehicle and the target vehicle is determined based on at least one of the estimated velocity of the target vehicle, the velocity of the host vehicle, the position of the target vehicle at each of the plurality of time periods, and the position of the host vehicle at each of the plurality of time periods. In some embodiments, the method further includes determining whether the time to impact for a respective time period of the plurality of time periods is less than a threshold, and initiating at least one steering control operation in response to determining that the time to impact for a respective time period of the plurality of time periods is less than the threshold. In some embodiments, the at least one steering control operation comprises applying a torque overlay to at least one motor of a steering system of the host vehicle, wherein the torque overlay is configured to direct the host vehicle away from the target vehicle. In some embodiments, the at least one sensor comprises at least one radio detection and ranging sensor. In some embodiments, the steering maneuver includes a lane change maneuver. In some embodiments, the host vehicle includes an electronic power steering system. In some embodiments, the host vehicle includes a steer-by-wire steering system.
In some embodiments, a system for active blind spot assistance without using an image capture device includes a processor and a memory. The memory includes instructions that, when executed by the processor, cause the processor to: receiving a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time; identifying a target vehicle in a blind zone of a host vehicle based on the plurality of sensor values; determining, at a first time, that a host vehicle initiated a steering maneuver; identifying a plurality of time periods between a first time and a second time; updating the plurality of sensor values using at least one transformation function; determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values; estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle; and estimating the location of the target vehicle at the second time using each location of the target vehicle at each respective time period of the plurality of time periods.
In some embodiments, the at least one transformation function comprises a homogeneous transformation matrix. In some embodiments, the instructions further cause the processor to determine a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values by at least applying a linear regression to the updated plurality of sensor values. In some embodiments, the instructions further cause the processor to determine, for each of the plurality of time periods, a time at which a collision is to occur between the host vehicle and the target vehicle based on at least one of an estimated velocity of the target vehicle, a velocity of the host vehicle, a position of the target vehicle at each of the plurality of time periods, and a position of the host vehicle at each of the plurality of time periods. In some embodiments, the instructions further cause the processor to determine whether the time to collision for a respective time period of a plurality of time periods is less than a threshold, and initiate at least one steering control operation in response to determining that the time to collision for a respective time period of the plurality of time periods is less than the threshold. In some embodiments, the at least one steering control operation comprises applying a torque overlay to at least one motor of a steering system of the host vehicle, wherein the torque overlay is configured to direct the host vehicle away from the target vehicle. In some embodiments, the at least one sensor comprises at least one radio detection and ranging sensor.
In some embodiments, an apparatus for active blind spot assistance includes a processor and a memory. The memory includes instructions that, when executed by the processor, cause the processor to: receiving a plurality of sensor values from at least one radio detection and ranging sensor disposed proximate a rear portion of the host vehicle prior to a first time; identifying a target vehicle in a blind zone of a host vehicle based on the plurality of sensor values; determining that the host vehicle is initiating a steering maneuver at a first time; identifying a plurality of time periods between a first time and a second time; updating the plurality of sensor values using at least one transformation function; determining a heading angle of the target vehicle relative to the host vehicle by applying a linear regression to the updated plurality of sensor values; estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle; estimating a location of the target vehicle at a second time using each location of the target vehicle at each respective time period of the plurality of time periods; determining a time at which a collision is to occur between the host vehicle and the target vehicle using at least the position of the target vehicle at the second time; and in response to determining that the time to collision is less than the threshold, apply a torque overlay to at least one motor of a steering system of the host vehicle to steer the host vehicle away from the target vehicle.
The word "example" is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the word "example" is intended to present concepts in a concrete fashion. As used in this application, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless otherwise specified or clear from context, "X comprises a or B" is intended to mean any of the natural inclusive permutations. That is, if X comprises A; x comprises B; or X includes both A and B, then "X includes A or B" is satisfied under any of the foregoing circumstances. In addition, the articles "a" and "an" as used in this application and the appended claims should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form. Furthermore, unless described as such, use of the term "embodiment" or "one embodiment" throughout is not intended to refer to the same embodiment or implementation.
Embodiments of the systems, algorithms, methods, instructions, etc. described herein may be implemented in hardware, software, or any combination thereof. The hardware may include, for example, a computer, an Intellectual Property (IP) core, an Application Specific Integrated Circuit (ASIC), a programmable logic array, an optical processor, a programmable logic controller, microcode, a microcontroller, a server, a microprocessor, a digital signal processor, or any other suitable circuit. In the claims, the term "processor" should be understood to include any of the foregoing hardware, alone or in combination. The terms "signal" and "data" are used interchangeably.
As used herein, the term module may include a packaged functional hardware unit designed for use with other components, a set of instructions executable by a controller (e.g., a processor executing software or firmware), a processing circuit configured to perform a specific function, and a separate hardware or software component that interfaces with a large system. For example, a module may include, or be a combination of, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a circuit, digital logic, analog circuitry, a combination of discrete circuits, gates, and other types of hardware, or a combination of both. In other embodiments, a module may include a memory that stores instructions executable by the controller to implement the features of the module.
Further, in an aspect, for example, the systems described herein may be implemented using a general purpose computer or a general purpose processor with a computer program that, when executed, performs any of the respective methods, algorithms, and/or instructions described herein. Additionally or alternatively, for example, a special purpose computer/processor may be utilized which may contain other hardware for performing any of the methods, algorithms, or instructions described herein.
Furthermore, all or portions of embodiments of the invention may take the form of a computer program product accessible from, for example, a computer-usable or computer-readable medium. A computer-usable or computer-readable medium may be, for example, any apparatus that can tangibly contain, store, communicate, or transport the program for use by or in connection with any processor. The medium may be, for example, an electrical, magnetic, optical, electromagnetic or semiconductor device. Other suitable media may also be used.
The above-described embodiments, embodiments and aspects have been described to allow easy understanding of the present disclosure and do not limit the present disclosure. On the contrary, the disclosure is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.

Claims (20)

1. A method for active blind spot assistance, the method comprising:
receiving a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time;
identifying a target vehicle in a blind zone of the host vehicle based on the plurality of sensor values;
determining, at a first time, that the host vehicle initiated a steering maneuver;
identifying a plurality of time periods between the first time and a second time;
updating the plurality of sensor values using at least one transformation function;
determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values;
estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle; and
estimating a position of the target vehicle at the second time using each position of the target vehicle at each corresponding time period of the plurality of time periods.
2. The method of claim 1, wherein the at least one transform function comprises a homogeneous transform matrix.
3. The method of claim 1, wherein determining the heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values comprises applying a linear regression to the updated plurality of sensor values.
4. The method of claim 1, further comprising estimating a speed of the target vehicle relative to the host vehicle using the updated plurality of sensor values and a constant speed model.
5. The method of claim 4, further comprising:
determining a speed of the host vehicle; and
estimating a position of the host vehicle for each of the plurality of time periods using a constant turning radius model.
6. The method of claim 5, further comprising:
for each of the plurality of time periods, determining a time at which a collision is to occur between the host vehicle and the target vehicle based on at least one of the estimated speed of the target vehicle, the speed of the host vehicle, the location of the target vehicle at each of the plurality of time periods, and the location of the host vehicle at each of the plurality of time periods.
7. The method of claim 6, further comprising:
determining whether a time to collision for a respective time period of the plurality of time periods is less than a threshold; and
initiating at least one steering control operation in response to determining that the time to collision for a respective time period of the plurality of time periods is less than the threshold.
8. The method of claim 7, wherein the at least one steering control operation includes applying a torque overlay to at least one motor of a steering system of the host vehicle, wherein the torque overlay is configured to direct the host vehicle away from the target vehicle.
9. The method of claim 1, wherein the at least one sensor comprises at least one radio detection and ranging sensor.
10. The method of claim 1, wherein the steering maneuver comprises a lane change maneuver.
11. The method of claim 1, wherein the host vehicle comprises an electronic power steering system.
12. The method of claim 1, wherein the host vehicle comprises a steer-by-wire steering system.
13. A system for active blind spot assistance without the use of an image capture device, the system comprising:
a processor; and
a memory comprising instructions that, when executed by the processor, cause the processor to:
receiving a plurality of sensor values from at least one sensor disposed proximate a rear portion of a host vehicle prior to a first time;
identifying a target vehicle in a blind zone of the host vehicle based on the plurality of sensor values;
determining, at a first time, that the host vehicle initiated a steering maneuver;
identifying a plurality of time periods between the first time and a second time;
updating the plurality of sensor values using at least one transformation function;
determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values;
estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle; and
estimating a position of the target vehicle at the second time using each position of the target vehicle at each corresponding time period of the plurality of time periods.
14. The system of claim 13, wherein the at least one transform function comprises a homogeneous transform matrix.
15. The system of claim 13, wherein the instructions further cause the processor to: determining a heading angle of the target vehicle relative to the host vehicle using the updated plurality of sensor values by at least applying a linear regression to the updated plurality of sensor values.
16. The system of claim 13, wherein the instructions further cause the processor to:
for each of the plurality of time periods, determining a time at which a collision is to occur between the host vehicle and the target vehicle based on at least one of the estimated speed of the target vehicle, the speed of the host vehicle, the location of the target vehicle at each of the plurality of time periods, and the location of the host vehicle at each of the plurality of time periods.
17. The system of claim 16, wherein the instructions further cause the processor to:
determining whether a time to collision for a respective time period of the plurality of time periods is less than a threshold; and
in response to determining that the time to collision for a respective time period of the plurality of time periods is less than the threshold, initiating at least one steering control operation.
18. The system of claim 17, wherein the at least one steering control operation includes applying a torque overlay to at least one motor of a steering system of the host vehicle, wherein the torque overlay is configured to direct the host vehicle away from the target vehicle.
19. The system of claim 13, wherein the at least one sensor comprises at least one radio detection and ranging sensor.
20. An apparatus for active blind spot assistance, the apparatus comprising:
a processor; and
a memory comprising instructions that, when executed by the processor, cause the processor to:
receiving a plurality of sensor values from at least one radio detection and ranging sensor disposed proximate a rear portion of the host vehicle prior to a first time;
identifying a target vehicle in a blind zone of the host vehicle based on the plurality of sensor values;
determining, at a first time, that the host vehicle initiated a steering maneuver;
identifying a plurality of time periods between the first time and a second time;
updating the plurality of sensor values using at least one transformation function;
determining a heading angle of the target vehicle relative to the host vehicle by applying a linear regression to the updated plurality of sensor values;
estimating a position of the target vehicle for each of the plurality of time periods based on a heading angle of the target vehicle relative to the host vehicle;
estimating a position of the target vehicle at the second time using each position of the target vehicle at each corresponding time period of the plurality of time periods;
determining a time at which a collision is to occur between the host vehicle and the target vehicle using at least the position of the target vehicle at the second time; and
in response to determining that the time to collision is less than a threshold, applying a torque overlay to at least one motor of a steering system of the host vehicle to steer the host vehicle away from the target vehicle.
CN202110801860.5A 2021-06-15 2021-07-15 System and method for active blind zone assistance Pending CN115476923A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/347,687 2021-06-15
US17/347,687 US12054194B2 (en) 2021-06-15 2021-06-15 Systems and methods for active blind zone assist

Publications (1)

Publication Number Publication Date
CN115476923A true CN115476923A (en) 2022-12-16

Family

ID=84192397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110801860.5A Pending CN115476923A (en) 2021-06-15 2021-07-15 System and method for active blind zone assistance

Country Status (3)

Country Link
US (1) US12054194B2 (en)
CN (1) CN115476923A (en)
DE (1) DE102021115708A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11851051B2 (en) * 2021-07-28 2023-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for detecting an object in a turning alert zone of a vehicle
DE102023203433B3 (en) 2023-04-17 2024-03-21 Volkswagen Aktiengesellschaft Method for controlling a steering system, blind spot assistance system and vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2615596A4 (en) * 2010-09-08 2017-12-06 Toyota Jidosha Kabushiki Kaisha Moving-object prediction device, virtual-mobile-object prediction device, program, mobile-object prediction method, and virtual-mobile-object prediction method
DE102016116963A1 (en) * 2016-09-09 2018-03-15 Knorr-Bremse Systeme für Nutzfahrzeuge GmbH Device for warning a driver of a vehicle in front of an object and vehicle with such a device
DE102016224061A1 (en) 2016-12-02 2018-06-07 Bayerische Motoren Werke Aktiengesellschaft Lane change assistance system with relatively speed-dependent reaction area
KR102646644B1 (en) * 2017-03-01 2024-03-13 모빌아이 비젼 테크놀로지스 엘티디. Systems and methods for navigation with detection uncertainty
JP6958001B2 (en) * 2017-06-09 2021-11-02 トヨタ自動車株式会社 Lane change support device
KR20200139443A (en) * 2019-06-04 2020-12-14 주식회사 만도 Apparatus and method for driver assistance
US11267482B2 (en) * 2019-10-11 2022-03-08 International Business Machines Corporation Mitigating risk behaviors

Also Published As

Publication number Publication date
US12054194B2 (en) 2024-08-06
US20220396313A1 (en) 2022-12-15
DE102021115708A1 (en) 2022-12-15

Similar Documents

Publication Publication Date Title
CN112721922B (en) System and method for shared control of emergency steering
Galvani History and future of driver assistance
US10737717B2 (en) Trajectory tracking for vehicle lateral control using neural network
US10793120B2 (en) Control method to avoid collision and vehicle using the same
US12054194B2 (en) Systems and methods for active blind zone assist
US11249476B2 (en) Steering holding determination device and autonomous driving system
US11663860B2 (en) Dynamic and variable learning by determining and using most-trustworthy inputs
CN113619680B (en) Autonomous driver feedback system and method
CN113276864A (en) System and method for obstacle proximity detection
WO2023287919A1 (en) System and method for lane departure warning with ego motion and vision
US20220348197A1 (en) Always on lateral advanced driver-assistance system
CN113135189B (en) System and method for real-time monitoring of vehicle inertia parameter values using lateral dynamics
CN115195741A (en) Vehicle control method, vehicle control device, vehicle, and storage medium
US20190202493A1 (en) Vehicle control apparatus
CN114312979B (en) Distributed system architecture for autonomous steering system
CN115107867B (en) Functional limitation of torque request based on neural network calculations
CN113247091B (en) Vehicle steering control system
US11891060B2 (en) System and method in lane departure warning with full nonlinear kinematics and curvature
US12017661B2 (en) System and method in vehicle path prediction based on full nonlinear kinematics
US20230382451A1 (en) Systems and methods for reverse directional polarity steering
US20240116499A1 (en) Vehicle control device
CN118457567A (en) Parking method, storage medium, controller, vehicle, and program product
Chatterjee et al. Vehicle Perimeter Monitoring using Minimum Number of Sensors during Parking Maneuver
WO2023287913A1 (en) System and method in data-driven vehicle dynamic modeling for path-planning and control
CN117125054A (en) System and method for inducing speed reduction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination