CN107918758B - Vehicle capable of environmental scenario analysis - Google Patents

Vehicle capable of environmental scenario analysis Download PDF

Info

Publication number
CN107918758B
CN107918758B CN201710924399.6A CN201710924399A CN107918758B CN 107918758 B CN107918758 B CN 107918758B CN 201710924399 A CN201710924399 A CN 201710924399A CN 107918758 B CN107918758 B CN 107918758B
Authority
CN
China
Prior art keywords
host vehicle
vehicle
lane
target
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710924399.6A
Other languages
Chinese (zh)
Other versions
CN107918758A (en
Inventor
埃里克·L·里德
乔纳森·迪德里希
罗伯特·克洛斯克
阿迪尔·尼扎姆·西迪基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Publication of CN107918758A publication Critical patent/CN107918758A/en
Application granted granted Critical
Publication of CN107918758B publication Critical patent/CN107918758B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18154Approaching an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/072Curvature of the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way

Abstract

A host vehicle may include: a motor, a brake, a sensor, a processor configured to: (a) Predicting a target path of a target vehicle based on lane boundaries of the virtual map; (b) Comparing the target path with a predicted main path of the host vehicle; (c) applying a brake based on the comparison; (d) Predicting the target path and the main path as shapes each having a two-dimensional surface area; (e) Determining whether the shapes intersect, and based on the determination, (f) calculating a first time interval for the target vehicle to reach the intersection, and (g) a second time interval for the host vehicle to reach the intersection.

Description

Vehicle capable of environmental scenario analysis
Technical Field
The present application relates to an environmental scenario (environmental context) of a vehicle (e.g., traffic lane markings).
Background
The existing vehicle is configured to predict a collision based on a speed of the existing vehicle and a speed of the target vehicle. However, many of these predictions do not take into account possible changes in the speed of the target vehicle based on environmental scenarios. For example, if the target vehicle is traveling in a circular lane, the speed (speed includes heading or direction) of the target vehicle will likely rotate to follow the circular lane. Therefore, a solution is needed that takes environmental scenarios into account in collision prediction.
Disclosure of Invention
A host vehicle may include: a motor, a brake, a sensor, a processor configured to: (a) Predicting a target path of a target vehicle based on lane boundaries of the virtual map; (b) Comparing the target path with a predicted main path of the host vehicle; (c) applying a brake based on the comparison; (d) Predicting the target path and the main path as shapes each having a two-dimensional surface area; (e) Determining whether the shapes intersect, and based on the determination, (f) calculating a first time interval for the target vehicle to reach the intersection, and (g) a second time interval for the host vehicle to reach the intersection.
According to the present invention, there is provided a host vehicle including:
a motor, a brake, a sensor, a processor configured to:
predicting a target path of a target vehicle based on lane boundaries of the virtual map;
comparing the target path with a predicted main path of the host vehicle;
brakes are applied based on the comparison.
According to one embodiment of the invention, the processor is configured to construct the virtual map based on (a) the received street map and (b) the measurements received from the sensor.
According to one embodiment of the invention, wherein the sensor comprises a camera and the processor is configured to apply the lane boundaries to the virtual map based on images taken by the camera.
According to one embodiment of the invention, the processor is configured to identify a lane of the target vehicle, determine a radius of curvature of a lane boundary of the identified lane, and predict the target path based on the determined radius of curvature.
According to one embodiment of the invention, the processor is configured to:
determining (a) a first radius of curvature of a first lane boundary of the identified lane and (b) a second radius of curvature of a second lane boundary of the identified lane;
calculating an intermediate radius of curvature based on (a) and (b);
the target path is predicted based on the calculated intermediate radius of curvature.
According to one embodiment of the invention, the processor is configured to predict a range of positions of the target vehicle at a first future time based on the target path, and predict a range of positions of the target vehicle at a second future time.
According to one embodiment of the invention, wherein the first future time is a predetermined time interval multiplied by a first number and the second future time is a predetermined time interval multiplied by the first number plus one.
According to one embodiment of the invention, the processor is configured such that the total area of the predicted position range of the target vehicle at the second future time exceeds the total area of the predicted position range of the target vehicle at the first future time.
According to one embodiment of the invention, the processor is configured such that the total area of the predicted position range of the target vehicle at the first future time exceeds the total area of the target vehicle.
According to one embodiment of the invention, the processor is configured to predict the target path and the main path as shapes each having a two-dimensional surface area.
According to one embodiment of the invention, the processor is configured to determine whether the shapes intersect and calculate a first time interval for the target vehicle to reach the intersection and a second time interval for the host vehicle to reach the intersection based on the determinations.
According to an embodiment of the invention, the processor is configured to determine whether any portion of the first time interval overlaps any portion of the second time interval.
According to an embodiment of the invention, the processor is configured to apply the brake based on a positive overlap determination.
According to the present invention, there is provided a host vehicle comprising:
a motor, a steering device, a sensor, a processor configured to:
predicting a target path of a target vehicle based on lane boundaries of the virtual map;
comparing the target path with a predicted main path of the host vehicle;
The steering device is actuated based on the comparison.
According to one embodiment of the invention, the processor is configured to predict the target path and the main path as shapes each having a two-dimensional surface area.
According to one embodiment of the invention, the processor is configured to determine whether the shapes intersect and calculate a first time interval for the target vehicle to reach the intersection and a second time interval for the host vehicle to reach the intersection based on the determinations.
According to an embodiment of the invention, the processor is configured to determine whether any portion of the first time interval overlaps any portion of the second time interval.
According to an embodiment of the invention, the processor is configured to actuate the steering device based on a positive overlap determination.
According to one embodiment of the invention, the processor is configured to construct the virtual map based on (a) the received street map and (b) the measurements received from the sensor.
According to the present invention, there is provided a host vehicle including:
a motor, a warning light, a sensor, a processor configured to:
predicting a target path of a target vehicle based on lane boundaries of the virtual map;
Comparing the target path with a predicted main path of the host vehicle;
the warning light is activated based on the comparison.
Drawings
For a better understanding of the invention, reference may be made to the embodiments illustrated in the following drawings. The components in the figures are not necessarily to scale and related elements may be omitted or the proportions may be exaggerated in some cases to emphasize and clearly illustrate the novel features described herein. Furthermore, the system components may be arranged differently, as is known in the art. Further, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 is a block diagram of a vehicle computing system;
FIG. 2 is a top plan view of a host vehicle including a vehicle computing system;
FIG. 3 is a block diagram corresponding to collision threat assessment;
FIG. 4 is a more specific embodiment of the block diagram of FIG. 3;
FIG. 5 is a first traffic scene and represents a virtual map presented in graphical form;
FIG. 6 illustrates possible operations associated with first, second, and third traffic scenarios;
FIG. 7 is a second traffic scene and represents a virtual map presented in graphical form;
FIG. 8 illustrates features of a cross traffic warning system;
FIG. 9 is a third traffic scene and represents a virtual map presented in graphical form;
FIG. 10 illustrates possible operations associated with first, second, and third traffic scenarios;
FIG. 11 illustrates possible operations associated with first, second, and third traffic scenarios;
FIG. 12 illustrates possible operations associated with first, second, and third traffic scenarios;
FIG. 13 illustrates possible operations associated with first, second, and third traffic scenarios; and
fig. 14 illustrates possible operations associated with first, second, and third traffic scenarios.
Detailed Description
While the present invention may be embodied in various forms, there is shown in the drawings and will hereinafter be described some exemplary and non-limiting embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated.
In this application, the use of disjunctive is intended to include conjunctions. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, reference to "the" object or "an (a/an)" object is intended to also mean one of a possible plurality of such objects. Furthermore, the conjunction "or" may be used to express that features are both present as an alternative and mutually exclusive as an alternative. In other words, the conjunctive word "or" should be understood to include "and/or" as an alternative and "one or both" as an alternative.
Fig. 1 illustrates a computing system 100 of a host vehicle 200. The host vehicle 200 is connected (meaning that the host vehicle 200 is configured to) (a) receive wireless data from an external entity (e.g., infrastructure, server, other connected vehicle), and (b) transmit wireless data to the external entity. The host vehicle 200 may be autonomous, semi-autonomous, or manual. The host vehicle 200 includes a motor, a battery, at least one wheel driven by the motor, and a steering system configured to pivot the at least one wheel. Host vehicle 200 may be fossil fuel (e.g., diesel, gasoline, natural gas) driven, hybrid, all-electric, fuel cell driven, and the like.
Vehicles are described, for example, in the following patent applications: U.S. patent application Ser. No. 15/076,210 to Miller (Miller), U.S. patent application Ser. No. 8,180,547 to Prasad (Prasad), U.S. patent application Ser. No. 15/186,850 to Lavoie (Lavoie), U.S. patent publication No. 2006/0117971 to Dalmator (D' Amato), and U.S. patent application Ser. No. 14/972,761 to Hu (Hu), all of which are incorporated herein by reference in their entirety. Host vehicle 200 may include any of the features described in miller, plasa, ravoy, damask, and urheen.
The computing system 100 resides in a host vehicle 200. The computing system 100 is capable of automatically controlling mechanical systems within the host vehicle 200 and facilitating communication between the host vehicle 200 and external entities (e.g., connected infrastructure, internet, other connected vehicles), among others. The computing system 100 includes a data bus 101, one or more processors 108, volatile memory 107, non-volatile memory 106, a user interface 105, a telematics unit 104, actuators and motors 103, and local sensors 102.
The data bus 101 carries electronic signals or data between electronic components. The processor 108 performs operations on the electronic signals or data to produce modified electronic signals or data. Volatile memory 107 stores data that is called near-instant by processor 108. The non-volatile memory 106 stores data called by the volatile memory 107 and/or the processor 108. The nonvolatile memory 106 includes a series of nonvolatile memories including a hard disk drive, a Solid State Disk (SSD), a Digital Video Disc (DVD), a Blu-Ray disc (Blu-Ray), and the like. User interface 105 includes a display, a touch screen display, a keyboard, buttons, and other devices that enable a user to interact with the computing system. The telematics unit 104 enables wired and wireless communication with external entities via Bluetooth, cellular data (e.g., third generation mobile communication technology (3G), long Term Evolution (LTE)), universal Serial Bus (USB), and the like.
The actuator/motor 103 produces a tangible result. Examples of actuators/motors 103 include fuel injectors, windshield wipers, brake light circuits, transmissions, airbags, motors mounted to sensors (e.g., motors configured to rotate the local sensor 102), engines, driveline motors, steering devices, blind spot warning lights, and the like.
The local sensor 102 communicates the digital readings or measurements to the processor 108. Examples of local sensors 102 include temperature sensors, rotation sensors, seat belt sensors, speed sensors, cameras, laser radar (lidar) sensors, radar sensors, infrared sensors, ultrasonic sensors, clocks, humidity sensors, rain sensors, light sensors, and the like. It should be understood that any of the various electronic components of fig. 1 may include separate or dedicated processors and memories. Further details of the structure and operation of computing system 100 are described, for example, in miller, plasa, ravoy, damatode, and urheen.
Fig. 2 generally illustrates and describes a host vehicle 200, the host vehicle 200 including the computing system 100. Some of the local sensors 102 are mounted outside the host vehicle 200 (others are located inside the vehicle 200). The local sensor 102a is configured to detect an object in front of the vehicle 200. The local sensor 102b is configured to detect an object behind the vehicle 200, as indicated by the rear sensing range 109 b. The left sensor 102c and the right sensor 102d are configured to perform similar functions for the left and right sides of the vehicle 200.
As previously described, the local sensors 102 a-102 d may be ultrasonic sensors, lidar sensors, radar sensors, infrared sensors, cameras, microphones, any combination thereof, and the like. The host vehicle 200 includes a plurality of other local sensors 102 located in the vehicle interior or on the vehicle exterior. The local sensors 102 may include any or all of the sensors disclosed in miller, plasa, rawok, damatode, and urheen. The general arrangement of the components shown in fig. 1 and 2 is prior art.
It should be appreciated that the host vehicle 200 (and more specifically, the processor 108 of the host vehicle 200) is configured to perform the methods and operations described herein. In some cases, host vehicle 200 is configured to perform these functions via computer programs stored on volatile memory 107 and/or non-volatile memory 106 of computing system 100.
The one or more processors are "configured to: the disclosed method steps, blocks, or operations are performed at least when at least one of the one or more processors is in operative communication with a memory storing a software program embodying code or instructions of the disclosed method steps or blocks. Further description of how the processor, memory and software cooperate appears in prazidate. According to some embodiments, a mobile phone or external server in operative communication with host vehicle 200 performs some or all of the methods and operations discussed below.
According to various embodiments, the host vehicle 200 includes some or all of the features of the vehicle 100a of plasa. According to various embodiments, the computing system 100 includes some or all of the features of the Vehicle Computing and Communication System (VCCS) 102 of fig. 2 of plasa. According to various embodiments, the host vehicle 200 communicates with some or all of the devices shown in fig. 1 of plasa, including the nomadic or mobile device 110, the communication tower 116, the telecommunications network 118, the internet 120, and the data processing center (i.e., one or more servers) 122. Each entity described in this application (e.g., connected infrastructure, other vehicles, mobile phones, servers) may share any or all of the features described with reference to fig. 1 and 2.
When the term "loaded vehicle" is used in the claims, it is defined herein to mean: "a vehicle including a motor, a plurality of wheels, a power source, and a steering system; wherein the motor transmits torque to at least one of the plurality of wheels to thereby drive the at least one of the plurality of wheels; wherein the power supply supplies energy to the motor; and wherein the steering system is configured to steer at least one of the plurality of wheels. The host vehicle 200 may be a loaded vehicle.
When the term "equipped electric vehicle" is used in the claims, it is defined herein to mean "a vehicle comprising a battery, a plurality of wheels, a motor, a steering system; wherein the motor transmits torque to at least one of the plurality of wheels to thereby drive the at least one of the plurality of wheels; wherein the battery is rechargeable and configured to supply electrical energy to the motor to thereby drive the motor; and wherein the steering system is configured to steer at least one of the plurality of wheels. The host vehicle 200 may be an equipped electric vehicle.
Fig. 3 is a block diagram of generating a driving decision based on (a) sensed external entities and (b) sensed environmental context (also referred to as context). The driving decision includes any instruction that affects the physical, tangible changes of the host vehicle 200. Driving decisions include accelerating, decelerating (e.g., braking), re-planning a route, re-planning a path (i.e., adjusting heading), and issuing an alert (e.g., flashing a light, generating audio). The driving decision may also include any instructions that affect the physical, tangible changes of the external vehicle. External entities are physical, tangible external objects and include external vehicles, pedestrians, obstructions (e.g., buildings, walls, pits). Environmental scenarios are typically non-physical and non-tangible (although environmental scenarios are sensed with reference to physical and tangible objects, such as street signs, painted road signs, non-travelable areas (e.g., grasslands)). Thus, the environmental scenario represents an artificially imposed rule related to driving. Examples of rules include speed limits, unrestricted areas, specified traffic flow directions, specified stop points (as indicated by stop signs and red lights, among other things).
The context and entities may be derived from local sensors 102 and/or telematics unit 104 (e.g., previously generated street maps received from servers, external vehicles, and external infrastructure). Although the camera may be configured to sense an external entity, the processing software required to convert the image to coordinates of the external entity is inefficient and sometimes inaccurate. However, the camera is effective in photographing contrast and color. The processing software associated with the infrared sensor, radar sensor, lidar sensor and/or ultrasonic sensor is effective in converting the measurements from the sensor into coordinates of an external entity. Such processing software is inefficient in capturing color and contrast and sometimes unable to capture color and contrast. In other words, the camera is good at capturing two-dimensional rather than three-dimensional information, while the infrared, radar, lidar, and/or ultrasonic sensors are good at capturing three-dimensional rather than two-dimensional information.
When measuring the environment, the image from the camera local sensor 102 may be insufficient due to limited ambient lighting or weather conditions. To increase the robustness of the context measurement system, vector or raster based graphics data may be passed to an image processing algorithm on the host vehicle 200 by a navigation system (consisting of stored map data, global positioning system (gps), compass). These images may be used to refine the area where the image processing subsystem performs the contextual analysis or to increase the confidence of the environmental classification used to make braking/acceleration/path planning decisions.
In practice, the contextual indicia is typically two-dimensional or actually two-dimensional. For example, letters printed on road signs are two-dimensional in nature. The computer is preferably capable of distinguishing the printed letters from the ambient noise by comparing the contrast and/or color of the printed letters to the ambient environment. Similarly, painted lane lines are two-dimensional in nature. The computer is preferably able to distinguish the painted lane line from ambient noise by comparing the contrast or color of the painted lane line to the surrounding environment. Thus, host vehicle 200 may be configured to resolve environmental scenarios with camera local sensor 102 and external entities (static or dynamic) with non-camera local sensors such as radar, lidar, ultrasonic local sensor 102.
Referring to fig. 3, a host vehicle 200 extracts a scenario 301 from data (e.g., a database including speed limits of roads and street maps) received from sensors 102 and/or via telematics unit 104. The host vehicle 200 extracts the entity 302 from the local sensor 102 and/or data received via the telematics unit 104. The extracted entities include properties such as location, two-or three-dimensional size and shape, speed, acceleration, heading, attributes (e.g., animal, pedestrian, vehicle), and the like.
Once the external entity has been parsed, the external entity is analyzed according to context to produce collision threat assessment 303. Collision threat assessment includes, for example, time of collision and/or distance of collision, etc. The collision threat assessment takes into account predicted future properties of the external entity and/or the host vehicle 200. For example, analysis of the time of collision may assume that an external entity traveling at a certain speed maintains that certain speed, and that the host vehicle 200 maintains its current speed. The host vehicle 200 generates a driving decision 304 based on the collision threat assessment 303.
Methods of extracting entities and their properties from local sensors are known in the art. Methods of determining collision threats are known in the art. Some methods of determining collision threats are disclosed in the following U.S. patent applications: the application number 15/183,355 to Bidner (Bidner), which is incorporated herein by reference in its entirety.
Fig. 4 is a more specific embodiment of the method and operations discussed with reference to fig. 3. The external entities 401a to 401n transmit data to the host vehicle 200 via the telematics unit 104. Some external entities (e.g., external entity 401 b) may proxy unconnected external entities (e.g., external entity 401 a). The telematics unit 104 forwards the received data to the processor 108 and/or the memories 106, 107. The local sensor 102 forwards sensed data (e.g., measurements) to the processor 108 and/or the memory 106, 107.
The processor 108 constructs a virtual map 402 of the area surrounding the host vehicle 200 based on the forwarded data. The virtual map 402 need not be graphically displayed or displayable. As one example, the virtual map 402 may be embodied as objects and their attributes stored in the memory 106, 107. Suitable programming or software for constructing a virtual map based on sensed (i.e., received) data (e.g., a street map) is known in the art. The virtual map may be two-dimensional or three-dimensional. The virtual map includes parsed, detected or received entities 403 placed in parsed, detected or received scenes 404. Scenarios include some or all of the following:
(A) Surface location and properties, including (i) non-travelable or out-of-boundary surfaces and (ii) travelable surfaces. The travelable surfaces may be separated or segmented by speed limits. The host vehicle 200 determines a drivable surface via map information received from an external source. The local context sensor 102 supplements this information via the contrast and/or color of the image. Image processing programming software parses the travelable surface based on contrast and/or color and separates the travelable surface from the non-travelable surface accordingly. For example, a surface that appears green and is outside the lane is marked as a non-travelable surface.
(B) A lane of a drivable surface comprising (i) a location of the lane and (ii) an attribute of the lane. The location of the lane includes some or all of the following: lane length, lane width, lane coordinates, lane curvature, and number of lanes. The attributes of the lanes correspond to rules of the lanes. Rules include direction of traffic flow and lane change legitimacy. The host vehicle 200 determines the lanes and their properties via map information received from an external source. The local sensor 102 supplements this information via the contrast and/or color of the image. The image processing programming software parses the lane lines based on contrast and/or color. The processor 108 determines any of the lane properties described above based on the analyzed lane lines.
(C) A parking space on a drivable surface comprising (i) a location of the parking space and (ii) an attribute of the parking space. The position of the parking space includes the width and depth of the parking space. The attributes of the parking space include rules related to the parking space (e.g., allowing only parallel parking, allowing only disabled people to park). The host vehicle 200 determines the parking space and its attributes via map information received from an external source. The local sensor 102 supplements this information via the contrast and/or color of the image. The image processing programming software parses the boundaries of the parking space (e.g., painted parking lines) based on the contrast and/or color of the image. The processor 108 determines any of the lane properties described above based on the parsed boundaries.
As previously described, the host vehicle 200 applies the parsed scenario to the parsed entity. Applying the parsed context to the parsed entity includes predicting or estimating future properties (e.g., position, velocity, acceleration, heading) of the parsed entity based on the parsed context. Thus, the predicted or estimated future property of the parsed entity depends at least on (a) the current property of the parsed entity and (b) the scenario. Examples are provided below.
As previously described, the host vehicle 200 performs collision threat assessment 405 of the parsed entity based on predicted or estimated future properties of the parsed entity. As previously described, host vehicle 200 generates driving decision 406 based on collision threat assessment 405.
Fig. 5 shows an example virtual map of a host vehicle 200. A traffic circle 501 intersects roads 502, 503, and 506. The road 502 is a one-way road in a direction from the host vehicle 200 toward the rotary center 501a. The road 506 intersects the roads 505 and 504. The rotary includes lanes 501c and 501d separated by lane line 501b and a non-travelable center 501a. Lanes 501c and 501d carry parallel traffic flows, as indicated by dashed lane line 501 b. The road 506 includes lanes 506b and 506c. Lane 506b carries traffic in the opposite direction from lane 506c, as indicated by double lane line 506 a. The host vehicle 200 is on road 502 at a speed (velocity) of 200a (velocity) including speed and heading. The second vehicle 201 is in lane 501c of the rotary 501 at a speed of 200b. The third vehicle 202 is on road 504 at a speed 202a. The fourth vehicle 203 is in lane 506b of road 506 at speed 203a. The fifth vehicle 204 is on road 505 at speed 204a.
As shown in fig. 5, each vehicle is represented by a box. Each box includes triangles (not labeled). Each triangle is directed towards the front bumper of the respective vehicle. In other figures, the triangles may have different dimensions. Unless otherwise indicated, such dimensional changes are not intended to be indicative of potential significance, but are merely drawing techniques for increasing clarity.
The main speed 200a is zero such that the host vehicle 200 is stopped. The second speed 201a includes a heading 201a directed to the host vehicle 200. Thus, if the host vehicle 200 is to remain stopped and if the second vehicle 201 is to continue traveling in the second heading 201a (speed includes heading as described above), then the host vehicle 200 and the second vehicle 201 will collide.
However, the lane 501c is curved. If the second vehicle 201 follows lane 501c, the second heading 201a will rotate to remain parallel to lane 501c. The host vehicle 200 predicts that the second heading 201a will follow the lane 501c based on the context (i.e., the curvature of the lane 501 c). The collision threat assessment between the host vehicle 200 and the second vehicle 201 results in zero or low collision probability. Accordingly, the host vehicle 200 does not make driving decisions (e.g., evasive maneuvers) based on the intended collision between the host vehicle 200 and the second vehicle 201. According to some embodiments, the driving decision is based on the magnitude of the probability of collision.
Fig. 6 and 7 relate to a first embodiment of collision threat assessment. Fig. 8 to 14 relate to a second embodiment of collision threat assessment. Features of both the first and second embodiments may be combined.
Referring to fig. 6, the host vehicle 200 responds to the trigger by calculating a reference segment 601a extending from the second vehicle 201 to the host vehicle 200. The trigger is based on the position and speed of the host vehicle 200 and the position and speed of the second vehicle 201.
In fig. 6, the reference segment 601a is the shortest segment connecting the second vehicle 201 to the host vehicle 200. According to other embodiments, the reference segment 601a is from the midpoint of the host vehicle 200 to the midpoint of the front surface of the second vehicle 201. The host vehicle 200 calculates a series of curved segments 601b to 601g intersecting both ends of the reference segment 601 a. The curved sections 601b to 601g may be spaced apart by a predetermined interval. The curved segments 601b to 601g follow a predetermined geometric function. Some or all may be parabolic. The total number of curved segments 601 b-601 g calculated by the first vehicle 200 is based on (a) the predetermined interval and (b) the speed and/or acceleration of the host vehicle 200 and/or the second vehicle 201. The number of curved segments on each side of the reference segment 601a is based on the speed and/or acceleration of the host vehicle 200 and/or the second vehicle 201.
The outer curved sections 601f and 601g correspond to the extreme paths of the second vehicle 201. For example, based on the second speed 201a, the outer curved segments 601f and 601g represent the most extreme curved collision path between the second vehicle 201 and the host vehicle 200 that does not cause the second vehicle 201 to become uncontrollable (e.g., spin or roll). The host vehicle 200 is preloaded with one or more functions that determine the curvature of the outer curved segments 601f and 601g based on the properties (e.g., speed) of the second vehicle 201.
According to the embodiment, the host vehicle 200 determines the outer curved sections 601f and 601g after determining the reference section 601a, then fills a first predetermined number of intermediate curved sections 601c and 601e between the outer curved section 601g and the reference section 601a, and fills a second predetermined number of intermediate curved sections between the outer curved section 601f and the reference section 601 a. The first and second predetermined amounts may be (a) preset, (b) equal, (c) based on the angle of the reference segment 601a relative to the second speed 201 a.
The host vehicle 200 evaluates each segment 601 according to the scenario. The host vehicle 200 may determine the number of rules (derived from the scenario as described above) that the second vehicle 201 breaks down for each segment 601. The host vehicle 200 may also determine the extent of each broken rule.
If the second vehicle 201 follows segment 601f, the second vehicle 201 will (a) illegally leave the rotary 501 at angle 602, (b) illegally traverse the distance defined between points 603 and 604 of the non-travelable surface 507, and (c) illegally enter the road 502 at angle 605. Angle 602, the distance between points 603 and 604, and angle 605 correspond to the degree of the rule of disruption.
Instead of identifying each illegal action, the host vehicle 200 may calculate a portion of each segment 601 corresponding to one or more broken rules. For example, the reference segment 601a may be broken down into a legal first portion extending from the second vehicle 201 to the point 608 and an illegal second portion extending from the point 608 to the host vehicle 200. When traversing the first portion of segment 601a, the second speed 201a will sufficiently match (i.e., match within predetermined limits) the curvature of the lane 501c, which is considered legal. When traversing the second portion of segment 601a, the second speed 201a will deviate sufficiently from the curvature of the lane 501c and the final road 502, which is considered illegal. Thus, the extent of illegal activity of segment 601a is related to at least the distance of the second portion. In contrast, the entire segment 601g will be considered illegal activity because at each point along the segment 601g, the second speed 201a will deviate sufficiently from the curvature of the lane 501c and will eventually be opposite the direction of travel of the road 502 (as described above, the road 502 is a one-way street toward the center 501 a).
Host vehicle 200 performs collision threat assessment for each segment 601 according to one or more of the following: whether segment 601 includes illegal activity (i.e., activity that violates a scene) and the size or extent of the illegal activity. Segments 601 with a greater degree of illegal activity are ignored or considered unlikely. The host vehicle 200 sums the collision threat assessment for each segment 601. If the sum is above the predetermined probability threshold, the host vehicle 200 generates driving decisions (e.g., controlling steering, braking, and/or acceleration) corresponding to evasive maneuvers (i.e., maneuvers calculated to (a) reduce the probability of a collision and/or (b) reduce the possible speed differential between the host vehicle 200 and the second vehicle 201 at the time of the collision).
Turning to fig. 7, the host vehicle 200 travels along a two-lane unidirectional road 707 at a speed 200 a. The second vehicle 201 enters the parking space 703 of the parking lot 702 at the speed 201 a. The third, fourth and fifth vehicles 202, 203 and 204 have been parked. Parking space 703 is defined by two side lines (not labeled) and end line 704. A concrete bumper 705 separates the road 707 from the parking lot 702. The road 707 includes a painted line 706 that is adjacent to the road 707 and extends parallel to the road 707.
The host vehicle 200 resolves the vehicles 201 to 204 via local entity sensors and specifies speeds for the vehicles 201 to 204. The host vehicle 200 determines a second speed 201a (speed includes speed and heading, as described above). The host vehicle 200 determines a host speed 200a. The host vehicle 200 parses the scene at least in part via the local scene sensor 102 a.
In one case, the host vehicle 200 identifies painted lines of seven parking spaces via image processing software. The host vehicle 200 compares the identified paint line to the preloaded reference parking space geometry. The host vehicle 200 recognizes that the width between painted lines (corresponding to the width of the parking space) falls within a predetermined range of widths in the preloaded reference parking space geometry. The host vehicle 200 recognizes that each parking space is defined by three painted lines and is therefore a rectangle of openings that match at least some of the preloaded reference parking space geometries. The host vehicle 200 recognizes that parking spaces are grouped together into a plurality of adjacent parking spaces. Given some or all of these identifications, the host vehicle applies a parking space scenario to parking space 703. The host vehicle 200 confirms the parking space scene using the map information received from the external server, and recognizes the area associated with the parking lot 702 as a parking lot.
In another case, the host vehicle 200 receives (or has previously received) information identifying the coordinates of the parking lot 702 from an external server. Based on this previously received information, the host vehicle 200 scans the coordinates of the parking lot 702 with the local sensor 102 and confirms that the received information is consistent with the characteristics of the image captured by the local sensor 102.
The concrete wall 705 is three-dimensional and may be resolved using the local sensor 102. The host vehicle 200 marks the concrete wall 705 as a stable infrastructure and recognizes that the height of the concrete wall 705 exceeds the vertical coordinates of the parking lot 702.
The host vehicle 200 performs collision threat assessment between the second vehicle 201 and the host vehicle 200. If the host vehicle 200 continues at the host speed 200a and the second vehicle 201 continues at the second speed 201a, then the host vehicle 200 and the second vehicle 201 will collide at point 701.
However, the host vehicle 200 predicts the future position and future speed of the second vehicle 201 based on the scenario. More specifically, the host vehicle 200 predicts that the second speed 201a will decrease based on the end line 704 of the parking space 703. The host vehicle 200 predicts that the future position of the second vehicle 201 will be immediately adjacent to (i.e., not intersect) the end line 704. The host vehicle 200 performs a similar analysis for the concrete bumper 705 and the painted line 706. Accordingly, it should be appreciated that when performing collision threat assessment, the host vehicle 200 predicts the future speed of the entity based on the parsed scenario.
According to some embodiments, the trigger for extending the reference segment 601a between the host vehicle 200 and the second vehicle 201 is based on the absence of an entity (including infrastructure, such as the concrete bumper bar 705) blocking the collision path of the second vehicle 201 with the host vehicle 200. The entity blocks the collision path when the entity is a solid body having at least a predetermined thickness, intersects the ground plane within a predetermined angular range, and terminates at least at a predetermined height above the ground adjacent to the infrastructure on the side closest to the second vehicle 201. Here, the concrete bumper rail 705 exceeds a predetermined thickness, intersects the ground at 90 degrees (and thus falls within a predetermined angular range), and extends a predetermined height above the ground between the end line 704 and the concrete bumper rail 705. Thus, the host vehicle 200 does not extend the reference segment 601a between the host vehicle 200 and the second vehicle 201.
Fig. 8-14 relate to a second embodiment of collision threat assessment. These embodiments may include any of the features previously discussed with reference to the first embodiment of collision threat assessment. The operations of fig. 8-14 may be applied to any collision threat assessment (not just cross traffic alerts) as well as any traffic condition (e.g., not just when the host vehicle is reversing into the road).
Referring to fig. 8-14, a host vehicle 200 may be configured to issue a Cross Traffic (CT) alert, which is a form of collision threat assessment. The CT alert alerts the primary driver when the primary vehicle 200 may return to the cross-traffic. Referring to fig. 8, ct alarms rely on local sensors 102. The rear local sensor 102b may include CT local sensors 102e and 102f configured to transmit CT signal patterns 801 and 802, respectively. CT local sensors 102e and 102f may be radar, lidar, ultrasound, or any other type of sensor previously discussed with respect to local sensor 102.
The host vehicle 200 (and more specifically the processor 108) is configured to predict the path of an external entity detected using CT local sensors, as previously described. The general concept of CT alarms is known in the art.
Referring to fig. 8, the intersection (i.e., collision) of the second vehicle 201 with the host vehicle 200 is predicted based on the position, speed, and/or acceleration of the second vehicle 201 as detected by the CT local sensors 102e, 102f and based on the position, speed, and/or acceleration of the host vehicle 200 as detected by the other local sensors 102. Accordingly, the host vehicle 200 issues a CT alarm. The CT alert may include a reminder displayed on the user interface 105 (including flashing a light), automatic application of a brake, etc. With continued reference to fig. 8, the host vehicle 200 determines that the third vehicle 202 is stopped, although the third vehicle 202 is detected. Thus, the host vehicle 200 may ignore the third vehicle 202.
Referring to fig. 9, the host vehicle 200 is configured to apply a virtual map 402 to enhance the accuracy of CT alerts. In fig. 9, the host vehicle 200 is in a traffic lane 802 and is backing into lane 501c. Fig. 9 is otherwise similar to fig. 5. Under at least some existing CT alert systems, the host vehicle 200 incorrectly predicts the path of the second vehicle 201 as path 901. Path 901 represents the path of second vehicle 201 in the event that instantaneous second speed 201a remains advancing into the future. However, the second vehicle 201 is more likely to follow the curvature of the lane 501c, and thus the path 902 is more likely to occur than the path 901. Thus, under at least some existing CT alert systems, the host vehicle 200 will erroneously predict that stopping at the current location may result in a collision, and returning to lane 501c will avoid a collision. However, the correct prediction (or at least the most likely accurate prediction) is that the host vehicle 200 will avoid a collision immediately stopping in the traffic lane 802, while returning into the lane 501c (and thus intersecting the path 902) may result in a collision.
The host vehicle 200 is configured to obtain the correct result by (a) detecting the current properties of the second vehicle 201 and (b) modifying those current properties according to the scenario. More specifically, and with reference to fig. 10, the host vehicle 200 performs some or all of the following operations: first, the current nature of the second vehicle 201 (including the current lane) is detected; second, an ideal path 1001 and/or an actual path 1301 of the second vehicle 201 is predicted based on virtual map data corresponding to the current lane of the second vehicle 201.
To find the ideal path 1001, the coordinates of a series of points 1002 can be found. The identification of coordinates is discussed with reference to fig. 11. The paths discussed above and below may represent the location of a fixed point on the second vehicle 201 (e.g., the location of the center of the front bumper of the second vehicle 201).
As shown in fig. 11, the current lane of the second vehicle 201 may be divided into segments 1111. Each segment 1111 may be defined to exist between an outer lane boundary 1112 and an inner lane boundary 1113 defining the lane 501c (as identified in the virtual map 402). Each segment 1111 may be defined as having the same (e.g., substantially the same) surface area. Each segment 1111 may be defined to have any surface area less than the surface area of the predetermined segment.
Because the roadway 502 intersects the roadway 501c, an additional boundary 1114 may be applied to separate the roadway 502 from the roadway 501 c. The additional border 1114 may be a straight line defined between the opposite ends 1115 and 1116 of the outer border 1112. Additional boundaries 1114 may already exist in the virtual map. The additional border 1114 may be curved with curvature interpolated between the segments of the outer border 1112.
Each segment may be defined between an outer boundary 1112 and an inner boundary 1113 and a lateral boundary 1117. Each lateral boundary 1117 may be set such that it intersects both boundaries 1112 and 1113 at an angle 1119, 1120 within a predetermined range of 90 degrees (e.g., ±10%). As shown in fig. 11, the lateral boundary 1117b of the first segment 1111a may serve as the lateral boundary 1117b of the adjacent segment 1111 b. The above process may be iterative, continuing to resize segment 1111 (and thus relocate lateral boundary 1117) until the above condition is met. Thereafter, the midpoint 1122 of each lateral boundary 1117 is found. Ideal path 1001 is then interpolated based on midpoints 1122 (e.g., ideal path 1001 may be a best fit line intersecting each midpoint 1122).
To correct for the actual situation (e.g., the second vehicle 201 deviates from the ideal path 1001), the ideal path 1001 may be widened to form the actual path 1301. To find the actual path 1301, a series of external and internal actual points 1201, 1202 may be found. The outer physical point 1201 may be defined as being located on a lateral boundary 1117 that is located a predetermined distance outside of the outer boundary 1113 or outside of the ideal path 1001. The interior actual point 1202 may be defined to lie on a lateral boundary 1117 within the boundary 1113 or at a predetermined distance within the ideal path 1001.
Returning to fig. 9, the path 902 of the second vehicle 201 may thus be set as the actual path 1301 or the ideal path 1001 (depending on the embodiment applied). Host vehicle 200 may apply any of the operations described above or any other suitable operation to predict the host path (and its associated time or range). The host vehicle 200 determines whether a host path (not shown) intersects the second vehicle path 902. If there is an intersection, the host vehicle 200 determines whether the intersection will occur simultaneously (e.g., at a single point in time or within a certain intersection range of a single point in time). Thus, it should be appreciated that, with reference to fig. 9, in the case where the second vehicle 201 follows the lane 501d instead of the lane 501c and the host vehicle 200 is predicted to occupy only the traffic lane 802 and the lane 501c, the host vehicle 20 may predict not to collide if the path of the host vehicle 200 does not intersect the path of the second vehicle 201.
To find the simultaneous intersection, the host vehicle 200 may pass through a series of time intersection intervals. At each time interval, it may be determined whether the predicted footprint of the host vehicle 200 intersects (or falls within a predetermined distance of) the predicted footprint of the second vehicle 201. The predicted footprint is at least the size of the respective vehicle and may be larger because, as described below, a range of positions each vehicle occupies at any given point in time may be predicted.
To account for the timing of the second vehicle 201, each lateral boundary 1117 may be mapped to the front center of the front bumper of the second vehicle 201, and may be paired with a point in time (or range of points in time) based on the current properties (e.g., speed, acceleration) of the second vehicle 201. If the range is applied, the range may widen as the distance from the current position of the second vehicle 201 increases.
As one example, the first lateral boundary 1117a may be associated with one second in the future and the second lateral boundary 1117b may be associated with three seconds in the future. As another example, a first lateral boundary 1117a may be associated with one to four seconds (a range of three seconds) in the future, a second lateral boundary 1117b may be associated with three to seven seconds (a range of four seconds) in the future, and a third lateral boundary (not labeled) may be associated with five to eleven seconds (a range of six seconds) in the future. The time associated with the inner area of the segment 1111 may be interpolated between the successive lateral boundaries 1117. In parallel with these operations, the timing of the host vehicle path may also be determined using the same or any other suitable technique.
Each intersection time interval is mapped to any location of the second vehicle 201 and the host vehicle 200 corresponding to the time interval. For example, if the intersection time interval is 0.1 seconds, each position of the second vehicle that occurs at 0.1 seconds is identified as corresponding. It is assumed that the time range of the first lateral boundary is 0.01 seconds to 0.2 seconds, and the time range of the second lateral boundary is 0.1 seconds to 0.4 seconds. According to this example, the entire first segment 1111a corresponds to an intersection time interval of 0.1 seconds. Thus, the footprint of the second vehicle 201 at the 0.1 intersection time interval will include the entire first segment 1111a.
In order to take into account the body of the second vehicle 201 having a two-dimensional area, each occupied area may be lengthened. For example, when the path is mapped to the center of the front bumper of the second vehicle 201, the footprint may extend toward the current position of the second vehicle 201 to take into account the body of the second vehicle 201. For example, and referring to fig. 14, the corresponding range of position 1401 at a third time interval of 0.3 seconds (0.1 x 3, where 3 represents the time interval) may be a portion of the entire first segment 1111a and second segment 1111 b. The body area 1402 is added to the corresponding range of locations 1401 to produce a total footprint 1403 equal to the area of 1401 plus the area of 1402. For the reasons discussed previously, as the time interval increases, the total footprint increases. For example, at 0.1 time intervals, the total footprint may be 100 square meters. At the next interval of 0.2 seconds, the total footprint may be 150 square meters. At the next time interval of 0.3 seconds, the total footprint may be 250 square meters.
The same or other suitable operations are performed for the host vehicle 200. If the footprints of the host vehicle 200 and the second vehicle 201 overlap (or fall within a predetermined distance of each other) at any given time intersection interval, then a simultaneous intersection is found to exist. Only advancing through the time interval is performed until the simultaneous intersection is determined to exist (i.e., ending once the simultaneous intersection is determined to exist). If simultaneous intersections exist for a certain time interval, the computation ends and a CT alert is issued immediately. If simultaneous intersections do not exist, the calculation may proceed to the next time interval.
As described above, the operations discussed with reference to fig. 8-14 may be applied to non-CT alert systems (e.g., other forms of collision threat assessment). The operations discussed with reference to fig. 8-14 may be applied to any of the embodiments previously discussed, including any of the embodiments discussed with reference to fig. 1-7.

Claims (15)

1. A host vehicle, comprising:
a motor, a brake, a sensor, a processor configured to:
identifying a lane of a target vehicle;
determining a radius of curvature of a lane boundary of the identified lane;
predicting a target path of the target vehicle based on the lane boundaries of the virtual map and the determined radius of curvature of the lane boundaries;
Comparing the target path with a predicted main path of the host vehicle;
the brake is applied based on the comparison.
2. The host vehicle of claim 1, wherein the processor is configured to construct the virtual map based on (a) the received street map and (b) the measurements received from the sensor.
3. The host vehicle of claim 2, wherein the sensor comprises a camera and the processor is configured to apply the lane boundary to the virtual map based on an image captured by the camera.
4. The host vehicle of claim 1, wherein the processor is configured to:
determining (a) a first radius of curvature of a first lane boundary of the identified lane, (b) a second radius of curvature of a second lane boundary of the identified lane;
calculating an intermediate radius of curvature based on (a) and (b);
the target path is predicted based on the calculated intermediate radius of curvature.
5. The host vehicle of claim 4, wherein the processor is configured to predict a range of positions of the target vehicle at a first future time based on the target path and predict a range of positions of the target vehicle at a second future time.
6. The host vehicle of claim 5, wherein the first future time is a predetermined time interval multiplied by a first number and the second future time is a first number plus one followed by the predetermined time interval.
7. The host vehicle of claim 6, wherein the processor is configured such that a total area of the predicted range of positions of the target vehicle at the second future time exceeds a total area of the predicted range of positions of the target vehicle at the first future time.
8. The host vehicle of claim 7, wherein the processor is configured such that a total area of the predicted location range of the target vehicle at the first future time exceeds a total area of the target vehicle.
9. The host vehicle of claim 1, wherein the processor is configured to predict the target path and the host path as shapes each having a two-dimensional surface area.
10. The host vehicle of claim 9, wherein the processor is configured to determine whether the shapes intersect and calculate a first time interval for the target vehicle to reach an intersection and a second time interval for the host vehicle to reach the intersection based on the determination.
11. The host vehicle of claim 10, wherein the processor is configured to determine whether any portion of the first time interval overlaps any portion of the second time interval.
12. The host vehicle of claim 11, wherein the processor is configured to apply the brake based on a positive overlap determination.
13. A host vehicle, comprising:
a motor, a steering device, a sensor, a processor configured to:
identifying a lane of a target vehicle;
determining a radius of curvature of a lane boundary of the identified lane;
predicting a target path of the target vehicle based on the lane boundaries of the virtual map and the determined radius of curvature of the lane boundaries;
comparing the target path with a predicted main path of the host vehicle;
actuating the steering device based on the comparison.
14. The host vehicle of claim 13, wherein the processor is configured to:
predicting the target path and the main path as shapes each having a two-dimensional surface area;
determining whether the shapes intersect, and calculating a first time interval for the target vehicle to reach an intersection and a second time interval for the host vehicle to reach the intersection based on the determination;
Determining whether any portion of the first time interval overlaps any portion of the second time interval;
actuating the steering device based on a positive overlap determination;
constructing the virtual map based on (a) the received street map and (b) the measurements received from the sensor.
15. A host vehicle, comprising:
a motor, a warning light, a sensor, a processor configured to:
identifying a lane of a target vehicle;
determining a radius of curvature of a lane boundary of the identified lane;
predicting a target path of the target vehicle based on the lane boundaries of the virtual map and the determined radius of curvature of the lane boundaries;
comparing the target path with a predicted main path of the host vehicle;
activating the warning light based on the comparison.
CN201710924399.6A 2016-10-06 2017-09-30 Vehicle capable of environmental scenario analysis Active CN107918758B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662405183P 2016-10-06 2016-10-06
US62/405,183 2016-10-06
US15/418,556 US10377376B2 (en) 2016-10-06 2017-01-27 Vehicle with environmental context analysis
US15/418,556 2017-01-27

Publications (2)

Publication Number Publication Date
CN107918758A CN107918758A (en) 2018-04-17
CN107918758B true CN107918758B (en) 2023-06-02

Family

ID=61829899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710924399.6A Active CN107918758B (en) 2016-10-06 2017-09-30 Vehicle capable of environmental scenario analysis

Country Status (4)

Country Link
US (1) US10377376B2 (en)
CN (1) CN107918758B (en)
MX (1) MX2017012907A (en)
RU (1) RU2017134361A (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112018001182T5 (en) * 2017-03-06 2019-12-24 Panasonic Intellectual Property Management Co., Ltd. Vehicle travel control system for a parking lot and control method of a vehicle travel control system for a parking lot
US11080537B2 (en) * 2017-11-15 2021-08-03 Uatc, Llc Autonomous vehicle lane boundary detection systems and methods
US11724691B2 (en) * 2018-09-15 2023-08-15 Toyota Research Institute, Inc. Systems and methods for estimating the risk associated with a vehicular maneuver
KR20200086764A (en) * 2019-01-09 2020-07-20 현대자동차주식회사 Vehicle and method for controlling thereof
US10719076B1 (en) * 2019-02-25 2020-07-21 Rockwell Collins, Inc. Lead and follower aircraft navigation system
FR3093690B1 (en) * 2019-03-14 2021-02-19 Renault Sas Selection process for a motor vehicle of a traffic lane of a roundabout
US11021148B2 (en) * 2019-03-25 2021-06-01 Zoox, Inc. Pedestrian prediction based on attributes
US11351991B2 (en) * 2019-03-25 2022-06-07 Zoox, Inc. Prediction based on attributes
JP2021043707A (en) * 2019-09-11 2021-03-18 本田技研工業株式会社 Vehicle controller, vehicle control method, and program
CN111062372B (en) * 2020-03-13 2020-07-03 北京三快在线科技有限公司 Method and device for predicting obstacle track
KR20210127267A (en) * 2020-04-13 2021-10-22 현대자동차주식회사 Vehicle and method for controlling thereof
US11823458B2 (en) 2020-06-18 2023-11-21 Embedtek, LLC Object detection and tracking system
US11673548B2 (en) 2020-09-10 2023-06-13 Ford Global Technologies, Llc Vehicle detection and response
US11618444B2 (en) * 2020-10-01 2023-04-04 Argo AI, LLC Methods and systems for autonomous vehicle inference of routes for actors exhibiting unrecognized behavior
US11731661B2 (en) 2020-10-01 2023-08-22 Argo AI, LLC Systems and methods for imminent collision avoidance
US11358598B2 (en) * 2020-10-01 2022-06-14 Argo AI, LLC Methods and systems for performing outlet inference by an autonomous vehicle to determine feasible paths through an intersection
US20220178700A1 (en) * 2020-12-03 2022-06-09 Motional Ad Llc Localization based on surrounding vehicles
CN112729320B (en) * 2020-12-22 2022-05-17 中国第一汽车股份有限公司 Method, device and equipment for constructing obstacle map and storage medium
US11880203B2 (en) * 2021-01-05 2024-01-23 Argo AI, LLC Methods and system for predicting trajectories of uncertain road users by semantic segmentation of drivable area boundaries
CN115311892B (en) * 2022-08-05 2023-11-24 合众新能源汽车股份有限公司 Parking space display method and device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101497330A (en) * 2008-01-29 2009-08-05 福特全球技术公司 A system for collision course prediction
CN105339228A (en) * 2013-05-09 2016-02-17 罗伯特·博世有限公司 Adaptive cruise control with stationary object recognition

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4407757A1 (en) 1993-03-08 1994-09-15 Mazda Motor Device for detecting obstacles for a vehicle
JPH09142236A (en) 1995-11-17 1997-06-03 Mitsubishi Electric Corp Periphery monitoring method and device for vehicle, and trouble deciding method and device for periphery monitoring device
JP3223239B2 (en) * 1996-11-12 2001-10-29 本田技研工業株式会社 Vehicle control device
JP2001328451A (en) * 2000-05-18 2001-11-27 Denso Corp Travel route estimating device, preceding vehicle recognizing device and recording medium
US6862537B2 (en) 2002-03-21 2005-03-01 Ford Global Technologies Llc Sensor fusion system architecture
US7248151B2 (en) 2005-01-05 2007-07-24 General Motors Corporation Virtual keypad for vehicle entry control
US7729857B2 (en) * 2005-08-18 2010-06-01 Gm Global Technology Operations, Inc. System for and method of detecting a collision and predicting a vehicle path
CN101287634B (en) * 2005-10-13 2012-08-15 日产自动车株式会社 Vehicle driving assist system
US20100198428A1 (en) 2009-01-30 2010-08-05 Delphi Technologies, Inc. Multi-purpose fob system
GB0901906D0 (en) * 2009-02-05 2009-03-11 Trw Ltd Collision warning apparatus
CA2658963A1 (en) 2009-03-23 2010-09-23 Richard Harrington D.c. power generating automotive shock absorber
JP2010249613A (en) 2009-04-14 2010-11-04 Toyota Motor Corp Obstacle recognition device and vehicle control unit
US8315756B2 (en) * 2009-08-24 2012-11-20 Toyota Motor Engineering and Manufacturing N.A. (TEMA) Systems and methods of vehicular path prediction for cooperative driving applications through digital map and dynamic vehicle model fusion
US8340894B2 (en) 2009-10-08 2012-12-25 Honda Motor Co., Ltd. Method of dynamic intersection mapping
US8614622B2 (en) 2010-03-08 2013-12-24 Ford Global Technologies, Llc Method and system for enabling an authorized vehicle driveaway
DE102010032063A1 (en) 2010-06-09 2011-05-12 Daimler Ag Method for determining environment of vehicle, involves recognizing radar data from roadway surface elevated object by radar sensors and recognizing camera data from pavement marking by camera
JP5804676B2 (en) 2010-06-29 2015-11-04 ダイハツ工業株式会社 Driving assistance device
US8810431B2 (en) 2011-10-20 2014-08-19 GM Global Technology Operations LLC Highway merge assistant and control
US9098086B2 (en) * 2012-08-07 2015-08-04 Caterpillar Inc. Method and system for planning a turn path for a machine
US8798809B2 (en) 2012-08-21 2014-08-05 GM Global Technology Operations LLC System for passive entry and passive start using near field communication
US9110772B2 (en) 2012-11-08 2015-08-18 GM Global Technology Operations LLC Mobile device-activated vehicle functions
US20140293753A1 (en) 2013-04-02 2014-10-02 David Pearson Smartphone activated vehicle entry device
SE537621C2 (en) 2013-09-10 2015-08-11 Scania Cv Ab Detection of objects using a 3D camera and a radar
CN103699717A (en) * 2013-12-03 2014-04-02 重庆交通大学 Complex road automobile traveling track predication method based on foresight cross section point selection
JP6180968B2 (en) * 2014-03-10 2017-08-16 日立オートモティブシステムズ株式会社 Vehicle control device
US9248834B1 (en) * 2014-10-02 2016-02-02 Google Inc. Predicting trajectories of objects based on contextual information
CN104821080B (en) * 2015-03-02 2017-04-12 北京理工大学 Intelligent vehicle traveling speed and time predication method based on macro city traffic flow

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101497330A (en) * 2008-01-29 2009-08-05 福特全球技术公司 A system for collision course prediction
CN105339228A (en) * 2013-05-09 2016-02-17 罗伯特·博世有限公司 Adaptive cruise control with stationary object recognition

Also Published As

Publication number Publication date
MX2017012907A (en) 2018-09-27
US10377376B2 (en) 2019-08-13
US20180099663A1 (en) 2018-04-12
CN107918758A (en) 2018-04-17
RU2017134361A (en) 2019-04-03

Similar Documents

Publication Publication Date Title
CN107918758B (en) Vehicle capable of environmental scenario analysis
CN109927719B (en) Auxiliary driving method and system based on obstacle trajectory prediction
US20220397402A1 (en) Systems and methods for determining road safety
US11288521B2 (en) Automated road edge boundary detection
US10507807B2 (en) Systems and methods for causing a vehicle response based on traffic light detection
EP4273837A2 (en) Systems and methods for predicting blind spot incursions
US11313976B2 (en) Host vehicle position estimation device
US20200049511A1 (en) Sensor fusion
US11898855B2 (en) Assistance control system that prioritizes route candidates based on unsuitable sections thereof
US11460851B2 (en) Eccentricity image fusion
CN110060467B (en) Vehicle control device
US11938926B2 (en) Polyline contour representations for autonomous vehicles
US20220027642A1 (en) Full image detection
GB2556427A (en) Vehicle with environmental context analysis
US9616886B2 (en) Size adjustment of forward objects for autonomous vehicles
US20200307692A1 (en) On-road localization methodologies and equipment utilizing road surface characteristics
US20220371583A1 (en) Systems and Methods for Selectively Decelerating a Vehicle
CN114126940A (en) Electronic control device
Valldorf et al. Advanced Microsystems for Automotive Applications 2007
RU2746026C1 (en) Method and system for generating the reference path of a self-driving car (sdc)
WO2019127076A1 (en) Automated driving vehicle control by collision risk map
CN114572243A (en) Target object detection device and vehicle equipped with the same
RU2757037C1 (en) Method and system for identifying the presence of a track on the current terrain
박성렬 Efficient Environment Perception based on Adaptive ROI for Vehicle Safety of Automated Driving Systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant