US10480948B2 - Method and system for estimating a boundary of a road - Google Patents

Method and system for estimating a boundary of a road Download PDF

Info

Publication number
US10480948B2
US10480948B2 US15/448,991 US201715448991A US10480948B2 US 10480948 B2 US10480948 B2 US 10480948B2 US 201715448991 A US201715448991 A US 201715448991A US 10480948 B2 US10480948 B2 US 10480948B2
Authority
US
United States
Prior art keywords
lane marking
road
boundary
geometrical representation
positions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/448,991
Other languages
English (en)
Other versions
US20170261327A1 (en
Inventor
Claes Olsson
Anders DAHLBACK
Martin Anders KARLSSON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volvo Car Corp
Original Assignee
Volvo Car Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volvo Car Corp filed Critical Volvo Car Corp
Assigned to VOLVO CAR CORPORATION reassignment VOLVO CAR CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DAHLBACK, ANDERS, KARLSSON, MARTIN ANDERS, OLSSON, CLAES
Publication of US20170261327A1 publication Critical patent/US20170261327A1/en
Application granted granted Critical
Publication of US10480948B2 publication Critical patent/US10480948B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • G06K9/00798
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Definitions

  • the present disclosure relates to a boundary estimation system adapted to be on-board a vehicle and a method performed therein, for estimating a boundary of a road on which the vehicle is positioned.
  • motor vehicles may be provided with driver assistance systems. These systems may monitor the surroundings of the vehicle, decide whether an imminent road departure is likely to occur, and may further warn and/or intervene with the steering system and/or the brake system of the vehicle in order to prevent the impending or probable road departure.
  • information about the vehicle's position in relation to a road boundary of said road, as well as the shape—i.e., the geometry—of the road boundary, may be desired to be determined.
  • one or more sensors on-board the vehicle may be utilized, such as one or more cameras.
  • the quality of the road boundary detection and/or estimation may be inherently limited, for instance due to insignificant contrast between the drivable surface and the area outside the drivable surface, and/or due to irregularities of the road boundary. Furthermore, darkness, specific road conditions, etc., could worsen the problem.
  • the effect may be noise and/or uncertainties in the information about the vehicle's position and/or orientation in relation to the road in the estimated geometry of the road boundary. These uncertainties may negatively impact the precision in the assessment of the risk of an imminent road departure. Moreover, the possibility to accurately and comfortably provide automatic control of the vehicle for preventing said vehicle from leaving the drivable road surface, may be negatively affected as a consequence of these uncertainties. Accordingly, to provide an approach for preventing a vehicle from running into and/or cross a road boundary, with high availability and satisfying performance, it is desired to enhance the quality of the road boundary information.
  • EP 2 012 211 for instance, is set out to—among other things—provide an improved surrounding monitoring system, and accordingly discloses a surrounding monitoring system which includes a radar being arranged to detect a road boundary. Since the detected road boundary does not form a continuous line but rather a large set of detected objects forming an uneven interrupted border, EP 2 012 211 suggests to form a continuous line that estimates the road boundary by determining a shape of a shoulder edge thereof, as commonly known in the art. There is further disclosed that a lateral position of the shoulder edge is determined, and subsequently, that an intervention is generated based on a relative lateral position of the vehicle to the shoulder edge.
  • EP 2 012 211 suggests utilizing a radar for improved quality of the road boundary information, and furthermore subsequently supports preventing an impending or probable road departure in an improved manner, there is still room for alternative approaches to obtain enhanced quality of road boundary information.
  • the object is achieved by a method performed by a boundary estimation system on-board a vehicle for estimating a boundary of a road on which the vehicle is positioned.
  • the road comprises at least a first lane marking arranged to form a straight and/or curved intermittent or continuous line on a road surface along the road.
  • the boundary estimation system monitors the surroundings of the vehicle.
  • the boundary estimation system furthermore detects one or more positions of the at least first lane marking, and approximates a geometrical representation of the at least first lane marking based on one or more of the positions of the at least first lane marking.
  • the boundary estimation system detects one or more positions of a road boundary of the road.
  • the boundary estimation system further approximates a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, and subsequently defines a fictive outer boundary of at least a section of the road, based on laterally shifting at least a section of the geometrical representation of the at least first lane marking, the relative lateral offset.
  • an approach is provided which enables for approximating—in view of a vehicle—a boundary of a road constraining, e.g., laterally, a road surface considered drivable. That is, since the surroundings of the vehicle are monitored, the environment surrounding the vehicle—such as the road, road markings and/or road edges—is sensed, and information and/or an image thereof may be retrieved. Furthermore, since one or more positions of the at least first lane marking are detected, one or more longitudinal and corresponding lateral positions of one or more lane markings are located.
  • a geometrical representation of the at least first lane marking then is approximated based on one or more of the detected positions of the at least first lane marking, a mathematical function describing an approximation of the at least first lane marking is provided, derived from the detected one or more longitudinal and corresponding lateral positions thereof. Moreover, since furthermore one or more positions of a road boundary of the road are detected, one or more longitudinal and corresponding lateral positions of a road boundary along the road are located.
  • a fictive outer boundary of at least a section of the road then is defined, based on laterally shifting the geometrical representation of the at least first lane marking, the relative lateral offset, there is provided an approximation of an outer boundary of the road which is represented by a replica of the geometrical representation of the at least first lane marking, positioned offset in a lateral direction by—and/or based on—the value of the relative lateral offset.
  • a shape of the geometrical representation of the at least first lane marking is utilized to estimate a—laterally shifted—shape of the road boundary. For that reason, an alternative approach of estimating a road boundary is provided.
  • a boundary estimation system on-board a vehicle for estimating a boundary of a road on which the vehicle is positioned
  • an approach is provided which enables for approximating—in view of a vehicle—a boundary of a road constraining, e.g., laterally, a road surface considered drivable.
  • the “road” may be any road intended for vehicle driving, and may be of any shape, width and length, and comprise any arbitrary number of lanes, intersections, cross sections etc.
  • the expression “approximating a boundary constraining a road surface considered drivable”, may refer to “approximating a boundary representing a limit beyond which driving is—and/or is deemed—inappropriate and/or not possible”.
  • the expression “estimating” a boundary of a road may refer to “determining”, “approximating”, “calculating” and/or “providing” a boundary of the road.
  • the expression “road on which the vehicle is positioned”, may refer to “road along which the vehicle is driving”.
  • Vehicle may refer to any arbitrary vehicle, for instance an engine-propelled vehicle such as e.g., a car, truck, lorry, van, bus, tractor, military vehicle, etc.
  • the vehicle may furthermore be represented by a vehicle supporting partially autonomous, semi-autonomous and/or fully autonomous driving.
  • the concept of autonomous driving relates to that the vehicle, at least to some extent, is driven without human interaction.
  • the vehicle may autonomously perform some actions, as e.g., keeping a suitable distance to the vehicle ahead, while the driver may perform other actions, as e.g., overtaking another vehicle when appropriate.
  • the “boundary estimation system” may at least partly be comprised in and/or be integrated with the vehicle, and further for instance be distributed between different nodes, such as distributed between one or more ECUs (“Electronic control modules”). Additionally or alternatively, the boundary estimation system may at least partly be comprised in a mobile device, which mobile device for instance may refer to a multi-functional smart phone, mobile phone, mobile terminal or wireless terminal, portable computer such as a laptop, PDA or tablet computer, tablet such as an iPad, Pocket PC, and/or mobile navigation device. Said optional mobile device, which may be carried on-board the vehicle and/or be attached thereto, may then further be adapted to communicate with the vehicle on which it may be carried and/or mounted.
  • a mobile device which mobile device for instance may refer to a multi-functional smart phone, mobile phone, mobile terminal or wireless terminal, portable computer such as a laptop, PDA or tablet computer, tablet such as an iPad, Pocket PC, and/or mobile navigation device.
  • Said optional mobile device which may be carried on-board the vehicle and/or
  • the expression of the mobile device being adapted to “communicate with the vehicle”, may refer to the mobile device being configured to be in communication with the vehicle, such that information and/or data may be transferred there between. Such communication may for instance be accomplished physically, such as via USB connection, and/or wirelessly, such as via Bluetooth, WiFi, or the like. Said expression may further refer to the mobile device being adapted to be “paired” and/or adapted to “be in connection” with the vehicle. Possibly, in order to be paired, identification of the mobile device may be necessary, and/or authentication of the vehicle occupant, e.g., the driver.
  • the road comprises at least a first lane marking arranged to form a straight and/or curved intermittent or continuous line on a road surface along the road
  • one or more commonly known road markings are distributed along the road.
  • the expression “lane marking” is throughout this disclosure intended to refer to a set of one or more lane markers which together form a straight and/or curved intermittent or continuous line on the ground of the road.
  • a first lane marking may for instance be represented by a plurality of lane markers arranged to form a straight and/or curved intermittent or continuous line on the road surface on the right-hand side of the road.
  • a second lane marking may for instance be represented by a plurality of lane markers arranged to form a straight and/or curved intermittent or continuous line on the road surface between two adjacent lanes.
  • a third lane marking may for instance represent a center line of the road.
  • the lane marking being arranged “on” a road surface may according to an example refer to the lane marking being arranged “essentially on” a road surface, thus including a lane marking being comprised in—and/or even below—the road surface.
  • Such a lane marking may for instance refer to one or more magnets, e.g., ferret magnets, arranged to form a straight and/or curved intermittent or continuous line; a vehicle equipped with magnetic sensors may then be able to sense said magnets. More common however, is a visual two-dimensional and/or essentially two-dimensional lane marking, e.g., being represented by paint such as white and/or yellow paint, provided, marked and/or painted on the road surface. Length, width, shape etc. of a lane marking may vary, as well as a distance to a second, e.g., parallel, lane marking. Similarly, length, width, shape etc. of a lane marker may vary, as well as distances and relations between lane markers. “Lane marking” may refer to “road marking”, whereas “lane marker” may refer to “road marker”.
  • the boundary estimation system monitors the surroundings of the vehicle, the environment surrounding the vehicle—such as the road, road markings and/or road edges—is sensed, and information and/or an image thereof may be retrieved as commonly known in the art.
  • the expression “monitoring the surroundings of the vehicle” may refer to “monitoring the surroundings in front of, and/or essentially in front of, the vehicle”, and moreover, “monitoring” may refer to “sensing” and/or “observing”. Monitoring the surroundings may be accomplished for instance by means of one or more sensors, such as one or more vision sensors, for instance lasers and/or lidars on-board the vehicle—and/or, according to an alternative example—via computer vision.
  • monitoring the surroundings of the vehicle may comprise monitoring the surroundings of the vehicle by means of one or more cameras on-board the vehicle.
  • the sensor(s) for instance the camera(s)—may be arbitrarily arranged throughout the vehicle, for instance arranged in a protected position supporting a substantially clear view such that a clear view of the surroundings may be provided, thus for instance behind the windscreen in the vicinity of, or embedded with, a rear view mirror.
  • the sensor(s) for instance the camera(s)—may monitor the surroundings of the vehicle within a field of view, i.e., within a vision zone.
  • one or more longitudinal and corresponding lateral positions of one or more lane markings are located, as commonly known in the art. That is, one or more longitudinal and corresponding lateral positions, for instance three such positions, of the at least first lane marking, are detected within the field of view and/or within an image of the field of view. For instance, for determining the position(s) of the at least first lane marking on the road surface, a commonly known Hough transform may be utilized.
  • positions x n ,y Ln may be arbitrarily selected as known in the art; merely as an example, positions x 1 ,y L1 , x 4 ,y L4 , x 6 ,y L6 may for instance be selected.
  • An x-axis which may refer to a longitudinal direction, may as commonly known originate in a current position of the vehicle, and may furthermore extend in a running direction thereof.
  • the y-axis which may refer to a lateral direction, is naturally perpendicular to the x-axis. Additionally or alternatively, the x-axis may run in a direction of the road and/or the lane markings.
  • Detecting may throughout this disclosure refer to “locating”, “determining”, “identifying”, “interpreting” and/or “deriving”.
  • positions may throughout this disclosure refer to “geographical positions” and/or “locations”, and furthermore to “longitudinal positions and corresponding lateral positions”.
  • geometrical “representation” may throughout this disclosure refer to geometrical “estimate, function and/or polynomial”, whereas “geometrical” may refer to “mathematical” and/or “polynomial”.
  • geometrical may refer to “mathematical” and/or “polynomial”.
  • approximately may throughout this disclosure refer to “determining”, “calculating” and/or “estimating”, whereas the expression “based on” one or more of the detected positions, may refer to “by calculating based on and/or derived from” one or more of the detected positions.
  • the boundary estimation system furthermore detects one or more positions of a road boundary of the road, one or more longitudinal and corresponding lateral positions of a road boundary along the road are located, as commonly known in the art. That is, one or more longitudinal and corresponding lateral positions, for instance three such positions, of the road boundary, are detected within the field of view and/or within an image of the field of view. For instance, for determining the position(s) of the road boundary, a commonly known Hough transform may be utilized.
  • positions x n ,y Rn may be arbitrarily selected as known in the art; merely as an example, positions x 2 ,y R2 , x 3 ,y R3 , x 5 ,y R5 may for instance be selected.
  • the “road boundary” may be represented by any arbitrary boundary or border constraining, delimiting and/or restricting the drivable surface, commonly in a lateral direction of the road and/or road marking(s).
  • the road boundary may thus be represented by one or more of e.g., a road edge, a transition between e.g., asphalt and grass and/or gravel, a barrier, a delimiter, a line of parked vehicles, a curb, a non-drivable road surface or area, etc.
  • the boundary estimation system further approximates a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, there is estimated a difference in a lateral direction between the approximated geometrical representation of the at least first lane marking and the detected road boundary, valid at, derived from and/or based on a lateral difference at at least one longitudinal position. That is, the approximated relative lateral offset is valid at, derived from and/or based on a lateral delta between one or more lateral positions of the geometrical representation of the at least first lane marking and one or more lateral position of the detected road boundary of the corresponding longitudinal positions. Said at least one longitudinal position may be selected as suitable for the situation and/or conditions at hand.
  • relative lateral offset may refer to “single value relative lateral offset” and/or “fixed relative lateral offset”.
  • lateral may in this context be considered in view of a longitudinal extension of the geometrical representation of the at least first lane marking. Additionally or alternatively, in view of the lane marking(s) and/or the detected road boundary. According to another example, “lateral” may be considered in view of a running direction of the vehicle.
  • the boundary estimation system furthermore defines a fictive outer boundary of at least a section of the road, based on laterally shifting the geometrical representation of the at least first lane marking, the relative lateral offset, there is provided an approximation of an outer boundary of the road which is represented by a replica of the geometrical representation of the at least first lane marking, positioned offset in a lateral direction by—and/or based on—the value of the relative lateral offset.
  • a shape of the geometrical representation of the at least first lane marking is utilized to estimate a—laterally shifted—shape of the road boundary.
  • the geometrical representation of the at least first lane marking and the fictive outer boundary may be represented by parallel curves offset from each other the magnitude and/or value of the approximated relative lateral offset.
  • a lane marking is arranged alongside a road boundary, whereby a smoothed shape of the road boundary—i.e., a shape disregarding small irregularities—essentially may coincide with the shape of the lane marking.
  • Detection and estimation of a lane marking is, however, commonly a more trivial task than detection and estimation of a road boundary, and may hence be carried out with higher accuracy, i.e., lower noise levels and/or lower uncertainty.
  • the fictive outer boundary in addition to being based on the approximated relative lateral offset—is defined in consideration of the geometrical representation of the at least first lane marking which thus is likely to have a shape similar to the road boundary, an approach is introduced which ameliorate uncertainties associated with estimating a geometry of the road boundary. Subsequently, the quality of road boundary information may be enhanced. “Defining” a fictive outer boundary may refer to “determining”, “providing”, “estimating”, “approximating” and/or “deriving” a fictive outer boundary, whereas “based on” laterally shifting, may refer to “derived from” laterally shifting.
  • laterally shifting may refer to “laterally shifting at least a section of”, “hypothetically laterally shifting”, “moving laterally”, and/or “shifting or moving laterally toward the detected road boundary”.
  • lateral may in this context be considered in view of a longitudinal extension of the geometrical representation of the at least first lane marking. According to an alternative example, “lateral” may be considered in view of a running direction of the vehicle.
  • the expression “fictive outer boundary” may refer to “fictive lateral outer boundary”, “fictive outer boundary representing a road boundary”, and/or “fictive curved and/or straight line representing a road boundary”.
  • the expression “laterally shifting the geometrical representation of the at least first lane marking, the relative lateral offset”, may refer to “laterally shifting the geometrical representation of the at least first lane marking, essentially the relative lateral offset”.
  • the magnitude of the shift may be equal to, or essentially equal to, the relative lateral offset.
  • “defining a fictive outer boundary of at least a section of the road, based on laterally shifting the geometrical representation of the at least first lane marking, the relative lateral offset” may comprise “defining a fictive outer boundary of at least a section of the road, such that the fictive outer boundary is a replica of the geometrical representation of the at least first lane marking, positioned offset in a lateral direction based on the relative lateral offset”.
  • approximating the relative lateral offset may comprise approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, derived from one or more detected lateral positions of the road boundary and respective approximated or detected one or more lateral positions of the geometrical representation of the first lane marking having corresponding longitudinal positions.
  • approximating the relative lateral offset may comprise approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, derived from one or more detected lateral positions of the road boundary and respective approximated or detected one or more lateral positions of the geometrical representation of the first lane marking having corresponding longitudinal positions.
  • a relative lateral offset— ⁇ y LR may be valid, derived from and/or based on one or more offsets at the exemplifying detected longitudinal positions x 2 , x 3 , x 5 .
  • the expression “derived from” may in this context refer to “based on”.
  • the boundary estimation system may furthermore approximate a geometrical representation of the road boundary, based on one or more of the detected positions of the road boundary. Approximating the relative lateral offset then comprises approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary represented by the geometrical representation of the road boundary.
  • a geometrical representation of the road boundary such as an initial, rough and/or pre-estimate thereof, a mathematical function describing an approximation of the road boundary is provided, derived from the detected one or more longitudinal and corresponding lateral positions of the road boundary.
  • the geometrical representation of the road boundary may thus be represented by e.g., a curved and/or straight line.
  • a difference in a lateral direction between the approximated geometrical representation of the at least first lane marking and the approximated geometrical representation of the road boundary valid at, derived from and/or based on a lateral difference at at least one longitudinal position.
  • Said at least one longitudinal position may be selected as suitable for the situation and/or conditions at hand.
  • the at least one longitudinal position may be positioned within the field of view as well as outside the field of view.
  • the geometrical representation of the at least first lane marking and/or the geometrical representation of the road boundary may be evaluated to derive for instance a local offset, i.e., an offset at x 0 , which may be accomplished e.g., by extrapolation outside e.g., the polynomial region of the field of view.
  • approximating the relative lateral offset may comprise approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, wherein the relative lateral offset is derived at least partly based on a first offset value, at a first longitudinal position, between a first lateral position of the detected road boundary and a corresponding first lateral position of the geometrical representation of the first lane marking; and further derived at least partly based on at least a second offset value, at at least a second longitudinal position, between at least a second lateral position of the detected road boundary and a corresponding at least second lateral position of the geometrical representation of the first lane marking.
  • the approximated relative lateral offset is valid at, derived from and/or based on two or more lateral deltas between two or more lateral positions of the geometrical representation of the at least first lane marking and one or more lateral position of the detected road boundary of the corresponding longitudinal positions.
  • the relative lateral offset may then e.g., be derived from a mean value of the first and the at least second offset value. Additionally or alternatively, the relative lateral offset may then be derived from weighted values of the first and/or the at least second offset value.
  • detecting the one or more positions of the at least first lane marking may comprise detecting one or more positions of the at least first lane marking, at a first and at at least a second time instant. Approximating the geometrical representation of the at least first lane marking then comprises approximating a first geometrical representation of the at least first lane marking based on one or more of the positions of the at least first lane marking derived from the first time instant, and approximating a second geometrical representation of the at least first lane marking based on one or more of the positions of the at least first lane marking derived from the second time instant.
  • detecting the one or more positions of the road boundary then comprises detecting one or more positions of a road boundary, at the first and at the at least second time instant.
  • approximating the relative lateral offset then comprises approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, wherein the relative lateral offset is derived at least partly based on a first relative lateral offset between the first geometrical representation of the at least first lane marking and the detected road boundary derived from the first time instant, and furthermore derived at least partly based on a second relative lateral offset between the second geometrical representation of the at least first lane marking and the detected road boundary derived from the second time instant.
  • time instances may be arbitrarily selected and determined as commonly known in the art, for instance separated in time by approximately 25 ms.
  • the relative lateral offset being derived at least partly based on a first relative lateral offset between the first geometrical representation of the at least first lane marking and the detected road boundary derived from the first time instant, and furthermore derived at least partly based on a second relative lateral offset between the second geometrical representation of the at least first lane marking and the detected road boundary derived from the second time instant
  • the resulting relative lateral offset may be derived from plural offsets derived from plural time instances. Accordingly, yet further enhanced round boundary information quality may be provided, in that data detected at plural time instances may be utilized.
  • the resulting relative lateral offset may be derived from a mean value of the first relative lateral offset derived from the first time instant and the at least second relative lateral offset derived from the at least second time instant. Additionally or alternatively, the relative lateral offset may be derived from weighted values of the first relative lateral offset derived from the first time instant and the at least second relative lateral offset derived from the at least second time instant.
  • the boundary estimation system may further define at least a section of an intervention path based on the fictive outer boundary.
  • an intervention path is provided which—rather than as commonly known being defined based on e.g., a geometrical representation of a lane marking—is derived from the defined fictive outer boundary.
  • intervention path which commonly may be referred to as an “avoiding path”—is throughout this disclosure intended to refer to an intervention path as commonly known in the art, with the exception that the intervention path is defined based on the introduced fictive outer boundary.
  • the boundary estimation system may further assess a risk of an imminent road departure of the vehicle based on comparison of a relative lateral position of the vehicle to the fictive outer boundary.
  • a risk assessment is performed which—rather than as commonly known be based on comparison of a relative lateral position of the vehicle to e.g., a geometrical representation of a lane marking—is based on a relative lateral position of the vehicle to the defined fictive outer boundary. Determining a lateral position of the vehicle relative the fictive outer boundary may be accomplished as commonly known in the art. Moreover, assessing the risk of an imminent road departure, e.g., estimated to occur within 1 s, is throughout this disclosure intended to refer to commonly known manners of doing so, with the exception that the risk assessment is based on the introduced fictive outer boundary. According to an example, a warning may be initiated to the vehicle driver, based on said assessed risk.
  • the boundary estimation system may optionally intervene in steering and/or braking of the vehicle based on a proximity of a relative lateral position of the vehicle to the fictive outer boundary.
  • intervention is provided which—rather than as commonly known be based on a proximity of a relative lateral position of the vehicle to e.g., a geometrical representation of a lane marking—is based on a proximity of a relative lateral position of the vehicle to the introduced fictive outer boundary. Determining a proximity of a relative lateral position of the vehicle to the fictive outer boundary may be accomplished in any known manner.
  • intervention may be accomplished as commonly known, for instance by means of steering and/or braking.
  • the object is achieved by a boundary estimation system adapted for on-board a vehicle estimating a boundary of a road on which the vehicle is positioned.
  • the road comprises at least a first lane marking arranged to form a straight and/or curved intermittent or continuous line on a road surface along the road.
  • the boundary estimation system comprises a monitoring unit adapted for monitoring the surroundings of the vehicle, a lane marking unit adapted for detecting one or more positions of the at least first lane marking, and a geometrical representation approximating unit adapted for approximating a geometrical representation of the at least first lane marking based on one or more of the detected positions of the at least first lane marking.
  • the boundary estimation system further comprises a road boundary detecting unit adapted for detecting one or more positions of a road boundary of the road, an offset approximating unit adapted for approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, and a fictive boundary defining unit adapted for defining a fictive outer boundary of at least a section of the road, based on laterally shifting at least a section of the geometrical representation of the at least first lane marking, the relative lateral offset.
  • the offset approximating unit may further be adapted for approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, derived from one or more detected lateral positions of the road boundary and respective one or more lateral positions of the geometrical representation of the first lane marking having corresponding longitudinal positions.
  • the geometrical representation approximating unit may further be adapted for approximating a geometrical representation of the road boundary, based on one or more of the detected positions of the road boundary. The offset approximating unit is then further adapted for approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary represented by the geometrical representation of the road boundary.
  • the offset approximating unit may further be adapted for approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, wherein the relative lateral offset is derived at least partly based on a first offset value, at a first longitudinal position, between a first lateral position of the detected road boundary and a corresponding first lateral position of the geometrical representation of the first lane marking; and at least partly based on at least a second offset value, at at least a second longitudinal position, between at least a second lateral position of the detected road boundary and a corresponding at least second lateral position of the geometrical representation of the first lane marking.
  • the lane marking detecting unit may further be adapted for detecting one or more positions of the at least first lane marking, at a first and at at least a second time instant.
  • the geometrical representation unit is then further adapted for approximating a first geometrical representation of the at least first lane marking based on one or more of the positions of the at least first lane marking derived from the first time instant; and approximating a second geometrical representation of the at least first lane marking based on one or more of the positions of the at least first lane marking derived from the second time instant.
  • the road boundary detecting unit is then further adapted for detecting one or more positions of a road boundary, at the first and at the at least second time instant.
  • the offset approximating unit is then further adapted for approximating a relative lateral offset between the geometrical representation of the at least first lane marking and the detected road boundary, wherein the relative lateral offset is derived at least partly based on a first relative lateral offset between the first geometrical representation of the at least first lane marking and the detected road boundary derived from the first time instant; and at least partly based on a second relative lateral offset between the second geometrical representation of the at least first lane marking and the detected road boundary derived from the second time instant.
  • the boundary estimation system may comprise an intervention path defining unit adapted for defining at least a section of an intervention path based on the fictive outer boundary. Additionally or alternatively, optionally, the boundary estimation system may comprise a risk assessment unit adapted for assessing a risk of an imminent road departure of the vehicle based on comparison of a relative lateral position of the vehicle to the fictive outer boundary. Furthermore, additionally or alternatively, the boundary estimation system may optionally comprise an intervening unit adapted for intervening in steering and/or braking of the vehicle based on a proximity of a relative lateral position of the vehicle to the fictive outer boundary.
  • the object is achieved by a vehicle at least partly comprising the monitoring unit, the lane marking detecting unit, the geometrical representation approximating unit, the road boundary detecting unit, the offset approximating unit, the fictive boundary defining unit, the optional intervention path defining unit, the optional risk assessment unit and/or the optional intervening unit discussed above.
  • a vehicle at least partly comprising the monitoring unit, the lane marking detecting unit, the geometrical representation approximating unit, the road boundary detecting unit, the offset approximating unit, the fictive boundary defining unit, the optional intervention path defining unit, the optional risk assessment unit and/or the optional intervening unit discussed above.
  • a computer program product comprising a computer program containing computer program code means arranged to cause a computer or a processor to execute the steps of the boundary estimation system discussed above, stored on a computer-readable medium or a carrier wave.
  • FIG. 1 illustrates a schematic overview of an exemplifying boundary estimation system on-board a vehicle according to embodiments of the disclosure, and surroundings of a road on which the vehicle is positioned;
  • FIGS. 2A-B illustrates schematic overviews according to exemplifying embodiments of the disclosure, which may result from the conditions of FIG. 1 ;
  • FIG. 3 illustrates a schematic overview of a relative lateral offset and a fictive outer boundary according to exemplifying embodiments of the disclosure, which may result from the conditions of FIGS. 2A-B ;
  • FIGS. 4A-D illustrates schematic overviews of varying road surroundings, and of relative lateral offsets and fictive outer boundaries according to exemplifying alternative embodiments of the disclosure
  • FIG. 5 is a schematic block diagram illustrating an exemplifying boundary estimation system according to embodiments of the disclosure.
  • FIG. 6 is a flowchart depicting an exemplifying method for estimating a boundary of a road according to embodiments of the disclosure.
  • FIG. 1 there is illustrated a schematic overview of an exemplifying boundary estimation system 1 on-board a vehicle 2 according to embodiments of the disclosure, and surroundings of a road 3 on which the vehicle 2 is positioned.
  • the boundary estimation system 1 which will be described in greater detail further on, is adapted for on-board the vehicle 2 estimating a boundary of the road 3 on which the vehicle 2 is positioned.
  • the vehicle 2 shown in the exemplifying embodiment comprises the boundary estimation system 1 , and is a passenger car supporting—at least to some extent—autonomous driving.
  • the vehicle 2 is in the shown embodiment driven on the right-hand side of the road 3 , although this is merely exemplifying; according to alternative embodiments, the vehicle 2 may likewise be driven on the left-hand side of the road 3 .
  • the road 3 comprises at least a first lane marking L 1 arranged to form a straight and/or curved intermittent or continuous line on a road surface along the road 3 .
  • the road 3 further comprises an exemplifying second lane marking L 2 similarly arranged to form a straight and/or curved intermittent or continuous line on a road surface along the road 3 .
  • the first lane marking L 1 is in the shown exemplifying embodiment arranged alongside a road boundary R of the road 3 , here represented by a road edge beyond which there is e.g., grass and/or gravel.
  • the second lane marking L 2 is in the shown exemplifying embodiment arranged in parallel to the first lane marking L 1 . Further shown in FIG. 1 is how the boundary estimation system 1 monitors the surroundings of the vehicle 2 , for instance with support from at least a first camera 21 on-board the vehicle 2 .
  • FIG. 2A illustrates a schematic overview according to an exemplifying embodiment of the disclosure, which may result from the conditions of FIG. 1 . Shown is a geometrical representation G 1 of the first lane marking L 1 based on the detected positions (x 1 ,y L1 ), (x 4 ,y L4 ), (x 6 ,y L6 ) of the first lane marking L 1 .
  • offset values namely an offset value ⁇ y LR (x 2 ) at the longitudinal position x 2 between the lateral position y R2 of the detected road boundary R and a corresponding lateral position y L (x 2 ) of the geometrical representation G 1 ; an offset value ⁇ y LR (x 3 ) at the longitudinal position x 3 between the lateral position y R3 of the detected road boundary R and a corresponding lateral position y L (x 3 ) of the geometrical representation G 1 ; and an offset value ⁇ y LR (x 5 ) at the longitudinal position x 5 between the lateral position y R5 of the detected road boundary R and a corresponding lateral position y L (x 5 ) of the geometrical representation G 1 .
  • FIG. 2B illustrates a schematic overview according to an exemplifying alternative embodiment of the disclosure, which also may result from the conditions of FIG. 1 .
  • offset values namely the offset value ⁇ y LR (x 2 ) at the longitudinal position x 2 between the lateral position y R (x 2 ) of the geometrical representation 4 of the detected road boundary R and the corresponding lateral position y L (x 2 ) of the geometrical representation G 1 of the first lane marking L 1 ; an offset value ⁇ y LR (x 7 ) at a longitudinal position x 7 between the lateral position y R (x 7 ) of the geometrical representation 4 of the detected road boundary R and the corresponding lateral position y L (x 7 ) of the geometrical representation G 1 of the first lane marking L 1 ; and an offset value ⁇ y LR (x 8 ) at a longitudinal position x 8 between the lateral position y R (x 8 ) of the geometrical representation 4 of the detected road boundary R and the corresponding lateral position y L (x 8 ) of the geometrical representation G 1 of the first lane marking L 1 .
  • FIG. 3 illustrates a schematic overview of a relative lateral offset ⁇ y LR and a fictive outer boundary 5 according to an exemplifying embodiment of the disclosure, which may result from the conditions of FIGS. 2A and/or 2B .
  • the relative lateral offset ⁇ y LR which has a constant magnitude, i.e., a fixed value, along the geometrical representation G 1 of the first lane marking L 1 will be described in greater detail further on in this description.
  • the fictive outer boundary 5 which is “parallel” to the geometrical representation G 1 distanced the relative lateral offset ⁇ y LR therefrom, i.e., positioned a fixed distance namely the relative lateral offset ⁇ y LR from the geometrical representation G 1 —will in a similar manner be described in greater detail further on in this description. Further shown is an intervention path 6 , which is based on the fictive outer boundary 5 .
  • FIG. 4A illustrates a schematic overview of an alternative road surroundings, and a relative lateral offset ⁇ y LR ′ and a fictive outer boundary 5 ′ according to an exemplifying alternative embodiment of the disclosure.
  • the road boundary R′ here comprises a gap 7 in the form of a road exit 7 ′, due to which there is a disruption in the first lane marking L 1 ′ and subsequently in the geometrical representation G 1 ′ of the first lane marking L 1 ′. Further shown is thus a geometrical representation G 2 of the second lane marking L 2 , with which the fictive outer boundary 5 ′ instead is parallel, distanced the relative lateral offset ⁇ y LR ′ therefrom.
  • FIG. 4B illustrates a schematic overview of another alternative road surroundings, and a relative lateral offset ⁇ y LR ′′ and a fictive outer boundary 5 ′′ according to an exemplifying another alternative embodiment of the disclosure.
  • the fictive outer boundary 5 ′′ is parallel to the geometrical representation G 1 ′′ of the first lane marking L 1 ′, distanced the relative lateral offset ⁇ y LR ′′ therefrom.
  • the road boundary R′′ is here represented by a barrier having a gap 7 ′′, which gap 7 ′′ the fictive outer boundary 5 ′′ may be independent of.
  • FIG. 4C illustrates a schematic overview of yet another alternative road surroundings, and a relative lateral offset ⁇ y LR ′′′ and a fictive outer boundary 5 ′′′ according to an exemplifying yet another alternative embodiment of the disclosure.
  • the fictive outer boundary 5 ′′′ is parallel to the geometrical representation G 2 ′′′ of the second lane marking L 2 ′′′, distanced the relative lateral offset ⁇ y LR ′′′ therefrom.
  • the exemplifying road boundary R′′′ is here situated on the left-hand side of the vehicle 2 .
  • FIG. 4D illustrates a schematic overview of still another alternative road surroundings, and a relative lateral offset ⁇ y LR ′′′′ and a fictive outer boundary 5 ′′′′ according to an exemplifying still another alternative embodiment of the disclosure.
  • the fictive outer boundary 5 ′ is parallel to the geometrical representation G 1 ′′′′ of the first lane marking L 1 ′′′′, distanced the relative lateral offset ⁇ y LR ′′′′ therefrom.
  • the exemplifying road boundary R′′′′ is here represented by a line of parked vehicles, between which there naturally are gaps 7 ′′′′; the fictive outer boundary 5 ′′′′ may, however, be independent of said gaps 7 ′′′′.
  • the boundary estimation system 1 comprises a monitoring unit 101 , a lane marking detecting unit 102 , a geometrical representation approximating unit 103 , a road boundary detecting unit 104 , an offset approximating unit 105 , and a fictive boundary defining unit 106 , all of which will be described in greater detail further on.
  • the boundary estimation system 1 may furthermore comprise an optional intervention path defining unit 107 , an optional risk assessment unit 108 and/or an optional intervening unit 109 , which similarly will be described in greater detail further on in the description.
  • the embodiments herein for enabling the boundary estimation system 1 for in view of the vehicle 2 estimating a boundary of the road 3 on which the vehicle 2 is positioned may be implemented through one or more processors, such as a processor 110 , here denoted CPU, together with computer program code for performing the functions and actions of the embodiments herein.
  • Said program code may also be provided as a computer program product, for instance in the form of a data carrier carrying computer program code for performing the embodiments herein when being loaded into the boundary estimation system 1 .
  • One such carrier may be in the form of a CD ROM disc. It is however feasible with other data carriers such as a memory stick.
  • the computer program code may furthermore be provided as pure program code on a server and downloaded to the boundary estimation system 1 .
  • the boundary estimation system 1 may further comprise a memory 111 comprising one or more memory units.
  • the memory 111 may be arranged to be used to store e.g., information, and further to store data, configurations, schedulings, and applications, to perform the methods herein when being executed in the boundary estimation system 1 .
  • one or more of said units 101 , 102 , 103 , 104 , 105 , 106 , 107 , 108 , 109 , and/or the processor 110 and/or the memory 11 may for instance be implemented in one or several arbitrary nodes 112 , and/or in one or more mobile units which may be carried on-board, be mounted to and/or be integrated with the vehicle 2 .
  • a node 112 may be an electronic control unit (ECU) or any suitable generic electronic device throughout the vehicle 2 .
  • the boundary estimation system 1 may be represented by a plug-in solution, such that said boundary estimation system 1 at least partly is implemented on for instance a dongle. In that manner, an aftermarket solution may be provided to any arbitrary vehicle 2 and/or mobile device suitable.
  • one or more of the units 101 , 102 , 103 , 104 , 105 , 106 , 107 , 108 , 109 described above, and which will be described in more detail later on in this description, may refer to a combination of analog and digital circuits, and/or one or more processors configured with software and/or firmware, e.g., stored in a memory such as the memory 111 , that when executed by the one or more processors such as the processor 110 perform as will be described in more detail later on.
  • processors may be included in a single ASIC (Application-Specific Integrated Circuitry), or several processors and various digital hardware may be distributed among several separate components, whether individually packaged or assembled into a SoC (System-on-a-Chip). Further shown in FIG. 5 is the exemplifying optional at least first camera 21 , an optional steering assisting system 22 , and an optional braking assisting system 23 .
  • FIG. 6 is a flowchart depicting an exemplifying method for estimating a boundary of the road 3 according to embodiments of the disclosure.
  • the method is performed by the boundary estimation system 1 on-board the vehicle 2 .
  • the road 3 comprises at least a first lane marking L 1 arranged to form a straight and/or curved intermittent or continuous line on a road surface along the road 3 .
  • the exemplifying method which may be continuously repeated, comprises the following actions discussed with support from FIGS. 1-5 .
  • the actions may be taken in any suitable order, e.g., may Actions 1002 and 1004 be performed simultaneously and/or in an alternate order.
  • the boundary estimation system 1 monitors the surroundings of the vehicle 2 .
  • the monitoring unit 101 is adapted for monitoring the surroundings of the vehicle 2 .
  • the environment surrounding the vehicle 2 such as the road 3 , road markings L 1 , L 2 and/or road edges R—is sensed, for instance with support from the one or more cameras 21 .
  • the boundary estimation system 1 detects one or more positions (x n ,y Ln ) of the at least first lane marking L 1 .
  • the lane marking detecting unit 102 is adapted for detecting one or more positions (x n ,y Ln ) of the at least first lane marking L 1 .
  • one or more longitudinal and corresponding lateral positions (x n ,y Ln ) of one or more lane markings L 1 are located, for instance the positions (x 1 ,y L1 ), (x 4 ,y L4 ), (x 6 ,y L6 ).
  • detecting one or more positions (x n ,y Ln ) of the at least first lane marking L 1 may comprise detecting one or more positions (x n ,y Ln ) of the at least first lane marking L 1 , at a first time instant t 1 (not shown) and at at least a second time instant t 2 (not shown).
  • the lane marking detecting unit may be adapted for detecting one or more positions (x n ,y Ln ) of the at least first lane marking L 1 , at a first time instant t 1 and at at least a second time instant t 2 .
  • longitudinal and corresponding lateral positions (x n ,y Ln ) of the at least first lane marking L 1 are then detected at plural time instances t 1 , t 2 .
  • the boundary estimation system 1 approximates the geometrical representation G 1 of the at least first lane marking L 1 based on one or more of the detected positions (x n ,y Ln ) of the at least first lane marking L 1 .
  • the geometrical representation approximating unit 103 is adapted for approximating the geometrical representation G 1 of the at least first lane marking L 1 based on one or more of the detected positions (x n ,y Ln ) of the at least first lane marking L 1 .
  • a mathematical function G 1 describing an approximation of the at least first lane marking L 1 is derived from the detected one or more longitudinal and corresponding lateral positions (x 1 ,y L1 ), (x 4 ,y L4 ), (x 6 ,y L6 ) thereof.
  • —approximating the geometrical representation G 1 of the at least first lane marking L 1 may comprise: approximating a first geometrical representation G 11 (not shown) of the at least first lane marking L 1 based on one or more of the positions (x n ,y Ln ) of the at least first lane marking L 1 derived from the first time instant t 1 ; and approximating a second geometrical representation G 12 (now shown) of the at least first lane marking L 1 based on one or more of the positions (x n ,y Ln ) of the at least first lane marking L 1 derived from the second time instant t 2 .
  • the geometrical representation approximating unit 103 may be adapted for approximating a first geometrical representation G 11 of the at least first lane marking L 1 based on one or more of the positions (x n ,y Ln ) of the at least first lane marking L 1 derived from the first time instant t 1 ; and further adapted for approximating a second geometrical representation G 12 of the at least first lane marking L 1 based on one or more of the positions (x n ,y Ln ) of the at least first lane marking L 1 derived from the second time instant t 2 .
  • plural geometrical representations G 11 ,G 12 of the at least first lane marking L 1 are then approximated derived from plural time instances t 1 , t 2 .
  • the boundary estimation system 1 detects one or more positions (x n ,y Rn ) of the road boundary R of the road 3 .
  • the road boundary detecting unit 104 is adapted for detecting one or more positions (x n ,y Rn ) of the road boundary R of the road 3 .
  • one or more longitudinal and corresponding lateral positions (x n ,y Rn ) of the road boundary R along the road 3 are located, for instance the positions (x 2 ,y L2 ), (x 3 ,y L3 ), (x 5 ,y L5 ).
  • detecting one or more positions (x n ,y Rn ) of the road boundary R may comprise detecting one or more positions (x n ,y Rn ) of the road boundary R, at the first time instant t 1 and at the at least second time instant t 2 .
  • the road boundary detecting unit 104 may be adapted for detecting one or more positions (x n ,y Rn ) of a road boundary R at the first time instant t 1 and at the at least second time instant t 2 .
  • longitudinal and corresponding lateral positions (x n ,y Rn ) of the road boundary R are then detected at plural time instances t 1 , t 2 .
  • the boundary estimation system 1 may approximate the geometrical representation 5 of the road boundary R, based on one or more of the detected positions (x n ,y Rn ) of the road boundary R.
  • the geometrical representation approximating unit 103 may be adapted for approximating the geometrical representation 5 of the road boundary R, based on one or more of the detected positions (x n ,y Rn ) of the road boundary R.
  • a mathematical function 4 describing an approximation 4 of the road boundary R is provided, derived from the detected one or more longitudinal and corresponding lateral positions (x n ,y Rn ) of the road boundary R, for instance (x 2 ,y R2 ), (x 3 ,y R3 ), (x 5 ,y R5 ).
  • the boundary estimation system 1 approximates the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R.
  • the offset approximating unit 105 is adapted for approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R.
  • a difference ⁇ y LR in a lateral direction between the approximated geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R valid at, derived from and/or based on a lateral difference ⁇ y LR (x n ) at at least one ⁇ y LR (x n ) longitudinal position x n , such as based on for instance ⁇ y LR (x 2 ), ⁇ y LR (x 3 ), ⁇ y LR (x 5 ), ⁇ y LR (x 7 ), ⁇ y LR (x 8 ), and/or ⁇ y LR (x 0 ) at respective longitudinal position x 2 , x 3 , x 5 , x 7 , x 8 , and x 0 .
  • approximating the relative lateral offset ⁇ y LR may comprise approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, derived from one or more detected lateral positions y Rn of the road boundary R and respective one or more lateral positions y L (x n ) of the geometrical representation G 1 of the first lane marking L 1 having corresponding longitudinal positions x n .
  • the offset approximating unit 105 may be adapted for approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, derived from one or more detected lateral positions y Rn of the road boundary R and respective one or more lateral positions y L (x n ) of the geometrical representation G 1 of the first lane marking L 1 having corresponding longitudinal positions x n .
  • the offset approximating unit 105 may be adapted for approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, derived from one or more detected lateral positions y Rn of the road boundary R and respective one or more lateral positions y L (x n ) of the geometrical representation G 1 of the first lane marking L 1 having corresponding longitudinal positions x n .
  • a difference ⁇ y LR in a lateral direction between the approximated geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R valid at, derived from and/or based on a lateral difference ⁇ y LR (x n ) at at least one longitudinal position x n coinciding with the detected one or more longitudinal road boundary positions x n ; for instance ⁇ y LR (x 2 ), ⁇ y LR (x 3 ) and/or ⁇ y LR (x 5 ) at respective longitudinal positions x 2 , x 3 , x 5 .
  • approximating the relative lateral offset ⁇ y LR may comprise approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R represented by the geometrical representation 4 of the road boundary R.
  • the offset approximating unit 105 may be adapted for approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R represented by the geometrical representation 4 of the road boundary R.
  • a difference ⁇ y LR in a lateral direction between the approximated geometrical representation G 1 of the at least first lane marking L 1 and the approximated geometrical representation 4 of the road boundary R valid at, derived from and/or based on a lateral difference ⁇ y LR (x n ) at at least one longitudinal position x n ; for instance ⁇ y LR (x 2 ), ⁇ y LR (x 7 ), ⁇ y LR (x 8 ) and/or ⁇ y LR (x 0 ) at respective longitudinal position x 2 , x 7 , x 8 , x 0 .
  • approximating the relative lateral offset ⁇ y LR may comprise approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, wherein the relative lateral offset ⁇ y LR is derived: at least partly based on a first offset value ⁇ y LR (x n1 ), at a first longitudinal position x n1 , between a first lateral position y Rn1 of the detected road boundary R and a corresponding first lateral position y L (x n1 ) of the geometrical representation G 1 of the first lane marking L 1 ; and at least partly based on at least a second offset value ⁇ y LR (x n2 ), at at least a second longitudinal position x n2 , between at least a second lateral position y Rn2 of the detected road boundary R and a corresponding at least second lateral position y L (x n2 ) of the geometrical representation G 1 of
  • the offset approximating unit 105 may be adapted for approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, wherein the relative lateral offset ⁇ y LR is derived: at least partly based on a first offset value ⁇ y LR (x n1 ), at a first longitudinal position x n1 , between a first lateral position y Rn1 of the detected road boundary R and a corresponding first lateral position y L (x n1 ) of the geometrical representation G 1 of the first lane marking L 1 ; and at least partly based on at least a second offset value ⁇ y LR (x n2 ), at at least a second longitudinal position x n2 , between at least a second lateral position y Rn2 of the detected road boundary R and a corresponding at least second lateral position y L (x n2 ) of the geometrical representation G 1 of the first
  • ⁇ y LR in a lateral direction between the approximated geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R—and/or the approximated geometrical representation 4 of the detected road boundary R—valid at, derived from and/or based on a first offset value ⁇ y LR (x n1 ) e.g., ⁇ y LR (x 2 ) at a first longitudinal position x n1 e.g., x 2 and at least a second offset ⁇ y LR (x n1 ) e.g., ⁇ y LR (x 3 ) at at least a second longitudinal position x n2 e.g., x 3 .
  • the approximated relative lateral offset ⁇ y LR is valid at, derived from and/or based on two or more lateral deltas e.g., ⁇ y LR (x 2 ), ⁇ y LR (x 3 ), ⁇ y LR (x 5 ), ⁇ y LR (x 7 ), ⁇ y LR (x 8 ) and/or ⁇ y LR (x 0 ) between two or more lateral positions e.g., y L (x 2 ), y L (x 3 ), y L (x 8 ) and/or y L (x 0 ) of the geometrical representation G 1 of the at least first lane marking L 1 and one or more lateral position y R (x 2 ), y R (x 3 ), y R (x 5 ), y R (x 7 ), y R (x 8 ) and/or y R (x 0 ) of the detected road boundary R of the corresponding longitudinal positions x 2 , x 3 , x 5
  • the relative lateral offset ⁇ y LR may then for instance be derived from a mean value of the first offset value ⁇ y LR (x n1 ) and the at least second offset value ⁇ y LR (x n2 ). Additionally or alternatively, the relative lateral offset ⁇ y LR may then be derived from weighted values of the first offset value ⁇ y LR (x n1 ) and/or the at least second offset value ⁇ y LR (x n2 ).
  • ⁇ y LR may comprise approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, wherein the relative lateral offset ⁇ y LR is derived: at least partly based on a first relative lateral offset ⁇ y LR (x n1 ) between the first geometrical representation G 11 of the at least first lane marking L 1 and the detected road boundary R derived from the first time instant t 1 ; and at least partly based on a second relative lateral offset ⁇ y LR (x n2 ) between the second geometrical representation G 12 of the at least first lane marking L 1 and the detected road boundary R derived from the second time instant t 2 .
  • the offset approximating unit 105 may be adapted for approximating the relative lateral offset ⁇ y LR between the geometrical representation G 1 of the at least first lane marking L 1 and the detected road boundary R, wherein the relative lateral offset ⁇ y LR is derived: at least partly based on a first relative lateral offset ⁇ y LR (x n1 ) between the first geometrical representation G 11 of the at least first lane marking L 1 and the detected road boundary R derived from the first time instant t 1 ; and at least partly based on a second relative lateral offset ⁇ y LR (x n2 ) between the second geometrical representation G 12 of the at least first lane marking L 1 and the detected road boundary R derived from the second time instant t 2 .
  • the resulting relative lateral offset ⁇ y LR may then be derived from plural offsets ⁇ y LR (x n1 ), ⁇ y LR (x n2 ) derived from plural time instances t 1 , t 2 .
  • the resulting relative lateral offset ⁇ y LR may be derived from a mean value of the first relative lateral offset ⁇ y LR (x n1 ) derived from the first time instant t 1 and the at least second relative lateral offset ⁇ y LR (x n2 ) derived from the at least second time instant t 2 .
  • the relative lateral offset ⁇ y LR may be derived from weighted values of the first relative lateral offset ⁇ y LR (x n1 ) derived from the first time instant t 1 and the at least second relative lateral offset ⁇ y LR (x n2 ) derived from the at least second time instant t 2 .
  • the boundary estimation system 1 defines the fictive outer boundary 5 of at least a section of the road 3 , based on laterally shifting at least a section of the geometrical representation G 1 of the at least first lane marking L 1 , the relative lateral offset ⁇ y LR .
  • the fictive boundary defining unit 106 is adapted for defining the fictive outer boundary 5 of at least a section of the road 3 , based on laterally shifting at least a section of the geometrical representation G 1 of the at least first lane marking L 1 , the relative lateral offset ⁇ y LR .
  • an approximation 5 of an outer boundary of the road 3 which is represented by a replica of the geometrical representation G 1 of the first lane marking L 1 and/or the geometrical representation G 2 of the second lane marking L 2 , positioned offset in a lateral direction by—and/or based on—the value of the relative lateral offset ⁇ y LR .
  • a shape of a geometrical representation(s) G 1 ,G 2 of a lane marking(s) L 1 ,L 2 is utilized to estimate a—laterally shifted—shape of the road boundary R.
  • the fictive outer boundary 5 and the geometrical representation(s) G 1 ,G 2 of the lane marking(s) L 1 ,L 2 may be represented by parallel curved and/or straight lines offset from each other the fixed value of the approximated relative lateral offset ⁇ y LR .
  • the boundary estimation system 1 may define at least a section of the intervention path 6 based on the fictive outer boundary 5 .
  • the intervention path defining unit 107 may be adapted for defining at least a section of the intervention path 6 based on the fictive outer boundary 5 .
  • an intervention path 6 is provided which—rather than as commonly known being defined based on e.g., a geometrical representation G 1 , G 2 of a lane marking L 1 , L 2 —is derived from the defined fictive outer boundary 5 .
  • the boundary estimation system 1 may assess a risk of an imminent road departure of the vehicle 2 based on comparison of a relative lateral position of the vehicle 2 to the fictive outer boundary 5 .
  • the risk assessment unit may be adapted for assessing a risk of an imminent road departure of the vehicle 2 based on comparison of a relative lateral position of the vehicle 2 to the fictive outer boundary 5 .
  • a risk assessment is performed which—rather than as commonly known be based on comparison of a relative lateral position of the vehicle 2 to e.g., a geometrical representation G 1 , G 2 of a lane marking L 1 , L 2 —is based on a relative lateral position of the vehicle 2 to the defined fictive outer boundary 5 .
  • the boundary estimation system 1 may intervene
  • the intervening unit 109 may be adapted for intervening in steering and/or braking of the vehicle 2 based on a proximity of a relative lateral position of the vehicle 2 to the fictive outer boundary 5 .
  • intervention is provided which—rather than as commonly known be based on a proximity of a relative lateral position of the vehicle 2 to e.g., a geometrical representation G 1 , G 2 of a lane marking L 1 , L 2 —is based on a proximity of a relative lateral position of the vehicle 2 to the introduced fictive outer boundary 5 . Determining a proximity of a relative lateral position of the vehicle 2 to the fictive outer boundary 5 may be accomplished in any known manner. Moreover, intervention may be accomplished as commonly known in the art, for instance with support from the steering assisting system 22 and/or the braking assisting system 23 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
US15/448,991 2016-03-10 2017-03-03 Method and system for estimating a boundary of a road Active 2037-09-29 US10480948B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP16159536 2016-03-10
EP16159536.8 2016-03-10
EP16159536.8A EP3217374A1 (en) 2016-03-10 2016-03-10 Method and system for estimating a boundary of a road technical field

Publications (2)

Publication Number Publication Date
US20170261327A1 US20170261327A1 (en) 2017-09-14
US10480948B2 true US10480948B2 (en) 2019-11-19

Family

ID=55527357

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/448,991 Active 2037-09-29 US10480948B2 (en) 2016-03-10 2017-03-03 Method and system for estimating a boundary of a road

Country Status (3)

Country Link
US (1) US10480948B2 (zh)
EP (1) EP3217374A1 (zh)
CN (1) CN107176167B (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3570262A4 (en) * 2017-01-10 2019-12-18 Mitsubishi Electric Corporation VEHICLE DETECTION DEVICE AND METHOD DETECTION METHOD
WO2019151110A1 (ja) * 2018-01-31 2019-08-08 パイオニア株式会社 路面情報取得方法
US10967851B2 (en) * 2018-09-24 2021-04-06 Ford Global Technologies, Llc Vehicle system and method for setting variable virtual boundary
JP7183729B2 (ja) * 2018-11-26 2022-12-06 トヨタ自動車株式会社 撮影異常診断装置
EP3716138A1 (en) * 2019-03-29 2020-09-30 Zenuity AB Road boundary determination
CN110174113B (zh) * 2019-04-28 2023-05-16 福瑞泰克智能系统有限公司 一种车辆行驶车道的定位方法、装置及终端
US20220266825A1 (en) * 2019-07-01 2022-08-25 Zenuity Ab Sourced lateral offset for adas or ad features
CN111027423B (zh) * 2019-11-28 2023-10-17 北京百度网讯科技有限公司 一种自动驾驶车道线检测方法、装置和电子设备

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6038496A (en) * 1995-03-07 2000-03-14 Daimlerchrysler Ag Vehicle with optical scanning device for a lateral road area
US6937165B2 (en) * 2002-09-23 2005-08-30 Honeywell International, Inc. Virtual rumble strip
EP1674826A1 (en) 2004-12-24 2006-06-28 Aisin Aw Co., Ltd. Systems, methods, and programs for determining whether a vehicle is on-road or off-road
US20070274566A1 (en) * 2006-05-24 2007-11-29 Nissan Motor Co., Ltd. Pedestrian detector and pedestrian detecting method
EP2012211A1 (en) 2007-07-03 2009-01-07 Ford Global Technologies, LLC A system for monitoring the surroundings of a vehicle
US20100259617A1 (en) * 2009-04-09 2010-10-14 Nippon Soken, Inc. Boundary line recognition apparatus
US20120069185A1 (en) * 2010-09-21 2012-03-22 Mobileye Technologies Limited Barrier and guardrail detection using a single camera
US20120288154A1 (en) * 2009-12-28 2012-11-15 Hitachi Automotive Systems, Ltd. Road-Shoulder Detecting Device and Vehicle Using Road-Shoulder Detecting Device
US8411900B2 (en) * 2008-09-17 2013-04-02 Hitachi Automotive Systems, Ltd. Device for detecting/judging road boundary
US20130274959A1 (en) * 2010-12-15 2013-10-17 Toyota Jidosha Kabushiki Kaisha Driving support apparatus, driving support method, and vehicle
WO2013169153A1 (en) 2012-05-08 2013-11-14 Autoliv Development Ab A lane-marking crossing warning system
US20140160244A1 (en) * 2010-09-21 2014-06-12 Mobileye Technologies Limited Monocular cued detection of three-dimensional structures from depth images
US20140161323A1 (en) * 2010-09-21 2014-06-12 Mobileye Technologies Limited Dense structure from motion
US20140257659A1 (en) * 2013-03-11 2014-09-11 Honda Motor Co., Ltd. Real time risk assessments using risk functions
US20140379166A1 (en) * 2013-06-21 2014-12-25 Fuji Jukogyo Kabushiki Kaisha Vehicle drive assist system
US20150165972A1 (en) * 2012-06-19 2015-06-18 Toyota Jidosha Kabushiki Kaisha Roadside object detection apparatus
US20150332114A1 (en) * 2014-05-14 2015-11-19 Mobileye Vision Technologies Ltd. Systems and methods for curb detection and pedestrian hazard assessment
US20150363668A1 (en) * 2014-06-13 2015-12-17 Fujitsu Limited Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
US20160012300A1 (en) * 2014-07-11 2016-01-14 Denso Corporation Lane boundary line recognition device
US20160042236A1 (en) * 2013-04-08 2016-02-11 Denso Corporation Boundary line recognizer device
US9283958B2 (en) * 2012-10-01 2016-03-15 Conti Temic Microelectronic Gmbh Method and device for assisting in returning a vehicle after leaving a roadway
US20160091325A1 (en) * 2014-09-25 2016-03-31 Nissan North America, Inc. Method and system of assisting a driver of a vehicle
US20160131762A1 (en) * 2014-11-07 2016-05-12 Hyundai Mobis Co., Ltd. Apparatus and method for determining available driving space
US20160152266A1 (en) * 2014-12-02 2016-06-02 Hyundai Mobis Co., Ltd. Apparatus for controlling start-up of lane keeping assistance system and method of controlling the same
US20170008521A1 (en) * 2015-02-10 2017-01-12 Mobileye Vision Technologies Ltd. Autonomous vehicle speed calibration
US20170101094A1 (en) * 2015-10-13 2017-04-13 Magna Electronics Inc. Vehicle vision system with lateral control algorithm for lane keeping
US20170177951A1 (en) * 2015-12-22 2017-06-22 Omnivision Technologies, Inc. Lane Detection System And Method
US20170274898A1 (en) * 2014-08-29 2017-09-28 Nissan Motor Co., Ltd. Travel Control Device and Travel Control Method
US20180170429A1 (en) * 2015-06-30 2018-06-21 Denso Corporation Deviation avoidance apparatus
US20180197414A1 (en) * 2015-07-01 2018-07-12 Denso Corporation Intra-lane travel control apparatus and intra-lane travel control method
US20180257647A1 (en) * 2015-09-10 2018-09-13 Continental Automotive Gmbh Automated detection of hazardous drifting vehicles by vehicle sensors
US20180286247A1 (en) * 2015-09-30 2018-10-04 Nissan Motor Co., Ltd. Travel Control Method and Travel Control Apparatus
US20180322787A1 (en) * 2015-10-16 2018-11-08 Denso Corporation Display control apparatus and vehicle control apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005058130A1 (de) * 2005-11-30 2007-06-06 Valeo Schalter Und Sensoren Gmbh Warnsystem für ein Kraftfahrzeug
CN102209658B (zh) * 2008-11-06 2014-01-15 沃尔沃技术公司 用于确定道路数据的方法和系统
CN102156977A (zh) * 2010-12-22 2011-08-17 浙江大学 一种基于视觉的道路检测方法
US9063548B1 (en) * 2012-12-19 2015-06-23 Google Inc. Use of previous detections for lane marker detection
KR20150059489A (ko) * 2013-11-22 2015-06-01 현대자동차주식회사 협로 검출 방법과 협로 검출 장치 및 협로 검출 시스템

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6038496A (en) * 1995-03-07 2000-03-14 Daimlerchrysler Ag Vehicle with optical scanning device for a lateral road area
US6937165B2 (en) * 2002-09-23 2005-08-30 Honeywell International, Inc. Virtual rumble strip
EP1674826A1 (en) 2004-12-24 2006-06-28 Aisin Aw Co., Ltd. Systems, methods, and programs for determining whether a vehicle is on-road or off-road
US20070274566A1 (en) * 2006-05-24 2007-11-29 Nissan Motor Co., Ltd. Pedestrian detector and pedestrian detecting method
EP2012211A1 (en) 2007-07-03 2009-01-07 Ford Global Technologies, LLC A system for monitoring the surroundings of a vehicle
US8411900B2 (en) * 2008-09-17 2013-04-02 Hitachi Automotive Systems, Ltd. Device for detecting/judging road boundary
US20100259617A1 (en) * 2009-04-09 2010-10-14 Nippon Soken, Inc. Boundary line recognition apparatus
US20120288154A1 (en) * 2009-12-28 2012-11-15 Hitachi Automotive Systems, Ltd. Road-Shoulder Detecting Device and Vehicle Using Road-Shoulder Detecting Device
US20140161323A1 (en) * 2010-09-21 2014-06-12 Mobileye Technologies Limited Dense structure from motion
US20180315163A1 (en) * 2010-09-21 2018-11-01 Mobileye Vision Technologies Ltd. Dense structure from motion
US20120069185A1 (en) * 2010-09-21 2012-03-22 Mobileye Technologies Limited Barrier and guardrail detection using a single camera
US20140160244A1 (en) * 2010-09-21 2014-06-12 Mobileye Technologies Limited Monocular cued detection of three-dimensional structures from depth images
US20130274959A1 (en) * 2010-12-15 2013-10-17 Toyota Jidosha Kabushiki Kaisha Driving support apparatus, driving support method, and vehicle
WO2013169153A1 (en) 2012-05-08 2013-11-14 Autoliv Development Ab A lane-marking crossing warning system
US20150165972A1 (en) * 2012-06-19 2015-06-18 Toyota Jidosha Kabushiki Kaisha Roadside object detection apparatus
US9283958B2 (en) * 2012-10-01 2016-03-15 Conti Temic Microelectronic Gmbh Method and device for assisting in returning a vehicle after leaving a roadway
US20140257659A1 (en) * 2013-03-11 2014-09-11 Honda Motor Co., Ltd. Real time risk assessments using risk functions
US20160042236A1 (en) * 2013-04-08 2016-02-11 Denso Corporation Boundary line recognizer device
US20140379166A1 (en) * 2013-06-21 2014-12-25 Fuji Jukogyo Kabushiki Kaisha Vehicle drive assist system
US20150332114A1 (en) * 2014-05-14 2015-11-19 Mobileye Vision Technologies Ltd. Systems and methods for curb detection and pedestrian hazard assessment
US20150363668A1 (en) * 2014-06-13 2015-12-17 Fujitsu Limited Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
US20160012300A1 (en) * 2014-07-11 2016-01-14 Denso Corporation Lane boundary line recognition device
US20170274898A1 (en) * 2014-08-29 2017-09-28 Nissan Motor Co., Ltd. Travel Control Device and Travel Control Method
US20160091325A1 (en) * 2014-09-25 2016-03-31 Nissan North America, Inc. Method and system of assisting a driver of a vehicle
US20160131762A1 (en) * 2014-11-07 2016-05-12 Hyundai Mobis Co., Ltd. Apparatus and method for determining available driving space
US20160152266A1 (en) * 2014-12-02 2016-06-02 Hyundai Mobis Co., Ltd. Apparatus for controlling start-up of lane keeping assistance system and method of controlling the same
US20170008521A1 (en) * 2015-02-10 2017-01-12 Mobileye Vision Technologies Ltd. Autonomous vehicle speed calibration
US20180170429A1 (en) * 2015-06-30 2018-06-21 Denso Corporation Deviation avoidance apparatus
US20180197414A1 (en) * 2015-07-01 2018-07-12 Denso Corporation Intra-lane travel control apparatus and intra-lane travel control method
US20180257647A1 (en) * 2015-09-10 2018-09-13 Continental Automotive Gmbh Automated detection of hazardous drifting vehicles by vehicle sensors
US20180286247A1 (en) * 2015-09-30 2018-10-04 Nissan Motor Co., Ltd. Travel Control Method and Travel Control Apparatus
US20170101094A1 (en) * 2015-10-13 2017-04-13 Magna Electronics Inc. Vehicle vision system with lateral control algorithm for lane keeping
US20180322787A1 (en) * 2015-10-16 2018-11-08 Denso Corporation Display control apparatus and vehicle control apparatus
US20170177951A1 (en) * 2015-12-22 2017-06-22 Omnivision Technologies, Inc. Lane Detection System And Method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
European Office Action Dated Jun. 1, 2018, Application No. 16 159 536.8-1203, Applicant Volvo Car Corporation, 6 Pages.
Extended European Search Report Dated Sep. 15, 2016, Application No. 16159536.8-1803, Applicant Volvo Car Corporation, 5 Pages.

Also Published As

Publication number Publication date
CN107176167A (zh) 2017-09-19
US20170261327A1 (en) 2017-09-14
CN107176167B (zh) 2021-11-12
EP3217374A1 (en) 2017-09-13

Similar Documents

Publication Publication Date Title
US10480948B2 (en) Method and system for estimating a boundary of a road
EP3867118B1 (en) Lidar-based trailer tracking
US10759428B2 (en) Use of laser scanner for autonomous truck operation
CN107451521B (zh) 车辆车道图估算
CN109195860B (zh) 自动驾驶车辆的车道路缘辅助离道检查和车道保持系统
US10703363B2 (en) In-vehicle traffic assist
CN105280022B (zh) 车道保持抑制系统和方法
US9994147B2 (en) Collision avoidance system for vehicles
US20170021863A1 (en) System and method for verifying road position information for a motor vehicle
CN109641590B (zh) 车辆控制装置
US20150149076A1 (en) Method for Determining a Course of a Traffic Lane for a Vehicle
US20200317188A1 (en) Systems and methods for monitoring a vehicle during auto-parking
JP2021089732A (ja) 衝突回避のために周囲の車両に対して警告を提供するためのシステムおよび方法
US20220118970A1 (en) Systems and methods for selectively modifying collision alert thresholds
US20220196395A1 (en) Method for ascertaining an operating angle between a tractor and a trailer of the tractor
CN116062032A (zh) 用于具有悬伸的重型车辆的驾驶员辅助系统
WO2022065078A1 (ja) 自動運転装置
US20100152967A1 (en) Object detection system with learned position information and method
CN111746553A (zh) 车辆的控制装置、车辆的控制方法以及车辆的控制系统
CN111137286A (zh) 车辆车道偏离系统以及方法
US11904856B2 (en) Detection of a rearward approaching emergency vehicle
US11220255B2 (en) Systems and methods for mitigating trailer instability due to pressure differentials
US20200317202A1 (en) Method for ascertaining a liquid depth of a liquid accumulation on a travel path in front of a vehicle, and method for ascertaining a travel trajectory through a liquid accumulation on a travel path in front of a vehicle
CN111801258A (zh) 机动车辆的驾驶辅助设备和方法
CN116118770A (zh) 面向稳健自动驾驶控制的车辆感知系统的自适应合理化器

Legal Events

Date Code Title Description
AS Assignment

Owner name: VOLVO CAR CORPORATION, SWEDEN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OLSSON, CLAES;DAHLBACK, ANDERS;KARLSSON, MARTIN ANDERS;SIGNING DATES FROM 20170221 TO 20170227;REEL/FRAME:041460/0329

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4