WO2021241834A1 - Appareil et procédé de génération de voie virtuelle reposant sur la perception d'informations de flux de trafic pour une conduite autonome dans des conditions météorologiques défavorables - Google Patents

Appareil et procédé de génération de voie virtuelle reposant sur la perception d'informations de flux de trafic pour une conduite autonome dans des conditions météorologiques défavorables Download PDF

Info

Publication number
WO2021241834A1
WO2021241834A1 PCT/KR2021/000335 KR2021000335W WO2021241834A1 WO 2021241834 A1 WO2021241834 A1 WO 2021241834A1 KR 2021000335 W KR2021000335 W KR 2021000335W WO 2021241834 A1 WO2021241834 A1 WO 2021241834A1
Authority
WO
WIPO (PCT)
Prior art keywords
driving
history
lane
virtual lane
attention
Prior art date
Application number
PCT/KR2021/000335
Other languages
English (en)
Korean (ko)
Inventor
이경수
고영일
이주현
Original Assignee
서울대학교산학협력단
(주)스마트모빌리티랩
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 서울대학교산학협력단, (주)스마트모빌리티랩 filed Critical 서울대학교산학협력단
Publication of WO2021241834A1 publication Critical patent/WO2021241834A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • B60W2050/0052Filtering, filters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/30Driving style
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/14Cruise control
    • B60Y2300/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/14Cruise control
    • B60Y2300/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/18Propelling the vehicle
    • B60Y2300/18008Propelling the vehicle related to particular drive situations
    • B60Y2300/18166Overtaking, changing lanes

Definitions

  • the present invention relates to a method and apparatus for generating a virtual lane. More particularly, the present invention relates to a method and apparatus for determining a driving trajectory of a surrounding vehicle by utilizing a self-attention mechanism of a convolutional neural network (CNN) and generating a virtual lane therefrom.
  • CNN convolutional neural network
  • driver assistance systems and intelligent vehicle systems are being developed.
  • autonomous driving technology recognition and positioning of the surrounding environment, determination and planning for changes in the surrounding environment, and control of vehicle driving are performed accordingly, so that the vehicle can autonomously drive without driver intervention.
  • the states of objects constituting the traffic environment such as surrounding vehicles, pedestrians, and traffic signals, may be detected through sensors such as radar and cameras.
  • sensors such as radar and cameras.
  • information on lanes displayed on the road surface may be mainly utilized to plan the behavior of the own vehicle and predict the behavior of surrounding vehicles.
  • visual information such as an image by a camera may be a main acquisition means.
  • the painted portion of the lane on the road surface may be distinguished from the asphalt by using the difference in reflectivity for laser scanning through LiDAR or the like.
  • the technical problem to be solved by the present invention is to provide a virtual environment around the own vehicle so that behavior planning and driving control according to autonomous driving can be performed even when a lane around the own vehicle is not detected due to a weather environment such as heavy snow or heavy rain. It is to provide technology to create lanes.
  • the method of generating a virtual lane for assisting autonomous driving includes driving data of at least one surrounding vehicle measured by a complex sensor in time flow. deriving a driving history by accumulating accordingly; dividing the driving history into at least one driving section according to a driving pattern based on self-attention implemented by a convolutional neural network (CNN); deriving a weighted driving history of the at least one surrounding vehicle by assigning a weight to the at least one driving section based on the self-attention; determining a driving trajectory of the at least one surrounding vehicle by performing curve fitting on the weighted driving history through the CNN; and generating the virtual lane on an HD map for assisting the autonomous driving based on the driving trajectory.
  • CNN convolutional neural network
  • an apparatus for generating a virtual lane for assisting autonomous driving includes: a memory storing at least one program; and a processor for generating the virtual lane by executing the at least one program, wherein the processor is configured to accumulate driving data of at least one surrounding vehicle measured by a complex sensor over time to derive a driving history, , divides the driving history into at least one driving section according to a driving pattern based on self-attention implemented by a convolutional neural network (CNN), and divides the driving history into at least one driving section based on the self-attention.
  • CNN convolutional neural network
  • Deriving a weighted driving history of the at least one surrounding vehicle by assigning a weight to a driving section, and performing curve fitting on the weighted driving history through the CNN to determine the driving trajectory of the at least one surrounding vehicle, and
  • the virtual lane is generated on the HD map for assisting the autonomous driving based on the driving trajectory.
  • a virtual lane can be created by referring to the driving flow of surrounding vehicles, so that the autonomous driving process based on the virtual lane is can be performed smoothly.
  • the driving patterns of surrounding vehicles are classified according to the self-attention mechanism, and behaviors that obscure lane creation, such as lane changes, can be considered with low weight. The generated accuracy can be improved.
  • FIG. 1 is a block diagram illustrating elements constituting an apparatus for generating a virtual lane according to some exemplary embodiments.
  • FIG. 2 is a diagram for explaining how an autonomous driving system including an apparatus for generating a virtual lane according to some exemplary embodiments operates.
  • FIG. 3 is a diagram for describing a detailed process of generating a virtual lane according to some exemplary embodiments.
  • FIG. 4 is a flowchart illustrating steps of configuring a method for generating a virtual lane according to some embodiments.
  • FIG. 1 is a block diagram illustrating elements constituting an apparatus for generating a virtual lane according to some exemplary embodiments.
  • an apparatus 100 for generating a virtual lane may include a memory 110 and a processor 120 .
  • the present invention is not limited thereto, and other general-purpose components other than the components shown in FIG. 1 may be further included in the device 100 .
  • the apparatus 100 may be a computing device mounted on a vehicle so as to create a virtual lane.
  • the device 100 may include the memory 110 as a means for storing various data, instructions, at least one program or software, and performs processing on various data by executing the instructions or at least one program.
  • the processor 120 may be included as a means for doing so.
  • the device 100 may refer to an electric component mounted in a vehicle in the form of a smart mobility device.
  • the device 100 is not limited to a type mounted on a vehicle, and the device 100 may be implemented as a mobile device that wirelessly exchanges data with the vehicle.
  • the memory 110 may store various commands related to generation of a virtual lane in the form of at least one program.
  • the memory 110 may store instructions constituting software such as a computer program or mobile application.
  • the memory 110 may store various data necessary for the execution of at least one program.
  • the memory 110 includes a read only memory (ROM), a programmable ROM (PROM), an electrically programmable ROM (EPROM), an electrically erasable and programmable ROM (EEPROM), a flash memory, a phase-change RAM (PRAM), and an MRAM.
  • ROM read only memory
  • PROM programmable ROM
  • EPROM electrically programmable ROM
  • EEPROM electrically erasable and programmable ROM
  • flash memory a phase-change RAM (PRAM), and an MRAM.
  • MRAM magnetic RAM
  • RRAM resistive RAM
  • FRAM ferrroelectric RAM
  • etc. may be implemented as a non-volatile memory, or DRAM (dynamic RAM), SRAM (static RAM), SDRAM (synchronous DRAM), PRAM (phase- It may be implemented as a volatile memory such as change RAM), resistive RAM (RRAM), and ferroelectric RAM (FeRAM).
  • the memory 110 may be implemented as a hard disk drive (HDD), a solid state drive (
  • the processor 120 may create a virtual lane by executing at least one program stored in the memory 110 .
  • the processor 120 may perform a series of processing processes for implementing the creation of the virtual lane.
  • the processor 120 may perform general functions for controlling the apparatus 100 , and may process various operations inside the apparatus 100 .
  • the processor 120 may be implemented as an array of a plurality of logic gates or a general-purpose microprocessor.
  • the processor 120 may be configured with a single number or a plurality of processors.
  • the processor 120 may be integrally configured with the memory 110 instead of being separate from the memory 110 for storing at least one program.
  • the processor 120 may be at least one of a central processing unit (CPU), a graphics processing unit (GPU), and an application processor (AP) provided in the device 100 , but this is only an example, and the processor 120 may be It may be implemented in various forms.
  • the device 100 for generating a virtual lane may be for assisting autonomous driving.
  • the processor 120 of the device 100 may create a virtual lane by performing a series of processing steps.
  • the processor 120 may derive a driving history by accumulating driving data of at least one surrounding vehicle measured by the complex sensor over time.
  • the processor 120 may acquire driving data such as positions and speeds of surrounding vehicles based on data measured every time unit by the complex sensor.
  • the unit of time for which data is measured by the composite sensor may be 100 ms, or may be set to another suitable value.
  • the complex sensor may detect data on the surrounding environment of the vehicle by means of a camera, radar, LiDAR, and GPS.
  • complex sensors can utilize HD maps to access information such as roads, terrain, and buildings.
  • driving data of a surrounding vehicle measured by the complex sensor may be expressed as physical quantities such as a two-dimensional relative position of the surrounding vehicle with respect to the own vehicle, a degree of dependence, and a lateral speed.
  • the driving data of the surrounding vehicle measured by the complex sensor may be generated for each time unit, and the processor 120 may derive the driving history of the surrounding vehicle by accumulating the driving data generated for each time unit.
  • driving histories for each of the surrounding vehicles may be derived.
  • the processor 120 may derive the driving history by generating driving data by applying Kalman filtering for data purification to data measured by the complex sensor.
  • Kalman filtering is to derive driving history based on more refined data.
  • driving data is expressed in two-dimensional relative positions and relative speeds of surrounding vehicles, driving data of surrounding vehicles is analyzed through a model that assumes constant speed. It can be estimated and corrected.
  • the processor 120 may divide the driving history into at least one driving section according to a driving pattern based on self-attention implemented by a convolutional neural network (CNN).
  • the memory 110 may store a CNN including a self-attention layer.
  • the memory 110 may store the learned values of parameters constituting the CNN to store the learned CNN in a completed state.
  • the processor 120 may divide the driving history of the surrounding vehicle into at least one driving section according to the driving pattern by utilizing the CNN stored in the memory 110 .
  • CNN is one of deep neural networks including a convolutional layer, and can be variously used for machine learning that classifies patterns of inputs.
  • a CNN may include a self-attention layer to classify patterns in driving data of surrounding vehicles measured by complex sensors. That is, the processor 120 may divide the driving history into at least one driving section by classifying driving data by patterns based on CNN.
  • the CNN stored in the memory 110 and utilized by the processor 120 may include a self-attention layer to utilize the self-attention mechanism.
  • a self-attention layer to utilize the self-attention mechanism.
  • data sections that have high correlation with each other in successive time series data to form a pattern may be classified, and the classified data sections may be expressed as weights.
  • the self-attention layer may calculate an attention score in order to divide the driving history into at least one driving section according to a driving pattern.
  • the attention score may be calculated in various ways.
  • the self-attention of CNN may calculate an attention score by using a softmax function.
  • the softmax function is a function for estimating the probability of each component of a vector having an arbitrary dimension, and may refer to one of the main activation functions used in the field of machine learning.
  • the processor 120 may derive a weighted driving history of at least one surrounding vehicle by assigning a weight to at least one driving section based on self-attention.
  • a weight may be given to at least one driving section by the self-attention of the CNN, and the processor 120 considers the corresponding weight, that is, a driving section with a high weight as a higher weight than a driving section with a low weight.
  • a weighted driving history can be derived. That is, the processor 120 may derive the weighted driving history by excluding unnecessary sections from the driving history.
  • the processor 120 may determine the driving trajectory of at least one surrounding vehicle by performing curve fitting on the weighted driving history through CNN.
  • Curve fitting is a process of determining a trajectory that matches the driving data constituting the weighted driving history. For example, it may refer to a process of deriving a curve or a straight line that is fitted to the weighted driving history on a two-dimensional plane corresponding to the road surface. have.
  • the CNN may be trained to receive driving data of surrounding vehicles and determine a driving trajectory.
  • the processor 120 may determine the driving trajectory by performing curve fitting by inferring coefficients of a polynomial function for representing a trajectory fitted to the weighted driving history. That is, the driving trajectory may be determined in the form of a polynomial function on a two-dimensional plane corresponding to the road surface. For example, the processor 120 may determine the coefficients of the quadratic function or the linear function by setting the driving trajectory as a curve of the quadratic function or a straight line of the linear function so that the driving trajectory is fitted to the weighted driving history.
  • the processor 120 may generate a virtual lane on the HD map for assisting autonomous driving based on the driving trajectory.
  • the HD map may display objects, such as surrounding vehicles or pedestrians, which are the basis for determination in the planning and determination stage of autonomous driving.
  • the processor 120 may generate virtual lanes based on driving trajectories of surrounding vehicles on the HD map, for example, assuming that each driving trajectory is formed between the virtual lanes.
  • the memory 110 may store the CNN, and the CNN divides the driving history into at least one driving section including a self-attention layer, derives a weighted driving history through weighting, and performs curve fitting to perform a driving trajectory can be learned to determine, and the processor 120 can generate a virtual lane based on the CNN, so even in bad weather conditions, that is, even when the actual lane on the road surface is difficult to identify by visual information such as a camera, the complex sensor Since a virtual lane can be created only with driving data measured by , robustness to the weather environment and road surface condition of autonomous driving can be secured.
  • FIG. 2 is a diagram for explaining how an autonomous driving system including an apparatus for generating a virtual lane according to some exemplary embodiments operates.
  • the autonomous driving system 200 may be configured with various modules.
  • the autonomous driving system 200 may include a complex sensor 210 , a localization module 220 , a virtual lane creation module 230 , a vehicle control module 240 , and a vehicle driving module 250 .
  • the apparatus 100 described with reference to FIG. 1 may be for implementing the function of the virtual lane generating module 230 .
  • Memory 110 and processor 120 of device 100 generate virtual lanes from inputs by composite sensor 210 and localization module 220, to plan and control the behavior of the vehicle.
  • Information on the virtual lane may be provided to the vehicle control module 240 requiring the information.
  • the complex sensor 210 may detect data on the surrounding environment of the vehicle using a camera, radar, LiDAR, GPS, or the like.
  • complex sensors can utilize HD maps to access information such as roads, terrain, and buildings.
  • the composite sensor 210 may include a lidar (LiDAR) that detects surrounding objects by utilizing the reflection of laser light, even when it is difficult to visually identify the actual lane on the road surface due to bad weather or road conditions through the lidar , driving data of surrounding vehicles, which is a basis for generating a virtual lane, may be obtained and provided to the virtual lane generating module 230 .
  • LiDAR lidar
  • the localization module 220 may match the location of the own vehicle on the HD map. In addition, when the relative position of the surrounding vehicle is detected by the complex sensor 210 , the approximate absolute positions of the surrounding vehicle and the host vehicle may be matched on the HD map. Meanwhile, the approximate absolute position of the own vehicle and lane information on the HD map by the localization module 220 may be utilized in the process of determining the driving trajectory of the surrounding vehicle.
  • the virtual lane generating module 230 may include a driving trajectory determining module 231 , a virtual lane generating module 232 , and a target selection module 233 .
  • the driving trajectory determining module 231 and the virtual lane generating module 232 may be understood to perform respective processing steps for generating a virtual lane performed in the device 100 .
  • the target selection module 233 may select a target vehicle that is a basis for speed control and inter-vehicle distance control of the host vehicle based on the virtual lane. That is, the autonomous driving system 200 may control the speed of the own vehicle so that an appropriate inter-vehicle distance with the target vehicle is maintained. , selecting a target vehicle used for speed control and inter-vehicle distance control according to autonomous driving based on the virtual lane.
  • the vehicle control module 240 may perform steering control, speed control, and inter-vehicle distance control of the own vehicle.
  • the vehicle control module 240 may control a variable such as a yaw rate that determines the steering of the own vehicle, that is, the lateral behavior of the vehicle, based on the virtual lane provided from the virtual lane generating module 230, and , the speed of the own vehicle and the inter-vehicle distance to the target vehicle may be controlled based on the target vehicle provided from the virtual lane generating module 230 .
  • the vehicle driving module 250 may drive the own vehicle according to a control command determined by the vehicle control module 240 , for example, according to a steering command or an acceleration/deceleration command.
  • FIG. 3 is a diagram for describing a detailed process of generating a virtual lane according to some exemplary embodiments.
  • a first example 310 and a second example 320 of the creation of a virtual lane are shown.
  • the first example 310 is to illustrate a case in which the CNN and self-attention of the present invention are not utilized, and the second example 320 is to illustrate a case in which the CNN and self-attention of the present invention are utilized. .
  • the first example 310 may represent a case in which the surrounding vehicle 312 and the surrounding vehicle 313 are driven together with the own vehicle 311 , and the second example 320 is also the same as the first example 310 .
  • a case in which the vehicle 321 , the surrounding vehicle 322 , and the surrounding vehicle 323 travel may be represented.
  • the surrounding vehicle 312 may drive while following the lane without changing the lane, but the surrounding vehicle 313 may change from before the change 313A through the lane change 313B to after the change ( 313C) to change the driving lane.
  • the autonomous driving system of the own vehicle 311 cannot interpret the behavior of the surrounding vehicle 313 as a lane change, the virtual lane 314 may be generated according to the driving trajectory of the surrounding vehicle 313, Accuracy of virtual lane generation may be deteriorated.
  • the device 100 divides the driving history of the surrounding vehicle into at least one driving section based on CNN and self-attention, and , a method of deriving a weighted driving history by assigning weights to each driving section, it is possible to consider the influence of a lane change of a surrounding vehicle.
  • the processor 120 may divide the driving history into at least one driving section by dividing the driving history into a lane following section and a lane change section according to a driving pattern. For example, in the second example 320 , the processor 120 lane-follows driving data corresponding to the before-change 323A and after-change 323C of the surrounding vehicle 323 based on the CNN and self-attention. As a section, driving data corresponding to the lane change 323B may be divided into a lane change section.
  • the processor 120 may derive the weighted driving history by setting the weight of the lane following section to be higher than the weight of the lane change section. For example, in the second example 320 , the processor 120 may give a high weight to the lane following section before the change 323A and after the change 323C, and the lane change of the lane change 323B A low weight can be given to the section.
  • the virtual lane 324 that more accurately matches the lane on the actual road surface that is not identified by visual information such as a camera is provided. can be generated, so that the accuracy of virtual lane generation can be prevented from being deteriorated.
  • FIG. 4 is a flowchart illustrating steps of configuring a method for generating a virtual lane according to some embodiments.
  • the method of generating a virtual lane may include steps 410 to 450 .
  • the present invention is not limited thereto, and general steps other than the steps shown in FIG. 4 may be further included in the method of generating a virtual lane.
  • the method of generating a virtual lane for assisting autonomous driving of FIG. 4 may include steps processed in time series by the apparatus 100 for generating a virtual lane described with reference to FIGS. 1 to 3 . Accordingly, even if the contents of the method of FIG. 4 are omitted below, the contents described above for the apparatus 100 may be equally applied to the method of FIG. 4 .
  • the device 100 may derive a driving history by accumulating driving data of at least one surrounding vehicle measured by the complex sensor over time.
  • the apparatus 100 may derive the driving history by generating driving data by applying Kalman filtering for data purification to data measured by the complex sensor.
  • the device 100 may divide the driving history into at least one driving section according to a driving pattern based on self-attention implemented by a convolutional neural network (CNN).
  • CNN convolutional neural network
  • the apparatus 100 may divide the driving history into at least one driving section by dividing the driving history into a lane following section and a lane change section according to the driving pattern.
  • the apparatus 100 may derive the weighted driving history by setting the weight of the lane following section to be higher than the weight of the lane change section.
  • the apparatus 100 may derive a weighted driving history of at least one surrounding vehicle by giving weights to at least one driving section based on self-attention.
  • the device 100 may determine the driving trajectory of at least one surrounding vehicle by performing curve fitting on the weighted driving history through CNN.
  • the apparatus 100 may determine the driving trajectory by performing curve fitting by inferring coefficients of a polynomial function for representing the trajectory fitted to the weighted driving history.
  • the device 100 may generate a virtual lane on the HD map for assisting autonomous driving based on the driving trajectory.
  • the self-attention may calculate an attention score by using a softmax function.
  • the composite sensor may include a lidar (LiDAR) that detects surrounding objects by utilizing the reflection of laser light.
  • LiDAR lidar
  • the device 100 may select a target vehicle used for speed control and inter-vehicle distance control according to autonomous driving based on the virtual lane.
  • the method of generating a virtual lane for assisting autonomous driving of FIG. 4 may be recorded in a computer-readable recording medium in which at least one program or software including instructions for executing the method is recorded.
  • Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CD-ROMs and DVDs, and floppy disks. Magneto-optical media and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like may be included. Examples of program instructions may include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

La divulgation concerne un procédé de génération d'une voie virtuelle destinée à aider à la conduite autonome, le procédé comprenant les étapes consistant : à déduire un historique de conduite au moyen de l'accumulation de données de conduite d'au moins un véhicule environnant mesurées par un capteur composite en fonction du passage du temps ; à classer l'historique de conduite en au moins une section de conduite selon un motif de conduite sur la base d'une attention automatique mise en œuvre par un réseau neuronal convolutionnel (CNN) ; à déduire un historique de conduite pondéré dudit véhicule environnant au moyen de l'attribution d'une pondération à ladite section de conduite sur la base de l'attention automatique ; à déterminer une trajectoire de conduite dudit véhicule environnant au moyen de la réalisation d'un réglage de courbe sur l'historique de conduite pondéré par le biais du CNN ; et à générer une voie virtuelle sur une carte HD pour aider à la conduite autonome sur la base de la trajectoire de conduite.
PCT/KR2021/000335 2020-05-29 2021-01-11 Appareil et procédé de génération de voie virtuelle reposant sur la perception d'informations de flux de trafic pour une conduite autonome dans des conditions météorologiques défavorables WO2021241834A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0064709 2020-05-29
KR1020200064709A KR102342414B1 (ko) 2020-05-29 2020-05-29 악천후 자율주행을 위한 교통 흐름 정보 인지 기반 가상 차선 생성 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2021241834A1 true WO2021241834A1 (fr) 2021-12-02

Family

ID=78744981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/000335 WO2021241834A1 (fr) 2020-05-29 2021-01-11 Appareil et procédé de génération de voie virtuelle reposant sur la perception d'informations de flux de trafic pour une conduite autonome dans des conditions météorologiques défavorables

Country Status (2)

Country Link
KR (1) KR102342414B1 (fr)
WO (1) WO2021241834A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114312840A (zh) * 2021-12-30 2022-04-12 重庆长安汽车股份有限公司 自动驾驶障碍目标轨迹拟合方法、系统、车辆及存储介质
CN114707630A (zh) * 2022-02-16 2022-07-05 大连理工大学 一种通过注意场景和状态的多模态轨迹预测方法
CN114842440A (zh) * 2022-06-30 2022-08-02 小米汽车科技有限公司 自动驾驶环境感知方法、装置、车辆及可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101526816B1 (ko) * 2014-09-12 2015-06-05 현대자동차주식회사 차선 추정 시스템 및 그 방법
KR20180051836A (ko) * 2016-11-09 2018-05-17 삼성전자주식회사 주행 차량을 위한 가상의 주행 차선을 생성하는 방법 및 장치
US20200026282A1 (en) * 2018-07-23 2020-01-23 Baidu Usa Llc Lane/object detection and tracking perception system for autonomous vehicles
KR20200058272A (ko) * 2018-11-19 2020-05-27 건국대학교 산학협력단 도로 주행 영상의 전처리를 통한 도로 주행 상황 제공 방법 및 시스템

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101782423B1 (ko) 2015-03-19 2017-09-29 현대자동차주식회사 차량, 및 차량의 제어방법
KR102456626B1 (ko) * 2016-01-15 2022-10-18 현대자동차주식회사 자율 주행 차량의 차선 인지 방법 및 장치
KR20180124713A (ko) * 2017-05-12 2018-11-21 삼성전자주식회사 도로의 형상을 추정하는 전자 장치 및 그 동작 방법
KR102421855B1 (ko) 2017-09-28 2022-07-18 삼성전자주식회사 주행 차로를 식별하는 방법 및 장치
KR102553247B1 (ko) 2018-04-27 2023-07-07 주식회사 에이치엘클레무브 전방 차량 추종 제어 시 안전성을 향상할 수 있는 차선 유지 보조 시스템 및 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101526816B1 (ko) * 2014-09-12 2015-06-05 현대자동차주식회사 차선 추정 시스템 및 그 방법
KR20180051836A (ko) * 2016-11-09 2018-05-17 삼성전자주식회사 주행 차량을 위한 가상의 주행 차선을 생성하는 방법 및 장치
US20200026282A1 (en) * 2018-07-23 2020-01-23 Baidu Usa Llc Lane/object detection and tracking perception system for autonomous vehicles
KR20200058272A (ko) * 2018-11-19 2020-05-27 건국대학교 산학협력단 도로 주행 영상의 전처리를 통한 도로 주행 상황 제공 방법 및 시스템

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HOU YUENAN; MA ZHENG; LIU CHUNXIAO; LOY CHEN CHANGE: "Learning Lightweight Lane Detection CNNs by Self Attention Distillation", 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 27 October 2019 (2019-10-27), pages 1013 - 1021, XP033723536, DOI: 10.1109/ICCV.2019.00110 *
KOH YOUNGIL: "Online Vehicle Motion Learning based Steering Control for an Automated Driving System using Incremental Sparse Spectrum Gaussian Process Regression", THESES, 11 February 2020 (2020-02-11), pages 1 - 134, XP055871589 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114312840A (zh) * 2021-12-30 2022-04-12 重庆长安汽车股份有限公司 自动驾驶障碍目标轨迹拟合方法、系统、车辆及存储介质
CN114312840B (zh) * 2021-12-30 2023-09-22 重庆长安汽车股份有限公司 自动驾驶障碍目标轨迹拟合方法、系统、车辆及存储介质
CN114707630A (zh) * 2022-02-16 2022-07-05 大连理工大学 一种通过注意场景和状态的多模态轨迹预测方法
CN114842440A (zh) * 2022-06-30 2022-08-02 小米汽车科技有限公司 自动驾驶环境感知方法、装置、车辆及可读存储介质
CN114842440B (zh) * 2022-06-30 2022-09-09 小米汽车科技有限公司 自动驾驶环境感知方法、装置、车辆及可读存储介质

Also Published As

Publication number Publication date
KR20210148518A (ko) 2021-12-08
KR102342414B1 (ko) 2021-12-24

Similar Documents

Publication Publication Date Title
WO2021241834A1 (fr) Appareil et procédé de génération de voie virtuelle reposant sur la perception d'informations de flux de trafic pour une conduite autonome dans des conditions météorologiques défavorables
CN111971574B (zh) 用于自动驾驶车辆的lidar定位的基于深度学习的特征提取
Chen et al. Deepdriving: Learning affordance for direct perception in autonomous driving
JP7060625B2 (ja) 自動運転車において3dcnnネットワークを用いてソリューション推断を行うlidar測位
JP7256758B2 (ja) 自動運転車両においてrnnとlstmを用いて時間平滑化を行うlidar測位
CN111874006B (zh) 路线规划处理方法和装置
EP3722908B1 (fr) Apprentissage d'une distribution de comportement humain de conduite basée sur le scénario pour modèle de simulation réaliste
CN111367282B (zh) 一种基于多模感知与强化学习的机器人导航方法及系统
EP3032454B1 (fr) Procédé et système pour une analyse de scène à base de rayon adaptatif d'espaces de trafic sémantique et véhicule équipé d'un tel système
US11460851B2 (en) Eccentricity image fusion
CN110047276B (zh) 障碍物车辆的拥堵状态确定方法、装置和相关产品
CN110377025A (zh) 用于自动驾驶车辆的传感器聚合框架
WO2018101526A1 (fr) Procédé de détection de zone de route et de voie à l'aide de données lidar, et système associé
CN110347145A (zh) 用于自动驾驶车辆的感知辅助
CN110569602B (zh) 用于无人驾驶车辆的数据采集方法与系统
US11055859B2 (en) Eccentricity maps
WO2021153989A1 (fr) Système et procédé de commande de réaction longitudinale de véhicule autonome électrique
EP3722907B1 (fr) Apprentissage d'une distribution basée sur le scénario d'un comportement de conduite humain réaliste pour modèle de simulation réaliste et dérivation d'un modèle d'erreur de capteurs stationnaires et mobiles
CN116703966A (zh) 多对象跟踪
CN116266380A (zh) 一种环境数据重建方法、装置、系统及存储介质
Kumar et al. Vision-based outdoor navigation of self-driving car using lane detection
JP6946456B2 (ja) 地図及び測位を必要としない自動運転車両のコーナネゴシエーション方法
CN113762030A (zh) 数据处理方法、装置、计算机设备及存储介质
Reddy Driverless car: software modelling and design using Python and Tensorflow
Reddy Driverless Car-Design of a Parallel and Self-Organizing System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21813674

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21813674

Country of ref document: EP

Kind code of ref document: A1