WO2018101526A1 - Procédé de détection de zone de route et de voie à l'aide de données lidar, et système associé - Google Patents

Procédé de détection de zone de route et de voie à l'aide de données lidar, et système associé Download PDF

Info

Publication number
WO2018101526A1
WO2018101526A1 PCT/KR2016/014565 KR2016014565W WO2018101526A1 WO 2018101526 A1 WO2018101526 A1 WO 2018101526A1 KR 2016014565 W KR2016014565 W KR 2016014565W WO 2018101526 A1 WO2018101526 A1 WO 2018101526A1
Authority
WO
WIPO (PCT)
Prior art keywords
lidar
lane
vehicle
points
point
Prior art date
Application number
PCT/KR2016/014565
Other languages
English (en)
Korean (ko)
Inventor
정지영
김남일
민재식
Original Assignee
네이버 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 네이버 주식회사 filed Critical 네이버 주식회사
Publication of WO2018101526A1 publication Critical patent/WO2018101526A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/076Slope angle of the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/15Road slope, i.e. the inclination of a road segment in the longitudinal direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/35Road bumpiness, e.g. potholes

Definitions

  • the following description relates to a technology for detecting a road area and a lane in which a vehicle can travel.
  • a lane detection method may be classified into a model-based, feature point-based, and area-based detection method.
  • Korean Patent Laid-Open Publication No. 10-2011-0046607 (published May 06, 2011) uses a HSV color model and an edge model to extract a lane feature from a camera image and then extracts a bay from the lane feature.
  • a technique for detecting a lane through a decision rule is disclosed.
  • a method and system that can accurately detect lanes on a driveable area of a road using lidar data.
  • a computer-implemented method comprising the steps of: acquiring point-type lidar data from a multichannel lidar sensor mounted on a vehicle; And detecting a road area in which the vehicle may travel using the inclination of the lidar point between neighboring channels at the lidar point acquired from the multi-channel lidar sensor.
  • the acquiring may include acquiring a lidar point of a radial form according to an omnidirectional scan through the multichannel lidar sensor.
  • the detecting of the road area may include: calculating an inclination of a lidar point between neighboring channels for each scan angle of the multichannel lidar sensor; And detecting the road area by classifying a rider point having the slope less than or equal to a predetermined threshold value.
  • the step of calculating the inclination it is possible to calculate the inclination of the lidar point between the neighboring channels in a direction away from the vehicle from the lidar point scanning the closest to the vehicle.
  • the threshold value may be set according to a road surface requirement including a shape of a road or irregularities.
  • the method may further include detecting a lane in the road area by using the reflection intensity of a lidar point included in the road area.
  • the detecting of the lane may detect the lane by classifying a lidar point having a reflection intensity of a predetermined value or more among the lidar points included in the road area.
  • the detecting of the lane may include: classifying a lidar point having a reflection intensity of a predetermined value or more among lidar points included in the road area; Setting a straight line parameter based on the position and driving direction of the vehicle; And classifying a lidar point included in a line of the parameter or located within a set range among the classified lidar points into a lane area.
  • the setting of the straight line parameter may include: setting a straight line parameter corresponding to a driving direction of the vehicle as an initial value on both sides of a lane width based on the position of the vehicle; And updating the straight line parameter by using a Lidar point classified into the lane area.
  • the method may further include calculating and providing a Lidar point detected as the lane as a coordinate value to be displayed on a map.
  • the road area and lane detection method is obtained from a multichannel lidar sensor mounted on a vehicle.
  • Acquiring LiDAR data in the form of points Detecting a road area in which the vehicle can travel using the inclination of the rider points between neighboring channels at the rider points acquired from the multichannel rider sensors; And detecting a lane on the corresponding road area by classifying a rider point corresponding to a linear parameter set based on the position and driving direction of the vehicle among the rider points included in the road area. do.
  • a computer-implemented system comprising: at least one processor configured to execute computer-readable instructions, the at least one processor being configured to perform an omni-directional scan from a multichannel lidar sensor mounted on a vehicle.
  • a rider data acquisition unit for obtaining rider data in the form of points;
  • a road area detector configured to detect a road area in which the vehicle may travel by using an inclination of a rider point between neighboring channels at a rider point obtained from the multichannel rider sensor.
  • the slope or partial loss of the road itself is detected by detecting the road region based on the inclination between points using LiDAR data, instead of simply detecting the road region satisfying the plane condition.
  • the driving area on the road can be more robustly detected.
  • the lane is classified in the lane capable of driving on the road by classifying the lidar data on the road included in the straight line into lanes. It can be accurately distinguished from other road surface markings such as arrows, crosswalks, crosswalk notices, stop lines, and the like.
  • FIG. 1 is a block diagram illustrating an example of an internal configuration of a computer system according to an embodiment of the present invention.
  • FIG. 2 illustrates an example of components that may be included in a processor of a computer system according to an exemplary embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating an example of a road area and a lane detection method which may be performed by a computer system according to an exemplary embodiment of the present invention.
  • FIG. 4 illustrates an example of a multichannel lidar sensor in one embodiment of the present invention.
  • FIG. 5 is an exemplary diagram for describing a process of classifying a road area that may be driven according to an embodiment of the present invention.
  • FIG. 6 is an exemplary diagram for describing a process of classifying road markings according to an exemplary embodiment of the present invention.
  • FIG. 7 is an exemplary diagram for describing a process of classifying a lane among road markings according to an exemplary embodiment of the present invention.
  • Embodiments of the present invention relate to a technique for detecting a road area and a lane capable of driving a vehicle using lidar data.
  • Embodiments can achieve accurate road area and lane detection using lidar data, thereby achieving significant advantages in terms of accuracy, efficiency, speed, cost savings, and the like. .
  • FIG. 1 is a block diagram illustrating an example of an internal configuration of a computer system according to an embodiment of the present invention.
  • a road area and lane detection system may be implemented through the computer system 100 of FIG. 1.
  • the computer system 100 is a processor 110, a memory 120, a permanent storage device 130, a bus 140, and input / output as components for executing a road area and lane detection method. It may include an interface 150 and a network interface 160.
  • the processor 110 may include or be part of any device capable of processing a sequence of instructions as a component for detecting a road area and lane capable of driving using lidar data.
  • Processor 110 may include, for example, a processor within a computer processor, mobile device or other electronic device, and / or a digital processor.
  • the processor 110 may be included in, for example, a server computing device, a server computer, a series of server computers, a server farm, a cloud computer, a content platform, and the like.
  • the processor 110 may be connected to the memory 120 through the bus 140.
  • Memory 120 may include volatile memory, permanent, virtual, or other memory for storing information used by or output by computer system 100.
  • the memory 120 may include, for example, random access memory (RAM) and / or dynamic RAM (DRAM).
  • RAM random access memory
  • DRAM dynamic RAM
  • Memory 120 may be used to store any information, such as status information of computer system 100.
  • the memory 120 may also be used to store instructions of the computer system 100, including, for example, instructions for detecting a driveable area of a road and a lane.
  • Computer system 100 may include one or more processors 110 as needed or where appropriate.
  • Bus 140 may include a communication infrastructure that enables interaction between various components of computer system 100.
  • Bus 140 may carry data, for example, between components of computer system 100, for example, between processor 110 and memory 120.
  • Bus 140 may include wireless and / or wired communication media between components of computer system 100 and may include parallel, serial, or other topology arrangements.
  • Persistent storage 130 is a component, such as a memory or other persistent storage, such as used by computer system 100 to store data for some extended period of time (eg, relative to memory 120). Can include them. Persistent storage 130 may include non-volatile main memory as used by processor 110 in computer system 100. Persistent storage 130 may include, for example, flash memory, hard disk, optical disk, or other computer readable medium.
  • the input / output interface 150 may include interfaces for a keyboard, mouse, voice command input, display, or other input or output device. Configuration commands and / or inputs for road area and lane detection may be received via the input / output interface 150.
  • Network interface 160 may include one or more interfaces to networks such as a local area network or the Internet.
  • Network interface 160 may include interfaces for wired or wireless connections. Configuration commands and / or input for road area and lane detection may be received via network interface 160.
  • computer system 100 may include more components than the components of FIG. 1. However, it is not necessary to clearly show most of the prior art components.
  • the computer system 100 may be implemented to include at least some of the input / output devices connected to the input / output interface 150 described above, or may include a transceiver, a global positioning system (GPS) module, a camera, various sensors, It may further include other components such as a database.
  • GPS global positioning system
  • Embodiments of the present invention relate to a technology for detecting a road area and a lane that can be driven using LiDAR data, which may be applied to a vehicle driving assistance system such as free driving or unmanned driving, but is not limited thereto.
  • FIG. 2 is a diagram illustrating an example of components that may be included in a processor of a computer system according to an embodiment of the present invention
  • FIG. 3 is a road area that may be performed by the computer system according to an embodiment of the present invention.
  • an example of a lane detection method is also included in a processor of a computer system according to an embodiment of the present invention.
  • the processor 110 may include a LiDAR data acquisition unit 210, a road area detection unit 220, a lane detection unit 230, and a lane coordinate providing unit 240.
  • the components of such a processor 110 may be representations of different functions performed by the processor 110 in accordance with a control instruction provided by at least one program code.
  • the lidar data acquisition unit 210 can be used as a functional representation that the processor 110 operates to control the computer system 100 to acquire lidar data.
  • the processor 110 and the components of the processor 110 may perform steps S310 to S350 included in the road area and lane detection method of FIG. 3.
  • the processor 110 and the components of the processor 110 may be implemented to execute instructions according to the code of the operating system included in the memory 120 and the at least one program code described above.
  • the at least one program code may correspond to a code of a program implemented to process the road area and the lane detection method.
  • the road area and lane detection methods may not occur in the order shown, and some of the steps may be omitted or additional processes may be included.
  • the processor 110 may load program code stored in a program file for a road area and a lane detection method, into the memory 120.
  • the program file for the road area and the lane detection method may be stored in the persistent storage 130 described with reference to FIG. 1, and the processor 110 may store the program stored in the persistent storage 130 through a bus.
  • Computer system 110 may be controlled such that program code from a file is loaded into memory 120.
  • each of the processor 110 and the lidar data acquisition unit 210, the road area detection unit 220, the lane detection unit 230, and the lane coordinate providing unit 240 included in the memory 120 is included in the memory 120.
  • processor 110 may be different functional representations of the processor 110 for executing instructions of a corresponding portion of the program code loaded in to execute subsequent steps S320 to S350.
  • the processor 110 and the components of the processor 110 may directly process an operation according to a control command or control the computer system 100.
  • the lidar data acquisition unit 210 performs a point-type lidar data according to an omnidirectional scan around the vehicle from the multichannel lidar sensor through a wired or wireless connection with the multichannel lidar sensor mounted in the vehicle. (Hereinafter referred to as 'lidar point') can be obtained.
  • the lidar data acquisition unit 210 may control a multi-channel lidar sensor having a scan angle of 360 degrees to acquire a lidar point according to the omnidirectional scan.
  • the lidar point may include a value indicating a distance from the lidar sensor and a value indicating the reflection intensity.
  • the multi-channel lidar sensor used in the present invention, laser scanners as many as the number of channels are arranged in a line in the up and down direction to shoot light and return the light to measure distance and reflection intensity for each channel.
  • the multi-channel lidar sensor has a mirror that reflects light in the sensor for omnidirectional scanning, so that the mirror rotates 360 degrees to measure the distance and reflection intensity of the omnidirectional surrounding the vehicle.
  • the lidar data acquisition unit 210 may acquire lidar points having the same scan angle at the same time as the number of channels. As the mirror rotates within the sensor, ie the scanning angle changes, the LiDa points accumulate at a high rate.
  • the degree of denseness of the scan angle is different for each lidar sensor.
  • the lidar data is used.
  • the acquisition unit 210 may acquire 32 LiDa points in units of 1.33 degrees through the multi-channel LiDAR sensor 40.
  • the lidar points acquired through 32 channels while the mirror rotates 360 degrees in the multi-channel lidar sensor 40 form a radial shape according to the omnidirectional scan around the sensor position.
  • the road area detection unit 220 has a rider point having an inclination less than or equal to a threshold value based on the inclination of the rider points between neighboring channels at the rider point acquired from the multichannel rider sensor.
  • a threshold value based on the inclination of the rider points between neighboring channels at the rider point acquired from the multichannel rider sensor.
  • the road area detector 220 may calculate the inclination between the lidar points scanned in the neighboring channels for the same point in time and the same scan angle among the lidar points in order to calculate the inclination between the lidar points in a radial form.
  • the road area detection unit 220 calculates the inclination of the rider points between neighboring channels from the rider point nearest to the position of the rider sensor, and the rider points between neighboring channels from the position of the rider sensor.
  • the road region may be classified to a point before the slope of the crossover exceeds a predetermined threshold value.
  • the road area detection unit 220 may classify the road area for 360 degrees by repeating the above method for each scan angle.
  • FIG. 5 is an exemplary diagram for describing a process of classifying a road area according to an embodiment of the present invention. Assuming that the vehicle 50 equipped with the multi-channel lidar sensor is currently located on the road, the lidar points in the radial form according to the 360-degree omnidirectional scan about the vehicle 50 through the multi-channel lidar sensor 501 can be obtained.
  • a threshold value which is a criterion for classifying road areas, may be set in consideration of road requirements such as the shape of a road and general unevenness.
  • the road area detector 220 calculates the inclination of the lidar point X between neighboring channels at the same scan angle 51, and when the inclination exceeds a specific threshold, the scan angle 51. ) Can be classified into the road area 503 from the position of the vehicle 50 to the lidar point just before the threshold is exceeded, and this process is repeated for each scan angle to form the road area for 360 degrees. Can be classified.
  • the inclination between the points is calculated in a radial form around the sensor position (that is, the vehicle), and when the distribution of the points is maintained below a specific inclination, it may be classified as a road area that can be driven.
  • This method still classifies some of the irregularities caused by the convex shape of a typical road or partial breakage or renovation into the road area, but it can be classified as a low but abrupt slope change point such as a road curb, a road divider, a sight bar, or a parked area. After the point where obstacles such as other vehicles are located, it is possible to effectively classify the non-driving area.
  • the lane detection unit 230 classifies the rider points corresponding to the linear parameters set based on the position and the driving direction of the vehicle among the rider points classified as the road areas in step S340. Lanes can be detected.
  • the lidar point may include a value indicating a reflection intensity, together with a value indicating a distance to the multichannel lidar sensor.
  • Various road markings such as direction arrows, crosswalks, pedestrian crossing notices, stop lines, etc., including lanes in the road area are made by painting with paint, and the paint on the road markings has a certain intensity of reflection over the lidar. do.
  • 6 to 7 are exemplary diagrams for describing a process of detecting a lane according to an embodiment of the present invention.
  • the lane detection unit 230 includes a rider point corresponding to a road surface display of rider points 605 having a reflection intensity of a predetermined value or more, among the rider points classified as the road area 503. Can be classified as
  • the lane detecting unit 230 first classifies the position of the current lidar sensor, that is, the vehicle 50, in order to classify the lidar point corresponding to the lane among the lidar points 605 corresponding to the road surface display, as shown in FIG. 7.
  • the straight line parameter 707 extending in the driving direction of the vehicle 50 may be set as an initial value at both sides of the standard space to the lane width of the standard.
  • the lane detector 230 may classify the lidar points included in the line of the linear parameter 707 among the lidar points 605 corresponding to the road surface display or located within a predetermined range as the lane area.
  • the lane detection unit 230 may update the straight line parameter 707 by using a Lidar point classified into the lane area.
  • the lane detection unit 230 may repeatedly apply an prediction-maximization algorithm that updates the linear parameter 707 with a newly found lidar point in the lane area.
  • the surrounding environment of the vehicle 50 changes as the vehicle 50 travels until the omnidirectional scan of the next cycle is made, and since the position of the lane is likely to be similar to the previous position, the lidar data included in the previous straight line parameter is included. Repeat the classification and update of straight line parameters. Using this method, even if the vehicle 50 changes lanes while driving, the lanes may be continuously detected based on the updated straight line parameters while continuously updating the straight line parameters as the reference for lane detection. According to the performance of the lidar sensor, it is possible to stably classify four lanes corresponding to three lanes at a time, and since the parameter update occurs frequently in the scan area close to the lidar sensor, processing for the curved lane is not necessary. .
  • the lane coordinate providing unit 240 may calculate coordinates corresponding to Lidar points classified as lanes and provide them as coordinate values for lane marking on a map.
  • the lane coordinate providing unit 240 may store LiDAR points classified into lane areas based on a straight line parameter and calculate points connecting them at regular intervals, and calculate GPS coordinates corresponding to the calculated points. Can be specified as a coordinate value for lane marking on the map.
  • the lane coordinate providing unit 240 may calculate and provide a point divided into lanes as a set of GPS coordinate points to be expressed as a lane on the map.
  • the road area may be detected by detecting the road area based on the inclination between points using lidar data, instead of simply detecting the road area satisfying the plane condition.
  • the roadside area can be detected more robustly.
  • the lidar data on the road included in the straight line is classified as a lane, and the lane in the available area on the road. Can be accurately distinguished from other road surface markings such as directional arrows, crosswalks, crosswalk notices, stop lines, and the like.
  • the apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components.
  • the devices and components described in the embodiments may include a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a programmable PLU (programmable). It can be implemented using one or more general purpose or special purpose computers, such as logic units, microprocessors, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • OS operating system
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • processing device includes a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that it may include.
  • the processing device may include a plurality of processors or one processor and one controller.
  • other processing configurations are possible, such as parallel processors.
  • the software may include a computer program, code, instructions, or a combination of one or more of the above, and configure the processing device to operate as desired, or process it independently or collectively. You can command the device.
  • Software and / or data may be any type of machine, component, physical device, virtual equipment, computer storage medium or device in order to be interpreted by or to provide instructions or data to the processing device. It can be embodied in.
  • the software may be distributed over networked computer systems so that they may be stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer readable recording media.
  • the method according to the embodiment may be embodied in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the medium may be to continuously store a program executable by the computer, or to temporarily store for execution or download.
  • the medium may be a variety of recording means or storage means in the form of a single or several hardware combined, not limited to a medium directly connected to any computer system, it may be distributed on the network. Examples of the medium include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, And ROM, RAM, flash memory, and the like, configured to store program instructions.
  • examples of another medium may include a recording medium or a storage medium managed by an app store that distributes an application, a site that supplies or distributes various software, a server, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé de détection d'une zone de route et d'une voie à l'aide de données lidar, et un système associé. Le procédé selon l'invention est mis en œuvre par un ordinateur et comprend les étapes suivantes : obtention de données lidar de type point en provenance d'un capteur lidar multicanal monté sur un véhicule ; et détection d'une zone de route dans laquelle peut se déplacer le véhicule, à l'aide d'un gradient d'un point lidar entre des canaux voisins dans un point lidar obtenu grâce au capteur lidar multicanal.
PCT/KR2016/014565 2016-11-30 2016-12-13 Procédé de détection de zone de route et de voie à l'aide de données lidar, et système associé WO2018101526A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2016-0161070 2016-11-30
KR1020160161070A KR101843866B1 (ko) 2016-11-30 2016-11-30 라이다 데이터를 이용하여 도로 영역과 차선을 검출하는 방법 및 그 시스템

Publications (1)

Publication Number Publication Date
WO2018101526A1 true WO2018101526A1 (fr) 2018-06-07

Family

ID=62188055

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/014565 WO2018101526A1 (fr) 2016-11-30 2016-12-13 Procédé de détection de zone de route et de voie à l'aide de données lidar, et système associé

Country Status (2)

Country Link
KR (1) KR101843866B1 (fr)
WO (1) WO2018101526A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917418A (zh) * 2019-03-28 2019-06-21 安徽理工大学 一种激光雷达无反射区域的测量方法
CN110609268A (zh) * 2018-11-01 2019-12-24 驭势科技(北京)有限公司 一种激光雷达标定方法、装置、系统及存储介质
TWI690439B (zh) * 2018-11-01 2020-04-11 財團法人車輛研究測試中心 道路標線之光達偵測方法及其系統
CN113085877A (zh) * 2019-12-23 2021-07-09 深圳市大富科技股份有限公司 位置关系的检测方法及车辆的辅助驾驶系统
CN113581184A (zh) * 2021-08-25 2021-11-02 京东鲲鹏(江苏)科技有限公司 一种最大可通行区域的确定方法、装置、设备和介质
KR102333828B1 (ko) * 2021-07-21 2021-12-03 (주)뷰런테크놀로지 라이다 데이터 기반 차선 검출 방법 및 시스템
US12062241B2 (en) 2021-05-12 2024-08-13 Hyundai Motor Company Lane recognition device and method based on LIDAR for vehicle

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909577B (zh) * 2018-09-18 2023-11-17 上汽通用汽车有限公司 基于信号相似距离的路面特征分类识别方法
KR102083909B1 (ko) 2018-10-23 2020-03-04 주식회사 모빌테크 포인트 클라우드 맵 기반의 자율주행차량용 차선데이터 정보 자동 추출 방법
KR102103941B1 (ko) 2018-11-14 2020-04-23 주식회사 모빌테크 포인트 클라우드 맵 기반의 자율주행차량용 도로 및 차선데이터 실시간 업데이트 방법
KR102069666B1 (ko) 2018-11-14 2020-01-23 주식회사 모빌테크 포인트 클라우드 맵 기반의 자율주행차량용 실시간 주행경로 설정 방법
KR102504229B1 (ko) * 2018-12-18 2023-02-28 현대자동차주식회사 자율주행 차량의 주행 제어시스템 및 방법
CN113177427A (zh) * 2020-01-23 2021-07-27 宝马股份公司 道路预测的方法以及自主驾驶的方法、车辆及设备
CN113740355B (zh) * 2020-05-29 2023-06-20 清华大学 一种射线检测机器人的边界防护方法及系统
KR102371849B1 (ko) 2020-10-29 2022-03-11 주식회사 드림티엔에스 차선추출 방법
KR102414647B1 (ko) * 2020-11-02 2022-06-30 주식회사 맵퍼스 모바일 매핑 시스템을 이용한 차선 데이터 자동 추출 시스템 및 방법
KR102367138B1 (ko) * 2021-10-13 2022-02-25 (주)뷰런테크놀로지 라이다 센서를 이용하여 횡단보도를 검출하는 방법 및 상기 방법을 수행하는 횡단보도 검출 장치
CN117111089B (zh) * 2023-10-24 2024-02-02 青岛慧拓智能机器有限公司 矿卡卸载点可用状态识别方法、系统、设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070181810A1 (en) * 2006-02-06 2007-08-09 Tan Michael R T Vertical cavity surface emitting laser (VCSEL) array laser scanner
KR20130096012A (ko) * 2012-02-21 2013-08-29 현대엠엔소프트 주식회사 라이다 데이터를 이용한 도로의 곡선반경, 종단 및 횡단 경사도 산출 방법
WO2014073869A1 (fr) * 2012-11-07 2014-05-15 자동차부품연구원 Procédé d'avertissement de vitesse en virage
US20140336842A1 (en) * 2013-05-09 2014-11-13 Hyundai Motor Company System and method for detecting road surface conditions
US20160178802A1 (en) * 2014-12-22 2016-06-23 GM Global Technology Operations LLC Road surface reflectivity detection by lidar sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070181810A1 (en) * 2006-02-06 2007-08-09 Tan Michael R T Vertical cavity surface emitting laser (VCSEL) array laser scanner
KR20130096012A (ko) * 2012-02-21 2013-08-29 현대엠엔소프트 주식회사 라이다 데이터를 이용한 도로의 곡선반경, 종단 및 횡단 경사도 산출 방법
WO2014073869A1 (fr) * 2012-11-07 2014-05-15 자동차부품연구원 Procédé d'avertissement de vitesse en virage
US20140336842A1 (en) * 2013-05-09 2014-11-13 Hyundai Motor Company System and method for detecting road surface conditions
US20160178802A1 (en) * 2014-12-22 2016-06-23 GM Global Technology Operations LLC Road surface reflectivity detection by lidar sensor

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110609268A (zh) * 2018-11-01 2019-12-24 驭势科技(北京)有限公司 一种激光雷达标定方法、装置、系统及存储介质
TWI690439B (zh) * 2018-11-01 2020-04-11 財團法人車輛研究測試中心 道路標線之光達偵測方法及其系統
CN109917418A (zh) * 2019-03-28 2019-06-21 安徽理工大学 一种激光雷达无反射区域的测量方法
CN109917418B (zh) * 2019-03-28 2023-04-07 安徽理工大学 一种激光雷达无反射区域的测量方法
CN113085877A (zh) * 2019-12-23 2021-07-09 深圳市大富科技股份有限公司 位置关系的检测方法及车辆的辅助驾驶系统
US12062241B2 (en) 2021-05-12 2024-08-13 Hyundai Motor Company Lane recognition device and method based on LIDAR for vehicle
KR102333828B1 (ko) * 2021-07-21 2021-12-03 (주)뷰런테크놀로지 라이다 데이터 기반 차선 검출 방법 및 시스템
CN113581184A (zh) * 2021-08-25 2021-11-02 京东鲲鹏(江苏)科技有限公司 一种最大可通行区域的确定方法、装置、设备和介质

Also Published As

Publication number Publication date
KR101843866B1 (ko) 2018-05-14

Similar Documents

Publication Publication Date Title
WO2018101526A1 (fr) Procédé de détection de zone de route et de voie à l'aide de données lidar, et système associé
CN107103272B (zh) 区分车辆要遵循的车道标记
EP3759562B1 (fr) Localisation faisant appel à une caméra pour véhicules autonomes
KR102265703B1 (ko) 카메라를 이용한 차량 환경 모델링
CN111874006B (zh) 路线规划处理方法和装置
US20200282929A1 (en) Sensor validation using semantic segmentation information
CN111753797B (zh) 一种基于视频分析的车辆测速方法
EP4016457A1 (fr) Procédé et appareil de positionnement
US20200125862A1 (en) Method and apparatus for auto calibration
US20230102802A1 (en) Map change detection
WO2021241834A1 (fr) Appareil et procédé de génération de voie virtuelle reposant sur la perception d'informations de flux de trafic pour une conduite autonome dans des conditions météorologiques défavorables
CN114394088B (zh) 泊车循迹轨迹生成方法、装置、电子设备及存储介质
CN110008891A (zh) 一种行人检测定位方法、装置、车载计算设备及存储介质
JP6097533B2 (ja) 車載画像処理装置
WO2012011715A2 (fr) Système d'avertissement de collision de véhicules et son procédé
JP7259309B2 (ja) 画像処理装置および画像処理方法
Qian et al. A self-driving solution for resource-constrained autonomous vehicles in parked areas
KR102622578B1 (ko) 정밀지도 구축 장치 및 방법
JP2018206071A (ja) 区画線認識装置
Qian et al. Building and climbing based visual navigation framework for self-driving cars
CN115909235A (zh) 识别道路豁口的方法、装置、计算机设备及存储介质
KR102019594B1 (ko) 저해상도 카메라를 위한 보정 기법을 이용한 차량 간 거리 추정 방법 및 그 장치
KR20210041304A (ko) 도로 경계 추출 장치 및 방법
US20240246570A1 (en) Path planning system and path planning method thereof
CN115331421B (zh) 路侧多传感环境感知方法、装置及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16922972

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16922972

Country of ref document: EP

Kind code of ref document: A1