WO2024025063A1 - System and method for labeling scenarios using sensor data measurements of autonomous vehicle - Google Patents

System and method for labeling scenarios using sensor data measurements of autonomous vehicle Download PDF

Info

Publication number
WO2024025063A1
WO2024025063A1 PCT/KR2023/002913 KR2023002913W WO2024025063A1 WO 2024025063 A1 WO2024025063 A1 WO 2024025063A1 KR 2023002913 W KR2023002913 W KR 2023002913W WO 2024025063 A1 WO2024025063 A1 WO 2024025063A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
measurement
environment
information
autonomous vehicle
Prior art date
Application number
PCT/KR2023/002913
Other languages
French (fr)
Korean (ko)
Inventor
김태형
허준호
김봉섭
윤경수
Original Assignee
재단법인 지능형자동차부품진흥원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 재단법인 지능형자동차부품진흥원 filed Critical 재단법인 지능형자동차부품진흥원
Publication of WO2024025063A1 publication Critical patent/WO2024025063A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0219Electrical interface; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/33Multimode operation in different systems which transmit time stamped messages, e.g. GPS/GLONASS
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/02Instruments for indicating weather conditions by measuring two or more variables, e.g. humidity, pressure, temperature, cloud cover or wind speed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to a scenario labeling system and method using sensor data measurement of an autonomous vehicle.
  • object-level sensor data labeling is important, but in terms of big data management, it is necessary to infer the environment in which the data was measured and manage it by labeling it as a scenario.
  • Existing technology related to this includes an offline labeling method in which the user manually labels each data frame after data acquisition for labeling objects or environmental conditions (weather, lighting), etc., but separate time is required to label the measured data. There is a problem with additional demands.
  • the present invention was created to solve the above-described conventional problems.
  • the purpose of the present invention is to infer the data measurement environment based on precision road maps, precipitation/illuminance sensors, and GPS, and automatically label it when measuring data.
  • the present invention may include the following examples to achieve the above object.
  • An embodiment of the present invention collects weather and illumination conditions around an autonomous vehicle in real time, and combines dynamic environmental measurement data calculated by inferring the entire environment from the collected information and static environmental measurement data including road information.
  • the present invention includes an autonomous driving measurement unit that outputs sensor measurement data in real time as any one of a camera, lidar, radar, and IMU, and at least one of weather, time, and lighting conditions around the autonomous vehicle.
  • a dynamic environment measurement unit that measures and analyzes the environment
  • a static environment measurement unit that analyzes road and terrain information according to the location of the autonomous vehicle based on GPS location information and a precise road map, and an environment that fuses static and dynamic environment measurement data.
  • Final measurement including a fusion unit, an inference unit that infers the measurement environment through the entire fused measurement data, a scenario labeling unit that labels the inferred measurement environment data, and data labeled for each frame of real-time video data from the autonomous driving measurement unit. It includes a scenario labeling system using sensor data measurement of an autonomous vehicle, including a final measurement data derivation unit that calculates and outputs data.
  • another embodiment of the present invention is a scenario labeling method using sensor data measurement of an autonomous vehicle, a) detecting precipitation and illumination in the autonomous vehicle, GPS information including location and time, and precision road Analyzing the map layer, b) Outputting raw data measured in real time from at least one of a camera, lidar sensor, and radar sensor installed in the autonomous vehicle, and c) Autonomous driving using the precipitation measurement data a step of analyzing the lighting state using weather conditions around the vehicle and illuminance measurement data, d) a step of fusing the analysis results of the weather condition and the analysis result of the lighting condition, and e) the analysis result of the weather condition and the lighting.
  • the present invention increases the amount of information included in scenario labeling by utilizing precision road maps for measurement environment inference, and can respond to environments that change in real time by reflecting real-time sensing results such as precipitation/illuminance sensors in scenario labeling. .
  • the present invention infers the data measurement environment based on precision road maps, precipitation/illuminance sensors, and GPS, and automatically labels this when measuring data, thereby reducing costs such as time/manpower for data labeling and user convenience. Convenience can be increased.
  • Figure 1 is a block diagram for explaining the outline of the present invention.
  • Figure 2 is a block diagram showing a scenario labeling system using sensor data measurement of an autonomous vehicle according to the present invention.
  • Figure 3 is a block diagram showing a static environment measurement unit.
  • Figure 4 is a block diagram showing a dynamic environment measurement unit.
  • Figures 5 and 6 are diagrams showing an example of a day and night environmental data measurement process based on a precision road map.
  • Figure 7 is a diagram illustrating an example of a convergence environment determination process.
  • Figure 8 is a diagram showing an example of a data measurement environment inference process.
  • Figure 9 is a diagram showing an example of final measurement data.
  • Figure 10 is a flowchart showing a sirio labeling method using sensor data measurement of an autonomous vehicle according to the present invention.
  • Figure 1 is a block diagram for explaining the outline of the present invention
  • Figure 2 is a block diagram showing a scenario labeling system using sensor data measurement of an autonomous vehicle according to the present invention
  • Figure 3 is a block diagram showing a static environment measurement unit.
  • Figure 4 is a block diagram showing the dynamic environment measurement unit.
  • the present invention measures the dynamic environment and correction environment around an autonomous vehicle, infers the measurement environment through the measured static and dynamic data, and automatically labels the entire inferred measurement data. It is characterized by combining and outputting raw data detected in real time.
  • the static environment means that it is maintained regardless of the environment, such as GPS measurement information and precise road maps
  • the dynamic environment means weather, time, lighting, illuminance, and surrounding information of the autonomous vehicle (e.g., surrounding vehicle information, traffic information, etc.). refers to a changing environment such as information, etc.).
  • the present invention is a system that automatically labels by combining measurement data from dynamic and static environments and raw data sensed in real time.
  • the present invention includes a static environment measuring unit 100 that measures the static environment, a dynamic environment measuring unit 200 that measures the dynamic environment, and an autonomous driving measuring unit 300 that measures the road environment before or while driving the autonomous vehicle. ), an environment fusion unit 400 that fuses the measurement data of the static environment and the dynamic environment, an inference unit 500 that infers the entire measurement data through the fused measurement information, and a device that labels the inferred measurement data in the scenario. It includes a scenario labeling unit 600 and a final measurement data deriving unit 700 that calculates final measurement data.
  • the static environment measurement unit 100 includes a GPS measurement module 110 that measures GPS data and a road map analysis module 120 that analyzes layers of a precision road map.
  • the GPS measurement module 110 is a sensor that measures the location and time of the vehicle. It receives signals transmitted from two or more GPS satellites, calculates the distance, calculates the location, and provides time information using the atomic clock inside the satellite. do.
  • the road map analysis module 120 extracts information about roads and surrounding facilities from a precise road map based on the current location of the autonomous vehicle on the road map, the designated route, and the current location.
  • Precision road maps are infrastructure for determining the location of autonomous vehicles, setting/changing routes, and recognizing road/traffic regulations, including network information (nodes, links), road section information (tunnels, bridges, etc.), and sign information (traffic safety). It may refer to a precise electronic map expressing at least one of (signs, lanes, crosswalks, etc.) and facility information (traffic lights, vehicle protection and safety facilities, etc.).
  • the precise road map includes driving route nodes/links, roadways/accessory sections, parking surfaces, safety signs, road surface/road line markings, traffic lights, kilo posts, vehicle protection and safety facilities, speed bumps, height obstacles, supports, etc. and each can be configured in the form of a hierarchy (layer).
  • the road map analysis module 120 can add layers to the layer structure of an existing precision road map.
  • the road map analysis module 120 is designed to label road surface materials (e.g., asphalt, concrete, unpaved), etc. on a precision road map based on the location of the autonomous vehicle measured by the GPS measurement module 110. This data adds a layer to the precision road map.
  • additional layers can be selected arbitrarily by the user.
  • the dynamic environment measurement unit 200 includes a precipitation detection module 210 for detecting the amount of precipitation, an illuminance detection module 220 for detecting the illuminance, and a weather analysis module 230 for detecting weather conditions. , includes a lighting state analysis module 240 that analyzes the lighting state.
  • the precipitation detection module 210 is a precipitation sensor that measures the amount of precipitation and can detect the wavelength between the infrared rays being emitted and the infrared rays returning to the sensor. Additionally, the precipitation detection module 210 can estimate the amount of precipitation using the intensity of reflected light that varies depending on the amount of rain. This precipitation detection module 210 is mounted on the exterior of an autonomous vehicle and detects the amount of precipitation.
  • the precipitation detection module 210 may further include a device that detects the presence or absence of snow and the amount of snow.
  • the illuminance detection module 220 is a sensor that measures the surrounding brightness. When it receives optical energy (light), moving electrons are generated inside it, and the conductivity changes, thereby changing the intensity of the output voltage.
  • the illumination detection module 220 is installed on the outside of the vehicle to detect the intensity of light from the outside, and detects the characteristics of the light source (for example, the direction of incidence) by installing sensors in four or more directions. It is desirable to reduce measurement errors.
  • the weather analysis module 230 receives data measured by the precipitation detection module 210, analyzes weather conditions, and outputs a result. For example, the weather analysis module 230 can calculate rainfall and snowfall amounts. Alternatively, the weather analysis module 230 may calculate and output the analysis results of weather conditions as simple information such as little rainfall, heavy rainfall, little snowfall, and heavy snowfall.
  • the lighting condition analysis module 240 uses the illuminance measurement data from the illuminance detection module 220, the time measurement data from the GPS measurement module 110, and the road information from the road map analysis module 120 to determine the current time and the autonomous vehicle. Analyze the surrounding conditions (e.g., whether it is day, night, or in a tunnel, etc.). This is explained with reference to FIGS. 5 and 6.
  • FIG. 5 is a diagram illustrating an example of a daytime environmental data measurement process
  • FIG. 6 is a diagram illustrating an example of a nighttime environment data measurement process.
  • the lighting condition analysis module 240 provides current location information (e.g., whether on a road or in a tunnel) on a precise road map through measurement data of the road map analysis module, and the illuminance detection module 220. )'s illuminance detection data and GPS time information are fused to analyze the current time (night or day) and lighting conditions and output the results (for example, while passing through a tunnel during the day or driving on the road at night).
  • the autonomous driving measurement unit 300 includes at least one of a lidar sensor, radar sensor, camera, GPS, and IMU installed in the autonomous vehicle and measures the surroundings of the autonomous vehicle.
  • the data measured here consists of two-dimensional and/or three-dimensional data such as camera images and lidar point clouds.
  • the environment fusion unit 400 analyzes the surrounding environment by fusing weather, time, and lighting state analysis results.
  • the environment fusion unit 400 will be described with reference to FIG. 7 .
  • Figure 7 is a diagram illustrating an example of a convergence environment determination process.
  • the environment fusion unit 400 fuses the weather condition information and time and lighting information of the weather analysis module 230 and the lighting condition analysis module 240 to determine the current time (e.g., daytime) and driving information.
  • Result data including roads (e.g., general roads) and weather conditions (e.g., cloudy) can be calculated.
  • the inference unit 500 infers the overall data measurement environment including data in the static environment and data in the dynamic environment. This is explained with reference to FIG. 8.
  • Figure 8 is a diagram showing an example of a data measurement environment inference process.
  • the inference unit 500 adds static environment data to the dynamic environment data calculated by the environment fusion unit 400 to infer the entire data measurement environment.
  • static environmental data includes network information (nodes, links) of the road map analysis module, road section information (tunnels, bridges, etc.), sign information (traffic safety signs, lanes, crosswalks, etc.), and facility information (traffic lights, vehicle protection and safety). (Soil, etc.), number of lanes, lane number, road type, road grade, lane type, roadway section, and road line marking type.
  • the inference unit 500 may add measurement data (eg, camera image) around the autonomous vehicle measured by the autonomous driving measurement unit 300.
  • the scenario labeling unit 600 defines the inferred measurement environment data that is a fusion of static environment data and dynamic environment data as a scenario, configures it as labeling data, and fuses it with the raw data to generate learning data.
  • the raw data is data from the autonomous driving measurement unit 300 (e.g., camera image, video image, and lidar point cloud data), and the labeling data is data that constitutes the scenario defined in the scenario lavalem department.
  • the autonomous driving measurement unit 300 e.g., camera image, video image, and lidar point cloud data
  • the labeling data is data that constitutes the scenario defined in the scenario lavalem department.
  • the final measurement data deriving unit 700 derives the final measurement data as learning data that is a fusion of raw data and labeling data. Such final measurement data will be described with reference to FIG. 8.
  • Figure 8 is a diagram showing an example of final measurement data.
  • the final measurement data deriving unit 700 fuses the raw data from the autonomous driving measurement unit and the labeling data from the scenario labeling unit 600 and outputs them. At this time, labeling data may be output for each frame of raw data, as shown in FIG. 10.
  • the final data is output including labeling data for each frame of the raw data.
  • the present invention includes a scenario labeling method using sensor data measurement of an autonomous vehicle using the above configuration. This is explained with reference to FIG. 10.
  • Figure 10 is a flowchart showing a scenario labeling method using sensor data measurement of an autonomous vehicle according to the present invention.
  • the present invention includes step S110 of calculating measurement data, step S120 of measuring autonomous driving sensor data, step S130 of analyzing static environmental data, step S140 of determining the fusion environment, and data measurement. It includes a step S150 for inferring the environment, a scenario labeling step for calculating learning data, and a step S170 for deriving final measurement data.
  • Step S110 is a step of outputting sensor measurement data.
  • step S110 includes step S111 of measuring precipitation data, step S112 of measuring illuminance data, step S113 of measuring GPS data, and step S114 of analyzing the precision road map layer.
  • Step S111 is a step in which the precipitation detection module 210 detects the presence or absence of precipitation and/or the amount of precipitation.
  • the precipitation detection module 210 detects the presence or absence of precipitation and/or the amount of precipitation around the autonomous vehicle. At this time, the precipitation detection module 210 can output the measured amount of precipitation by classifying it as high or low.
  • Step S112 is a step in which the illuminance detection module 220 detects the illuminance around the autonomous vehicle.
  • the illuminance detection module 220 measures the illuminance around the autonomous vehicle and outputs illuminance measurement data.
  • Step S113 is a step in which the GPS measurement module 110 calculates location information and time information.
  • Step S114 is a step in which the road map analysis module 120 analyzes a layer at the current location of the autonomous vehicle on a precise road map using GPS measurement data.
  • the road map analysis module 120 can add a new layer, such as the color or material of the road surface, to the current location of the autonomous vehicle on the precision road map according to set conditions or input commands.
  • the road map analysis module 120 provides current location information (e.g., number of lanes, lane number, road type, presence or absence of tunnel, road surface material, road grade, lane type, roadway section type, road surface) through a precise road map. Outputs precision analysis data such as line display type.
  • Step S120 is a step in which raw data is output from the autonomous driving measurement unit 300.
  • Raw data is information measured by cameras, lidar sensors, and radar sensors installed in autonomous vehicles.
  • Step S130 includes step S131 for analyzing weather conditions and step S132 for analyzing time/lighting conditions.
  • step S131 the weather analysis module 230 analyzes the data measured by the precipitation detection module 210 to determine the current location of the autonomous vehicle and the presence or absence of rain and/or snowfall in the surrounding area, and the amount of rainfall and/or snowfall. Analyze.
  • step S131 the current time and autonomous vehicle are measured through the illuminance measurement data around the autonomous vehicle from the lighting condition analysis module 240, the time measurement data from the GPS measurement module 110, and the surrounding terrain data from the road map analysis module 120. This is the step to analyze the surrounding lighting conditions of the driving vehicle.
  • Step S140 is a step in which the environmental fusion unit 400 fuses the weather state analysis results and the time/lighting state analysis results.
  • the environment fusion unit 400 fuses weather, time, and lighting state analysis results to produce dynamic environmental data.
  • the dynamic environmental data may include the current time (eg, daytime), driving road and/or terrain information (eg, general road, tunnel), and weather conditions (eg, cloudy).
  • Step S150 is a step in which the inference unit 500 infers the data measurement environment.
  • the inference unit 500 infers the data measurement environment by fusing the static environment data of the road map analysis module with the dynamic environment data calculated in step S140 and outputs the result.
  • Step S160 is a step of labeling the entire measurement data inferred by the scenario labeling unit 600 and generating learning data combined with the raw data.
  • the scenario labeling unit 600 labels the entire inferred measurement data and defines it as a scenario.
  • Step S170 is a step of deriving final measurement data from the final measurement data derivation unit 700.
  • the final measurement data deriving unit 700 fuses the scenarios defined in the scenario labeling unit 600 for each frame of raw data, that is, labeled data, with the raw data to produce final measurement data.
  • the scenario labeling method using sensor data measurement of autonomous vehicles reflects real-time sensor measurement information of autonomous vehicles in scenario labeling and increases the amount of information by using precise road maps for measurement environment inference. As all of these processes are automated, the time and/or manpower required for scenario labeling can be significantly reduced compared to the past.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Ecology (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Environmental Sciences (AREA)
  • Electromagnetism (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Atmospheric Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

An embodiment of the present invention provides a system for labeling scenarios using sensor data measurements of an autonomous vehicle, wherein the system collects weather and illumination states around the autonomous vehicle in real time, and automatically labels overall measurement environment data, which combines calculated dynamic environment measurement data and static environment measurement data including road information, by inferring the overall environment as the collected information.

Description

자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템 및 방법Scenario labeling system and method using sensor data measurement of autonomous vehicles
본 발명은 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템 및 방법에 관한 것이다. The present invention relates to a scenario labeling system and method using sensor data measurement of an autonomous vehicle.
최근 자율 주행 및 운전자 보조 시스템(Driver Assistance Systems; DAS) 등 지능형 자동차 기술의 연구가 활발히 진행되고 있다. 이중, 최근에는 자율 주행 차량의 센서 데이터를 인공지능 학습 데이터로 활용할 수 있는 기술에 대한 연구도 활발하게 이루어지고 있다. Recently, research on intelligent vehicle technologies such as autonomous driving and driver assistance systems (DAS) has been actively conducted. Among these, research is being actively conducted recently on technologies that can utilize sensor data from autonomous vehicles as artificial intelligence learning data.
이와 같은 자율 주행 센서 데이터를 인공지능 학습 데이터로 구축/활용하기 위해서는 센서 원시데이터에 대한 라벨링이 필요하다. In order to construct/use such autonomous driving sensor data as artificial intelligence learning data, labeling of raw sensor data is necessary.
그리고 센서 데이터의 라벨링을 위해서는 객체 단위의 센서 데이터 라벨링도 중요하지만, 빅데이터 관리 측면에서 데이터가 계측된 환경을 추론하고 이를 시나리오로 라벨링하여 관리할 필요가 있다. In order to label sensor data, object-level sensor data labeling is important, but in terms of big data management, it is necessary to infer the environment in which the data was measured and manage it by labeling it as a scenario.
종래의 계측/라벨링 시스템은 대부분 객체(차량/보행자 등)에 초점을 맞추고 있으며, 개방형 데이터 셋의 경우 대부분 객체 단위로 라벨링한 데이터를 배포하고 있었다. Most conventional measurement/labeling systems focus on objects (vehicles/pedestrians, etc.), and in the case of open data sets, data labeled on an object basis is mostly distributed.
이와 관련된 기존 기술은 객체 또는 환경 조건(기상, 조명) 등에 대한 라벨링을 위하여 데이터 취득 이후에 각 데이터 프레임에 대해 사용자가 수동으로 라벨링하는 오프라인 라벨링 방법이 있으나, 계측된 데이터를 라벨링하기 위해 별도의 시간소요가 추가되는 문제점이 있다. Existing technology related to this includes an offline labeling method in which the user manually labels each data frame after data acquisition for labeling objects or environmental conditions (weather, lighting), etc., but separate time is required to label the measured data. There is a problem with additional demands.
또한 종래 기술은 데이터 취득을 하면서 사용자가 트리거 입력을 하는 형태로 데이터 라벨링하는 방법도 있으나, 운전 중 트리거 입력을 위한 조작이 필요하므로 안전상의 문제와 추가인력이 필요하다는 문제가 있으며 갑작스러운 환경변화에 대응하기 어려운 문제점이 있었다. Additionally, in the prior art, there is a method of labeling data in the form of a trigger input by the user while acquiring data, but since it requires manipulation for trigger input while driving, there are safety issues and the need for additional manpower, and there are problems with sudden environmental changes. There was a problem that was difficult to respond to.
본 발명은 상기와 같은 종래의 문제점을 해결하기 위하여 안출된 것으로서, 본 발명의 목적은 정밀 도로 지도와 강수/조도센서 및 GPS를 기반으로 데이터 계측 환경을 추론하고, 이를 데이터 계측시 자동으로 라벨링해주는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템 및 방법을 제공함에 있다. The present invention was created to solve the above-described conventional problems. The purpose of the present invention is to infer the data measurement environment based on precision road maps, precipitation/illuminance sensors, and GPS, and automatically label it when measuring data. To provide a scenario labeling system and method using sensor data measurement of autonomous vehicles.
본 발명은 상기와 같은 목적을 달성하기 위하여 하기와 같은 실시예를 포함할 수 있다 .The present invention may include the following examples to achieve the above object.
본 발명의 실시예는 자율 주행 차량 주변의 기상, 조도 상태를 실시간으로 수집하고, 수집된 정보로서 전체 환경을 추론하여 산출된 동적 환경 계측 데이타와, 도로정보를 포함하는 정적 환경 계측 데이터가 결합된 전체 계측 환경 데이터를 자동으로 라벨링하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템을 제공한다. An embodiment of the present invention collects weather and illumination conditions around an autonomous vehicle in real time, and combines dynamic environmental measurement data calculated by inferring the entire environment from the collected information and static environmental measurement data including road information. We provide a scenario labeling system using sensor data measurement from autonomous vehicles that automatically labels the entire measurement environment data.
또한, 본 발명은 다른 실시예로서, 카메라, 라이다, 레이더 및 IMU 중 어느 하나로서 실시간으로 센서 계측 데이터를 출력하는 자율 주행 계측부와, 자율 주행 차량 주변의 기상과 시간 및 조명 상태 중 적어도 하나 이상을 계측 및 분석하는 동적 환경 계측부와, GPS 위치 정보와 정밀 도로 지도 따라 자율 주행 차량의 위치에 따른 도로 및 지형 정보를 분석 하는 정적 환경 계측부와, 정적 환경 계측 데이터와 동적 환경 계측 데이터를 융합하는 환경 융합부와, 융합된 전체 계측 데이터를 통하여 계측 환경을 추론하는 추론부와, 추론된 계측 환경 데이터를 라벨링하는 시나리오 라벨링부 및 자율 주행 계측부의 실시간 영상 데이터의 프레임별로 라벨링된 데이터가 포함된 최종 계측 데이터를 산출 및 출력하는 최종 계측 데이터 도출부를 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템을 포함한다. In addition, the present invention, as another embodiment, includes an autonomous driving measurement unit that outputs sensor measurement data in real time as any one of a camera, lidar, radar, and IMU, and at least one of weather, time, and lighting conditions around the autonomous vehicle. A dynamic environment measurement unit that measures and analyzes the environment, a static environment measurement unit that analyzes road and terrain information according to the location of the autonomous vehicle based on GPS location information and a precise road map, and an environment that fuses static and dynamic environment measurement data. Final measurement including a fusion unit, an inference unit that infers the measurement environment through the entire fused measurement data, a scenario labeling unit that labels the inferred measurement environment data, and data labeled for each frame of real-time video data from the autonomous driving measurement unit. It includes a scenario labeling system using sensor data measurement of an autonomous vehicle, including a final measurement data derivation unit that calculates and outputs data.
또한, 본 발명의 또 다른 실시예는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법에 있어서, a)자율 주행 차량에서 강수와 조도를 감지하고, 위치와 시간을 포함하는 GPS 정보와, 정밀 도로 지도 레이어를 분석하는 단계와, b)자율 주행 차량에 설치되는 카메라, 라이다 센서, 레이더 센서 중 적어도 하나 이상에서 실시간 계측된 원시 데이터를 출력하는 단계와, c)강수 계측 데이터를 이용하여 자율 주행 차량 주변의 기상 상태와, 조도 계측 데이터를 이용하여 조명 상태를 분석하는 단계와, d)기상 상태의 분석 결과와, 조명 상태의 분석 결과를 융합하는 단계와, e)기상 상태의 분석 결과와 조명 상태가 융합된 동적 환경 데이터와 도로 지도분석 모듈의 도로 정보를 융합하여 자율 주행 차량 주변의 전체 계측 데이터를 추론하는 단계 및 f)추론된 전체 계측 데이터를 자동으로 라벨링하고, 원시 데이터의 영상 프레임별로 라벨링된 데이터를 결합하여 최종 데이터를 산출하는 단계를 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법을 포함한다.In addition, another embodiment of the present invention is a scenario labeling method using sensor data measurement of an autonomous vehicle, a) detecting precipitation and illumination in the autonomous vehicle, GPS information including location and time, and precision road Analyzing the map layer, b) Outputting raw data measured in real time from at least one of a camera, lidar sensor, and radar sensor installed in the autonomous vehicle, and c) Autonomous driving using the precipitation measurement data a step of analyzing the lighting state using weather conditions around the vehicle and illuminance measurement data, d) a step of fusing the analysis results of the weather condition and the analysis result of the lighting condition, and e) the analysis result of the weather condition and the lighting. f) inferring the entire measurement data around the autonomous vehicle by fusing the state-fused dynamic environment data and the road information of the road map analysis module; f) automatically labeling the entire inferred measurement data and labeling each video frame of the raw data; It includes a scenario labeling method using sensor data measurement of an autonomous vehicle, which includes combining labeled data to calculate final data.
따라서 본 발명은 정밀 도로 지도를 계측 환경 추론에 활용함으로써, 시나리오 라벨링에 포함되는 정보의 양이 증가되고, 강수/조도 센서 등 실시간 센싱 결과를 시나리오 라벨링에 반영함으로써 실시간으로 변화되는 환경에 대응 가능하다. Therefore, the present invention increases the amount of information included in scenario labeling by utilizing precision road maps for measurement environment inference, and can respond to environments that change in real time by reflecting real-time sensing results such as precipitation/illuminance sensors in scenario labeling. .
또한, 본 발명은 정밀 도로 지도와 강수/조도센서 및 GPS를 기반으로 데이터 계측 환경을 추론하고, 이를 데이터 계측시 자동으로 라벨링이 가능하기에 데이터 라벨링에 대한 시간/인력 등의 비용절감과 사용자의 편의성을 증대시킬 수 있다. In addition, the present invention infers the data measurement environment based on precision road maps, precipitation/illuminance sensors, and GPS, and automatically labels this when measuring data, thereby reducing costs such as time/manpower for data labeling and user convenience. Convenience can be increased.
도 1은 본 발명의 개요를 설명하기 위한 블럭도 이다. Figure 1 is a block diagram for explaining the outline of the present invention.
도 2는 본 발명에 따른 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템을 도시한 블럭도이다. Figure 2 is a block diagram showing a scenario labeling system using sensor data measurement of an autonomous vehicle according to the present invention.
도 3은 정적 환경 계측부를 도시한 블럭도이다 Figure 3 is a block diagram showing a static environment measurement unit.
도 4는 동적 환경 계측부를 도시한 블럭도이다. Figure 4 is a block diagram showing a dynamic environment measurement unit.
도 5 및 도 6은 정밀 도로 지도 기반의 주야간 환경 데이터 계측 과정에서 일예를 도시한 도면이다. Figures 5 and 6 are diagrams showing an example of a day and night environmental data measurement process based on a precision road map.
도 7은 융합 환경 판단 과정의 예를 도시한 도면이다. Figure 7 is a diagram illustrating an example of a convergence environment determination process.
도 8는 데이터 계측 환경 추론 과정의 예를 도시한 도면이다. Figure 8 is a diagram showing an example of a data measurement environment inference process.
도 9는 최종 계측 데이터의 예를 도시한 도면이다. Figure 9 is a diagram showing an example of final measurement data.
도 10은 본 발명에 따른 자율 주행 차량의 센서 데이터 계측을 이용한 시리오 라벨링 방법을 도시한 순서도이다. Figure 10 is a flowchart showing a sirio labeling method using sensor data measurement of an autonomous vehicle according to the present invention.
본 발명은 다양한 변경을 가할 수 있고 여러 가지 실시예를 가질 수 있지만, 특정 실시예를 도면에 예시하여 상세하게 설명하고자 한다. 이는 본 발명을 특정한 실시 형태에 대해 한정하려는 것이 아니며, 서로 다른 방향으로 연장되는 구조물을 연결 및/또는 고정시키기 위한 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물중 어느 하나에 해당되는 것으로 이해되어야 한다.Although the present invention may be subject to various changes and may have various embodiments, specific embodiments will be described in detail by illustrating them in the drawings. This is not intended to limit the present invention to specific embodiments, and is not intended to limit the present invention to any of the changes, equivalents, or substitutes included in the spirit and scope of the present invention for connecting and/or fixing structures extending in different directions. It must be understood as applicable.
본 명세서에서 사용한 용어는 단지 특정한 실시예를 설명하기 위해 사용된 것으로, 본 발명을 한정하려는 의도가 아니다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. The terms used herein are only used to describe specific embodiments and are not intended to limit the invention. Singular expressions include plural expressions unless the context clearly dictates otherwise.
또한, 본 발명을 설명함에 있어 관련된 공지 기술에 대한 구체적인 설명이 본 발명의 요지를 불필요하게 흐릴 수 있다고 판단되는 경우 그 상세한 설명을 생략한다.Additionally, when describing the present invention, if it is determined that a detailed description of related known technologies may unnecessarily obscure the gist of the present invention, the detailed description will be omitted.
이하에서는 본 발명의 실시예를 첨부된 도면을 참조하여 설명한다. Hereinafter, embodiments of the present invention will be described with reference to the attached drawings.
도 1은 본 발명의 개요를 설명하기 위한 블럭도, 도 2는 본 발명에 따른 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템을 도시한 블럭도, 도 3은 정적 환경 계측부를 도시한 블럭도, 도 4는 동적 환경 계측부를 도시한 블럭도이다. Figure 1 is a block diagram for explaining the outline of the present invention, Figure 2 is a block diagram showing a scenario labeling system using sensor data measurement of an autonomous vehicle according to the present invention, and Figure 3 is a block diagram showing a static environment measurement unit. , Figure 4 is a block diagram showing the dynamic environment measurement unit.
도 1 및 도 2를 참조하면, 본 발명은 자율 주행 차량 주변의 동적 환경과 정정 환경을 계측하고, 계측된 정적 및 동적 데이터를 통하여 계측 환경을 추론하고, 추론된 전체 계측 데이터를 자동으로 라벨링 하여 실시간으로 감지되는 원시 데이터에 결합하여 출력하는 것을 특징으로 한다. Referring to Figures 1 and 2, the present invention measures the dynamic environment and correction environment around an autonomous vehicle, infers the measurement environment through the measured static and dynamic data, and automatically labels the entire inferred measurement data. It is characterized by combining and outputting raw data detected in real time.
여기서 정적 환경은 GPS 계측 정보, 정밀 도로 지도와 같이 환경에 상관 없이 유지되는 것을 의미하며, 동적 환경은 기상이나 시간, 조명, 조도와 자율 주행 차량의 주변 정보(예를 들면, 주변 차량정보, 교통 정보 등)와 같이 변화되는 환경을 의미한다.Here, the static environment means that it is maintained regardless of the environment, such as GPS measurement information and precise road maps, and the dynamic environment means weather, time, lighting, illuminance, and surrounding information of the autonomous vehicle (e.g., surrounding vehicle information, traffic information, etc.). refers to a changing environment such as information, etc.).
구체적으로 본 발명은 동적 환경과 정적 환경의 계측 데이터와, 실시간으로 감지되는 원시 데이터를 결합시켜 자동으로 라벨링하는 시스템이다. Specifically, the present invention is a system that automatically labels by combining measurement data from dynamic and static environments and raw data sensed in real time.
이를 위해 본 발명은 정적 환경을 계측하는 정적 환경 계측부(100)와, 동적 환경을 계측하는 동적 환경 계측부(200)와, 자율 주행 차량의 주행 전 또는 주행 중의 도로 환경을 계측하는 자율 주행 계측부(300)와, 정적 환경과 동적 환경의 계측 데이터를 융합하는 환경 융합부(400)와, 융합된 계측 정보를 통하여 전체 계측 데이터를 추론하는 추론부(500)와, 추론된 계측 데이터를 시나리오에 라벨링하는 시나리오 라벨링부(600)와, 최종 계측 데이터를 산출하는 최종 계측 데이터 도출부(700)를 포함한다. To this end, the present invention includes a static environment measuring unit 100 that measures the static environment, a dynamic environment measuring unit 200 that measures the dynamic environment, and an autonomous driving measuring unit 300 that measures the road environment before or while driving the autonomous vehicle. ), an environment fusion unit 400 that fuses the measurement data of the static environment and the dynamic environment, an inference unit 500 that infers the entire measurement data through the fused measurement information, and a device that labels the inferred measurement data in the scenario. It includes a scenario labeling unit 600 and a final measurement data deriving unit 700 that calculates final measurement data.
정적 환경 계측부(100)는, 도 3을 참조하면, GPS 데이터를 계측하는 GPS 계측 모듈(110)과, 정밀 도로 지도의 레이어를 분석하는 도로 지도 분석 모듈(120)을 포함한다. Referring to FIG. 3, the static environment measurement unit 100 includes a GPS measurement module 110 that measures GPS data and a road map analysis module 120 that analyzes layers of a precision road map.
GPS 계측 모듈(110)은 계측 차량의 위치와 시간을 측정하는 센서로서 2 이상의 GPS 위성으로부터 송신된 신호를 수신하여 거리를 계산하여 위치를 산출하고, 위성 내부의 원자시계를 이용하여 시간 정보를 제공한다. The GPS measurement module 110 is a sensor that measures the location and time of the vehicle. It receives signals transmitted from two or more GPS satellites, calculates the distance, calculates the location, and provides time information using the atomic clock inside the satellite. do.
도로 지도 분석 모듈(120)은 도로 지도 상에서 현재 자율 주행 차량의 위치와, 지정된 경로 및 현재 위치를 기준으로 정밀 도로 지도에서 도로 및 주변 시설물의 정보를 추출한다. The road map analysis module 120 extracts information about roads and surrounding facilities from a precise road map based on the current location of the autonomous vehicle on the road map, the designated route, and the current location.
정밀 도로 지도는 자율 주행 차량의 위치 결정, 경로 설정/변경, 도로/교통 규제 인지 등을 위한 인프라로써, 네트워크 정보(노드, 링크), 도로구간 정보(터널, 교량 등), 표지 정보(교통안전 표지, 차선, 횡단보도 등), 시설 정보(신호등, 차량방호안전시설 등) 중 적어도 하나 이상을 표현하는 정밀 전자지도를 의미할 수 있다. Precision road maps are infrastructure for determining the location of autonomous vehicles, setting/changing routes, and recognizing road/traffic regulations, including network information (nodes, links), road section information (tunnels, bridges, etc.), and sign information (traffic safety). It may refer to a precise electronic map expressing at least one of (signs, lanes, crosswalks, etc.) and facility information (traffic lights, vehicle protection and safety facilities, etc.).
또한, 정밀 도로 지도는 주행경로노드/링크, 차도/부속구간, 주차면, 안전표지, 노면/노면선표시, 신호등, 킬로포스트, 차량방호안전시설, 과속방지턱, 높이장애물, 지주 등을 포함하고 있으며, 각각 계층(레이어) 형태로 구성될 수 있다. In addition, the precise road map includes driving route nodes/links, roadways/accessory sections, parking surfaces, safety signs, road surface/road line markings, traffic lights, kilo posts, vehicle protection and safety facilities, speed bumps, height obstacles, supports, etc. and each can be configured in the form of a hierarchy (layer).
도로 지도 분석 모듈(120)은 기존의 정밀 도로 지도의 레이어 구성에 레이어를 추가할 수 있다. 예를 들면, 도로 지도 분석 모듈(120)은 GPS 계측 모듈(110)에서 계측된 자율 주행 차량의 위치를 기준으로 정밀 도로 지도 상에서 도로 노면 재질(예를 들면, 아스팔트, 콘크리트, 비포장) 등 라벨링하고자 하는 데이터로서 정밀 도로 지도에 레이어를 추가한다. 여기서 추가 레이어는 사용자가 임의로 선택할 수 있다. The road map analysis module 120 can add layers to the layer structure of an existing precision road map. For example, the road map analysis module 120 is designed to label road surface materials (e.g., asphalt, concrete, unpaved), etc. on a precision road map based on the location of the autonomous vehicle measured by the GPS measurement module 110. This data adds a layer to the precision road map. Here, additional layers can be selected arbitrarily by the user.
동적 환경 계측부(200)는, 도 4를 참조하면, 강수량을 감지하는 강수 감지 모듈(210)과, 조도를 감지하는 조도 감지 모듈(220)과, 기상 상태를 감지하는 기상 분석 모듈(230)과, 조명 상태를 분석하는 조명 상태 분석 모듈(240)을 포함한다. Referring to FIG. 4, the dynamic environment measurement unit 200 includes a precipitation detection module 210 for detecting the amount of precipitation, an illuminance detection module 220 for detecting the illuminance, and a weather analysis module 230 for detecting weather conditions. , includes a lighting state analysis module 240 that analyzes the lighting state.
강수 감지 모듈(210)은, 예를 들면, 강수량을 측정하는 강수 센서로서 적외선을 발사한 뒤 적외선이 다시 센서로 돌아오는 사이의 파장을 감지할 수 있다. 또한, 강수 감지 모듈(210)은 비의 양에 따라 달라지는 반사광의 세기를 이용하여 강수량을 추정함도 가능하다. 이와 같은 강수 감지 모듈(210)은 자율 주행 차량의 외부에 장착되어 강수량을 감지한다. The precipitation detection module 210, for example, is a precipitation sensor that measures the amount of precipitation and can detect the wavelength between the infrared rays being emitted and the infrared rays returning to the sensor. Additionally, the precipitation detection module 210 can estimate the amount of precipitation using the intensity of reflected light that varies depending on the amount of rain. This precipitation detection module 210 is mounted on the exterior of an autonomous vehicle and detects the amount of precipitation.
또는 강수 감지 모듈(210)은 강설 유무 및 강설량을 감지하는 장치를 더 포함할 수 있다. Alternatively, the precipitation detection module 210 may further include a device that detects the presence or absence of snow and the amount of snow.
조도 감지 모듈(220)은 주변의 밝기를 측정하는 센서로서 광에너지(빛)를 받으면 내부에 움직이는 전자가 발생하여 전도율이 변화하여 출력 전압의 세기가 변화되는 장치이다. 조도 감지 모듈(220)은 차량 외부에 장착하는 형태로서 외부에서 빛의 세기를 감지할 수 있도록 설치하며, 4 방향 이상으로 센서를설치하여 광원의 특성(예를 들면, 입사 방향)을 감지하고, 게측 오차를 줄일 수 있도록 하는 것이 바람직하다. The illuminance detection module 220 is a sensor that measures the surrounding brightness. When it receives optical energy (light), moving electrons are generated inside it, and the conductivity changes, thereby changing the intensity of the output voltage. The illumination detection module 220 is installed on the outside of the vehicle to detect the intensity of light from the outside, and detects the characteristics of the light source (for example, the direction of incidence) by installing sensors in four or more directions. It is desirable to reduce measurement errors.
기상 분석 모듈(230)은 강수 감지 모듈(210)에서 계측된 데이터를 수신하여 기상 상태를 분석하여 결과를 출력한다. 예를 들면, 기상 분석 모듈(230)은 강우량과 강설량을 산출할 수 있다. 또는 기상 분석 모듈(230)은 기상 상태의 분석 결과를 강우 적음, 강우 많음, 강설 적음 강설 많음과 같이 간략한 정보로서 산출 및 출력함도 가능하다. The weather analysis module 230 receives data measured by the precipitation detection module 210, analyzes weather conditions, and outputs a result. For example, the weather analysis module 230 can calculate rainfall and snowfall amounts. Alternatively, the weather analysis module 230 may calculate and output the analysis results of weather conditions as simple information such as little rainfall, heavy rainfall, little snowfall, and heavy snowfall.
조명 상태 분석 모듈(240)은 조도 감지 모듈(220)의 조도 계측 데이터와, GPS 계측 모듈(110)의 시간 계측 데이터와, 도로 지도 분석 모듈(120)의 도로 정보를 통하여 현재 시간과 자율 주행 차량의 주변 상태(예를 들면, 주간, 야간 또는 터널 여부 등)를 분석한다. 이는 도 5과 도 6을 참조하여 설명한다. The lighting condition analysis module 240 uses the illuminance measurement data from the illuminance detection module 220, the time measurement data from the GPS measurement module 110, and the road information from the road map analysis module 120 to determine the current time and the autonomous vehicle. Analyze the surrounding conditions (e.g., whether it is day, night, or in a tunnel, etc.). This is explained with reference to FIGS. 5 and 6.
도 5은 주간 환경 데이터 계측 과정의 일예를 도시한 도면이고, 도 6은 야간 환경 데이터 계측 과정의 일예를 도시한 도면이다. FIG. 5 is a diagram illustrating an example of a daytime environmental data measurement process, and FIG. 6 is a diagram illustrating an example of a nighttime environment data measurement process.
도 5 및 도 6을 참조하면, 조명 상태 분석 모듈(240)은 도로 지도 분석모듈의 계측 데이터를 통하여 정밀 도로 지도 상의 현재 위치 정보(예를 들면, 도로 위 또는 터널 여부와, 조도 감지 모듈(220)의 조도 감지 데이터와, GPS 시간 정보를 융합하여 현재 시간(야간 또는 주간)과, 조명 상태를 분석하여 그 결과(예를 들면, 주간에 터널 통과 중 또는 야간에 도로 주행 중)를 출력한다. Referring to FIGS. 5 and 6, the lighting condition analysis module 240 provides current location information (e.g., whether on a road or in a tunnel) on a precise road map through measurement data of the road map analysis module, and the illuminance detection module 220. )'s illuminance detection data and GPS time information are fused to analyze the current time (night or day) and lighting conditions and output the results (for example, while passing through a tunnel during the day or driving on the road at night).
자율 주행 계측부(300)는 자율 주행 차량에 설치된 라이더 센서, 레이더 센서, 카메라, GPS, IMU 중 적어도 하나 이상을 포함하여 자율 주행 차량 주변을 계측한다. 여기서 계측된 데이터는 카메라 이미지, 라이다 포인트 클라우드 등 2차원 및/또는 3차원 데이터로 구성된다. The autonomous driving measurement unit 300 includes at least one of a lidar sensor, radar sensor, camera, GPS, and IMU installed in the autonomous vehicle and measures the surroundings of the autonomous vehicle. The data measured here consists of two-dimensional and/or three-dimensional data such as camera images and lidar point clouds.
환경 융합부(400)는 기상 및 시간과 조명 상태 분석 결과를 융합하여 주변 환경을 분석한다. 이와 같은 환경 융합부(400)의 설명은 도 7을 참조하여 설명한다. The environment fusion unit 400 analyzes the surrounding environment by fusing weather, time, and lighting state analysis results. The environment fusion unit 400 will be described with reference to FIG. 7 .
도 7은 융합 환경 판단 과정의 예를 도시한 도면이다. Figure 7 is a diagram illustrating an example of a convergence environment determination process.
도 7을 참조하면, 환경 융합부(400)는 기상 분석 모듈(230)과 조명 상태 분석 모듈(240)의 기상 상태 정보와 시간 및 조명 정보를 융합하여 현재 시간(예를 들면, 주간)과 주행 도로(예를 들면, 일반도로)와 기상상태(예를 들면, 흐림)가 포함된 결과 데이터를 산출할 수 있다. Referring to FIG. 7, the environment fusion unit 400 fuses the weather condition information and time and lighting information of the weather analysis module 230 and the lighting condition analysis module 240 to determine the current time (e.g., daytime) and driving information. Result data including roads (e.g., general roads) and weather conditions (e.g., cloudy) can be calculated.
추론부(500)는 정적 환경의 데이터와 동적 환경의 데이터가 포함된 전체적인 데이터 계측 환경을 추론한다. 이는 도 8를 참조하여 설명한다. The inference unit 500 infers the overall data measurement environment including data in the static environment and data in the dynamic environment. This is explained with reference to FIG. 8.
도 8는 데이터 계측 환경 추론 과정의 예를 도시한 도면이다. Figure 8 is a diagram showing an example of a data measurement environment inference process.
도 8를 참조하면, 예를 들면, 환경 융합부(400)의 기상, 시간 및 조명의 데이터는 동적 환경 데이터이다. 따라서 추론부(500)는 환경 융합부(400)에서 산출된 동적 환경 데이터에 정적 환경 데이터를 추가하여 전체 데이터 계측 환경을 추론한다. Referring to FIG. 8 , for example, weather, time, and lighting data of the environment fusion unit 400 are dynamic environment data. Therefore, the inference unit 500 adds static environment data to the dynamic environment data calculated by the environment fusion unit 400 to infer the entire data measurement environment.
여기서 정적 환경 데이터는 도로 지도 분석모듈의 네트워크 정보(노드, 링크), 도로구간 정보(터널, 교량 등), 표지 정보(교통안전 표지, 차선, 횡단보도 등), 시설 정보(신호등, 차량방호안전지설 등), 차로 수, 차로 번호, 도로 유형, 도로 등급, 차로 유형, 차도 구간, 노면선 표시 유형 중 적어도 하나 이상을 포함한다. Here, static environmental data includes network information (nodes, links) of the road map analysis module, road section information (tunnels, bridges, etc.), sign information (traffic safety signs, lanes, crosswalks, etc.), and facility information (traffic lights, vehicle protection and safety). (Soil, etc.), number of lanes, lane number, road type, road grade, lane type, roadway section, and road line marking type.
또한, 추론부(500)는 자율 주행 계측부(300)에서 계측된 자율 주행 차량 주변의 계측 데이터(예를 들면, 카메라 이미지)를 추가할 수 있다. Additionally, the inference unit 500 may add measurement data (eg, camera image) around the autonomous vehicle measured by the autonomous driving measurement unit 300.
시나리오 라벨링부(600)는 정적 환경 데이터와 동적 환경 데이터가 융합된 추론된 계측 환경 데이터를 시나리오로 정의하고, 이를 라벨링 데이터로 구성하여 원시 데이터와 융합하여 학습 데이터를 생성한다. The scenario labeling unit 600 defines the inferred measurement environment data that is a fusion of static environment data and dynamic environment data as a scenario, configures it as labeling data, and fuses it with the raw data to generate learning data.
여기서 원시 데이터는 자율 주행 계측부(300)의 데이터(예를 들면, 카메라 이미지, 영상이미지, 라이더의 점군 데이터)이고, 라벨링 데이터는 시나리오 라발렝부에서 정의된 시나리오를 구성하는 데이터이다.Here, the raw data is data from the autonomous driving measurement unit 300 (e.g., camera image, video image, and lidar point cloud data), and the labeling data is data that constitutes the scenario defined in the scenario lavalem department.
최종 계측 데이터 도출부(700)는 원시데이터와 라벨링 데이터가 융합된 학습 데이터로서 최종 계측 데이터를 도출한다. 이와 같은 최종 계측 데이터는 도 8를 참조하여 설명한다.The final measurement data deriving unit 700 derives the final measurement data as learning data that is a fusion of raw data and labeling data. Such final measurement data will be described with reference to FIG. 8.
도 8는 최종 계측 데이터의 예를 도시한 도면이다. Figure 8 is a diagram showing an example of final measurement data.
도 8를 참조하면, 최종 계측 데이터 도출부(700)는 자율 주행 계측부의 원시 데이터와 시나리오 라벨링부(600)의 라벨링 데이터를 융합하여 출력한다. 이때, 라벨링 데이터는 도 10에 도시된 바와 같이 원시 데이터의 프레임별로 출력될 수 있다. Referring to FIG. 8, the final measurement data deriving unit 700 fuses the raw data from the autonomous driving measurement unit and the labeling data from the scenario labeling unit 600 and outputs them. At this time, labeling data may be output for each frame of raw data, as shown in FIG. 10.
즉, 최종 데이터는 원시 데이터의 프레임별로 라벨링 데이터를 포함하여 출력된다. That is, the final data is output including labeling data for each frame of the raw data.
또한, 본 발명은 상기와 같은 구성을 이용하여 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법을 포함한다. 이는 도 10을 참조하여 설명한다. Additionally, the present invention includes a scenario labeling method using sensor data measurement of an autonomous vehicle using the above configuration. This is explained with reference to FIG. 10.
도 10은 본 발명에 따른 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법을 도시한 순서도이다. Figure 10 is a flowchart showing a scenario labeling method using sensor data measurement of an autonomous vehicle according to the present invention.
도 10을 참조하면, 본 발명은 계측 데이터를 산출하는 S110 단계와, 자율 주행 센서 데이터를 계측하는 S120 단계와, 정적 환경 데이터를 분석하는 S130단계와, 융합 환경을 판단하는 S140 단계와, 데이터 계측 환경을 추론하는 S150 단계와, 학습 데이터를 산출하는 시나리오 라벨링 단계와, 최종 계측 데이터를 도출하는 S170 단계를 포함한다. Referring to FIG. 10, the present invention includes step S110 of calculating measurement data, step S120 of measuring autonomous driving sensor data, step S130 of analyzing static environmental data, step S140 of determining the fusion environment, and data measurement. It includes a step S150 for inferring the environment, a scenario labeling step for calculating learning data, and a step S170 for deriving final measurement data.
S110 단계는 센서 계측 데이터를 출력하는 단계이다. 여기서 S110 단계는 강수 데이터를 계측하는 S111 단계와, 조도 데이터를 계측하는 S112 단계와, GPS 데이터를 계측하는 S113 단와, 정밀 도로 지도 레이어를 분석하는 S114 단계를 포함한다. Step S110 is a step of outputting sensor measurement data. Here, step S110 includes step S111 of measuring precipitation data, step S112 of measuring illuminance data, step S113 of measuring GPS data, and step S114 of analyzing the precision road map layer.
S111 단계는 강수 감지 모듈(210)에서 강수의 유무 및/또는 강수량을 감지하는 단계이다. 강수 감지 모듈(210)은 자율 주행 차량 주변의 강수 유무 및/또는 강수량을 감지한다. 이때, 강수 감지 모듈(210)은 강수량의 측정이 많은 또는 적음과 같이 구분하여 출력할 수 있다. Step S111 is a step in which the precipitation detection module 210 detects the presence or absence of precipitation and/or the amount of precipitation. The precipitation detection module 210 detects the presence or absence of precipitation and/or the amount of precipitation around the autonomous vehicle. At this time, the precipitation detection module 210 can output the measured amount of precipitation by classifying it as high or low.
S112 단계는 조도 감지 모듈(220)에서 자율 주행 차량 주변의 조도를 감지하는 단계이다. 조도 감지 모듈(220)은 자율 주행 차량 주변의 조도를 계측하여 조도 계측 데이터를 출력한다. Step S112 is a step in which the illuminance detection module 220 detects the illuminance around the autonomous vehicle. The illuminance detection module 220 measures the illuminance around the autonomous vehicle and outputs illuminance measurement data.
S113 단계는 GPS 계측 모듈(110)에서 위치 정보 및 시간 정보를 산출하는 단계이다. Step S113 is a step in which the GPS measurement module 110 calculates location information and time information.
S114 단계는 도로 지도 분석 모듈(120)에서 GPS 계측 데이터를 이용하여 정밀 도로 지도에서 자율 주행 차량의 현재 위치에서 레이어를 분석하는 단계이다. 여기서 도로 지도 분석 모듈(120)은 설정된 조건이나 입력된 명령에 따라 정밀 도로 지도의 현재 자율 주행 차량의 위치에 노면의 색깔이나 재질과 같은 신규 레이어를 추가할 수 있다. Step S114 is a step in which the road map analysis module 120 analyzes a layer at the current location of the autonomous vehicle on a precise road map using GPS measurement data. Here, the road map analysis module 120 can add a new layer, such as the color or material of the road surface, to the current location of the autonomous vehicle on the precision road map according to set conditions or input commands.
또한, 도로 지도 분석 모듈(120)은 정밀 도로 지도를 통해 현재 위치 정보(예를 들면, 차로 수, 차로 번호, 도로 유형, 터널 유무, 도로 노면 재질, 도로 등급, 차로 유형, 차도구간유형, 노면선 표시 유형과 같은 정밀 도로 분석 데이터를 출력한다. In addition, the road map analysis module 120 provides current location information (e.g., number of lanes, lane number, road type, presence or absence of tunnel, road surface material, road grade, lane type, roadway section type, road surface) through a precise road map. Outputs precision analysis data such as line display type.
S120 단계는 자율 주행 계측부(300)에서 원시 데이터를 출력하는 단계이다. 원시 데이터는 자율 주행 차량에 설치되는 카메라, 라이다 센서, 레이더 센서 등에 계측된 정보이다. Step S120 is a step in which raw data is output from the autonomous driving measurement unit 300. Raw data is information measured by cameras, lidar sensors, and radar sensors installed in autonomous vehicles.
S130 단계는 기상 상태를 분석하는 S131 단계와, 시간/조명 상태를 분석하는 S132 단계를 포함한다. Step S130 includes step S131 for analyzing weather conditions and step S132 for analyzing time/lighting conditions.
이중 S131 단계는 기상 분석 모듈(230)에서 강수 감지 모듈(210)에서 계측된 데이터를 기상 계측 데이터를 분석하여 현재 자율 주행 차량의 위치 및 주변의 강우 및/또는 강설의 유무와 강우량 및/또는 강설량을 분석한다. Among them, in step S131, the weather analysis module 230 analyzes the data measured by the precipitation detection module 210 to determine the current location of the autonomous vehicle and the presence or absence of rain and/or snowfall in the surrounding area, and the amount of rainfall and/or snowfall. Analyze.
S131 단계는 조명 상태 분석 모듈(240)에서 자율 주행 차량 주변의 조도 계측 데이터와, GPS 계측 모듈(110)의 시간 계측 데이터와, 도로 지도 분석 모듈(120)의 주변 지형 데이터를 통하여 현재 시간과 자율 주행 차량의 주변 조명 상태를 분석하는 단계이다. In step S131, the current time and autonomous vehicle are measured through the illuminance measurement data around the autonomous vehicle from the lighting condition analysis module 240, the time measurement data from the GPS measurement module 110, and the surrounding terrain data from the road map analysis module 120. This is the step to analyze the surrounding lighting conditions of the driving vehicle.
S140 단계는 환경 융합부(400)에서 기상 상태의 분석 결과와 시간/조명 상태 분석 결과를 융합하는 단계이다. 환경 융합부(400)는 기상 및 시간과 조명 상태 분석 결과를 융합하여 동적 환경 데이터를 산출한다. 여기서 동적 환경 데이터는 현재 시간(예를 들면, 주간)과 주행 도로 및/또는 지형 정보(예를 들면, 일반 도로, 터널)와 기상상태(예를 들면, 흐림)가 포함될 수 있다. Step S140 is a step in which the environmental fusion unit 400 fuses the weather state analysis results and the time/lighting state analysis results. The environment fusion unit 400 fuses weather, time, and lighting state analysis results to produce dynamic environmental data. Here, the dynamic environmental data may include the current time (eg, daytime), driving road and/or terrain information (eg, general road, tunnel), and weather conditions (eg, cloudy).
S150 단계는 추론부(500)에서 데이터 계측 환경을 추론하는 단계이다. 추론부(500)는 S140 단계에서 산출된 동적 환경 데이터에 도로 지도분석 모듈의 정적 환경 데이터를 융합하여 데이터 계측 환경을 추론하여 그 결과를 출력한다. Step S150 is a step in which the inference unit 500 infers the data measurement environment. The inference unit 500 infers the data measurement environment by fusing the static environment data of the road map analysis module with the dynamic environment data calculated in step S140 and outputs the result.
S160 단계는 시나리오 라벨링부(600)에서 추론된 전체 계측 데이터에 라벨링하고, 원시 데이터와 결합시킨 학습 데이터를 생성하는 단계이다. 시나리오 라벨링부(600)는 추론된 전체 계측 데이터에 라벨링을 부여하고, 시나리오로 정의한다. Step S160 is a step of labeling the entire measurement data inferred by the scenario labeling unit 600 and generating learning data combined with the raw data. The scenario labeling unit 600 labels the entire inferred measurement data and defines it as a scenario.
S170 단계는 최종 계측 데이터 도출부(700)에서 최종 계측 데이터를 도출하는 단계이다. 최종 계측 데이터 도출부(700)는 원시 데이터의 프레임별로 시나리오 라벨링부(600)에서 정의된 시나리오, 즉 라벨링된 데이터를 원시 데이터와 융합하여 최종 계측 데이터를 산출한다. Step S170 is a step of deriving final measurement data from the final measurement data derivation unit 700. The final measurement data deriving unit 700 fuses the scenarios defined in the scenario labeling unit 600 for each frame of raw data, that is, labeled data, with the raw data to produce final measurement data.
이와 같이 본 발명에 따른 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법은 자율 주행 차량의 실시간 센서 계측 정보를 시나리오 라벨링에 반영하고, 정밀 도로 지도를 게측 환경 추론에 활용함에 따라 정보의 양이 증가되며, 이와 같은 과정이 모두 자동화 됨에 따라 시나리오 라벨링에 소요되는 시간 및/또는 인력을 종래에 비하여 대폭 절감할 수 있다. In this way, the scenario labeling method using sensor data measurement of autonomous vehicles according to the present invention reflects real-time sensor measurement information of autonomous vehicles in scenario labeling and increases the amount of information by using precise road maps for measurement environment inference. As all of these processes are automated, the time and/or manpower required for scenario labeling can be significantly reduced compared to the past.

Claims (7)

  1. 자율 주행 차량 주변의 기상, 조도 상태를 실시간으로 수집하고, 수집된 정보로서 전체 환경을 추론하여 산출된 동적 환경 계측 데이타와, 도로정보를 포함하는 정적 환경 계측 데이터가 결합된 전체 계측 환경 데이터를 자동으로 라벨링하 는 것; 을 특징으로 하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템.The weather and illumination conditions around the autonomous vehicle are collected in real time, and the entire environment is automatically measured by combining dynamic environment measurement data calculated by inferring the entire environment from the collected information and static environment measurement data including road information. labeling as; A scenario labeling system using sensor data measurement from autonomous vehicles.
  2. 카메라, 라이다, 레이더 및 IMU 중 어느 하나로서 실시간으로 센서 계측 데이터를 출력하는 자율 주행 계측부(300);An autonomous driving measurement unit 300 that outputs sensor measurement data in real time as one of a camera, lidar, radar, and IMU;
    자율 주행 차량 주변의 기상과 시간 및 조명 상태 중 적어도 하나 이상을 계측 및 분석하는 동적 환경 계측부(200);a dynamic environment measurement unit 200 that measures and analyzes at least one of weather, time, and lighting conditions around the autonomous vehicle;
    GPS 위치 정보와 정밀 도로 지도 따라 자율 주행 차량의 위치에 따른 도로 및 지형 정보를 분석 하는 정적 환경 계측부(100);A static environment measurement unit 100 that analyzes road and terrain information according to the location of the autonomous vehicle according to GPS location information and a precise road map;
    정적 환경 계측 데이터와 동적 환경 계측 데이터를 융합하는 환경 융합부(400);An environmental fusion unit 400 that fuses static environmental measurement data and dynamic environmental measurement data;
    융합된 전체 계측 데이터를 통하여 계측 환경을 추론하는 추론부(500);an inference unit 500 that infers the measurement environment through the entire fused measurement data;
    추론된 계측 환경 데이터를 라벨링하는 시나리오 라벨링부(600); 및A scenario labeling unit 600 that labels the inferred measurement environment data; and
    자율 주행 계측부(300)의 실시간 영상 데이터의 프레임별로 라벨링된 데이터가 포함된 최종 계측 데이터를 산출 및 출력하는 최종 계측 데이터 도출부(700); 를 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템.a final measurement data deriving unit 700 that calculates and outputs final measurement data including data labeled for each frame of the real-time image data of the autonomous driving measurement unit 300; Scenario labeling system using sensor data measurement of autonomous vehicles including.
  3. 청구항 1에 있어서, 동적 환경 계측부(200)는The method in claim 1, wherein the dynamic environment measurement unit 200
    강수량을 감지하는 강수 감지 모듈(210);A precipitation detection module 210 that detects the amount of precipitation;
    조도를 감지하는 조도 감지 모듈(220);An illuminance detection module 220 that detects illuminance;
    강수 감지 모듈(210)의 계측 데이터를 통해 기상 상태를 감지하는 기상 분석 모듈(230); 및 a weather analysis module 230 that detects weather conditions through measurement data from the precipitation detection module 210; and
    조도 감지 모듈(220)의 계측 데이터를 통해 자율 주행 차량 주변의 조명 상태를 감지하는 조명 상태 분석 모듈(240); 을 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템. a lighting condition analysis module 240 that detects lighting conditions around the autonomous vehicle through measurement data from the illuminance detection module 220; Scenario labeling system using sensor data measurement of autonomous vehicles including.
  4. 청구항 1에 있어서, 정적 환경 계측부(100)는The method in claim 1, wherein the static environment measurement unit 100
    GPS 위성으로부터 송신된 신호를 수신하여 위치를 산출하고, 시간 정보를 제공하는 GPS 계측 모듈(110); 및 A GPS measurement module 110 that receives signals transmitted from GPS satellites, calculates the location, and provides time information; and
    정밀 도로 지도에서 현재 자율 주행 차량의 위치를 기준으로 도로정보를 추출하는 도로 지도 분석 모듈(120); 을 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템. A road map analysis module 120 that extracts road information based on the current location of the autonomous vehicle from a precise road map; Scenario labeling system using sensor data measurement of autonomous vehicles including.
  5. 청구항 4에 있어서, 정밀 도로 지도는 The method of claim 4, wherein the precision road map is
    자율 주행 차량의 위치 결정, 경로 설정/변경, 도로/교통 규제 중 적어도 하나 이상의 인지를 위한 인프라로써, 네트워크 정보, 도로구간 정보, 표지 정보, 시설 정보 중 적어도 하나 이상을 포함하는 것; 을 특징으로 하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템. Infrastructure for determining the location of autonomous vehicles, setting/changing routes, and recognizing at least one of road/traffic regulations, including at least one of network information, road section information, sign information, and facility information; A scenario labeling system using sensor data measurement from autonomous vehicles.
  6. 청구항 5에 있어서, 도로 지도 분석 모듈(120)은 The method of claim 5, wherein the road map analysis module 120
    각각 계층 형태로 구성되는 주행경로노드/링크, 차도/부속구간, 주차면, 안전표지, 노면/노면선표시, 신호등, 킬로포스트, 차량방호 안전시설, 과속방지턱, 높이장애물, 지주 중 적어도 하나 이상을 더 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 시스템. At least one of the driving path nodes/links, roadways/accessory sections, parking surfaces, safety signs, road surface/road line markings, traffic lights, kilo posts, vehicle protection safety facilities, speed bumps, height obstacles, and supports, each of which is configured in a hierarchical form. A scenario labeling system using sensor data measurement of an autonomous vehicle further comprising:
  7. 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법에 있어서, In a scenario labeling method using sensor data measurement of an autonomous vehicle,
    a)자율 주행 차량에서 강수와 조도를 감지하고, 위치와 시간을 포함하는 GPS 정보와, 정밀 도로 지도 레이어를 분석하는 단계;a) detecting precipitation and illumination in an autonomous vehicle, analyzing GPS information including location and time, and a precise road map layer;
    b)자율 주행 차량에 설치되는 카메라, 라이다 센서, 레이더 센서 중 적어도 하나 이상에서 실시간 계측된 원시 데이터를 출력하는 단계;b) outputting raw data measured in real time from at least one of a camera, lidar sensor, and radar sensor installed in the autonomous vehicle;
    c)강수 계측 데이터를 이용하여 자율 주행 차량 주변의 기상 상태와, 조도 계측 데이터를 이용하여 조명 상태를 분석하는 단계;c) analyzing weather conditions around the autonomous vehicle using precipitation measurement data and lighting conditions using illuminance measurement data;
    d)기상 상태의 분석 결과와, 조명 상태의 분석 결과를 융합하는 단계;d) fusing the analysis results of the weather condition and the analysis result of the lighting condition;
    e)기상 상태의 분석 결과와 조명 상태가 융합된 동적 환경 데이터와 도로 지도분석 모듈의 도로 정보를 융합하여 자율 주행 차량 주변의 전체 계측 데이터를 추론하는 단계; 및 e) inferring the entire measurement data around the self-driving vehicle by fusing dynamic environmental data combining weather condition analysis results and lighting conditions with road information from the road map analysis module; and
    f)추론된 전체 계측 데이터를 자동으로 라벨링하고, 원시 데이터의 영상 프레임별로 라벨링된 데이터를 결합하여 최종 데이터를 산출하는 단계; 를 포함하는 자율 주행 차량의 센서 데이터 계측을 이용한 시나리오 라벨링 방법. f) automatically labeling the entire inferred measurement data and combining the labeled data for each video frame of the raw data to calculate final data; Scenario labeling method using sensor data measurement of autonomous vehicles including.
PCT/KR2023/002913 2022-07-27 2023-03-03 System and method for labeling scenarios using sensor data measurements of autonomous vehicle WO2024025063A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220093233A KR20240015792A (en) 2022-07-27 2022-07-27 Scenario labeling system and method using sensor data measurement of autonomous vehicle
KR10-2022-0093233 2022-07-27

Publications (1)

Publication Number Publication Date
WO2024025063A1 true WO2024025063A1 (en) 2024-02-01

Family

ID=89706828

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/002913 WO2024025063A1 (en) 2022-07-27 2023-03-03 System and method for labeling scenarios using sensor data measurements of autonomous vehicle

Country Status (2)

Country Link
KR (1) KR20240015792A (en)
WO (1) WO2024025063A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200144166A (en) * 2019-06-17 2020-12-29 연세대학교 산학협력단 Data-based voice service system and method using machine learning algorithm
CN112818910A (en) * 2021-02-23 2021-05-18 腾讯科技(深圳)有限公司 Vehicle gear control method and device, computer equipment and storage medium
KR102339011B1 (en) * 2020-04-27 2021-12-14 계명대학교 산학협력단 Adaptive switcher for day and night pedestrian detection in autonomous vehicle and pedestrian detection apparatus using thereof
JP2022514891A (en) * 2018-12-21 2022-02-16 コンチネンタル オートモーティブ システムズ インコーポレイテッド Systems and methods for automatic image labeling for supervised machine learning
KR20220080975A (en) * 2020-12-08 2022-06-15 한국교통대학교산학협력단 Driving environment static object recognition AI data processing method and device therefor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102337070B1 (en) 2021-07-12 2021-12-08 (주)에이아이매틱스 Method and system for building training database using automatic anomaly detection and automatic labeling technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022514891A (en) * 2018-12-21 2022-02-16 コンチネンタル オートモーティブ システムズ インコーポレイテッド Systems and methods for automatic image labeling for supervised machine learning
KR20200144166A (en) * 2019-06-17 2020-12-29 연세대학교 산학협력단 Data-based voice service system and method using machine learning algorithm
KR102339011B1 (en) * 2020-04-27 2021-12-14 계명대학교 산학협력단 Adaptive switcher for day and night pedestrian detection in autonomous vehicle and pedestrian detection apparatus using thereof
KR20220080975A (en) * 2020-12-08 2022-06-15 한국교통대학교산학협력단 Driving environment static object recognition AI data processing method and device therefor
CN112818910A (en) * 2021-02-23 2021-05-18 腾讯科技(深圳)有限公司 Vehicle gear control method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
KR20240015792A (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN108961790B (en) Bad weather early warning management system and method based on four-dimensional live-action traffic simulation
CN111076731B (en) Automatic driving high-precision positioning and path planning method
JP6065107B2 (en) Driving environment evaluation system, driving environment evaluation method, driving support device, and driving environment display device
CN110097762B (en) Road video image low visibility scale estimation method and system
WO2021006441A1 (en) Road sign information collection method using mobile mapping system
KR20130127822A (en) Apparatus and method of processing heterogeneous sensor fusion for classifying and positioning object on road
WO2018143589A1 (en) Method and device for outputting lane information
US10190909B2 (en) Path detection system based on solar blind ultraviolet light signals
CN115410403B (en) Road vehicle positioning tracking method and device based on passive perception and readable medium
WO2019124668A1 (en) Artificial intelligence system for providing road surface danger information and method therefor
KR102227649B1 (en) Device and Method for verifying function of Automatic Driving
WO2024025063A1 (en) System and method for labeling scenarios using sensor data measurements of autonomous vehicle
KR102599558B1 (en) Automotive sensor integration module
US20230004764A1 (en) Automotive sensor integration module
CN108885112A (en) Method for determining the attitude of an at least partially autonomous vehicle by means of landmarks which are specially selected and transmitted by a back-end server
US11854221B2 (en) Positioning system and calibration method of object location
WO2023096037A1 (en) Device for generating real-time lidar data in virtual environment and control method thereof
US11682298B2 (en) Practical method to collect and measure real-time traffic data with high accuracy through the 5G network and accessing these data by cloud computing
KR102693851B1 (en) Automotive sensor integration module
Sauter et al. High performance pavement markings enhancing camera and LiDAR detection
CN114299715A (en) Expressway information detection system based on videos, laser radar and DSRC
KR20210095602A (en) Exit display apparatus of driving guide line for improving visibility at bad weather and bad view and night
WO2023282571A1 (en) Vehicle ar display device and ar service platform
Versavel et al. Camera and computer-aided traffic sensor
WO2024143641A1 (en) Local server and method for controlling road traffic

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23846724

Country of ref document: EP

Kind code of ref document: A1