KR101828255B1 - Driving path updating apparatus based on driving information and environment recognition and method thereof - Google Patents

Driving path updating apparatus based on driving information and environment recognition and method thereof Download PDF

Info

Publication number
KR101828255B1
KR101828255B1 KR1020160019981A KR20160019981A KR101828255B1 KR 101828255 B1 KR101828255 B1 KR 101828255B1 KR 1020160019981 A KR1020160019981 A KR 1020160019981A KR 20160019981 A KR20160019981 A KR 20160019981A KR 101828255 B1 KR101828255 B1 KR 101828255B1
Authority
KR
South Korea
Prior art keywords
environment recognition
recognition result
unmanned vehicle
information
area
Prior art date
Application number
KR1020160019981A
Other languages
Korean (ko)
Other versions
KR20170098070A (en
Inventor
김종희
최지훈
이영일
Original Assignee
국방과학연구소
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 국방과학연구소 filed Critical 국방과학연구소
Priority to KR1020160019981A priority Critical patent/KR101828255B1/en
Publication of KR20170098070A publication Critical patent/KR20170098070A/en
Application granted granted Critical
Publication of KR101828255B1 publication Critical patent/KR101828255B1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a traveling area update apparatus based on travel information and environment recognition, and a control method thereof. The navigation area update apparatus according to an embodiment of the present invention includes an environment recognition result base line conversion unit that performs environment recognition through a sensor mounted on an unmanned vehicle and rotates the environment recognition result based on a predetermined reference line, And a parameter used to fuse the second environment recognition result obtained at the present time to the accumulated first environment recognition result and the accumulated first environment recognition result using the calculated parameter, And a traveling area updating unit that fuses the second environment recognition result and updates the current traveling area using the fusion result.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a driving area updating apparatus,

More particularly, the present invention relates to a traveling area update apparatus based on travel information and environment recognition, and a control method thereof. More specifically, when planning a route for autonomous operation of an unmanned vehicle operated in a limited space, And a control method thereof.

Due to recent technological advances, development of unmanned vehicles is actively underway. An unmanned vehicle may mean a vehicle that can ascertain the road situation by itself and arrive at its destination without the driver's operation. The unmanned vehicle may be meant to encompass a vehicle capable of self-running without any human operation, regardless of whether or not a person is present inside the vehicle.

In autonomous navigation of unmanned vehicles, driving information (or navigation information) and driving range information based on environment recognition are very important information in selecting an unmanned vehicle to travel.

However, even if the unmanned vehicle travels repeatedly in the same route (or the traveling region), uncertainty in the traveling information and uncertainty in the result of the environment recognition exist, so that difficulties due to uncertainty exist in the route selection of the unmanned vehicle .

In addition, the driving information such as navigation information, which is constructed in advance, has a strong dependency on driving information and environment at the time of construction, and uncertainty of driving information is reflected at the time of autonomous driving of the unmanned vehicle, It has been difficult to use.

It is an object of the present invention to provide a traveling area updating apparatus and a control method therefor which can acquire traveling area information with improved reliability by accumulating traveling information constructed in advance and environment recognition results of an unmanned vehicle.

The navigation area update apparatus according to an embodiment of the present invention includes an environment recognition result base line conversion unit that performs environment recognition through a sensor mounted on an unmanned vehicle and rotates the environment recognition result based on a predetermined reference line, And a parameter used to fuse the second environment recognition result obtained at the present time to the accumulated first environment recognition result and the accumulated first environment recognition result using the calculated parameter, And a traveling area updating unit that fuses the second environment recognition result and updates the current traveling area using the fusion result.

In an embodiment, the environment recognition result is information indicating a possibility of running of the unmanned vehicle using a probability, using data acquired through the sensor, and is stored in a lattice form.

In an embodiment, the accumulated first environment recognition result is information stored in a grid form on the basis of the preset reference line, and the environment recognition result base line conversion unit uses the running information including the running direction of the unmanned vehicle A line corresponding to the running direction is generated on the basis of a line corresponding to the running direction, and on the basis of an angle difference between the predetermined reference line and a line corresponding to the running direction, And acquires the second environment recognition result by rotating the environment recognition result generated based on the second environment recognition result to correspond to the preset reference line.

In the embodiment, the running information may further include position information of the unmanned vehicle, and the fusion function parameter calculating unit may calculate the fusion function parameter based on the position information of the unmanned vehicle, the coordinates of the overlapped grids in the first and second environment recognition results The first width value between one side of the running area and the unmanned vehicle, and the second width value between the other side of the running area and the unmanned vehicle using the width, the predetermined reference line and the width value of the running area .

In an embodiment, the travel area updating unit may calculate a weight using the first width value and the second width value, apply the weight to the first environment recognition result and the second environment recognition result, The third environment recognition result of the second environment recognition unit, and updates the current time zone using the third environment recognition result.

A control method of a travel area updating apparatus according to an embodiment of the present invention includes the steps of performing environment recognition through a sensor mounted on an unmanned vehicle and rotating the environment recognition result based on a preset reference line, Calculating a first environment recognition result and a second environment recognition result obtained at a current time by using the calculated parameters, And updating the current travel region using the fusion result.

 According to the present invention, it is possible to reduce the uncertainty due to the time variation of the travelable area caused by the difference in travel information every time the unmanned vehicle autonomously travels, as the travel of the unmanned vehicle is repeatedly performed in the space where the unmanned vehicle operates. That is, the present invention can obtain the path planning result with improved reliability when performing the path planning by reducing the uncertainty according to the time change of the travel information by utilizing the accumulated environment recognition result.

Accordingly, the present invention can provide updated travel area information that can improve route planning performance in autonomous navigation of an unmanned vehicle.

1 is a block diagram showing a traveling area updating apparatus according to the present invention.
2 is a conceptual diagram for explaining a method of calculating a parameter of a fusion function related to the present invention.
3 is a graph showing the fusion function associated with the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

1 is a block diagram showing a traveling area updating apparatus according to the present invention.

Referring to FIG. 1, the travel area update apparatus according to the present invention includes an environment recognition result base line conversion unit 100 for rotating an environment recognition result of an unmanned vehicle based on a predetermined reference line, a first environment A fusion function parameter calculator 200 for fusion of the recognition result with the second environment recognition result obtained at the present time, and an environment recognition result obtained at the present time using the fusion function using the calculated parameters And a traveling area updating unit (300) for updating the traveling area.

Although not shown, the travel area update apparatus related to the present invention includes a memory for storing information (or data) such as an environment recognition result and an accumulated environment recognition result, and a sensor capable of sensing surrounding information surrounding the unmanned vehicle .

The sensor may be a proximity sensor, an illumination sensor, an acceleration sensor, a magnetic sensor, a gravity sensor (G-sensor), a gyroscope sensor, a motion sensor an infrared sensor, an ultrasonic sensor, an optical sensor (for example, a camera), an environmental sensor (for example, a barometer, a hygrometer, a thermometer , A radiation detection sensor, a heat detection sensor, a gas detection sensor, etc.). Meanwhile, the traveling area update apparatus disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The travel area update apparatus described in the present specification may be a single apparatus provided (or mounted) on the unmanned vehicle, or may mean the unmanned vehicle itself. However, the present invention is not limited to this, and the travel area update device may be a separate device (for example, a server) provided outside the unmanned vehicle and capable of communicating with the unmanned vehicle (or capable of controlling the unmanned vehicle) .

Further, the sensor may be mounted on an unmanned vehicle, and may be formed so as to be able to communicate with the travel area update apparatus. Further, the sensor senses the surrounding information of the unmanned vehicle and transmits the sensed information (or data) to the traveling area update device (or the environment recognition result base line conversion section 100) through wired / wireless communication have.

The sensor can be understood as a component included in the travel area update apparatus described in this specification.

The environment recognition result base line conversion unit 100 can perform environment recognition through the sensor mounted on the unmanned vehicle and rotate the environment recognition result based on a predetermined reference line.

Specifically, the environment recognition result base line conversion unit 100 can generate the environment recognition result in which the driving ability of the unmanned vehicle is expressed in the form of probability by processing the data acquired through the sensor mounted on the unmanned vehicle. The environment recognition result described in the present specification is information indicating the possibility of running of the unmanned vehicle by using (processing) the data acquired through the sensor, and may be stored as a grid, for example. That is, the environment recognition result base line conversion unit 100 can perform environment recognition using the data obtained through the sensor, and store (generate) environment recognition results in a grid form.

At this time, the environment recognition result base line conversion unit 100 can use the running information (navigation information) that estimates the position and running direction of the unmanned vehicle.

The environment recognition result base line conversion unit 100 can generate the environment recognition result in a lattice form on the basis of the line corresponding to the running direction, using the running information including the running direction of the unmanned vehicle.

The environment recognition result base line conversion unit 100 converts the environment recognition result generated based on the line corresponding to the running direction into the reference line based on the angle difference between the predetermined reference line and the line corresponding to the running direction, And can be rotated so as to correspond to the set reference line. Hereinafter, the environment recognition result rotated to correspond to the preset reference line is referred to as "second environment recognition result (

Figure 112016016891792-pat00001
) "Or" baseline conversion environment recognition result (
Figure 112016016891792-pat00002
).

The second environment recognition result (

Figure 112016016891792-pat00003
) May be information obtained by rotating the environment recognition result acquired at the current time point on the basis of a predetermined reference line (or corresponding to the preset reference line).

Meanwhile, in the traveling area updating apparatus according to the present invention, the environment recognition results accumulated until the present time point may be stored in the memory. In the present specification, the environment recognition result accumulated until the present time is referred to as "first environment recognition result (

Figure 112016016891792-pat00004
).

The accumulated first environment recognition result up to the current time point may be information stored in a grid form based on the preset reference line.

The preset reference line described in this specification may be, for example, a virtual line corresponding to the true north direction.

The traveling area updating apparatus according to the present invention includes a first area recognition result accumulating unit

Figure 112016016891792-pat00005
) And the environmental recognition result obtained at the present time (the second environmental recognition result
Figure 112016016891792-pat00006
) Can be fused. To this end, the environment recognition result base line conversion unit 100 may rotate the environment recognition result generated based on the line corresponding to the running direction so that a line corresponding to the running direction corresponds to the preset reference line. The second environment recognition result (rotated) generated thereby
Figure 112016016891792-pat00007
) And the first environment recognition result (
Figure 112016016891792-pat00008
) May be the same as a predetermined reference line (for example, a virtual line directed toward the true north direction).

The fusion function parameter calculation unit 200 may calculate a parameter used to fuse the first environment recognition result accumulated up to the current time point and the second environment recognition result obtained at the current time.

More specifically, the fusion function parameter calculation unit 200 calculates the fusion environment parameter

Figure 112016016891792-pat00009
A second environment recognition result obtained at the current point of time and rotated to correspond to the preset reference line
Figure 112016016891792-pat00010
The parameters of the fusion function for fusing the overlapping regions can be calculated.

2 is a conceptual diagram for explaining a method of calculating a parameter of a fusion function related to the present invention.

Referring to FIG. 2, the travel information may include location information of an unmanned vehicle. As shown in FIG. 2, the position information of the unmanned vehicle may be, for example, coordinate information corresponding to the current position 203 of the unmanned vehicle

Figure 112016016891792-pat00011
Lt; / RTI >

In addition, the travel information may include a width value W of a travel region (or travelable region) 201 which is already known.

The fusion function parameter calculation unit 200 calculates the fusion function parameter based on the position information of the unmanned vehicle

Figure 112016016891792-pat00012
), The first environment recognition result (
Figure 112016016891792-pat00013
) And the second environment recognition result (
Figure 112016016891792-pat00014
) Coordinate information of the overlapped grid (
Figure 112016016891792-pat00015
A first width value W_L between one side of the traveling region and the unmanned vehicle and a second width value W_L between the other side of the traveling region and the width of the unmanned vehicle using the preset reference line 202 and the width value W of the traveling region 201. [ The second width value W_R between the vehicles can be calculated as a parameter. That is, the first width value W_L and the second width value W_R may be parameters of a fusion function.

The first and second environment recognition results (

Figure 112016016891792-pat00016
,
Figure 112016016891792-pat00017
The coordinate information of the overlapped grid in other words, the first and second environment recognition results (
Figure 112016016891792-pat00018
,
Figure 112016016891792-pat00019
(Or position information) of the lattice to be fusion target among the plurality of lattices superposed in the
Figure 112016016891792-pat00020
). ≪ / RTI >

The first width value W_L may mean a width value between the unmanned vehicle 203 and the left side surface of the running area 201 in the running area 201, for example.

The second width value W_R may mean a width value between the unmanned vehicle 203 and the right side surface of the running area 201 in the running area 201. [

The running area updating unit 300 can calculate the weight using the calculated parameters (the first width value and the second width value). The weight can be calculated, for example, by the fusion function F (P_M) described in Equation (1).

Figure 112016016891792-pat00021

At this time, the fusion function F (P_M) is different from that of the unmanned vehicle based on the predetermined reference line (true north direction reference line) 202 at the current position 203 of the unmanned vehicle, May be represented by a symmetric fusion function, as shown in FIG.

The travel region updating unit 300 may calculate a weight value (e.g., F (P_M)) using the first width value W_L and the second width value W_R. Further, the travel area updating unit may calculate the weighted value as the first environment recognition result (

Figure 112016016891792-pat00022
) And the second environment recognition result (
Figure 112016016891792-pat00023
) To calculate the accumulated third environment recognition result (
Figure 112016016891792-pat00024
) Can be calculated.

The third environment recognition result (

Figure 112016016891792-pat00025
) Can be calculated, for example, as shown in Equation (2).

Figure 112016016891792-pat00026

The third environment recognition result (

Figure 112016016891792-pat00027
), The first environment recognition result (
Figure 112016016891792-pat00028
) And the weight (F (P_M)) and the second environment recognition result (
Figure 112016016891792-pat00029
) And a weight (F (P_M)).

As the current position of the unmanned vehicle is located at the center of the traveling path 201 (in other words, as the overlapping grids are located at the center of the traveling path in the first and second environment recognition results), the weight increases, The probability that the vehicle can be autonomously driven (environment recognition result) becomes large.

Further, as the current position of the unmanned vehicle is closer to the edge (i.e., one side or the other side) of the traveling path 201, the weight becomes smaller, and the probability of the unmanned vehicle to autonomously travel (the environment recognition result) becomes smaller .

The traveling region updating unit 300 performs the operations of Equations 1 and 2 on all the grids superimposed on the first and second environment recognition results, and outputs the third environment recognition result (

Figure 112016016891792-pat00030
(Or travelable area) of the present time point can be updated by using the current time zone.

By this update, the width value of the running area at the current time point or the width value of the already known running area can be changed in whole or in part.

With this configuration, the present invention reduces the uncertainty due to the time variation of the travelable area generated due to the difference in travel information every time the unmanned vehicle autonomously travels, as the travel of the unmanned vehicle is repeatedly performed in the space where the unmanned vehicle is operated . That is, the present invention can obtain the path planning result with improved reliability when performing the path planning by reducing the uncertainty according to the time change of the travel information by utilizing the accumulated environment recognition result.

Accordingly, the present invention can provide updated travel area information that can improve route planning performance in autonomous navigation of an unmanned vehicle.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include at least one of the reference recognition unit 100, the fusion function parameter calculation unit 200, and the travel area update unit 300 of the travel area update apparatus. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: environment recognition result base line conversion unit
200: fusion function parameter calculation unit
300:

Claims (6)

An environment recognition result baseline converting unit that performs environment recognition through a sensor mounted on an unmanned vehicle and rotates the environment recognition result based on a predetermined reference line;
A fusion function parameter calculation unit for calculating a parameter used to fuse the first environment recognition result accumulated up to the present time point and the second environment recognition result obtained at the current time point; And
And a traveling area updating unit that fuses the accumulated first environment recognition result and the second environment recognition result using the calculated parameters and updates a current traveling area using the fusion result,
The accumulated first environment recognition result is information stored in a grid form on the basis of the preset reference line,
The environment recognition result baseline conversion unit,
Generates the environment recognition result in a lattice form on the basis of a line corresponding to the running direction using the running information including the running direction of the unmanned vehicle,
Based on an angle difference between the predetermined reference line and a line corresponding to the running direction, rotating the environment recognition result generated based on the line corresponding to the running direction to correspond to the preset reference line, Obtain the results,
Wherein the traveling information further includes position information of the unmanned vehicle,
The fusion function parameter calculation unit may calculate,
And a second area recognition unit that calculates a first area between the one side of the traveling area and the unmanned vehicle by using the position information of the unmanned vehicle, the coordinate information of the overlapped grids in the first and second environment recognition results, Width value, and a second width value between the other side of the traveling region and the unmanned vehicle, using the parameters.
The method according to claim 1,
The environment recognition result includes:
Wherein the information on the probability of running of the unmanned vehicle is stored in the form of a lattice by using the data obtained through the sensor.
delete delete The method according to claim 1,
The traveling region update unit may update,
Calculating weight values using the first width value and the second width value, applying the weight values to the first environment recognition result and the second environment recognition result, respectively, to calculate an accumulated third environment recognition result at the current time point And updates the travel area at the current time point by using the third environment recognition result.
Performing environmental recognition through a sensor mounted on an unmanned vehicle, and rotating an environment recognition result based on a predetermined reference line;
Calculating parameters to be used for fusing the first environment recognition result accumulated up to the current time point and the second environment recognition result obtained at the current time point; And
And a step of fusing the accumulated first environment recognition result and the second environment recognition result using the calculated parameters and updating the current time running area using the fusion result,
The accumulated first environment recognition result is information stored in a grid form on the basis of the preset reference line,
Wherein the rotating comprises:
Generating the environment recognition result in a lattice form on the basis of a line corresponding to the running direction using the running information including the running direction of the unmanned vehicle; And
Based on an angle difference between the predetermined reference line and a line corresponding to the running direction, rotating the environment recognition result generated based on the line corresponding to the running direction to correspond to the preset reference line, Obtaining a result,
Wherein the traveling information further includes position information of the unmanned vehicle,
Wherein the calculating step comprises:
And a second area recognition unit that calculates a first area between the one side of the traveling area and the unmanned vehicle by using the position information of the unmanned vehicle, the coordinate information of the overlapped grids in the first and second environment recognition results, Width value and a second width value between the other side of the traveling region and the unmanned vehicle are calculated as the parameters.
KR1020160019981A 2016-02-19 2016-02-19 Driving path updating apparatus based on driving information and environment recognition and method thereof KR101828255B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160019981A KR101828255B1 (en) 2016-02-19 2016-02-19 Driving path updating apparatus based on driving information and environment recognition and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160019981A KR101828255B1 (en) 2016-02-19 2016-02-19 Driving path updating apparatus based on driving information and environment recognition and method thereof

Publications (2)

Publication Number Publication Date
KR20170098070A KR20170098070A (en) 2017-08-29
KR101828255B1 true KR101828255B1 (en) 2018-02-13

Family

ID=59760013

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160019981A KR101828255B1 (en) 2016-02-19 2016-02-19 Driving path updating apparatus based on driving information and environment recognition and method thereof

Country Status (1)

Country Link
KR (1) KR101828255B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100745975B1 (en) * 2004-12-30 2007-08-06 삼성전자주식회사 Method and apparatus for moving minimum movement cost path using grid map
JP2009157430A (en) * 2007-12-25 2009-07-16 Toyota Motor Corp Coordinate correction method, coordinate correction program and autonomous mobile robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100745975B1 (en) * 2004-12-30 2007-08-06 삼성전자주식회사 Method and apparatus for moving minimum movement cost path using grid map
JP2009157430A (en) * 2007-12-25 2009-07-16 Toyota Motor Corp Coordinate correction method, coordinate correction program and autonomous mobile robot

Also Published As

Publication number Publication date
KR20170098070A (en) 2017-08-29

Similar Documents

Publication Publication Date Title
KR102518532B1 (en) Apparatus for determining route of autonomous vehicle and method thereof
EP3335006B1 (en) Controlling error corrected planning methods for operating autonomous vehicles
US10019008B2 (en) Sideslip compensated control method for autonomous vehicles
KR101581286B1 (en) System and method for path planning for autonomous navigation of driverless ground vehicle
JP6032163B2 (en) 3D object recognition apparatus, 3D object recognition method, and moving body
KR101439921B1 (en) Slam system for mobile robot based on vision sensor data and motion sensor data fusion
US8903644B2 (en) Digest for localization or fingerprinted overlay
US11402842B2 (en) Method to define safe drivable area for automated driving system
CN111033422B (en) Drift correction between planning and control phases of operating an autonomous vehicle
JP6543373B2 (en) Control type planning and control system used for autonomous driving vehicles
TW201728876A (en) Autonomous visual navigation
CN110390240B (en) Lane post-processing in an autonomous vehicle
KR20180052673A (en) System delay estimation method for autonomous vehicle control
US11167751B2 (en) Fail-operational architecture with functional safety monitors for automated driving system
US20180050694A1 (en) Method and device for monitoring a setpoint trajectory to be traveled by a vehicle for being collision free
KR20180051571A (en) Vehicle location point delivery method for autonomous vehicles
JP2019501809A (en) Method and system for following the speed of an autonomous vehicle
KR102561263B1 (en) Electronic apparatus and operating method for generating a map data
US11016489B2 (en) Method to dynamically determine vehicle effective sensor coverage for autonomous driving application
EP3659884B1 (en) Predetermined calibration table-based method for operating an autonomous driving vehicle
EP3901662A1 (en) Systems and methods to determine risk distribution based on sensor coverages of a sensor system for an autonomous driving vehicle
JPWO2020184013A1 (en) Vehicle control device
US20220019224A1 (en) Mobile body, method of controlling mobile body, and program
EP3896490B1 (en) Systems and methods to enhance early detection of performance induced risks for an autonomous driving vehicle
KR101828255B1 (en) Driving path updating apparatus based on driving information and environment recognition and method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant