WO2023042791A1 - Dispositif d'estimation de voie et procédé d'estimation de voie - Google Patents

Dispositif d'estimation de voie et procédé d'estimation de voie Download PDF

Info

Publication number
WO2023042791A1
WO2023042791A1 PCT/JP2022/034053 JP2022034053W WO2023042791A1 WO 2023042791 A1 WO2023042791 A1 WO 2023042791A1 JP 2022034053 W JP2022034053 W JP 2022034053W WO 2023042791 A1 WO2023042791 A1 WO 2023042791A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
lane
vehicle
vehicle speed
driving
Prior art date
Application number
PCT/JP2022/034053
Other languages
English (en)
Japanese (ja)
Inventor
寛之 鬼丸
武雄 徳永
篤樹 柿沼
康夫 大石
明 飯星
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2023548457A priority Critical patent/JPWO2023042791A1/ja
Publication of WO2023042791A1 publication Critical patent/WO2023042791A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map

Definitions

  • the present invention relates to a lane estimation device and lane estimation method for estimating lanes when a vehicle is traveling.
  • this type of device compares the road surface profile of each of a plurality of lanes registered in advance with the road surface profile measured while the vehicle is running, and calculates the similarity of the road surface profile for each of the plurality of lanes,
  • a device is known that estimates a lane with a high degree of similarity as a lane during vehicle travel (see, for example, Patent Document 1).
  • a lane estimation device which is one aspect of the present invention, includes a position information acquisition unit that acquires position information obtained by a positioning sensor that receives a signal transmitted from a positioning satellite and measures the position of a vehicle; vehicle speed information of the vehicle; and information on the detection value of the detector that changes according to the road surface profile of the road surface on which the vehicle travels; Acquired by a reference information acquisition unit that acquires vehicle speed reference information that serves as a reference for change, a road map information acquisition unit that acquires road map information including driving lane information and road surface profile information, and a travel information acquisition unit.
  • a driving lane identifying unit Based on the traveling information obtained by the vehicle speed reference information obtained by the reference information obtaining unit and the road map information obtained by the road map information obtaining unit, and a driving lane identifying unit that identifies a driving lane corresponding to the position of the vehicle determined by the position information.
  • a lane estimation method comprising the steps of acquiring position information obtained by a positioning sensor that receives a signal transmitted from a positioning satellite and measures the position of a vehicle; a step of obtaining vehicle travel information including information on the detected value of the detector that changes according to the road surface profile of the road surface on which the vehicle travels; obtaining vehicle speed reference information; obtaining road map information including road driving lane information and road surface profile information; Among them, the step of specifying the driving lane corresponding to the position of the vehicle determined by the position information is executed by the computer.
  • FIG. 1 is a diagram showing the overall configuration of a lane estimation system including a lane estimation device according to an embodiment of the present invention
  • FIG. FIG. 3 is a diagram showing an example of a road surface profile obtained by the server device of FIG. 2;
  • 1 is a block diagram showing the functional configuration of a lane estimation device according to an embodiment of the present invention
  • FIG. FIG. 6 is a flowchart showing an example of processing executed by the controller in FIG. 5;
  • FIG. A lane estimation device is configured to estimate a lane in which a vehicle is traveling on a road having a plurality of driving lanes (sometimes simply referred to as lanes).
  • driving lanes sometimes simply referred to as lanes.
  • creation of a road surface profile showing unevenness of the road surface, estimation of the position where a disabled vehicle is parked, and estimation of a wrong-way vehicle, etc., when the driving lane is estimated. can be done.
  • FIG. 1 is a diagram schematically showing an example of a road to which a lane estimation device according to an embodiment of the invention is applied.
  • FIG. 1 shows a road group RD including an elevated expressway RD1 installed on a pier P and a general road RD2 provided on the ground along the pier P.
  • the road group RD is provided in an area of tall buildings BL.
  • the road group RD includes a plurality of traveling lanes extending parallel to each other, for example, lanes LN1 and LN2 on the highway RD1 and a lane LN3 on the ordinary road RD2.
  • extending parallel to each other is not parallel in a strict sense, but refers to the case where each extends in the same direction or substantially the same direction (substantially parallel case), roads etc. with different heights, Including cases where there are overlapping parts.
  • the vehicle 1 it is estimated which of the plurality of driving lanes LN1 to LN3 shown in FIG. 1, that is, the plurality of lanes LN1 to LN3 that are adjacent to each other in plan view, the vehicle 1 is traveling.
  • a vehicle (object vehicle) 1a whose driving lane is to be estimated is traveling in lane LN3
  • a vehicle 1b other than the object vehicle 1a is traveling in lanes LN1 and LN2 in the directions of the arrows.
  • positioning satellites such as GPS (Global Positioning System)
  • GPS Global Positioning System
  • a positioning sensor such as a GPS receiver (GPS sensor) mounted on the vehicle. and compare the measured vehicle position with the lane position included in the map information. That is, if the positioning accuracy in measuring the position of the vehicle using the positioning sensor is such that the position of the lane can be specified, the position of the lane can be estimated using the positioning sensor.
  • the lane estimation device is configured as follows.
  • FIG. 2 is a diagram showing the overall configuration of the lane estimation system including the lane estimation device according to the embodiment of the present invention.
  • the lane estimation system has an in-vehicle device 100 mounted on a vehicle 1 and a server device 3 capable of communicating with the in-vehicle device 100 via a network 200 .
  • the vehicle 1 is, for example, a manually operated vehicle manually operated by a driver.
  • the in-vehicle device 100 has a positioning sensor 10 that receives positioning signals transmitted from the positioning satellites 2 and a communication unit 11 that communicates with the server device 3 via the network 200 .
  • the positioning satellites 2 are artificial satellites such as GPS satellites and quasi-zenith satellites. Using the positioning information from the positioning satellites 2 received by the positioning sensor 10, the current position (latitude, longitude, altitude) of the vehicle 1 is calculated. can do.
  • the network 200 includes not only public wireless communication networks such as the Internet and mobile phone networks, but also closed communication networks provided for each predetermined management area, such as wireless LAN, Wi-Fi (registered trademark), etc. , Bluetooth (registered trademark), and the like.
  • the server device 3 is configured, for example, as a single server or as a distributed server composed of separate servers for each function.
  • the server device 3 can also be configured as a distributed virtual server created in a cloud environment called a cloud server.
  • the server device 3 includes an arithmetic processing unit having a CPU, ROM, RAM, and other peripheral circuits.
  • the server device 3 has a communication unit 31, a storage unit 32, a road profile generation unit 33, and a vehicle speed change model generation unit 34 as functional configurations.
  • the communication unit 31 is configured to be able to wirelessly communicate with the in-vehicle device 100 via the network 200, and acquires the position information of the vehicle 1 and the travel information of the vehicle 1 via the communication unit 11 of the vehicle 1 respectively.
  • the position information is information indicating the current position of the vehicle 1 calculated from the signal received by the positioning sensor 10 of the vehicle 1 .
  • the traveling information is information indicating the traveling state of the vehicle 1 acquired by various sensors mounted on the vehicle 1 .
  • the travel information includes vehicle speed information of the vehicle 1 and information of values detected by an acceleration sensor (lateral acceleration sensor) that detects acceleration (lateral acceleration) of the vehicle 1 in the left-right direction.
  • the communication unit 31 constantly acquires position information and travel information not only for the target vehicle 1a (FIG. 1) whose driving lane is to be estimated, but also for a plurality of vehicles 1b (FIG. 1) other than the target vehicle 1a.
  • the storage unit 32 stores road map information.
  • Road map information includes road location information, road shape information (curvature, etc.), road gradient information, intersection and branch point location information, number of lanes, lane width, and location information for each lane.
  • the positional information for each lane is information such as the central position of the lane and the boundary of the lane position.
  • the road surface profile generation unit 33 generates a road surface profile indicating road surface properties based on the position information and travel information of a plurality of vehicles 1b other than the target vehicle 1a acquired via the communication unit 31.
  • FIG. 3 is a diagram showing an example of a road surface profile.
  • the horizontal axis in the figure is the position in the traveling direction of the vehicle 1 along the driving lane, that is, the distance, and the vertical axis is the amount of unevenness (depth or height) of the road surface, that is, the road surface roughness.
  • the lateral acceleration of the vehicle 1 increases as the amount of unevenness of the road surface increases. Therefore, road surface properties and lateral acceleration have a predetermined correlation. This predetermined correlation is stored in the storage unit 32 in advance.
  • the road surface profile generator 33 calculates the amount of unevenness of the road surface corresponding to the vehicle position on the road from the lateral acceleration, and generates the road surface profile in the traveling direction of the vehicle 1 as shown in FIG. Generate. Information of the road surface profile at each position of the generated road is stored in the storage unit 32 as part of the road map information.
  • the road profile detected by the lateral acceleration sensor of each vehicle 1 may differ due to the different positions of the tires on the road surface.
  • the road surface profile generator 33 averages the road surface profiles detected by the lateral acceleration sensors of the vehicles 1, for example, to generate a representative road surface profile of each road surface.
  • the road surface profile generation unit 33 can also generate a road surface profile from data obtained by running a dedicated vehicle for measuring road surface properties. For example, it is possible to generate a road profile without using a lateral acceleration sensor by running a dedicated vehicle equipped with a laser profiler and acquiring the measurement data at that time together with the position data of the dedicated vehicle.
  • the road surface profile information is updated each time the road surface profile generation unit 33 generates a road surface profile.
  • Other road map information is updated at predetermined intervals or at arbitrary timing.
  • the road surface profile (reference road surface profile) of each of the lanes LN1 to LN3 at the driving position of the vehicle 1 is already stored in the storage unit 32. .
  • the vehicle speed change model generation unit 34 calculates changes in vehicle speed of the vehicle 1 traveling in each of the lanes LN1 to LN3. Generate a reference vehicle speed change model.
  • Vehicle data is unique data of each vehicle 1 including position information and vehicle speed information of the vehicle 1 .
  • the position information includes the number of satellites captured by the positioning sensor 10 , signal strength received by the positioning sensor 10 , and positioning accuracy information by the positioning sensor 10 .
  • the positioning accuracy information is, for example, information on a rate of accuracy decrease DOP (Dilution of Precision).
  • the vehicle speed change model generation unit 34 can construct a vehicle speed change model by machine learning using vehicle data for each lane obtained via the communication unit 31 .
  • a vehicle speed change model may be constructed by adding road surface profile information for each lane to vehicle data for each lane.
  • FIG. 4A and 4B are diagrams showing examples of vehicle speed change models generated by the vehicle speed change model generation unit 34, respectively.
  • FIG. 4A is a vehicle speed change model during non-traffic traffic (during smooth running)
  • FIG. 4B is a vehicle speed change model during traffic jam.
  • 4A and 4B show a vehicle speed change model that serves as a reference for the vehicle speed of the vehicle 1 traveling in each lane, with characteristics in which the position of the vehicle 1 in the traveling direction (road) is plotted on the horizontal axis and the vehicle speed is plotted on the vertical axis.
  • Characteristics f1 and f3 in the figure are vehicle speed change models of expressway RD1 (lane LN1 or LN2), and characteristics f2 and f4 are vehicle speed change models of general road RD2 (lane LN3). These characteristics are obtained by machine learning using a large number of vehicle data for each lane, and correspond to characteristics obtained by averaging the vehicle data.
  • a vehicle speed change model may be obtained by statistically processing vehicle data.
  • the vehicle speed on expressways is faster than the vehicle speed on general roads.
  • the change in vehicle speed on expressways is smaller than the change in vehicle speed on general roads.
  • the position S represents the position of the intersection on the general road. Vehicles stop frequently at intersections, so the vehicle speed is lower than at other intersections.
  • the vehicle speed change model (characteristics f1, f3) on expressways is significantly different from the vehicle speed change model (characteristics f2, f4) on general roads.
  • the vehicle speed change model generated by the vehicle speed change model generation unit 34 is stored in the storage unit 32 .
  • This vehicle speed change model is updated each time a vehicle speed change model is generated by the vehicle speed change model generator 34 .
  • the vehicle speed change models of the lanes LN1 to LN3 at the driving position of the vehicle 1 are already stored in the storage unit 32.
  • FIG. 5 is a block diagram showing the functional configuration of the lane estimation device 101 according to this embodiment.
  • the lane estimation device 101 constitutes a part of the in-vehicle device 100 in FIG. 2 .
  • lane estimation device 101 includes positioning sensor 10 , communication unit 11 , sensor group 13 , switch group 14 , and controller 20 .
  • the positioning sensor 10, the communication unit 11, the sensor group 13, and the switch group 14 are each connected to the controller 20 so as to be communicable.
  • the sensor group 13 is a general term for a plurality of sensors that detect the running state of the vehicle 1.
  • the sensor group 13 includes a lateral acceleration sensor 131 that detects lateral acceleration of the vehicle 1, a vehicle speed sensor 132 that detects vehicle speed, and a steering angle sensor 133 that detects the steering angle of the steering wheel.
  • the switch group 14 is a general term for a plurality of switches that detect the running state of the vehicle 1 .
  • the switch group 14 includes a winker switch 141 that detects the driver's operation of the direction indicator.
  • the direction indicator is a device for indicating the direction to the surroundings when the vehicle 1 turns left or right or changes course, and is composed of a turn signal lever or the like.
  • the controller 20 is an electronic control unit including a computer having an arithmetic unit such as a CPU, a storage unit such as ROM and RAM, and other peripheral circuits.
  • the calculation unit of the controller 20 has an information acquisition unit 21 and a driving lane identification unit 25 as functional configurations.
  • the information acquisition section 21 has a position information acquisition section 211 , a travel information acquisition section 212 , a reference information acquisition section 213 and a road map information acquisition section 214 .
  • the storage unit of the controller 20 stores a predetermined correlation between the road surface properties and the lateral acceleration used when the road surface profile is generated, and threshold values for various determinations. etc. are stored.
  • the position information acquisition unit 211 acquires current position information of the vehicle 1 detected by the positioning sensor 10 .
  • the travel information acquisition unit 212 acquires travel information of the vehicle 1 including various detection values detected by the sensor group 13 and the switch group 14 .
  • the reference information acquisition unit 213 acquires a vehicle speed change model indicating reference information on vehicle speed from the server device 3 via the communication unit 11 . More specifically, the reference information acquiring unit 213 obtains a vehicle speed change model (Fig. 4A, FIG. 4B).
  • the road map information acquisition unit 214 acquires road map information from the server device 3 via the communication unit 11 .
  • the road map information acquisition unit 214 obtains road information including road lane information (driving lane information) at the current position of the vehicle 1 detected by the positioning sensor 10, and road surface profile information of each of the lanes LN1 to LN3. Get map information.
  • the driving lane identification unit 25 uses the driving information of the vehicle 1 obtained by the driving information obtaining unit 212, the vehicle speed change model obtained by the reference information obtaining unit 213, and the vehicle 1 information obtained by the road map information obtaining unit 214.
  • the driving lane corresponding to the current position of the vehicle 1 acquired by the position information acquisition unit 211 is specified among the plurality of driving lanes LN1 to LN3 based on the road map information of the road on which the vehicle is traveling.
  • the driving lane is identified based on the detected value of the lateral acceleration sensor 131 and road profile information included in the road map information. More specifically, the amount of unevenness of the road surface is calculated from the lateral acceleration detected by the lateral acceleration sensor 131 using the pre-stored correlation between the road surface properties and the lateral acceleration. When the vehicle 1 is undergoing a turn or the like and lateral acceleration is generated in the vehicle 1 , the amount of unevenness of the road surface is calculated from the detection value of the lateral acceleration sensor 131 by correcting the acceleration.
  • a road surface profile representing a change in the amount of unevenness of the road surface along the traveling direction of the vehicle 1, that is, a measured road surface profile that is a measured value of the road surface profile, and a road surface profile for each lane included in the road map information, that is, a reference road surface profile, and the degree of matching between the actually measured road surface profile and the reference road surface profile for each lane is calculated. Then, it is determined whether or not the degree of matching is equal to or greater than a predetermined value, and if there is a reference road surface profile determined to be equal to or greater than the predetermined value, the lane having the reference road surface profile is designated as the current driving lane. Identify.
  • the driving lane identification unit 25 calculates the degree of matching of the road surface profile over a predetermined distance, averages the degrees of matching within the predetermined distance, and determines whether the degree of matching is equal to or greater than a predetermined value.
  • the degree of matching can be calculated using a correlation coefficient or the like. The degree of matching is sometimes called the degree of similarity.
  • the driving lane specifying unit 25 determines the driving lane based on the detected value (detected vehicle speed) of the vehicle speed sensor 132 and the vehicle speed change model. Identify. Specifically, first, it is determined whether or not the road on which the vehicle 1 is traveling is congested based on the degree of change in the vehicle speed detection value. If it is determined that the vehicle is not congested, the degree of matching between the vehicle speed detection value and the vehicle speed change model for each lane (FIG. 4A) is calculated.
  • the degree of matching is equal to or greater than a predetermined value, and if there is a vehicle speed change model determined to be equal to or greater than the predetermined value, the lane corresponding to the vehicle speed change model is specified as the current driving lane. do.
  • the degree of matching between the vehicle speed detection value and the vehicle speed change model for each lane is calculated.
  • the lane corresponding to the vehicle speed change model is identified as the current driving lane.
  • the driving lane identification unit 25 may calculate the degree of matching over a predetermined distance, average the degrees of matching within the predetermined distance, and determine whether the degree of matching is equal to or greater than a predetermined value.
  • the degree of matching can be calculated using a correlation coefficient or the like. The degree of matching is sometimes called the degree of similarity.
  • the driving lane identification unit 25 estimates the driving lane based on the position information of the vehicle 1 obtained by the positioning sensor 10, and then compares the measured road surface profile with the reference road surface profile. Alternatively, the vehicle speed detection value and the vehicle speed change model may be compared to determine whether or not the estimation of the driving lane based on the position information is correct, thereby identifying the driving lane.
  • the weighting of the estimation result based on the detection value of the positioning sensor 10 may be changed according to the positioning accuracy (for example, the DOP value). good.
  • the lane estimation based on the detection value of the positioning sensor 10 is weighted more than the lane estimation based on the matching degree of the road surface profile and the lane estimation based on the vehicle speed matching degree.
  • the driving lane identification unit 25 determines whether or not there is a lane change based on the signal from the turn signal switch 141 when the matching degree of the road surface profile is not equal to or greater than a predetermined value and the matching degree of the vehicle speed is not equal to or greater than a predetermined value. That is, since the direction indicator is generally operated when changing lanes, the lane change of the vehicle 1 to the left or right is determined based on the signal from the winker switch 141 . For example, when the vehicle 1 is traveling in the center (second lane) of three lanes on one side (first, second, and third lanes), the turn signal switch 141 causes the vehicle 1 to change lanes to the right and left lanes.
  • the operation of the direction indicator when turning is detected by, for example, separate switches. As a result, based on the signal from the turn signal switch 141, it can be easily determined to which lane the vehicle 1 has changed lanes, left or right.
  • the driving lane identification unit can also determine whether or not there is a lane change based on the signal from the steering angle sensor 133. That is, since the steering wheel is operated when changing lanes, it is possible to determine whether or not there is a lane change by determining whether or not the detection value of steering angle sensor 133 is equal to or greater than a predetermined value.
  • the direction indicator is not always operated when changing lanes, or the direction indicator may be erroneously operated. Therefore, by using the detection value of the steering angle sensor 133, it is possible to determine whether or not there is a lane change with higher accuracy than when using the turn signal switch 141.
  • FIG. 6 is a flowchart showing an example of processing executed by the controller 20 (CPU) according to a predetermined program.
  • the processing shown in this flowchart is executed when the vehicle 1 is traveling in any one of a plurality of traveling lanes extending substantially parallel to each other. That is, it is executed when it is necessary to estimate the driving lane, and is repeated at a predetermined cycle.
  • step S1 current position information of the vehicle 1 detected by the positioning sensor 10, travel information of the vehicle 1 based on signals from the sensor group 13 and the switch group 14, and travel information obtained via the communication unit 11.
  • the road map information of the middle road and the vehicle speed change model of the road on which the vehicle is running obtained through the communication unit 11 are acquired.
  • step S2 an actually measured road surface profile is obtained based on the detected value of the lateral acceleration sensor 131, and the matching degree between the actually measured road surface profile and the reference road surface profile for each lane included in the road map information (road surface profile matching degree).
  • step S3 it is determined whether or not there is a reference road surface profile whose matching degree with the measured road surface profile is equal to or greater than a predetermined value, that is, whether or not there is a lane whose road surface profile matching degree is equal to or greater than a predetermined value. . If the result in step S3 is affirmative, the process proceeds to step S4, and if the result is negative, the process proceeds to step S5.
  • step S4 the lane having the reference road surface profile whose degree of matching is equal to or greater than a predetermined value is estimated as the driving lane on which the vehicle 1 is traveling, and the process ends.
  • the estimated driving lane is temporarily stored in the storage unit of controller 20 .
  • step S5 the degree of matching (vehicle speed matching) between the detected value of the vehicle speed sensor 132 and the vehicle speed change model for each lane is calculated.
  • step S6 it is determined whether or not there is a vehicle speed change model whose degree of coincidence with the vehicle speed detection value is greater than or equal to a predetermined value, that is, whether or not there is a lane whose degree of vehicle speed coincidence is greater than or equal to a predetermined value. If the result in step S6 is affirmative, the process proceeds to step S7, and if the result is negative, the process proceeds to step S8.
  • step S7 the lane corresponding to the vehicle speed change model whose degree of matching is equal to or greater than a predetermined value is estimated as the driving lane in which the vehicle 1 is traveling, and the process ends.
  • the estimated driving lane is temporarily stored in the storage unit of controller 20 .
  • step S8 based on the signal from the blinker switch 141, it is determined whether or not the direction indicator has been operated. If the result in step S8 is affirmative, the process proceeds to step S9, and if the result is negative, the process proceeds to step S10. In this case, in step S9, the lane of travel is estimated based on the signal from the winker switch 141 assuming that there is a lane change from the lane estimated in step S4 or step S7, and the process ends. In step S10, based on the signal from the steering angle sensor 133, it is determined whether or not the steering wheel has been operated by a predetermined amount or more.
  • step S10 the process proceeds to step S9, and if the result is negative, the process proceeds to step S11.
  • step S9 the lane is estimated based on the signal from the steering angle sensor 133 assuming that there is a lane change from the lane estimated in step S4 or step S7, and the process ends.
  • step S11 the driving lane is estimated assuming that there is no lane change from the lane estimated in step S4 or step S7, and the process ends.
  • an algorithm such as a support vector machine is used to divide the vehicle speed change model into two groups, such as expressways and general roads. You may make it classify
  • a model such as RNN (Recurrent Neural Network) or LSTM (Long Short-term Memory) may be used to estimate the lane from the vehicle speed detection value.
  • RNN Recurrent Neural Network
  • LSTM Long Short-term Memory
  • the road surface profile matching degree determination processing is performed before the vehicle speed matching degree determination processing.
  • the vehicle speed matching degree determination process may be performed with priority over the road surface profile matching degree determination process.
  • the vehicle speed matching degree may be weighted more than the road surface profile matching degree to estimate the driving lane.
  • the operation of the lane estimation device 101 can be summarized as follows. As shown in FIG. 1, attention is paid to a target vehicle 1a in an area surrounded by a group of buildings and having a plurality of lanes LN1 to LN3 extending parallel to each other. Since the target vehicle 1a is traveling on the general road RD2, the vehicle speed is lower than that of the vehicle 1b traveling on the highway RD1. Therefore, when the vehicle speed of the target vehicle 1a (vehicle speed detection value) detected by the vehicle speed sensor 132 is compared with the vehicle speed change model of each lane LN1 to LN3 acquired from the server device 3, the vehicle speed detection value and the vehicle speed of the lane LN3 are compared. Highest agreement with the change model. Accordingly, it can be estimated that the target vehicle 1a is traveling on the travel lane LN3 (step S7).
  • the driving lane By estimating the driving lane based on the vehicle speed detection value in this way, it is possible to accurately estimate the driving lane even in situations where positioning accuracy is degraded due to being surrounded by buildings, etc. Furthermore, in this embodiment, if the degree of matching between the measured road surface profile based on the signal from the lateral acceleration sensor 131 and the reference road surface profile obtained from the server device 3 is equal to or greater than a predetermined value, the lane having the reference road surface profile ( For example, lane LN3) is estimated as the driving lane (step S4). Therefore, even if the degree of matching between the vehicle speed detection value and the vehicle speed change model is less than the predetermined value, the driving lane can be accurately estimated based on the degree of matching of the road surface profile.
  • a predetermined value For example, lane LN3
  • the direction indicator and the steering wheel are operated when changing lanes. be.
  • the target vehicle 1b has changed lanes (step S9).
  • the lane change If it is determined that the vehicle is present, it can be estimated that the vehicle is traveling on the lane LN2.
  • the lane estimation device 101 is a position information acquisition unit that acquires position information of the current position of the vehicle 1 obtained by the positioning sensor 10 that receives signals transmitted from the positioning satellites 2 and measures the position of the vehicle 1.
  • a driving information acquisition unit 212 for acquiring driving information of the vehicle 1 including vehicle speed information of the vehicle 1 and information of the detection value of the lateral acceleration sensor 131 that changes according to the road profile of the road surface on which the vehicle 1 is traveling;
  • a reference information acquisition unit 213 that acquires a vehicle speed change model that serves as a reference for vehicle speed changes in each of a plurality of driving lanes LN1 to LN3 extending substantially parallel to each other, and road map information including road lane information and road surface profile information.
  • the road surface profile information and the vehicle speed information are used to estimate the driving lane, in a situation where the accuracy of positioning by the positioning sensor 10 is degraded, such as in an area where high-rise buildings stand or in a tunnel. can also accurately estimate the driving lane. That is, for example, in a region where an expressway and a general road extend substantially parallel, a plurality of driving lanes LN1 to LN3 with different characteristics of vehicle speed change (vehicle speed change model) extend. , the driving lane can be estimated well.
  • the positions of the tires in the width direction in the lane in which the vehicle 1 travels may differ from vehicle to vehicle, so it is difficult to accurately estimate the driving lane only based on the road surface profile.
  • a vehicle speed change model corresponding to each of the plurality of driving lanes LN1 to LN3 is constructed by machine learning. As a result, it is possible to use a vehicle speed change model that satisfactorily reflects the characteristic of vehicle speed change for each lane, and to perform lane estimation satisfactorily.
  • the vehicle speed change model includes a vehicle speed change model when traffic congestion occurs and a vehicle speed change model when traffic congestion does not occur (Figs. 4A and 4B). Although the characteristics of vehicle speed change are significantly different between congested traffic and non-congested traffic, a vehicle speed change model is prepared in consideration of this point, so lane estimation can be performed with higher accuracy.
  • the travel information acquired by the travel information acquisition unit 212 further includes operation information of the direction indicator.
  • the driving lane identification unit 25 further identifies the driving lane based on the operation information of the direction indicator (signal from the blinker switch 141) (Fig. 5). As a result, it is possible to determine whether or not there is a lane change, and to perform lane estimation with higher accuracy.
  • the travel information acquired by the travel information acquisition unit 212 further includes steering angle information indicating the steering angle of the steering wheel.
  • the driving lane identification unit 25 further identifies the driving lane based on steering angle information of the steering wheel (signal from the steering angle sensor 133) (FIG. 5). As a result, it is possible to determine whether or not there is a lane change, and to further improve the accuracy of lane estimation.
  • the lane estimation device 101 of this embodiment can also be used as a lane estimation method.
  • a computer (controller 20) executes a step (steps S2 to S7) of specifying a driving lane corresponding to the position of the vehicle 1 determined by the positional information among the driving lanes LN1 to LN3 (FIG. 6 ).
  • the lane on which the vehicle 1 is traveling can be accurately estimated.
  • the traveling information acquisition unit 212 acquires the traveling information of the vehicle 1 including the information of the detection value (sensor value) of the lateral acceleration sensor 131, but other detectors that change according to the road surface profile You may make it acquire the driving
  • the reference information acquisition unit 213 serves as a reference for vehicle speed changes in each of the plurality of driving lanes LN1 to LN3 extending substantially parallel to each other in the road group RD including the elevated expressway RD1 and the general road RD2.
  • the vehicle speed change model (FIGS. 4A and 4B) is acquired as the vehicle speed reference information
  • the vehicle speed reference information is not limited to the above. Different vehicle speed reference information may be acquired according to time zones such as daytime and nighttime. Therefore, the vehicle speed change model is not limited to the vehicle speed change model when traffic congestion occurs and the vehicle speed change model when traffic congestion does not occur.
  • the vehicle speed change model may be constructed by techniques other than machine learning (for example, statistical processing).
  • the road group RD having a plurality of driving lanes extending substantially parallel to each other is composed of a plurality of roads RD1 and RD2 having different heights, but is not limited to this, and is composed of a plurality of roads having the same height. You may For example, when a general road extends on the side of an expressway, the expressway and the general road may form a road group.
  • the presence or absence of a lane change is determined based on the operation information of the direction indicator and the operation information of the steering wheel, but it is determined based on signals from other sensors and switches. good too.
  • the lane estimation device 101 is installed in the vehicle 1 in the above embodiment, some or all of the functions of the lane estimation device 101 may be provided in the server device 3 .
  • the lane estimation device 101 is applied to a manually driven vehicle, but the lane estimation device 101 of the present invention can also be applied to an automatically driven vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif d'estimation de voie comportant : une unité d'acquisition d'informations de position qui acquiert des informations de position obtenues par un capteur de positionnement ; une unité d'acquisition d'informations de déplacement qui acquiert des informations de déplacement de véhicule, comprenant des informations de vitesse de véhicule et des informations concernant des valeurs détectées du profil de surface de route de la surface de route sur laquelle le véhicule se déplace ; une unité d'acquisition d'informations de référence qui acquiert des informations de référence de vitesse de véhicule, qui sont la référence pour changer la vitesse de véhicule dans chacune de multiples voies de déplacement qui s'étendent dans des directions mutuellement parallèles ; une unité d'acquisition d'informations de carte routière qui acquiert des informations de carte routière qui comprennent des informations de voie de déplacement concernant la route et des informations concernant le profil de surface de route ; et une unité d'identification de voie de déplacement qui, sur la base des informations de déplacement acquises, des informations de référence de vitesse de véhicule et des informations de carte routière, identifie parmi les multiples voies de déplacement la voie de déplacement qui correspond à la position du véhicule, qui est déterminée sur la base des informations de position.
PCT/JP2022/034053 2021-09-14 2022-09-12 Dispositif d'estimation de voie et procédé d'estimation de voie WO2023042791A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023548457A JPWO2023042791A1 (fr) 2021-09-14 2022-09-12

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021149227 2021-09-14
JP2021-149227 2021-09-14

Publications (1)

Publication Number Publication Date
WO2023042791A1 true WO2023042791A1 (fr) 2023-03-23

Family

ID=85602881

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/034053 WO2023042791A1 (fr) 2021-09-14 2022-09-12 Dispositif d'estimation de voie et procédé d'estimation de voie

Country Status (2)

Country Link
JP (1) JPWO2023042791A1 (fr)
WO (1) WO2023042791A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010152648A (ja) * 2008-12-25 2010-07-08 Nissan Motor Co Ltd 車両制御装置
JP2011113547A (ja) * 2009-11-30 2011-06-09 Sumitomo Electric Ind Ltd 交通情報推定装置、交通情報推定のためのコンピュータプログラム、及び交通情報推定方法
JP2016045063A (ja) * 2014-08-22 2016-04-04 株式会社日立製作所 路面性状測定装置、及び路面性状測定方法
JP2017096963A (ja) * 2016-12-22 2017-06-01 トムトム ベルギー ネムローゼ フエンノートシャップTomTom Belgium N.V. ナビゲーションの方法及びシステム
JP2019040378A (ja) * 2017-08-25 2019-03-14 住友電気工業株式会社 コンピュータプログラム、走行道路判定方法、走行道路判定装置、車載装置およびデータ構造

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010152648A (ja) * 2008-12-25 2010-07-08 Nissan Motor Co Ltd 車両制御装置
JP2011113547A (ja) * 2009-11-30 2011-06-09 Sumitomo Electric Ind Ltd 交通情報推定装置、交通情報推定のためのコンピュータプログラム、及び交通情報推定方法
JP2016045063A (ja) * 2014-08-22 2016-04-04 株式会社日立製作所 路面性状測定装置、及び路面性状測定方法
JP2017096963A (ja) * 2016-12-22 2017-06-01 トムトム ベルギー ネムローゼ フエンノートシャップTomTom Belgium N.V. ナビゲーションの方法及びシステム
JP2019040378A (ja) * 2017-08-25 2019-03-14 住友電気工業株式会社 コンピュータプログラム、走行道路判定方法、走行道路判定装置、車載装置およびデータ構造

Also Published As

Publication number Publication date
JPWO2023042791A1 (fr) 2023-03-23

Similar Documents

Publication Publication Date Title
US11846522B2 (en) Warning polygons for weather from vehicle sensor data
US11685431B2 (en) Steering angle calibration
JP6985203B2 (ja) 挙動予測装置
US9140792B2 (en) System and method for sensor based environmental model construction
JP7020348B2 (ja) 自車位置推定装置
US11613253B2 (en) Method of monitoring localization functions in an autonomous driving vehicle
JP6936658B2 (ja) 車両の運転支援装置
CN110871796A (zh) 车道保持控制装置
WO2015052577A1 (fr) Système de guidage vers une voie de circulation pour un véhicule et procédé de guidage vers une voie de circulation pour un véhicule
JP7151187B2 (ja) 道路標識認識装置
JP7059817B2 (ja) 運転支援装置
US20200250864A1 (en) Hazard warning polygons constrained based on end-use device
US11420632B2 (en) Output device, control method, program and storage medium
CN109425861B (zh) 本车位置可信度运算装置
JP6790951B2 (ja) 地図情報学習方法及び地図情報学習装置
CN111619577B (zh) 服务器、车辆控制系统
CN113753072A (zh) 基于人体驾驶参考数据的自动舒适度评分系统
JP2023168399A (ja) 地図データ生成方法
WO2021235209A1 (fr) Dispositif d'estimation de voie de circulation et procédé d'estimation de voie de circulation
WO2023042791A1 (fr) Dispositif d'estimation de voie et procédé d'estimation de voie
US20190295283A1 (en) Other vehicle position estimation apparatus
JP7037736B2 (ja) 路面情報収集システム
US20210261116A1 (en) Information processing device and driving assistance device
JP7276653B2 (ja) 車両交通管理装置、車載装置、車両交通管理システム、車両交通管理方法および車両交通管理プログラム
JP7358447B2 (ja) 道路管理装置および道路管理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22869938

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023548457

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE