WO2015087502A1 - Dispositif d'autolocalisation de véhicule - Google Patents

Dispositif d'autolocalisation de véhicule Download PDF

Info

Publication number
WO2015087502A1
WO2015087502A1 PCT/JP2014/005900 JP2014005900W WO2015087502A1 WO 2015087502 A1 WO2015087502 A1 WO 2015087502A1 JP 2014005900 W JP2014005900 W JP 2014005900W WO 2015087502 A1 WO2015087502 A1 WO 2015087502A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
calculated
road
vehicle position
Prior art date
Application number
PCT/JP2014/005900
Other languages
English (en)
Japanese (ja)
Inventor
林 和美
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2015087502A1 publication Critical patent/WO2015087502A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera

Definitions

  • This disclosure relates to a vehicle position detection device used in a vehicle.
  • the own vehicle position detection device mounted on the vehicle detects the current position of the own vehicle (own vehicle position), and displays a map image around the own vehicle position and the vehicle position on the map image to the driver. Display.
  • a self-contained navigation method is known in which the vehicle position is updated based on a detection value of a self-supporting sensor that detects a change in the behavior of the vehicle.
  • the self-supporting sensor include a vehicle speed sensor for detecting a moving distance as the behavior change of the host vehicle, a gyroscope for detecting the traveling direction of the vehicle, and the like.
  • Patent Document 1 discloses a technique for correcting the position of a vehicle calculated by self-contained navigation based on a GPS positioning signal when the travel distance becomes a predetermined value or more.
  • the detection accuracy of the vehicle position based on the GPS positioning signal may be low depending on the situation.
  • the vehicle position is corrected based on the GPS positioning signal, and thus there is a problem that the vehicle position cannot be detected accurately depending on the situation.
  • the present disclosure has been made in view of such problems, and an object of the present disclosure is to provide a vehicle position detection device that accurately detects the vehicle position.
  • an own vehicle position detection device used in a vehicle, and includes an image generation unit, a sign recognition unit, a map data acquisition unit, and a position correction unit.
  • An image generation part produces
  • the sign recognition unit recognizes the shape of the road sign drawn on the road in the captured image.
  • the map data acquisition unit acquires map data representing a map image including a road marking shape.
  • the position correction unit compares the shape of the road sign recognized by the sign recognition unit with the shape of the road sign that is included in the map image and exists around the own vehicle, and determines the absolute position of the own vehicle. to correct.
  • the vehicle position is specified based on the comparison between the map image and the captured image, the vehicle position is more accurate than when the vehicle position is detected based on the GPS positioning signal. It is possible to detect well.
  • FIG. 1 is a block diagram illustrating a configuration of a host vehicle position detection device according to an embodiment of the present disclosure.
  • FIG. 2A is a diagram showing a camera mounting position in a side view
  • FIG. 2B is a diagram showing a camera mounting position and an imaging range in a top view
  • FIG. 3 is a diagram illustrating an example of a GPS reception difficult area
  • FIG. 4 is a flowchart of the vehicle position calculation process.
  • FIG. 5 is a flowchart of the reset process.
  • FIG. 6 is a diagram showing an example of a characteristic road marking.
  • FIG. 7 is a flowchart of white line use update processing.
  • FIG. 1 is a block diagram illustrating a configuration of a host vehicle position detection device according to an embodiment of the present disclosure.
  • FIG. 2A is a diagram showing a camera mounting position in a side view
  • FIG. 2B is a diagram showing a camera mounting position and an imaging range in a top view
  • FIG. 3 is a diagram
  • FIG. 8 is a flowchart of white line count processing.
  • FIG. 9 is a diagram showing an example of specifying the vehicle position using a broken-line white line.
  • FIG. 10 is a flowchart of the oncoming vehicle update process.
  • FIG. 11 is a diagram illustrating an example of updating the own vehicle position using travel data of the oncoming vehicle.
  • FIG. 12 is a diagram illustrating a state in which the imaging range of the captured image in the map image is specified by comparing the road marking in the captured image with the road marking in the map image in another embodiment of the present disclosure;
  • FIG. 13 is a diagram illustrating a calculation example of the own vehicle position according to another embodiment.
  • the own vehicle position detection device 10 includes a wireless communication device 11, an imaging unit 20, a GPS receiver 31, a position detection unit 40, a map database (map DB) 51, a control device 61, and a display device 71.
  • the wireless communication device 11 is a device for performing wireless communication (vehicle-to-vehicle communication) with other vehicles existing around the vehicle (in a communication area where radio waves reach).
  • the imaging unit 20 includes a front camera 21, a rear camera 22, a right camera 23, and a left camera 24 as shown in FIGS. 2 (a) and 2 (b).
  • the front camera 21 is a camera for imaging a road ahead of the vehicle 1, and is installed, for example, at the center of the front bumper of the vehicle 1.
  • the rear camera 22 is a camera for capturing an image of a road behind the vehicle 1, and is installed, for example, at the center of the rear bumper of the vehicle 1.
  • the right side camera 23 is a camera for imaging the road on the right side of the vehicle 1, and is installed at the tip position of the right door mirror of the vehicle 1, for example.
  • the left side camera 24 is a camera for imaging the road on the left side of the vehicle 1, and is installed at the tip position of the left door mirror of the vehicle 1, for example. In this way, the cameras 21 to 24 are arranged so as to image the road around the host vehicle 1. The cameras 21 to 24 are installed so as to face downward from the horizontal direction in order to take an image of the road.
  • the cameras 21 to 24 output imaging information (image data) representing the captured image to the control device 61.
  • imaging information image data
  • the control device 61 Based on the imaging information from the cameras 21 to 24, the control device 61 generates a captured image whose viewpoint has been changed so that the road around the host vehicle 1 is looked down from above in the vertical direction.
  • the imaging range by the cameras 21 to 24 covers the entire circumference of the host vehicle 1 as indicated by the hatching portion 25 in FIG. Note that these four cameras 21 to 24 capture images at the same predetermined cycle.
  • the predetermined cycle (imaging cycle) here refers to, for example, a cycle in which imaging is performed at a frequency of at least every 3 m of traveling distance during traveling at the assumed maximum speed.
  • the GPS receiver 31 receives a positioning signal from a GPS artificial satellite, and detects the vehicle position (latitude and longitude) that is the current position of the host vehicle 1 based on the received positioning signal.
  • the position detection unit 40 includes a gyroscope 41, a vehicle speed sensor 42, and a G sensor 43.
  • the gyroscope 41 detects the magnitude of the rotational motion applied to the host vehicle 1.
  • the vehicle speed sensor 42 detects the rotational speed of the axle based on the number of pulses per unit time output from a pulse generator attached to the axle of the host vehicle 1, and the speed of the host vehicle 1 is determined based on the detected rotational speed. Calculate the speed.
  • the G sensor 43 detects the acceleration in the front-rear direction of the host vehicle 1.
  • the map database (map DB) 51 is a storage device that stores map data representing a map showing roads on which the vehicle 1 can travel.
  • the road marking here is a sign indicating regulation or instruction regarding road traffic, and is a line, symbol, or character drawn on the road surface.
  • signs indicating regulations include prohibition of turning, overtaking prohibition, parking and parking, etc.
  • signs indicating instructions include pedestrian crossings, stop lines, center lines, lane boundaries, etc. It is done.
  • the map data representing the map includes data that can specify the dimensions of each part of these road markings.
  • the map shows an area Q (referred to as a GPS reception difficulty area) where it is difficult to receive a positioning signal from a GPS artificial satellite, such as a dense zone of a high-rise building as shown in FIG. Has been.
  • the display device 71 is a device for displaying an image to the passenger of the vehicle 1. For example, on the display screen, a map image based on the map data input from the map database 51 and a mark indicating the position of the vehicle displayed superimposed on the map image are displayed.
  • the control device 61 is configured around a known microcomputer centering on the CPU 62, the ROM 63, and the RAM 64.
  • the control device 61 (specifically, the CPU 62) executes various processes based on the program stored in the ROM 63.
  • the control device 61 executes, for example, processing for reading map data representing a map near the vehicle position from the map database 51 and displaying the map data on the display device 71, processing for displaying a mark indicating the vehicle position on the map, and the like. .
  • the vehicle position is calculated by a well-known self-contained navigation. Specifically, the direction change amount is calculated based on the detection signal from the gyroscope 41 as the behavior change from the calculation time of the own vehicle position (existing own vehicle position) calculated last time, and the vehicle speed sensor 42 and the G sensor 43 The movement distance is calculated based on the detection signal. Then, the displacement amount calculated based on these behavior changes (azimuth variation amount and moving distance) is added to the already-existing vehicle position to update the vehicle position.
  • the host vehicle position calculated (estimated) by the control device 61 is also referred to as a calculated host vehicle position in order to distinguish it from the actual (true) host vehicle position.
  • the previous reset is a process for resetting the detection error accumulated by the self-contained navigation, and is a process of S130, S175, S235, S440 or S450 described later.
  • the process proceeds to S30, and if it is the reset period, the process proceeds to S40 to execute the reset process.
  • the reset process of S40 is obtained from a map reset that corrects the calculated vehicle position by comparing the shape of the road marking included in the map image with the shape of the road marking included in the captured image, or acquired from the position of the oncoming vehicle and the oncoming vehicle.
  • This is a process for executing an oncoming vehicle reset for correcting the calculated host vehicle position using the obtained information, a GPS reset for correcting the calculated host vehicle position based on a positioning signal from a GPS artificial satellite, and the like.
  • the detection signals (output values) of the independent sensors (gyroscope 41, vehicle speed sensor 42, and G sensor 43) included in the position detector 40 include errors. These errors are integrated with the increase.
  • the deviation (detection error) between the calculated own vehicle position based on the self-contained navigation and the actual (true) own vehicle position increases.
  • detection errors are reset (corrected) by executing a map reset, a GPS reset, or the like.
  • S30 which is shifted when it is determined that the reset cycle is not determined in S20, it is determined based on the map image represented by the map data whether or not the calculated vehicle position is a position less than a predetermined distance from the GPS reception difficult area. To do. For example, as shown in FIG. 3 described above, when it is assumed that the host vehicle 1 travels along the road from the host vehicle position P, the distance until the host vehicle 1 enters the GPS reception difficulty area Q is less than a predetermined distance. It is determined whether or not.
  • the present vehicle position calculation process is terminated.
  • the process proceeds to S40, the reset process is executed, and the present vehicle position calculation process is terminated.
  • the reason why the reset process is executed when the distance to the GPS reception difficult area is less than the predetermined distance is as follows. In other words, it is difficult to receive positioning signals from GPS satellites in areas where GPS reception is difficult, and there are fewer opportunities to perform GPS resets. This is because it is desired to reset the detected error.
  • S105 to S145 are processes for resetting the map
  • S150 to S160 are processes for resetting the oncoming vehicle
  • S175 is a process of resetting GPS.
  • S100 it is determined based on the map image represented by the map data whether or not any characteristic road marking exists within a predetermined range centered on the calculated vehicle position.
  • the process proceeds to S140.
  • the process proceeds to S105.
  • the characteristic road marking is, as shown in an example in FIG. 6, of a plurality of types of road markings 201 to 205 drawn on the road, such as lane markings for dividing the road (white lines 204 and 205, and broken white lines) 203), which excludes road markings drawn continuously.
  • the map reset is a process of identifying the vehicle position by comparing the shape of the road marking included in the map image with the shape of the road marking included in the captured image.
  • the position latitude and longitude
  • the process after S105 is performed using the characteristic road marking except the road marking drawn continuously.
  • the pedestrian crossing 201 and the stop line 202 drawn in front of the pedestrian crossing correspond to characteristic road markings.
  • signs representing various regulations and instructions such as turning prohibition, parking and stopping prohibited, and traveling direction correspond to characteristic road markings.
  • imaging information output from the imaging unit 20 (cameras 21 to 24) is acquired.
  • a captured image is generated by looking down the road around the host vehicle in the vertical direction.
  • the shape of the characteristic road marking is recognized in the captured image generated in S110. Specifically, among the various characteristic road markings drawn on the road, the characteristic road markings determined to exist within the predetermined range in S100 (a plurality of characteristic road markings when there are a plurality of characteristic road markings) The road marking is recognized in the captured image by a known pattern matching technique.
  • map data representing a map image within a predetermined range centered on the calculated vehicle position that is, map data representing a map image for which the presence or absence of a characteristic road marking is determined in S100 is acquired from the map database 51. .
  • the vehicle position is identified by comparing the shape of the characteristic road marking in the captured image recognized in S115 with the shape of the characteristic road marking included in the map image represented by the map data acquired in S120. . That is, on the assumption that the arrangement of the cameras 21 to 24 with respect to the host vehicle 1 is constant and the captured image is generated by a certain method, the position of the host vehicle 1 in the captured image is constant. Further, the position (imaging range) of the captured image in the map image is specified by comparing the map image and the captured image. Accordingly, the vehicle position is specified in the map image.
  • the calculated vehicle position is corrected (detection error reset) by replacing the calculated vehicle position with the vehicle position specified in S125. Then, the reset process ends.
  • S140 which is shifted when it is determined in S100 that there is no characteristic road marking within the predetermined range, it is determined whether or not a dashed white line exists within the predetermined range similar to S100 centered on the calculated vehicle position. judge.
  • the broken white line 203 is composed of a plurality of rectangular unit lines 210.
  • the imaging information output from the imaging unit 20 is acquired in S150, which is a transition to a case where it is determined in S140 that a dashed white line does not exist within the predetermined range.
  • S150 whether or not another vehicle (referred to as an oncoming vehicle) traveling in the opposite direction to the own vehicle 1 in the opposite lane (right side of the own vehicle 1 in this embodiment) is detected in the image represented by the imaging information acquired in S150. Determine whether. If an oncoming vehicle is not detected, the process proceeds to S170, and if an oncoming vehicle is detected, the process proceeds to S160.
  • S160 an oncoming vehicle update process for correcting the calculated host vehicle position using the information (travel data) acquired from the oncoming vehicle detected in S155 is executed (details will be described later). Then, the reset process ends.
  • S170 which is shifted when no oncoming vehicle is detected in S155, it is determined whether or not a positioning signal from a GPS artificial satellite is received.
  • the positioning signal is received when the positioning signal necessary for detecting the vehicle position is obtained. If a GPS positioning signal has not been received, the reset process is terminated.
  • the calculated vehicle position is corrected (reset detection error) by replacing the calculated vehicle position with the vehicle position based on the GPS positioning signal received in S170. )I do. Then, the reset process ends.
  • the calculated vehicle position is calculated using the characteristic road marking. Correct. When there is no characteristic road marking and when a broken white line exists around the calculated vehicle position, the calculated vehicle position is corrected using the broken white line. When there is no characteristic road marking or broken white line around the calculated vehicle position, or when an oncoming vehicle is imaged by the imaging unit 20, the calculated vehicle position is calculated using the travel data acquired from the oncoming vehicle. to correct. When none of the characteristic road marking, the broken white line, and the oncoming vehicle is imaged by the imaging unit 20, if a GPS positioning signal is received, the calculated vehicle position is corrected using the positioning signal. . If none of these is the case, in the own vehicle position calculation process, the calculated own vehicle position by the self-contained navigation is used as it is without correcting the calculated own vehicle position.
  • the imaging information output from the imaging unit 20 (cameras 21 to 24) is acquired.
  • a captured image in which the road around the host vehicle 1 is looked down in the vertical direction is generated, similar to S110.
  • the shape of the rectangular unit line forming the broken white line in the captured image generated in S210 is recognized by a known pattern matching method.
  • the white line count information is information indicating which unit line the unit line recognized in S215 corresponds to which characteristic road marking is used as a reference.
  • the white line count information includes reference information regarding the characteristic road marking as a reference, and an identification number of a unit line assigned based on the position of the characteristic road marking.
  • the white line count information is stored in the RAM 64 as an output of a white line count process (FIG. 8) executed in parallel with the vehicle position calculation process.
  • map data representing a map image within a predetermined range centered on the calculated vehicle position that is, map data representing a map image including a unit line recognized in S220 is acquired from the map database 51.
  • map image represented by the map data as shown in FIG. 9 as an example, the characteristic road marking 220 as a reference and the unit lines 211 (211a to 211e) constituting the broken white line 206 are assigned. The identification number can be recognized.
  • the unit line in the captured image recognized in S215 is compared with the unit line included in the map image represented by the map data acquired in S225, and the unit line in the captured image is the unit line in the map image.
  • the correspondence is specified based on the identification number.
  • the position (imaging range) of the captured image in the map image is specified, and as a result, the vehicle position is specified in the map image.
  • the identification of the vehicle position using the dashed white line 206 will be described.
  • one unit line 211d is included in the imaging range 230 of the captured image indicated by the alternate long and short dash line.
  • the map image within the predetermined range 231 represented by the map data acquired in S225 includes a plurality of unit lines (unit lines 211c and 211d) including the unit line 211d.
  • the unit line 211d is the eleventh unit line from the characteristic road marking 220 based on the white line count information (reference information, identification number). Therefore, the position of the captured image (imaging range 230) in the map image is specified by comparing the map image and the captured image, and the vehicle position is specified in the map image.
  • the calculated vehicle position is corrected (detection error reset) by replacing the calculated vehicle position with the vehicle position specified in S230. And this white line utilization update process is complete
  • the white line counting process is a process of counting the unit line corresponding to the unit road included in the captured image from the characteristic road marking as a reference. In parallel with the vehicle position calculation process, the white line counting process is performed by the control device 61. It is repeatedly executed in accordance with the imaging cycle described above.
  • imaging information output from the imaging unit 20 (cameras 21 to 24) is acquired.
  • a captured image in which the road around the host vehicle is looked down in the vertical direction is generated as in S110.
  • the shape of the rectangular unit line constituting the broken white line in the captured image generated in S310 is recognized by a known pattern matching method.
  • S330 it is determined whether or not a new characteristic road marking different from the reference characteristic road marking is detected in the captured image generated in S310.
  • the white line counting process is terminated.
  • the process proceeds to S335.
  • the value of the identification number is reset (the identification number is set to 0 and stored in the RAM 64).
  • information on the newly detected characteristic road marking is stored in the RAM 64 as reference information.
  • the information about the characteristic road marking is the type of the characteristic road marking, the absolute position including the latitude and longitude.
  • the white line count process ends.
  • the reference information and the identification number stored in the RAM 64 correspond to the white line count information described above.
  • control device 61 assigns an identification number to each unit line and stores the identification number in the RAM 64 until a new feature road sign is detected next, based on the newly detected characteristic road sign by white line counting processing. Is repeatedly executed.
  • the wireless communication device 11 wirelessly transmits travel data about the host vehicle 1 to oncoming vehicles existing in the vicinity of the host vehicle 1 (in a communication area where radio waves reach).
  • the travel data includes the calculated host vehicle position P1 of the host vehicle 1 and the duration T1.
  • the duration T1 refers to a time during which the calculation of the vehicle position by the self-contained navigation is continued after the correction of the calculated vehicle position P1 is performed last.
  • the own vehicle position detection device 10 similar to the own vehicle 1 is also mounted on the oncoming vehicle.
  • S415 based on the imaging information acquired in S150, a captured image in which the road around the host vehicle is looked down in the vertical direction is generated as in S110.
  • S420 the oncoming vehicle 2 is recognized by the well-known pattern matching method in the captured image generated in S415.
  • the relative position P3 of the host vehicle 1 with respect to the oncoming vehicle 2 is calculated in the captured image.
  • the relative position P3 calculated in S430 is added to the calculated own vehicle position P2 of the oncoming vehicle 2 received in S415, and the own vehicle position P4 based on the calculated own vehicle position P2 of the oncoming vehicle 2 is obtained. calculate.
  • the duration T1 of the host vehicle 1 and the duration T2 of the oncoming vehicle 2 are compared, and it is determined whether or not these differences are less than a predetermined duration threshold.
  • the continuation threshold is set to a value close to zero.
  • the process proceeds to S445, and if it is less than the continuation threshold, the process proceeds to S440.
  • the duration T2 of the oncoming vehicle 2 is equal to that of the host vehicle 1. It is determined whether or not the duration is less than T1. If the duration T2 of the oncoming vehicle 2 is less than the duration T1 of the host vehicle 1, the process proceeds to S450. In S450, the calculated own vehicle position P1 of the own vehicle 1 is replaced with the own vehicle position P4 calculated based on the calculated own vehicle position P2 of the oncoming vehicle 2 in S430, thereby correcting the calculated own vehicle position P1 (detection error). Reset). And this oncoming vehicle update process is complete
  • the calculation is based on the calculated own vehicle position P2 of the oncoming vehicle 2.
  • the own vehicle position P4 is calculated (S430). For example, when the duration T2 of the other vehicle 2 is shorter than the duration T1 of the own vehicle 1 (S445: YES), the calculated own vehicle position P1 of the own vehicle 1 has a small amount of error accumulation due to self-contained navigation. It is replaced with the estimated vehicle position P4 (S450).
  • an alternate long and short dash line 242 indicates the original travel locus (true travel locus) of the host vehicle 1, and a solid line 243 indicates a travel locus based on the calculated own vehicle position P1. That is, before the calculated own vehicle position P1 is corrected (replaced) using the oncoming vehicle 2, the actual traveling locus 243 based on the calculated own vehicle position P1 has a deviation from the true traveling locus 242. On the other hand, after correcting (replacing) the calculated own vehicle position P1 using the oncoming vehicle 2, the traveling locus (two) of the own vehicle 1 on the assumption that no correction using the oncoming vehicle 2 is performed. Compared with the dotted line 244, the deviation of the traveling locus (solid line 242) of the host vehicle 1 from the true traveling locus (one-dot chain line) 242 is suppressed.
  • the oncoming vehicle 2 when the host vehicle 1 is recognized in the imaging range 245 of the oncoming vehicle 2, the oncoming vehicle 2 performs the same processing as the own vehicle 1. That is, also in the oncoming vehicle 2, the oncoming vehicle 2 travels on the assumption that the correction using the own vehicle 1 is not performed after the correction (replacement) of the calculated own vehicle position P ⁇ b> 2 using the own vehicle 1. Compared with the trajectory (two-dot chain line) 246, the travel trajectory (solid line 247) based on the calculated host vehicle position P ⁇ b> 2 of the oncoming vehicle 2 is suppressed from the true travel trajectory (one-dot chain line) 248 of the oncoming vehicle 2.
  • the shape of the characteristic road marking in the captured image is compared with the shape of the characteristic road marking included in the map image represented by the map data. Then, the calculated vehicle position P1 of the host vehicle 1 is corrected. That is, since the position of the host vehicle 1 in the captured image is constant, the position (imaging range) of the captured image in the map image is specified by comparing the map image and the captured image, and as a result, the calculated vehicle position in the map image is determined. P1 is specified (S125). Then, the calculated vehicle position P1 is corrected (reset of detection error) by replacing the calculated vehicle position P1 with the vehicle position specified in S125 (S130). According to such a configuration, since the calculated vehicle position P1 is specified (corrected) based on the comparison between the map image and the captured image, the vehicle position is detected based on the GPS positioning signal. The vehicle position can be detected with high accuracy.
  • the unit line constituting the broken white line in the captured image is the unit line in the map image. Whether it corresponds is specified based on the identification number detected by the white line counting process (S230). According to such a configuration, even when a broken white line is captured in the captured image, that is, when a plurality of unit lines having the same shape are continuously captured, each unit line is Since it can identify based on an identification number, the calculation own vehicle position P1 of the own vehicle 1 in a map image can be specified by contrast with a map image and a captured image.
  • the oncoming obtained from the oncoming vehicle 2 The calculated host vehicle position P1 of the host vehicle 1 is corrected using the calculated host vehicle position P2 of the vehicle 2 and the duration T2. Specifically, the own vehicle position P4 is specified based on the calculated own vehicle position P2 of the oncoming vehicle 2 and the relative position P3 of the oncoming vehicle 2 with respect to the own vehicle 1 calculated based on the captured image (S435).
  • the calculated own vehicle position P1 is corrected so that the distance of the corrected calculated vehicle position with respect to the own vehicle position P4 based on the calculated own vehicle position P2 of the vehicle 2 and the relative position P3 calculated in S425 is shortened (S445).
  • the calculated vehicle position P1 is corrected to a position close to the own vehicle position P4 calculated on the basis of the calculated own vehicle position P2 of the oncoming vehicle 2 with a short duration of self-contained navigation.
  • the position can be detected with high accuracy.
  • correction is performed to replace the calculated host vehicle position P1 with the host vehicle position P4 calculated using the calculated host vehicle position P2 of the oncoming vehicle 2 as a reference. That is, the correction of the vehicle position is realized by a simple configuration of replacement.
  • the calculated host vehicle position P1 based on the self-contained navigation, and S430 Correction for replacing the calculated vehicle position P1 is performed at a position intermediate to the calculated vehicle position P4 (S435 to S440).
  • the duration T1 of the self-contained navigation of the own vehicle 1 and the duration T2 of the self-contained navigation of the oncoming vehicle 2 are substantially the same value, the calculation self-contained so that the detection error due to the continuation of the self-contained navigation is suppressed.
  • the vehicle position P1 can be corrected. As a result, the vehicle position can be detected with high accuracy.
  • the calculated vehicle position based on self-contained navigation is corrected before entering the GPS reception difficult area, even if GPS reset cannot be executed in the GPS reception difficult area, detection by self-contained navigation Errors can be suppressed.
  • the GPS cold start time is a time required to detect the position of the vehicle using a positioning signal from the GPS artificial satellite from a state where there is no accurate information on the current state of the GPS artificial satellite.
  • the cameras 21 to 24 correspond to an example of “imaging device”, and the gyroscope 41, the vehicle speed sensor 42, and the G sensor 43 correspond to an example of “self-supporting sensor”.
  • the processes of S105 to S110, S205 to S210, and S305 to S310 correspond to an example of the process as the “image generation unit”, and the processes of S115, S215, and S315 correspond to an example of the process as the “marking recognition unit”.
  • the processing of S120 and S225 corresponds to an example of “map data acquisition unit”
  • the processing of S125 to S130 and S230 to S235 corresponds to an example of “position correction unit”.
  • the processing of S320 to S335 corresponds to an example of “counting unit”.
  • the processing of S150 to S155 corresponds to an example of processing as “another vehicle recognition unit”
  • the processing of S410 corresponds to an example of processing as “another vehicle position acquisition unit”
  • the processing of S415 to S425 is “ This corresponds to an example of processing as the “relative position calculation unit”.
  • the process of S435 to S450 corresponds to an example of a process as “another vehicle use correction unit”
  • the process of S10 corresponds to an example of a process as a “position update unit”
  • the process of S410 is “continuation time acquisition”.
  • the process of S430 corresponds to an example of the process as the “own vehicle position calculation unit”.
  • each unit is expressed as S10, for example.
  • each part can be divided into a plurality of sub-parts, while the plurality of parts can be combined into one part.
  • each part configured in this manner can be referred to as a circuit, a device, a module, and a means.
  • Each of the above-mentioned plurality of parts or a combination thereof is not only (i) a software part combined with a hardware unit (for example, a computer), but also (ii) hardware (for example, an integrated circuit, As a part of the (wiring logic circuit), it can be realized with or without including the functions of related devices.
  • the hardware unit can be configured inside a microcomputer.
  • the position of the host vehicle 1 in the captured image is constant, and the position of the host vehicle in the map image is specified by comparing the road marking in the captured image with the road marking in the map image.
  • the vehicle position in the map image may be specified using the shape of the road marking.
  • the map represented by the map data stored in the map database 51 shows the road marking (position and shape) in detail (exactly), so the scale and map of the map represented by the map data
  • the actual road marking dimensions are calculated from the road marking dimensions in the map represented by the data. That is, after the position (imaging range) of the captured image in the map image is specified by comparing the captured image with the map image, based on the size of the road marking in the captured image and the position of the host vehicle 1 with respect to the road marking, The calculated vehicle position P1 can be obtained.
  • the absolute coordinates of the vehicle reference point on the right front side in the traveling direction of the host vehicle 1 shown in FIG. 12 are calculated.
  • the length of the stop line 202 of the crosswalk 201 is K1
  • the width is K3
  • the distance from the stop line 202 to the vehicle reference point B1 in the traveling direction of the host vehicle 1 is the traveling direction distance T
  • the width direction distance from the center line (white line) 207 in the width direction of the road to the vehicle reference point B1 is S.
  • the center position of the host vehicle 1 is calculated as the host vehicle position P1, and the actual values for the length SH of the host vehicle 1 and the length of the width SW are stored in the map database 51 in advance.
  • the host vehicle position P1 is obtained as a position shifted from the vehicle reference point B1 by SH / 2 in the traveling direction of the host vehicle 1 and by SW / 2 in the width direction.
  • the stop line 202 is used as a road sign as a reference for obtaining the calculated vehicle position P1, but the road sign used as a reference for obtaining the calculated vehicle position P1 is not limited to this. .
  • the calculated own vehicle position P1 is obtained based on the length of each part of the road marking in the captured image, such as the width K2 and length K4 of the pedestrian crossing 201 and the width F of the center line 207. Also good.
  • the calculated host vehicle position P1 of the host vehicle 1 is specified using the captured image based on the imaging information acquired in S150.
  • processing similar to S150 and S415 is added to acquire imaging information to generate a captured image based on the imaging information.
  • a process for calculating the vehicle position in the width direction may be added to specify the calculated vehicle position P1.
  • the calculated vehicle position P1 may be specified by comparing the map image and the captured image by comparing the map image and the captured image.
  • Modification 1 for example, based on the road width RW in the captured image shown in FIG.
  • the calculated vehicle position P1 may be calculated from the vehicle reference point C1 on the left side in the direction. Accordingly, the road width direction component of the calculated vehicle position P1 is calculated more accurately because it is calculated from the captured image based on the most recently acquired imaging information as compared to the imaging information acquired in S150 of FIG. The own vehicle position P1 can be corrected.
  • the lane marking drawn at the center of the road for example, the phantom white line 203 shown in FIG. 6 or the lane marking indicating the boundaries of a plurality of lanes (for example, as shown in FIG. 7).
  • the broken line-shaped white line 206) has been cited as an example, but the broken line-shaped white line is not limited thereto.
  • the broken line white line may be a part of a lane marking indicating a roadside band, for example, a broken line white line included in a parking / parking prohibition roadside band.
  • the shape of a unit line should just be the shape defined according to the kind of broken white line.
  • the imaging unit 20 includes the four cameras 21 to 24, but the number of cameras is not limited to this.
  • the imaging unit 20 may be configured to include two units, a front camera 21 and a rear camera 22.
  • the imaging unit 20 may have a configuration including one front camera 21 or a configuration including one rear camera 22.
  • the camera with which the imaging part 20 is provided may be installed in the vehicle 1 so as to image the road from directly above in the vertical direction.
  • the travel data of the oncoming vehicle 2 is used to correct the calculated own vehicle position P1 of the own vehicle 1, but the present invention is not limited to this.
  • the calculated host vehicle position P1 may be corrected using travel data of another vehicle whose travel direction is the same as the travel direction of the host vehicle 1.
  • the map data showing a map image was acquired from the map database 51 of the own vehicle position detection apparatus 10 mounted in the vehicle 1,
  • the map data used for this embodiment are not restricted to this. Absent.
  • the map data used in the present embodiment may be map data acquired via the cloud.
  • the map data used in the present embodiment is map data representing a map reflecting the road blockade obtained from a local server such as a local road management station, or an event acquired from a local server such as a tourist station, For example, it may be map data representing a map reflecting a situation such as a closed road due to a festival.
  • the shape of the road marking is shown in detail in the map image represented by the map data.
  • the present invention is not limited to this, and an accurate road shape may be shown.
  • the accurate road shape includes the road width, the forward and backward inclination of the road, the inclination in the width direction, the maintenance state such as whether it is a gravel road or a paved road, and the like.
  • the map image represented by the map data shows in detail the shape (and position) of structures around the road, such as road signs, signs on the road, and buildings near the road. Also good. If such map data is used, the shape of the road in the captured image or the shape of the structure around the road is compared with the shape in the map image, and the vehicle position can be accurately determined as in the above embodiment. Can be detected well.
  • Modification 12 The functions of one constituent element in the embodiment may be distributed as a plurality of constituent elements, or the functions of a plurality of constituent elements may be integrated into one constituent element. Further, at least a part of the configuration of the above embodiment may be replaced with a known configuration having the same function. Moreover, you may abbreviate
  • the present disclosure includes, in addition to the vehicle position detection device 10 described above, a control device constituting the vehicle position detection device 10, a program for causing a computer to function as the control device, a medium storing the program, a vehicle position It can be realized in various forms such as a calculation method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif d'autolocalisation de véhicule utilisé dans un véhicule, ledit dispositif d'autolocalisation de véhicule comportant une unité de génération d'image (S105-S110, S205-S210, S305-S310), une unité de reconnaissance de marquage (S115, S215, S315), une unité d'acquisition de données de carte (S120, S225) et une unité de correction de position (S125-S130, S230-S235). L'unité de génération d'image génère une image photographique de la route entourant le véhicule lorsqu'il est observé depuis la direction verticale sur la base d'informations photographiques provenant de dispositifs photographiques (21-24) fixés au véhicule. L'unité de reconnaissance de marquage reconnaît la forme d'un marquage routier dessiné sur la route dans l'image photographique. L'unité d'acquisition de données de carte acquiert des données de carte indiquant une image de carte incluant la forme du marquage routier. L'unité de correction de position corrige la position absolue du véhicule en comparant la forme du marquage routier reconnu par l'unité de reconnaissance de marquage et la forme d'un marquage routier à la périphérie du véhicule qui est un marquage routier inclus dans l'image de carte.
PCT/JP2014/005900 2013-12-09 2014-11-26 Dispositif d'autolocalisation de véhicule WO2015087502A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-254217 2013-12-09
JP2013254217A JP2015114126A (ja) 2013-12-09 2013-12-09 自車位置検出装置

Publications (1)

Publication Number Publication Date
WO2015087502A1 true WO2015087502A1 (fr) 2015-06-18

Family

ID=53370833

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/005900 WO2015087502A1 (fr) 2013-12-09 2014-11-26 Dispositif d'autolocalisation de véhicule

Country Status (2)

Country Link
JP (1) JP2015114126A (fr)
WO (1) WO2015087502A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105718860A (zh) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 基于驾驶安全地图及双目交通标志识别的定位方法及系统
FR3080448A1 (fr) * 2018-04-20 2019-10-25 Psa Automobiles Sa Dispositif et procede d’analyse de la position d’un vehicule par comparaison d’informations d’environnement determinees et connues
CN110554702A (zh) * 2019-09-30 2019-12-10 重庆元韩汽车技术设计研究院有限公司 基于惯性导航的无人驾驶汽车
US10753757B2 (en) 2015-09-30 2020-08-25 Sony Corporation Information processing apparatus and information processing method
CN112703540A (zh) * 2018-09-11 2021-04-23 日产自动车株式会社 驾驶辅助方法和驾驶辅助装置

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6593088B2 (ja) * 2015-10-19 2019-10-23 株式会社豊田中央研究所 車両位置推定装置及びプログラム
JP6747160B2 (ja) * 2016-02-18 2020-08-26 トヨタ自動車株式会社 車両位置推定装置
KR102552712B1 (ko) * 2016-05-12 2023-07-06 현대모비스 주식회사 차량의 위치 추정 시스템 및 이를 이용한 차량의 위치 추정 방법
JP2018028489A (ja) 2016-08-18 2018-02-22 トヨタ自動車株式会社 位置推定装置、位置推定方法
JP6901870B2 (ja) * 2017-02-28 2021-07-14 パイオニア株式会社 位置推定装置、制御方法、及びプログラム
WO2018212283A1 (fr) * 2017-05-19 2018-11-22 パイオニア株式会社 Dispositif de mesure, procédé de mesure et programme
JP6627135B2 (ja) * 2017-06-22 2020-01-08 本田技研工業株式会社 車両位置判定装置
JP6574224B2 (ja) * 2017-08-30 2019-09-11 本田技研工業株式会社 車両制御装置、車両、車両制御方法およびプログラム
JP6837948B2 (ja) * 2017-08-30 2021-03-03 本田技研工業株式会社 車両制御装置、車両、車両制御方法およびプログラム
WO2019155569A1 (fr) * 2018-02-08 2019-08-15 三菱電機株式会社 Dispositif de détection d'obstacle et procédé de détection d'obstacle
JP2020056740A (ja) * 2018-10-04 2020-04-09 三菱電機株式会社 位置補正システム、車載機、位置補正方法、および位置補正プログラム
KR102083571B1 (ko) * 2018-12-18 2020-03-02 박주환 차량 위치 분석 방법 및 네비게이션 장치
DE102020208082A1 (de) 2020-06-30 2021-12-30 Robert Bosch Gesellschaft mit beschränkter Haftung Ermitteln einer Ausgangsposition eines Fahrzeugs für eine Lokalisierung

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006153565A (ja) * 2004-11-26 2006-06-15 Nissan Motor Co Ltd 車載ナビゲーション装置及び自車位置補正方法
JP2007153031A (ja) * 2005-12-01 2007-06-21 Aisin Aw Co Ltd 車両位置算出方法及び車載装置
JP2007178271A (ja) * 2005-12-28 2007-07-12 Aisin Aw Co Ltd 自位置認識システム
JP2013050412A (ja) * 2011-08-31 2013-03-14 Aisin Aw Co Ltd 自車位置認識システム、自車位置認識プログラム、及び自車位置認識方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006153565A (ja) * 2004-11-26 2006-06-15 Nissan Motor Co Ltd 車載ナビゲーション装置及び自車位置補正方法
JP2007153031A (ja) * 2005-12-01 2007-06-21 Aisin Aw Co Ltd 車両位置算出方法及び車載装置
JP2007178271A (ja) * 2005-12-28 2007-07-12 Aisin Aw Co Ltd 自位置認識システム
JP2013050412A (ja) * 2011-08-31 2013-03-14 Aisin Aw Co Ltd 自車位置認識システム、自車位置認識プログラム、及び自車位置認識方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10753757B2 (en) 2015-09-30 2020-08-25 Sony Corporation Information processing apparatus and information processing method
CN105718860A (zh) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 基于驾驶安全地图及双目交通标志识别的定位方法及系统
FR3080448A1 (fr) * 2018-04-20 2019-10-25 Psa Automobiles Sa Dispositif et procede d’analyse de la position d’un vehicule par comparaison d’informations d’environnement determinees et connues
CN112703540A (zh) * 2018-09-11 2021-04-23 日产自动车株式会社 驾驶辅助方法和驾驶辅助装置
CN112703540B (zh) * 2018-09-11 2022-06-07 日产自动车株式会社 驾驶辅助方法和驾驶辅助装置
CN110554702A (zh) * 2019-09-30 2019-12-10 重庆元韩汽车技术设计研究院有限公司 基于惯性导航的无人驾驶汽车

Also Published As

Publication number Publication date
JP2015114126A (ja) 2015-06-22

Similar Documents

Publication Publication Date Title
WO2015087502A1 (fr) Dispositif d'autolocalisation de véhicule
JP6566132B2 (ja) 物体検出方法及び物体検出装置
JP6451844B2 (ja) 車両位置判定装置及び車両位置判定方法
JP6325806B2 (ja) 車両位置推定システム
JP6572930B2 (ja) 情報処理装置及び情報処理システム
JP6859927B2 (ja) 自車位置推定装置
JP2011013039A (ja) 車線判定装置及びナビゲーションシステム
CN110858405A (zh) 车载摄像头的姿态估计方法、装置和系统及电子设备
US11928871B2 (en) Vehicle position estimation device and traveling position estimation method
JP5365792B2 (ja) 車両用位置測定装置
JP2008008783A (ja) 車輪速パルス補正装置
JP2019168432A (ja) 自車位置推定装置
JP6828655B2 (ja) 自車位置推定装置
JP6520463B2 (ja) 車両位置判定装置及び車両位置判定方法
JP2020060369A (ja) 地図情報システム
WO2018109865A1 (fr) Machine de bord de route et système de communication véhicule-vers-route
JP2008157636A (ja) 自車位置特定方法及び自車位置特定装置
JP2016224714A (ja) 進入判定装置、進入判定方法
US20180208197A1 (en) Lane keeping assistance system
JP6627135B2 (ja) 車両位置判定装置
EP3859281B1 (fr) Appareil et procédé de collecte de données pour la génération de carte
JP2015118555A (ja) 対向車情報生成装置
JP5549468B2 (ja) 地物位置取得装置、方法およびプログラム
JP2016223846A (ja) 自車位置判定装置及び自車位置判定方法
JP7325296B2 (ja) 物体認識方法及び物体認識システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14869794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14869794

Country of ref document: EP

Kind code of ref document: A1