WO2019188874A1 - Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques - Google Patents

Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques Download PDF

Info

Publication number
WO2019188874A1
WO2019188874A1 PCT/JP2019/012314 JP2019012314W WO2019188874A1 WO 2019188874 A1 WO2019188874 A1 WO 2019188874A1 JP 2019012314 W JP2019012314 W JP 2019012314W WO 2019188874 A1 WO2019188874 A1 WO 2019188874A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
position estimation
vehicle
point
value
Prior art date
Application number
PCT/JP2019/012314
Other languages
English (en)
Japanese (ja)
Inventor
加藤 正浩
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019188874A1 publication Critical patent/WO2019188874A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the present invention relates to a self-position estimation technique.
  • Patent Document 1 discloses a technique for estimating a self-position by collating the output of a measurement sensor with the position information of a feature registered in advance on a map.
  • Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
  • Non-Patent Document 1 discloses specifications related to a data format for collecting data detected by a vehicle-side sensor with a cloud server.
  • vehicle position estimation method Various methods have been proposed as the vehicle position estimation method, but there are execution environments suitable for position estimation and unsuitable execution environments in any method. On the other hand, if one vehicle position estimation method continues to be executed in an execution environment that is not suitable for position estimation by this method, the vehicle position estimation accuracy deteriorates, and processing that requires high position estimation accuracy such as automatic driving control is required. Adverse effects may occur.
  • the present invention has been made in order to solve the above-described problems, and stores the data structure of map data and the map data in which the self-position estimation method can be suitably selected according to the travel area.
  • the main purpose is to provide an information processing apparatus.
  • the invention described in claim is a data structure of map data, which includes position information indicating a point or area on a map, and position estimation used when a mobile body estimates its own position at the point or area. It is the data structure of the map data used for selection of the said method when the said mobile body estimates the self-position including the recommended value information which shows the recommended value for every method.
  • the invention described in the claims is an information processing apparatus, which is position information indicating a point or area on a map, and position estimation used when a mobile object estimates the position of the point or area.
  • the map data generation device indicates the position information indicating the position of the moving object estimated by a predetermined estimation method and the position information in a predetermined period including the time point when the position is estimated.
  • the recommended value information indicating the recommended value for each of the estimation methods used for estimating the position of the moving object for each point or area, generated based on the average and standard deviation information of the position estimation accuracy,
  • a generation unit that generates map data in association with the position information.
  • the functional block of the own vehicle position estimation part in landmark base position estimation is shown.
  • the functional block of the own vehicle position estimation part in point cloud base position estimation is shown.
  • An example of the schematic data structure of voxel data is shown.
  • An example of the data structure of system recommendation information is shown. It is a figure which shows the outline
  • the data structure of map data is used when position information indicating a point or area on a map and a mobile object at the point or area estimate its own position. And a recommended value information indicating a recommended value for each position estimation method, and a data structure of map data used for selection of the method when the mobile body estimates its own position.
  • the mobile body estimates its own position is not limited to the case where the mobile body itself estimates its own position, but also includes the case where a device mounted on the mobile body estimates the position of the mobile body.
  • the self-position estimation method can be accurately selected when the mobile body estimates the self-position.
  • the position information is information indicating a point or an area by a distance from a reference point for each lane on the road. According to this aspect, selection of the self-position estimation method suitable for the traveling lane can be performed accurately.
  • the information processing apparatus includes position information indicating a point or area on a map, and position estimation used when a mobile object estimates the position of the point or area. And a recommended value information indicating a recommended value for each method.
  • the information processing device can distribute recommended value information to other devices or perform highly accurate self-position estimation with reference to the recommended value information.
  • the information processing apparatus includes a position estimation unit that estimates a position of a moving body, and the position estimation unit is associated with position information indicating a point or area of a road where the moving body exists.
  • the position estimation method is selected on the basis of the recommended value information.
  • the information processing apparatus can accurately select the optimum self-position estimation method with reference to the recommended value information.
  • the information processing apparatus includes a position estimation unit that estimates a position of a moving body, and the position estimation unit includes position information indicating a point or an area of a road where the moving body exists.
  • the weight of the estimation result for each position estimation method is determined based on the recommended value information associated with. According to this aspect, when the information processing apparatus executes a plurality of self-position estimation methods to estimate the position of the moving object, the information processing apparatus accurately weights the estimation result of each self-position estimation method with reference to the recommended value information. Can be determined.
  • the map data generation device includes the position information in a predetermined period including the position information indicating the position of the moving body estimated by a predetermined estimation method and the time point at which the position is estimated.
  • the map data generation device can suitably generate map data including recommended value information indicating recommended values for each position estimation method for each point or area.
  • the generation unit calculates standardized accuracy information that is accuracy information obtained by standardizing the accuracy information of the position information estimated by different estimation methods into a common predetermined range.
  • the recommended value information is generated based on an average of the standardized accuracy information for each point or area and for each estimation method.
  • the map data generation device can suitably generate recommended value information that excludes the influence of the difference in accuracy information distribution that occurs for each self-position estimation method and for each moving object.
  • FIG. 1 is a schematic configuration of a driving support system according to the present embodiment.
  • the driving support system includes an in-vehicle device 1 that moves together with each vehicle that is a moving body, and a server device 6 that communicates with each in-vehicle device 1 via a network.
  • a driving assistance system updates distribution map DB20 which is a map for distribution which the server apparatus 6 holds based on the information transmitted from each vehicle equipment 1.
  • the “map” includes data used for ADAS (Advanced Driver Assistance System) and automatic driving in addition to data referred to by a conventional in-vehicle device for route guidance.
  • ADAS Advanced Driver Assistance System
  • the in-vehicle device 1 is electrically connected to the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, a predetermined object is detected and the vehicle in which the in-vehicle device 1 is mounted.
  • the position of the vehicle also referred to as “the vehicle position” is estimated.
  • the vehicle equipment 1 performs automatic driving
  • DB DataBase
  • the vehicle equipment 1 estimates the own vehicle position by collating with the output of the lidar 2 etc. based on this map DB10.
  • the in-vehicle device 1 transmits upload information “Iu” including information on the detected object to the server device 6.
  • the in-vehicle device 1 is an example of an information processing device and an information transmission device.
  • the lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information.
  • the lidar 2 includes an irradiation unit that irradiates laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) reflected by the object, and a light reception signal output by the light receiving unit. And an output unit for outputting scan data based on.
  • the scan data is point cloud data, and is generated based on the irradiation direction corresponding to the laser beam received by the light receiving unit and the distance to the object in the irradiation direction specified based on the light reception signal.
  • the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
  • the server device 6 receives the upload information Iu from each in-vehicle device 1 and stores it. For example, the server device 6 updates the distribution map DB 20 based on the collected upload information Iu. In addition, the server device 6 transmits download information Id including update information of the distribution map DB 20 to each in-vehicle device 1.
  • the server device 6 is an example of an information processing device and a map data generation device.
  • FIG. 2A is a block diagram showing a functional configuration of the in-vehicle device 1.
  • the in-vehicle device 1 mainly includes an interface 11, a storage unit 12, a communication unit 13, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
  • the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15.
  • the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
  • the storage unit 12 stores a map DB 10 including method recommendation information IR, landmark information IL, and voxel data IB.
  • the method recommendation information IR is information indicating a recommendation level (also simply referred to as “recommended value”) for each vehicle position estimation method that can be executed by the vehicle-mounted device 1 for each point or area.
  • the system recommendation information IR is an example of recommended value information.
  • the landmark information IL is information relating to each object that is a landmark, and includes attribute information such as the position, size, and shape of each object.
  • the landmark is, for example, a kilometer post, a 100 m post, a delineator, a traffic infrastructure facility (for example, a sign, a direction signboard, a signal), a telephone pole, a streetlight, or the like that is periodically arranged along the road.
  • the landmark information IL is used in landmark base position estimation described later.
  • the voxel data IB is information on point cloud data indicating the measurement position of the stationary structure for each unit region (also referred to as “voxel”) when the three-dimensional space is divided into a plurality of regions.
  • the communication unit 13 performs transmission of the upload information Iu and reception of the download information Id based on the control of the control unit 15.
  • the input unit 14 is a button, a touch panel, a remote controller, a voice input device, or the like for a user to operate.
  • the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
  • the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
  • the control unit 15 includes a host vehicle position estimation unit 17, an upload control unit 18, and an automatic driving control unit 19.
  • the own vehicle position estimation unit 17 performs highly accurate estimation of the own vehicle position by selectively or combining a plurality of own vehicle position estimation methods.
  • the vehicle position estimation unit 17 performs position estimation using landmark information (also referred to as “landmark base position estimation”) and position estimation using voxel data (“point cloud base”).
  • landmark information also referred to as “landmark base position estimation”
  • point cloud base position estimation using voxel data
  • GNSS Global Navigation Satellite System
  • the vehicle position estimation unit 17 performs vehicle position estimation based on the output of the lidar 2 in the landmark base position estimation and the point cloud base position estimation, and the vehicle position estimation based on the output of the GPS receiver 5 in the GNSS base position estimation. I do.
  • the vehicle position estimation unit 17 refers to the method recommendation information IR and determines the position estimation method to be executed.
  • the own vehicle position estimating unit 17 estimates the own vehicle position, and generates information on the estimated accuracy of the own vehicle position (also referred to as “accuracy information”) and stores it in the storage unit 12 or the like. Details of the landmark-based position estimation and the point cloud-based position estimation and the method of generating the accuracy information will be described later.
  • the upload control unit 18 generates upload information Iu including information related to the detected object when a predetermined object is detected based on the output of an external sensor such as the lidar 2, and transmits the upload information Iu to the server device 6. To do. Further, in the present embodiment, the upload control unit 18 includes information related to the executed vehicle position estimation (also referred to as “position estimation related information”) in the upload information Iu together with the estimated position information and transmits the information to the server device 6. To do.
  • the position estimation related information includes position information estimated by the vehicle position estimation unit 17, accuracy information of position estimation performed by the vehicle position estimation unit 17, information on the average and standard deviation of the accuracy information, and execution. Identification information (also referred to as “method information”) indicating the own vehicle position estimation method.
  • the average and standard deviation described above are the average and standard deviation of the accuracy information values of the vehicle position estimation calculated by the vehicle position estimation unit 17 within the past predetermined time.
  • the upload control unit 18 is an example of a “transmission unit” and a “computer” that executes a program.
  • the automatic driving control unit 19 refers to the map DB 10 and transmits a signal necessary for automatic driving control to the vehicle based on the set route and the own vehicle position estimated by the own vehicle position estimating unit 17. Based on the set route, the automatic driving control unit 19 sets a target track, and the vehicle position estimated by the host vehicle position estimating unit 17 is set to a vehicle within a predetermined width from the target track. Then, a guide signal is transmitted to control the position of the vehicle.
  • FIG. 2B is a block diagram showing a functional configuration of the server device 6.
  • the server device 6 mainly includes a communication unit 61, a storage unit 62, and a control unit 65. Each of these elements is connected to each other via a bus line.
  • the communication unit 61 receives the upload information Iu and transmits the download information Id based on the control of the control unit 65.
  • the storage unit 62 stores a program executed by the control unit 65 and information necessary for the control unit 65 to execute a predetermined process.
  • the storage unit 62 stores a distribution map DB 20 having the same data structure as the map DB 10 and an accuracy information DB 27 that is a database of accuracy information based on the upload information Iu received from each vehicle-mounted device 1. .
  • the control unit 65 includes a CPU that executes a program and controls the entire server device 6.
  • the control unit 65 updates the accuracy information DB 27 based on the position estimation related information included in the upload information Iu received from each in-vehicle device 1 by the communication unit 61, and the system recommended information IR based on the accuracy information DB 27. And a process of transmitting map update information such as the generated method recommendation information IR to each in-vehicle device 1 by the communication unit 61.
  • the control unit 65 is an example of a “generation unit”.
  • the vehicle position estimation unit 17 is based on the distance and angle measurement values obtained by the lidar 2 with respect to the landmark and the landmark position information extracted from the map DB 10.
  • the vehicle position estimated from the output data of the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 is corrected.
  • the vehicle position estimation unit 17 predicts the vehicle position calculated in the prediction step of predicting the vehicle position from the output data of the gyro sensor 3, the vehicle speed sensor 4, and the like, and the prediction step immediately before.
  • the measurement update step for correcting the value is executed alternately.
  • Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter.
  • an extended Kalman filter an extended Kalman filter
  • an unscented Kalman filter an unscented Kalman filter
  • a particle filter an example in which the vehicle position estimation unit 17 performs vehicle position estimation using an extended Kalman filter.
  • FIG. 3 is a diagram showing the position of the vehicle to be estimated in two-dimensional orthogonal coordinates.
  • the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ ⁇ ” of the vehicle.
  • the yaw angle ⁇ is defined as an angle formed by the traveling direction of the vehicle and the x-axis.
  • four variables (x, y, z, ⁇ ) taking into account the z-axis coordinates perpendicular to the x-axis and the y-axis are used.
  • the vehicle position is estimated using the state variable of the vehicle position. Since a general road has a gentle slope, the pitch angle and roll angle of the vehicle are basically ignored in this embodiment.
  • FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step.
  • FIG. 5 shows an example of functional blocks of the vehicle position estimation unit 17. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the own vehicle position estimation part 17 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step.
  • the position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26.
  • the state variable vector of the reference time (ie, current time) “k” to be calculated is represented as “X ⁇ (k)” or “X ⁇ (k)”.
  • the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ⁇ ” on the character representing the value.
  • the position prediction block 24 of the control unit 15 adds the obtained moving distance and azimuth change to the state variable vector X ⁇ (k-1) at the time k-1 calculated in the immediately previous measurement update step, so that the time k A predicted value (also referred to as “predicted position”) X ⁇ (k) is calculated.
  • the landmark search / extraction block 25 associates the landmark position vector registered in the landmark information IL of the map DB 10 with the scan data of the lidar 2. Then, when the association is made, the landmark search / extraction block 25, the measured value “Z (k)” by the lidar 2 of the made landmark, the predicted position X ⁇ (k), and the map Landmark measurement values (referred to as “measurement prediction values”) “Z ⁇ (k)” obtained by modeling the measurement processing by the lidar 2 using the landmark position vectors registered in the DB 10 are acquired. To do.
  • the measured value Z (k) is a vehicle coordinate system ("vehicle coordinate system”) converted from a landmark distance and a scan angle measured by the rider 2 at time k into components with the vehicle traveling direction and the lateral direction as axes. Vector value). Then, the position correction block 26 multiplies the difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k) by the Kalman gain “K (k)” as shown in the following equation (1). By adding this to the predicted position X ⁇ (k), the updated state variable vector (also referred to as “estimated position”) X ⁇ (k) is calculated.
  • the position correction block 26 uses a covariance matrix P ⁇ (k) (simply expressed as P (k)) corresponding to the error distribution of the estimated position X ⁇ (k). Obtained from the covariance matrix P ⁇ (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
  • the prediction step and the measurement update step are repeatedly performed, and the predicted position X ⁇ (k) and the estimated position X ⁇ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
  • the accuracy of position estimation can be determined by the value of the diagonal element of the covariance matrix P.
  • the covariance matrix calculated based on the measured value for the landmark at time “k” is P (k)
  • the covariance matrix P (k) is expressed by the following equation (2).
  • the vehicle position estimation unit 17 uses the square roots “ ⁇ x (k)”, “ ⁇ y (k)”, “ ⁇ z (k)”, “ ⁇ ” of the diagonal elements of the covariance matrix P (k).
  • ⁇ (k) is regarded as accuracy information value“ d (k) ”for each state variable x, y, z, ⁇ .
  • the vehicle position estimation unit 17 uses the square roots ⁇ X (k), ⁇ Y (k), ⁇ Z (k), ⁇ ⁇ of the diagonal elements after conversion into the vehicle coordinate system (X, Y, Z). (K) is regarded as the precision information value d (k) for each state variable.
  • the voxel data IB used in the point cloud-based position estimation includes data representing the point cloud data measured for stationary structures in each voxel by a normal distribution, and is used for scan matching using NDT (Normal Distributions Transform). .
  • FIG. 6 shows an example of the vehicle position estimation unit 17 in the point cloud base position estimation. The difference from the vehicle position estimation unit 17 in the landmark base position estimation shown in FIG. 5 is that the point cloud data obtained from the lidar 2 and the voxel acquired from the map DB are used instead of the landmark search / extraction unit 25.
  • the point cloud data correlation block 27 is provided as the correlation process.
  • FIG. 7 shows an example of a schematic data structure of the voxel data IB.
  • the voxel data IB includes information on parameters when the point cloud in the voxel is expressed by a normal distribution.
  • the voxel ID, voxel coordinates, average vector, and covariance are included. Including matrix.
  • “voxel coordinates” indicate absolute three-dimensional coordinates of a reference position such as the center position of each voxel.
  • Each voxel is a cube obtained by dividing the space into a lattice shape, and since the shape and size are determined in advance, the space of each voxel can be specified by the voxel coordinates.
  • the voxel coordinates may be used as a voxel ID.
  • the mean vector“ ⁇ n ”and the covariance matrix“ V n ”at voxel n are expressed by the following equations (5) and (6), respectively.
  • the in-vehicle device 1 uses the point group obtained by coordinate transformation, the average vector ⁇ n and the covariance matrix V n included in the voxel data, and the voxel n represented by the following equation (9).
  • Overall evaluation function value “E (k)” also referred to as “overall evaluation function value” for all voxels to be matched indicated by the evaluation function value “E n ” and Expression (10). Is calculated.
  • the evaluation function value E n of each voxel is also referred to as "individual evaluation function value”.
  • the in-vehicle device 1 calculates an estimation parameter P that maximizes the overall evaluation function value E (k) by an arbitrary root finding algorithm such as Newton's method.
  • the in-vehicle device 1 applies the estimation parameter P to the own vehicle position X ⁇ (k) predicted from the position prediction unit 21 shown in FIG. An accurate own vehicle position X ⁇ (k) is estimated.
  • the accuracy information value d (k) of the point cloud base position estimation is defined so as to be smaller as the position estimation accuracy is better.
  • the GNSS base position estimation own vehicle position estimation unit 17 estimates the own vehicle position based on the output of the GPS receiver 5 in the GNSS base position estimation. Further, when executing the GNSS base position estimation, the host vehicle position estimation unit 17 acquires, for example, DOP (Division Of Precision) obtained from the GPS receiver 5 as the accuracy information value d (k). In another example, the vehicle position estimation unit 17 uses the standard deviation values of latitude, longitude, and altitude acquired from the GPS receiver 5 within a predetermined period of time as values of accuracy information for each of latitude, longitude, and altitude. Obtained as d (k).
  • the GPS receiver 5 may be a receiver capable of positioning not only GPS but also GLONASS, Galileo, quasi-zenith satellite (QZSS), and the like.
  • FIG. 8 shows an example of the data structure of the method recommendation information IR.
  • the method recommendation information IR shown in FIG. 8 includes items of “position” and “recommended value”, and “position” includes “lane link ID”, “CRP (Common Reference Point)”, and “distance from CRP”. Contains each sub-item.
  • the “recommended value” includes “landmark base” corresponding to landmark base position estimation, “point cloud base” corresponding to point cloud base position estimation, and “GNSS base” corresponding to GNSS base position estimation. Contains sub-items.
  • the recommended value has a value range from 0 to 1.
  • lane link ID indicates a lane link ID assigned to the lane to which the target point belongs
  • CRP indicates a reference point (reference point) serving as a reference in the lane to which the target point belongs.
  • Distance from CRP indicates the distance from the reference point defined in “CRP” to the target point.
  • the node ID (“N1”, “N4”, “N5”, “N9”) of the node corresponding to the start point of the lane to which the target point belongs is specified as the reference point specified in “CRP”. Yes.
  • the “position” indicates, for example, a point at a predetermined interval for each lane by a combination of “lane link ID”, “CRP”, and “distance from CRP”.
  • the “Recommended Value” “Landmark Base” has sub-items “traveling direction”, “lateral direction”, “height direction”, and “azimuth”, and state variables corresponding to the sub-items.
  • the recommended value indicating the effectiveness for improving the accuracy is specified.
  • a recommended value indicating the effectiveness for accuracy improvement by matching is defined.
  • GNSS base of “recommended value” is provided with sub-items of “latitude”, “longitude”, and “altitude”, and a recommendation indicating the effectiveness for improving accuracy of latitude, longitude, and altitude. Value is specified.
  • the system recommendation information IR shown in FIG. 8 for each of the points defined in the “position”, 0 to 1 for each of the landmark base position estimation, the point group base position estimation, and the GNSS base position estimation.
  • the recommended value that is the range of is specified.
  • each position estimation method there are areas that are suitable for execution of the method and areas that are not suitable, so the recommended value for each method is different for each area.
  • road signs such as white lines, direction signs, and kilometer posts are provided, and landmarks can be detected with high probability, so the recommended value for landmark-based position estimation is high on highways.
  • landmarks can be detected with high probability, so the recommended value for landmark-based position estimation is high on highways.
  • general roads in urban areas there are places where the white line is degraded, and there are many other vehicles, and landmarks may not be detected due to occlusion, so the recommended value for landmark base position estimation is low.
  • recommended values for each vehicle position estimation method are defined for each lane.
  • the recommended value for landmark base position estimation and point cloud base position estimation in the left lane where no occlusion by other vehicles occurs is increased, and multipath It is possible to set a recommended value for each lane such as lowering the recommended value for GNSS base position estimation in the left lane where the possibility increases.
  • the own vehicle position estimating unit 17 can suitably determine the own vehicle position estimating method to be executed by referring to the method recommendation information IR having the data structure shown in FIG.
  • the vehicle position estimation unit 17 is a method recommendation corresponding to the closest point on the lane to which the vehicle position estimated one time ago is a point specified in the “position” of the method recommendation information IR.
  • a record of information IR is extracted, and a vehicle position estimation method is determined based on each recommended value of each landmark-based position estimation, point cloud-based position estimation, and GNSS-based position estimation of the extracted record.
  • the host vehicle position estimation unit 17 performs host vehicle position estimation based on the host vehicle position estimation method having the highest recommended value specified for the point. That is, in this case, the host vehicle position estimation unit 17 estimates the host vehicle position by selectively executing only the host vehicle position estimation method having the highest recommended value.
  • the highest recommended value among the recommended values You may use as a representative value of the recommended value of a vehicle position estimation system, and you may use the average value or median value of those recommended values as a representative value of the recommended value of the said own vehicle position estimation system.
  • the own vehicle position estimation unit 17 is based on the current position estimation accuracy of the own vehicle, and is the state variable (for example, most important among the traveling direction, the lateral direction, the height direction, and the direction).
  • the recommended value corresponding to the state variable having the lowest estimation accuracy may be used as a representative value of the recommended value of the vehicle position estimation method.
  • the own vehicle position estimation unit 17 executes all the own vehicle position estimation methods (except for the method whose recommended value is 0) defined in the method recommendation information IR, and obtains the vehicle position estimation method by each vehicle position estimation method.
  • the final position estimation result is calculated by weighting and averaging the obtained position estimation results using recommended values corresponding to the respective vehicle position estimation methods. Whether to adopt the first example or the second example is determined according to, for example, the ability of the CPU functioning as the vehicle position estimation unit 17.
  • the automatic operation control unit 19 may determine the target trajectory of the vehicle with reference to the method recommendation information IR. For example, the automatic driving control unit 19 compares the recommended value of the currently executed position estimation method for each lane included in the currently traveling road, and the target trajectory of the vehicle so as to travel in the lane having the highest recommended value. To decide. In another example, in the route search process, the automatic driving control unit 19 sets a higher cost for a road with a lower recommended value so that it is difficult to set a road with a lower recommended value as a travel route. Good. In yet another example, the automatic operation control unit 19 may improve the measurement accuracy by the lidar 2 such as a landmark by reducing the traveling speed of the vehicle when traveling on a road with a low recommended value.
  • FIG. 9 is a diagram showing an outline of the data structure of the upload information Iu transmitted by the in-vehicle device 1.
  • the upload information Iu includes header information, travel route information, event information, and media information.
  • the header information includes items of “version”, “transmission source”, and “vehicle metadata”.
  • the in-vehicle device 1 designates information on the version of the data structure of the upload information Iu used in “Version”, and the name of the company (OEM name or system vendor of the vehicle that transmits the upload information Iu) in “Sender” Name) information. Further, the in-vehicle device 1 specifies vehicle attribute information (for example, vehicle type, vehicle ID, vehicle width, vehicle height, etc.) in “vehicle metadata”.
  • the travel route information includes an item “position estimation”. The in-vehicle device 1 designates, for this “position estimation”, the time stamp information indicating the position estimation time, the latitude, longitude, altitude information indicating the estimated vehicle position, and information regarding the estimation accuracy. .
  • Event information includes an item of “object recognition event”.
  • the vehicle-mounted device 1 detects an object recognition event, it designates information as a detection result as an “object recognition event”.
  • the media information is a data type used when transmitting raw data that is output data (detection information) of an external sensor such as the lidar 2.
  • the in-vehicle device 1 includes the position estimation related information in the upload information Iu and transmits it to the server device 6.
  • the in-vehicle device 1 may designate each piece of information on the position estimation related information in the item “position estimation”, and newly provide an item for notifying the position estimation related information as event information.
  • Each item of position estimation related information may be specified in the item.
  • the vehicle equipment 1 can suitably notify the server apparatus 6 of the position estimation related information together with the estimated position information of the vehicle.
  • the value S (k) of the standardization accuracy information becomes a negative value when it is smaller than the average ⁇ (k), and becomes a positive value when it is larger than the average ⁇ (k). Further, the value S (k) of the standardization accuracy information approaches 0 as the standard deviation ⁇ (k) increases, and increases as the standard deviation ⁇ (k) decreases. Therefore, the standardized accuracy information value S (k) is a standardized expression using the distribution of the accuracy information value d (k). If the vehicle position estimation method is landmark-based position estimation, the server device 6 uses the standardized accuracy information value S (k) for each state variable (traveling direction, lateral direction, height direction, direction). calculate. Similarly, for the GNSS base position estimation, the server device 6 calculates the standardized accuracy information value S (k) for each latitude, longitude, and altitude.
  • FIG. 10A shows accuracy information values d 1 (k) and d obtained when vehicle-mounted devices 1 mounted on different vehicles at a certain point perform vehicle position estimation using different vehicle position estimation methods. It is the figure which showed transition of 2 (k).
  • the accuracy information value d 1 (k) indicated by the graph G1 has a mean “ ⁇ 1 (k)” and standard deviation “ ⁇ 1 (k)” distribution
  • the accuracy information value indicated by the graph G2 d 2 (k) has a distribution of an average “ ⁇ 2 (k)” and a standard deviation “ ⁇ 2 (k)”.
  • the accuracy information differs in average and standard deviation due to the vehicle position estimation method to be executed and the accuracy of the sensor to be used.
  • FIG. 10B shows the transition of the standardized accuracy information value S 1 (k) obtained by standardizing the accuracy information value d 1 (k) based on the equation (12), and the accuracy information value d 2 (k). is a diagram showing changes and value S 2 of the standardized standardized accuracy information (k) based on (12).
  • the value S 1 (k) of the standardization accuracy information indicated by the graph G3 and the value S 2 (k) of the standardization accuracy information indicated by the graph G4 are both distributions of mean 0 and standard deviation 1. Yes.
  • Expression (12) it is possible to handle accuracy information having different distributions by the same axis.
  • the server device 6 converts the accuracy information of the position estimation related information included in the upload information Iu received from the in-vehicle device 1, the average and standard deviation information into the standardization accuracy information, and the converted standardization accuracy
  • the information is stored in the accuracy information DB 27 in association with the method information and the position information included in the position estimation related information. Therefore, the standardized accuracy information can be handled as a value S (p) using the position “p” as a parameter.
  • the server device 6 may calculate the standardization accuracy information when calculating the recommended value instead of calculating the standardization accuracy information when receiving the upload information Iu. In this case, the position estimation related information included in the upload information Iu received from the in-vehicle device 1 is recorded in the accuracy information DB 27.
  • the server device 6 averages the standardized accuracy information recorded in the accuracy information DB 27 at the update timing of the method recommendation information IR for each point p and for each vehicle position estimation method “S ⁇ (p)”. Is calculated.
  • the server device 6 classifies the standardization accuracy information for each point specified by the item “position” of the method recommendation information IR shown in FIG. 8 based on the location information associated with the standardization accuracy information, and also performs standardization accuracy. Based on the method information associated with the information, the standardized accuracy information is classified for each vehicle position estimation method. Then, the server device 6 averages the standardized accuracy information values classified for each point and for each vehicle position estimation method, thereby obtaining the average value S ⁇ (p for each point and for each vehicle position estimation method. ) Is calculated.
  • the server device 6 calculates the standardized accuracy information value S (p) for each state variable (traveling direction, lateral direction, height direction, direction). Average. Similarly, for the GNSS base position estimation, the server device 6 averages the value S (p) of the standardized accuracy information for each latitude, longitude, and altitude.
  • FIG. 11A shows the transition of the standardized accuracy information value S x (p) in the traveling direction of the landmark-based position estimation calculated based on the upload information Iu of the plurality of in-vehicle devices 1 and a value obtained by averaging these values.
  • the transition of S ⁇ x (p) is shown.
  • FIG. 11B shows the transition of the standardized accuracy information value S y (p) in the lateral direction of the landmark base position estimation calculated based on the upload information Iu of the plurality of in-vehicle devices 1 and averages these values.
  • the transition of the value S ⁇ y (p) shows the transition of the standardized accuracy information value S x (p) in the traveling direction of the landmark-based position estimation calculated based on the upload information Iu of the plurality of in-vehicle devices 1 and a value obtained by averaging these values.
  • the transition of S ⁇ x (p) is shown.
  • FIG. 11B shows the transition of the standardized accuracy information value S y (
  • 11C shows the transition of the value S (p) of the standardization accuracy information of the point cloud base position estimation calculated based on the upload information Iu of the plurality of in-vehicle devices 1, and the value S ⁇ averaged of these. (P) and transition.
  • the server device 6 sets a recommended value “r (p)” for each vehicle position estimation method at each point based on the average value S ⁇ (p) calculated for each point and for each vehicle position estimation method. To do.
  • the server device 6 increases the recommended value r (p).
  • the server device 6 decreases the recommended value r (p) because the estimated position accuracy of the place is worse as the average value S ⁇ (p) is larger.
  • the server device 6 calculates a recommended value r (p) with reference to any one of the following formulas (13) to (15), so that the average value S ⁇ (p) is negative.
  • the recommended value r (p) can be generated so as to approach 1 as the direction increases, and the recommended value r (p) as close to 0 as the average value S ⁇ (p) increases.
  • FIG. 12A shows the correspondence between the average value S ⁇ (p) based on the equations (13) to (15) and the recommended value r (p).
  • the graph G5 shows the correspondence between the average value S ⁇ (p) based on the formula (13) and the recommended value r (p)
  • the graph G6 shows the average value S ⁇ (p based on the formula (14).
  • the recommended value r (p) and the graph G7 shows the correspondence between the average value S ⁇ (p) based on the equation (15) and the recommended value r (p).
  • the server device 6 recommends from the average value S ⁇ (p) by using any of the following formulas (16) to (18) in which the coefficient c is introduced into the formulas (13) to (15). It is possible to suitably adjust the conversion ratio to the value r (p).
  • FIG. 12B shows a correspondence relationship between the average value S ⁇ (p) based on the equations (16) to (18) and the recommended value r (p) when the coefficient c is set to 0.5.
  • the graph G8 shows the correspondence between the average value S ⁇ (p) based on the equation (16) and the recommended value r (p)
  • the graph G9 shows the average value S ⁇ (p) based on the equation (17).
  • the graph G10 shows the correspondence between the average value S ⁇ (p) based on the equation (18) and the recommended value r (p).
  • FIG. 13 is an example of a flowchart showing an outline of processing related to transmission / reception of upload information Iu and download information Id.
  • the vehicle-mounted device 1 estimates its own vehicle position, calculates accuracy information, and stores it in the storage unit 12 (step S101). For example, in this case, the in-vehicle device 1 uses each vehicle position estimation method corresponding to the nearest point on the lane to which the vehicle position estimated one hour before belongs among the points included in the record of the method recommendation information IR. Based on the recommended value, the vehicle position estimation method to be executed is determined. The in-vehicle device 1 calculates accuracy information together with the vehicle position estimation and stores the accuracy information in the storage unit 12. Next, the in-vehicle device 1 determines whether it is the transmission timing of the upload information Iu (step S102). And when the vehicle equipment 1 is not the transmission timing of upload information Iu (step S102; No), the process of step S101 is continued.
  • the in-vehicle device 1 determines that it is the transmission timing of the upload information Iu (step S102; Yes), the average and standard deviation of the accuracy information values stored in the storage unit 12 within the past predetermined time are calculated. To do.
  • the in-vehicle device 1 adds the accuracy information calculated in the immediately preceding step S101 and the above average and standard deviation to the upload information Iu together with the method information indicating the own vehicle position estimation method executed in step S101 and the estimated position information.
  • the information is transmitted to the server device 6 (step S103).
  • the upload information Iu including the accuracy information, the average, and the standard deviation about each own vehicle position estimation system is obtained. It may be transmitted to the server device 6.
  • the server device 6 that has received the upload information Iu from the in-vehicle device 1 calculates standardization accuracy information based on the formula (12) from the accuracy information, average, and standard deviation included in the upload information Iu, and calculates the standardization accuracy information. Is stored in the accuracy information DB 27 together with the method information and the position information (step S201).
  • the server device 6 determines whether or not it is the update timing of the distribution map DB 20 (step S202). And when it is not the update timing of distribution map DB20 (step S202; No), the server apparatus 6 receives the upload information Iu from the vehicle equipment 1, and performs the process of step S201.
  • the server device 6 when determining that it is the update timing of the distribution map DB 20 (step S202; Yes), the server device 6 refers to the accuracy information DB 27 and calculates a recommended value for each point and for each vehicle position estimation method (step). S203). In this case, for example, the server device 6 averages the standardization accuracy information for each point and for each vehicle position estimation method, and for each point and for the vehicle position estimation method based on any one of the equations (13) to (18). A recommended value is calculated for each. Thereby, the server device 6 suitably generates the method recommendation information IR. Then, the server device 6 updates the distribution map DB 20 based on the generated method recommendation information IR (step S204). Then, the server device 6 transmits download information Id including the generated method recommendation information IR to each in-vehicle device 1 (step S205).
  • step S104 When the in-vehicle device 1 receives the download information Id (step S104; Yes), it updates the map DB 10 using the download information Id (step S105). As a result, the latest method recommendation information IR is recorded in the map DB 10. On the other hand, when the in-vehicle device 1 has not received the download information Id from the server device 6 (step S104; No), the process returns to step S101.
  • Modification 1 In the method recommendation information IR shown in FIG. 8, the recommended value of each vehicle position estimation method is recorded for each point. Instead, the recommended value of each vehicle position estimation method may be recorded for each section in the method recommendation information IR. In this case, for example, information indicating the position of the start point and the end point of the target section is specified in the “position” of the method recommendation information IR. In another example, only a link ID representing a lane or a road may be specified in the “position” of the system recommendation information IR.
  • a point where a recommended value for each vehicle position estimation method is set is defined as a distance from the CRP in association with the lane. Instead, the point may be represented by latitude and longitude information.
  • the in-vehicle device 1 determines each vehicle position estimation method corresponding to the point included in the record of the method recommendation information IR that is closest to the vehicle position estimated immediately before. The vehicle position estimation method to be executed may be determined based on the recommended value.
  • the configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG.
  • the electronic control device of the vehicle instead of having the in-vehicle device 1, executes the processes of the own vehicle position estimation unit 17, the upload control unit 18, and the automatic driving control unit 19 of the in-vehicle device 1.
  • the map DB 10 is stored in, for example, a storage unit in the vehicle, and the electronic control device of the vehicle exchanges upload information Iu and download information Id with the server device 6 via the in-vehicle device 1 or communication (not shown). You may go through the part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Navigation (AREA)

Abstract

Dans la présente invention, des informations IR de recommandation de méthode contiennent des éléments relatifs à une « position » et à une « valeur recommandée ». L'élément relatif à une « position » contient des sous-éléments relatifs à un « ID de lien de voie », à un « CRP » et à une « distance à partir du CRP ». De plus, l'élément relatif à une « valeur recommandée » contient des sous-éléments relatifs à une « base de repère » correspondant à une estimation d'une position d'un repère, à une « base de groupe de points » correspondant à une estimation d'une position d'une base de groupe de points, ainsi qu'à une « base GNSS » correspondant à une estimation d'une position d'une base GNSS. La valeur recommandée a une plage de 0 à 1.
PCT/JP2019/012314 2018-03-27 2019-03-25 Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques WO2019188874A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018060475 2018-03-27
JP2018-060475 2018-03-27

Publications (1)

Publication Number Publication Date
WO2019188874A1 true WO2019188874A1 (fr) 2019-10-03

Family

ID=68061775

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/012314 WO2019188874A1 (fr) 2018-03-27 2019-03-25 Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques

Country Status (1)

Country Link
WO (1) WO2019188874A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114526744A (zh) * 2020-11-05 2022-05-24 丰田自动车株式会社 地图更新装置和地图更新方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002036990A (ja) * 2000-07-26 2002-02-06 Honda Motor Co Ltd 車両の車線逸脱警報装置
JP2010230501A (ja) * 2009-03-27 2010-10-14 Kddi Corp 位置情報取得装置、位置情報取得システム、位置情報取得方法およびプログラム
WO2013128919A1 (fr) * 2012-02-27 2013-09-06 ヤマハ発動機株式会社 Ordinateur hôte, système de détermination de compétence de fonctionnement, procédé de détermination de compétence de fonctionnement et programme de détermination de compétence de fonctionnement
WO2016059930A1 (fr) * 2014-10-17 2016-04-21 ソニー株式会社 Dipositif, procédé et programme
JP2018017668A (ja) * 2016-07-29 2018-02-01 パナソニックIpマネジメント株式会社 情報処理装置、及び情報処理プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002036990A (ja) * 2000-07-26 2002-02-06 Honda Motor Co Ltd 車両の車線逸脱警報装置
JP2010230501A (ja) * 2009-03-27 2010-10-14 Kddi Corp 位置情報取得装置、位置情報取得システム、位置情報取得方法およびプログラム
WO2013128919A1 (fr) * 2012-02-27 2013-09-06 ヤマハ発動機株式会社 Ordinateur hôte, système de détermination de compétence de fonctionnement, procédé de détermination de compétence de fonctionnement et programme de détermination de compétence de fonctionnement
WO2016059930A1 (fr) * 2014-10-17 2016-04-21 ソニー株式会社 Dipositif, procédé et programme
JP2018017668A (ja) * 2016-07-29 2018-02-01 パナソニックIpマネジメント株式会社 情報処理装置、及び情報処理プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114526744A (zh) * 2020-11-05 2022-05-24 丰田自动车株式会社 地图更新装置和地图更新方法
CN114526744B (zh) * 2020-11-05 2024-03-22 丰田自动车株式会社 地图更新装置和地图更新方法

Similar Documents

Publication Publication Date Title
JP2022113746A (ja) 判定装置
US20200272816A1 (en) Scalable three dimensional object segmentation
WO2017060947A1 (fr) Appareil d'estimation, procédé de commande, programme et support de stockage
WO2015173034A1 (fr) Procédé et système pour déterminer une position par rapport à une carte numérique
JP6806891B2 (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
JP6980010B2 (ja) 自己位置推定装置、制御方法、プログラム及び記憶媒体
JP7155284B2 (ja) 計測精度算出装置、自己位置推定装置、制御方法、プログラム及び記憶媒体
JP6589570B2 (ja) センター処理装置、地図生成システム、及びプログラム
JP2022176322A (ja) 自己位置推定装置、制御方法、プログラム及び記憶媒体
WO2021112074A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support de stockage
JP2023075184A (ja) 出力装置、制御方法、プログラム及び記憶媒体
JP2023164553A (ja) 位置推定装置、推定装置、制御方法、プログラム及び記憶媒体
JP2019174191A (ja) データ構造、情報送信装置、制御方法、プログラム及び記憶媒体
JP2023078138A (ja) 出力装置、制御方法、プログラム及び記憶媒体
WO2019188820A1 (fr) Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support d'informations
JP2023174739A (ja) データ構造、情報処理装置、及び地図データ生成装置
WO2019188874A1 (fr) Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques
WO2019188886A1 (fr) Dispositif terminal, procédé de traitement d'informations et support d'informations
JP2019174675A (ja) データ構造、地図データ生成装置、制御方法、プログラム及び記憶媒体
WO2019188877A1 (fr) Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support de stockage
WO2018212290A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support de stockage
WO2021112078A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support de stockage
WO2019189098A1 (fr) Dispositif d'estimation de position automatique, procédé d'estimation de position automatique, programme et support d'enregistrement
CN114867989A (zh) 信息处理设备、控制方法、程序和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19776481

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19776481

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP