WO2019188820A1 - Information transmission device, data structure, control method, program, and storage medium - Google Patents

Information transmission device, data structure, control method, program, and storage medium Download PDF

Info

Publication number
WO2019188820A1
WO2019188820A1 PCT/JP2019/012176 JP2019012176W WO2019188820A1 WO 2019188820 A1 WO2019188820 A1 WO 2019188820A1 JP 2019012176 W JP2019012176 W JP 2019012176W WO 2019188820 A1 WO2019188820 A1 WO 2019188820A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
accuracy
vehicle
estimated
measurement
Prior art date
Application number
PCT/JP2019/012176
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 正浩
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019188820A1 publication Critical patent/WO2019188820A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3859Differential updating map data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/127Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station
    • G08G1/13Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station the indicator being in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the present invention relates to a technique for updating a map.
  • Patent Literature 1 when a change point of a partial map is detected based on an output of a sensor installed on a moving body such as a vehicle, the driving support device transmits change point information regarding the change point to a server device.
  • Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
  • Non-Patent Document 1 discloses specifications related to a data format for collecting data detected by a vehicle-side sensor with a cloud server.
  • the present invention has been made to solve the above-described problems, and is an information transmission apparatus capable of transmitting data suitable for processing for generating position information of an object to be registered in a map, and a data structure of the data
  • the main purpose is to provide
  • the invention described in claim is an information transmission device, which includes object position information of an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body A generation unit that generates measurement information; an acquisition unit that acquires accuracy information indicating the estimation accuracy of the estimated position information; and a transmission unit that transmits the object measurement information and the accuracy information to an information processing device. .
  • the invention described in the claims is a data structure of data transmitted to an information processing apparatus that collects object measurement information related to an object measured by a measurement apparatus mounted on a moving body, and is estimated for the moving body.
  • the object measurement information including the object position information of the object and the accuracy information indicating the estimation accuracy of the estimated position information generated based on the estimated position information indicating the position of the object, and the position of the object on the map Is a data structure of data used to determine the information processing apparatus.
  • the invention described in the claims is an information transmission device, and a generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body, and the object is measured
  • An acquisition unit that acquires accuracy information indicating the estimated accuracy of the position of the moving body
  • a transmission unit that transmits the object measurement information and the accuracy information to the information processing apparatus.
  • the invention described in the claims is a control method executed by the information transmission device, and is an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body. Generating the object measurement information including the object position information, acquiring the accuracy information indicating the estimation accuracy of the estimated position information, and transmitting the object measurement information and the accuracy information to the information processing apparatus And a transmitting step.
  • the functional block of the own vehicle position estimation part in landmark base position estimation is shown.
  • the functional block of the own vehicle position estimation part in point cloud base position estimation is shown.
  • An example of the schematic data structure of voxel data is shown. It is a graph which shows transition of the value of accuracy information, and transition of the value of standardization accuracy information.
  • the information transmission device is an object position information of an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body.
  • a generation unit that generates object measurement information including: an acquisition unit that acquires accuracy information indicating the estimation accuracy of the estimated position information; a transmission unit that transmits the object measurement information and the accuracy information to an information processing device; Have.
  • the information transmission apparatus updates the map based on the object measurement information measured by the moving body, the information transmission apparatus provides accuracy information that is an index of accuracy and reliability of the object measurement information. It can be suitably transmitted to the information processing apparatus together with the object measurement information.
  • the acquisition unit is configured to standardize the estimation accuracy based on an average and a standard deviation based on an average and a standard deviation of the estimation accuracy in a predetermined period including a time point when the position of the moving body is estimated.
  • the transmission unit transmits the object measurement information and the standardization accuracy information to the information processing apparatus.
  • the information transmission apparatus is suitable for the information processing apparatus with the standardization system information having uniform average and variation regardless of the position estimation method or the like, even when accuracy information having different averages and variations is acquired. Can be sent to.
  • the position of the moving body is estimated by weighted average of the position estimation results by a plurality of methods
  • the acquisition unit is configured for each of the position estimation results by the plurality of methods.
  • Standardization accuracy information is generated, the standardization accuracy information is averaged by the same weighting as the weighted average, and the transmission unit sends the object measurement information and the averaged standardization accuracy information to the information processing apparatus.
  • the information transmission device preferably generates standardized accuracy information and transmits it to the information processing device even when the position of the moving object is estimated by weighted averaging of the position estimation results by a plurality of methods. Can do.
  • a data structure of data transmitted to an information processing apparatus that collects object measurement information related to an object measured by a measurement apparatus mounted on the mobile object, the mobile object
  • the object measurement information including the object position information of the object, generated based on the estimated position information indicating the position estimated for, and the accuracy information indicating the estimation accuracy of the estimated position information, and on the map
  • It is the data structure of the data used for the said information processing apparatus to determine the position of an object.
  • the information processing apparatus can refer to accuracy information that is an index of accuracy and reliability of the received object measurement information, and can determine the position of the object on the map. It can be suitably determined.
  • an information transmission device a generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body, and the object
  • An acquisition unit that acquires accuracy information indicating the estimation accuracy of the position of the moving body estimated when measuring the object
  • a transmission unit that transmits the object measurement information and the accuracy information to an information processing device.
  • the information transmission device can suitably transmit accuracy information that is an index of accuracy and reliability of the object measurement information together with the object measurement information to the information processing device.
  • a control method executed by the information transmitting apparatus wherein a measuring device mounted on the moving body is based on estimated position information indicating a position estimated for the moving body.
  • the information transmitting apparatus can suitably transmit accuracy information that is an index of accuracy and reliability of the object measurement information to the information processing apparatus together with the object measurement information.
  • a program causes a computer to execute the control method described above.
  • the computer functions as the information transmission device described above by executing this program.
  • the program is stored in a storage medium.
  • FIG. 1 is a schematic configuration of a map update system according to the present embodiment.
  • the map update system includes an in-vehicle device 1 that moves together with each vehicle that is a moving body, and a server device 6 that communicates with each in-vehicle device 1 via a network. Then, the map update system updates the distribution map DB 20 that is a map for distribution held by the server device 6 based on the information transmitted from each in-vehicle device 1.
  • the “map” includes data used for ADAS (Advanced Driver Assistance System) and automatic driving in addition to data referred to by a conventional in-vehicle device for route guidance.
  • ADAS Advanced Driver Assistance System
  • the in-vehicle device 1 is electrically connected to the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, a predetermined object is detected and the vehicle in which the in-vehicle device 1 is mounted.
  • the position of the vehicle also referred to as “the vehicle position” is estimated.
  • the vehicle equipment 1 performs automatic driving
  • DB DataBase
  • the vehicle equipment 1 estimates the own vehicle position by collating with the output of the lidar 2 etc. based on this map DB10.
  • the in-vehicle device 1 transmits upload information “Iu” including information on the detected object to the server device 6.
  • the in-vehicle device 1 is an example of an information transmission device.
  • the lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information.
  • the lidar 2 includes an irradiation unit that irradiates laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) reflected by the object, and a light reception signal output by the light receiving unit. And an output unit for outputting scan data based on.
  • the scan data is point cloud data and is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal.
  • the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
  • the server device 6 receives the upload information Iu from each in-vehicle device 1 and stores it. For example, the server device 6 updates the distribution map DB 20 based on the collected upload information Iu. In addition, the server device 6 transmits download information Id including update information of the distribution map DB 20 to each in-vehicle device 1.
  • the server device 6 is an example of a map data generation device.
  • FIG. 2A is a block diagram showing a functional configuration of the in-vehicle device 1.
  • the in-vehicle device 1 mainly includes an interface 11, a storage unit 12, a communication unit 13, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
  • the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15.
  • the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
  • the storage unit 12 stores a map DB 10 including object information IO, landmark information IL, and voxel data IB.
  • the object information IO is information related to obstacles, holes, broken vehicles, etc. existing on or around the road, and includes attribute information such as the position, size, and shape of each object.
  • the landmark information IL is information related to features on or around the road to be a landmark.
  • a kilometer post, a 100 m post, a delineator, a traffic infrastructure facility eg, a sign, a direction
  • the landmark information IL is used in landmark base position estimation described later.
  • the landmark whose position or shape has changed due to an accident or the like is also information of the object information IO.
  • the voxel data IB is information on point cloud data indicating the measurement position of the stationary structure for each unit region (also referred to as “voxel”) when the three-dimensional space is divided into a plurality of regions.
  • the voxel data IB is used in point cloud base position estimation described later.
  • the communication unit 13 performs transmission of the upload information Iu and reception of the download information Id based on the control of the control unit 15.
  • the input unit 14 is a button, a touch panel, a remote controller, a voice input device, or the like for a user to operate.
  • the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
  • the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
  • the control unit 15 includes a host vehicle position estimation unit 17 and an upload control unit 18.
  • the own vehicle position estimation unit 17 performs highly accurate estimation of the own vehicle position by selectively or combining a plurality of own vehicle position estimation methods.
  • the vehicle position estimation unit 17 performs position estimation using the landmark measurement result and the landmark position information recorded in the object information IO (also referred to as “landmark base position estimation”). ), Position estimation using voxel data (also referred to as “point cloud-based position estimation”), and position estimation using a global positioning satellite system (GNSS) (“GNSS-based position estimation”).
  • GNSS-based position estimation global positioning satellite system
  • the vehicle position estimation unit 17 performs vehicle position estimation based on the output of the lidar 2 in the landmark base position estimation and the point cloud base position estimation, and the vehicle position estimation based on the output of the GPS receiver 5 in the GNSS base position estimation. I do.
  • the own vehicle position estimating unit 17 estimates the own vehicle position, and generates information on the estimated accuracy of the own vehicle position (also referred to as “accuracy information”) and stores it in the storage unit 12 or the like. Details of the landmark-based position estimation and the point cloud-based position estimation and the method of generating the accuracy information will be described later.
  • the upload control unit 18 generates upload information Iu including information related to the detected object when a predetermined object is detected based on the output of an external sensor such as the lidar 2, and transmits the upload information Iu to the server device 6. .
  • the information related to the object includes at least the position information of the object, and may further include attribute information such as object identification information, object shape, and size.
  • the upload control unit 18 standardizes the accuracy information calculated by the vehicle position estimation unit 17 so as to be a predetermined average value and standard deviation value (also referred to as “standardized accuracy information”). And the standardized accuracy information is included in the upload information Iu and transmitted to the server device 6. The standardized accuracy information will be described later.
  • the upload control unit 18 is an example of a “generation unit”, a “transmission unit”, and a “computer” that executes a program.
  • FIG. 2B is a block diagram showing a functional configuration of the server device 6.
  • the server device 6 mainly includes a communication unit 61, a storage unit 62, and a control unit 65. Each of these elements is connected to each other via a bus line.
  • the communication unit 61 receives the upload information Iu and transmits the download information Id based on the control of the control unit 65.
  • the storage unit 62 stores a program executed by the control unit 65 and information necessary for the control unit 65 to execute a predetermined process.
  • the storage unit 62 stores a distribution map DB 20 having the same data structure as the map DB 10 and an upload information DB 27 that is a database of upload information Iu received from each in-vehicle device 1.
  • the control unit 65 includes a CPU that executes a program and controls the entire server device 6.
  • the control unit 65 updates the distribution map DB 20 based on the upload information DB 27 in which the upload information Iu received from each in-vehicle device 1 by the communication unit 61 is accumulated, and the download including the generated map update information. Processing for transmitting the information Id to each vehicle-mounted device 1 through the communication unit 61 is performed.
  • the control unit 65 is an example of a “reception unit”, a “generation unit”, and a “computer” that executes a program.
  • the vehicle position estimation unit 17 is based on the distance and angle measurement values obtained by the lidar 2 with respect to the landmark and the landmark position information extracted from the map DB 10.
  • the vehicle position estimated from the output data of the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 is corrected.
  • the vehicle position estimation unit 17 predicts the vehicle position calculated in the prediction step of predicting the vehicle position from the output data of the gyro sensor 3, the vehicle speed sensor 4, and the like, and the prediction step immediately before.
  • the measurement update step for correcting the value is executed alternately.
  • Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter.
  • an extended Kalman filter an extended Kalman filter
  • an unscented Kalman filter an unscented Kalman filter
  • a particle filter an example in which the vehicle position estimation unit 17 performs vehicle position estimation using an extended Kalman filter.
  • FIG. 3 is a diagram showing the position of the vehicle to be estimated in two-dimensional orthogonal coordinates.
  • the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ ⁇ ” of the vehicle.
  • the yaw angle ⁇ is defined as an angle formed by the traveling direction of the vehicle and the x-axis.
  • four variables (x, y, z, ⁇ ) taking into account the z-axis coordinates perpendicular to the x-axis and the y-axis are used.
  • the vehicle position is estimated using the state variable of the vehicle position. Since a general road has a gentle slope, the pitch angle and roll angle of the vehicle are basically ignored in this embodiment.
  • FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step.
  • FIG. 5 shows an example of functional blocks of the vehicle position estimation unit 17. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the own vehicle position estimation part 17 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step.
  • the position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26.
  • the state variable vector of the reference time (ie, current time) “t” to be calculated is represented as “X ⁇ (k)” or “X ⁇ (k)”.
  • the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ⁇ ” on the character representing the value.
  • the position prediction block 24 of the control unit 15 adds the obtained movement distance and azimuth change to the state variable vector X ⁇ (k-1) at time t-1 calculated in the immediately previous measurement update step, and then adds the time t A predicted value (also referred to as “predicted position”) X ⁇ (k) is calculated.
  • the landmark search / extraction block 25 associates the landmark position vector registered in the landmark information IL of the map DB 10 with the scan data of the lidar 2. Then, when the association is made, the landmark search / extraction block 25, the measured value “Z (k)” by the lidar 2 of the made landmark, the predicted position X ⁇ (k), and the land A landmark measurement value (referred to as “measurement prediction value”) “Z ⁇ (k)” obtained by modeling the measurement processing by the lidar 2 using the landmark position vector registered in the mark information IL. Get each.
  • the measured value Z (k) is a vehicle coordinate system ("vehicle coordinate system”) converted from a landmark distance and a scan angle measured by the rider 2 at time t into components with the vehicle traveling direction and the lateral direction as axes. Vector value). Then, the position correction block 26 multiplies the difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k) by the Kalman gain “K (k)” as shown in the following equation (1). By adding this to the predicted position X ⁇ (k), the updated state variable vector (also referred to as “estimated position”) X ⁇ (k) is calculated.
  • the position correction block 26 uses a covariance matrix P ⁇ (k) (simply expressed as P (k)) corresponding to the error distribution of the estimated position X ⁇ (k). Obtained from the covariance matrix P ⁇ (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
  • the prediction step and the measurement update step are repeatedly performed, and the predicted position X ⁇ (k) and the estimated position X ⁇ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
  • the accuracy of position estimation can be determined by the value of the diagonal element of the covariance matrix P.
  • the covariance matrix calculated based on the measured value for the landmark at time “k” is P (k)
  • the covariance matrix P (k) is expressed by the following equation (2).
  • the vehicle position estimation unit 17 uses the square roots “ ⁇ x (k)”, “ ⁇ y (k)”, “ ⁇ z (k)”, “ ⁇ ” of the diagonal elements of the covariance matrix P (k).
  • ⁇ (k) is regarded as accuracy information value“ d (k) ”for each state variable x, y, z, ⁇ .
  • the vehicle position estimation unit 17 uses the square roots ⁇ X (k), ⁇ Y (k), ⁇ Z (k), ⁇ ⁇ of the diagonal elements after conversion into the vehicle coordinate system (X, Y, Z). (K) is regarded as the precision information value d (k) for each state variable.
  • the vehicle position estimation unit 17 uses the coordinate axes (that is, latitude, longitude, altitude, or xyz coordinates from a certain reference point) of the global coordinate system adopted in the map.
  • the value d (k) of accuracy information may be calculated for each value.
  • the vehicle position estimation unit 17 uses the matrix C ⁇ (k) described above to perform covariance.
  • the matrix P (k) is converted into the global coordinate system, and the diagonal element after conversion is defined as the value d (k) of accuracy information.
  • the voxel data IB used in the point cloud-based position estimation includes data representing the point cloud data measured for stationary structures in each voxel by a normal distribution, and is used for scan matching using NDT (Normal Distributions Transform). .
  • FIG. 6 shows an example of the vehicle position estimation unit 17 in the point cloud base position estimation.
  • the difference from the vehicle position estimation unit 17 in the landmark base position estimation shown in FIG. 5 is that the point cloud data obtained from the lidar 2 and the voxel acquired from the map DB are used instead of the landmark search / extraction unit 25.
  • the point cloud data correlation block 27 is provided as the correlation process.
  • FIG. 7 shows an example of a schematic data structure of the voxel data IB.
  • the voxel data IB includes information on parameters when the point cloud in the voxel is expressed by a normal distribution.
  • the voxel ID, voxel coordinates, average vector, and covariance are included. Including matrix.
  • “voxel coordinates” indicate absolute three-dimensional coordinates of a reference position such as the center position of each voxel.
  • Each voxel is a cube obtained by dividing the space into a lattice shape, and since the shape and size are determined in advance, the space of each voxel can be specified by the voxel coordinates.
  • the voxel coordinates may be used as a voxel ID.
  • the mean vector“ ⁇ n ”and the covariance matrix“ V n ”at voxel n are expressed by the following equations (5) and (6), respectively.
  • the average value “L ′ n ” is expressed by the following equation (7).
  • the in-vehicle device 1 uses the point group obtained by coordinate transformation, the average vector ⁇ n and the covariance matrix V n included in the voxel data, and the voxel n represented by the following equation (9).
  • Overall evaluation function value “E (k)” also referred to as “overall evaluation function value” for all voxels to be matched indicated by the evaluation function value “E n ” and Expression (10). Is calculated.
  • the in-vehicle device 1 calculates an estimation parameter P that maximizes the overall evaluation function value E (k) by an arbitrary root finding algorithm such as Newton's method.
  • the in-vehicle device 1 applies the estimation parameter P to the own vehicle position X ⁇ (k) predicted from the position prediction unit 21 shown in FIG. An accurate own vehicle position X ⁇ (k) is estimated.
  • the accuracy information value d () of the point cloud base position estimation is reduced so that the better the position estimation accuracy is, like the accuracy information value d (k) calculated in the landmark base position estimation. k) is defined.
  • the GNSS base position estimation own vehicle position estimation unit 17 estimates the own vehicle position based on the output of the GPS receiver 5 in the GNSS base position estimation. Further, when executing the GNSS base position estimation, the host vehicle position estimation unit 17 acquires, for example, DOP (Division Of Precision) obtained from the GPS receiver 5 as the accuracy information value d (k). In another example, the vehicle position estimation unit 17 uses the standard deviation values of latitude, longitude, and altitude acquired from the GPS receiver 5 within a predetermined period of time as values of accuracy information for each of latitude, longitude, and altitude. Obtained as d (k).
  • the GPS receiver 5 may be a receiver capable of positioning not only GPS but also GLONASS, Galileo, quasi-zenith satellite (QZSS), and the like.
  • the upload control unit 18 of the in-vehicle device 1 uses the accuracy information generated by the vehicle position estimation unit 17 at the transmission timing of the upload information Iu, and the accuracy of the vehicle position estimation calculated by the vehicle position estimation unit 17 within the past predetermined time. Standardized accuracy information is calculated based on the average and standard deviation of the information values.
  • the accuracy information calculated by the own vehicle position estimating unit 17 at the transmission timing of the upload information Iu is “d (k)”, and the average of the accuracy information values calculated by the own vehicle position estimating unit 17 within the past predetermined time is used.
  • the standard deviations are “ ⁇ (k)” and “ ⁇ (k)”, respectively, the upload control unit 18 calculates the value “S (k)” of the standardization accuracy information by the following equation (12).
  • the value S (k) of the standardized accuracy information is a negative value when the accuracy information d (k) is smaller than the average ⁇ (k), and a positive value when the accuracy information d (k) is larger than the average ⁇ (k).
  • the value S (k) of the standardization accuracy information approaches 0 as the standard deviation ⁇ (k) increases, and increases as the standard deviation ⁇ (k) decreases. Therefore, the standardized accuracy information value S (k) is a standardized expression using the distribution of the accuracy information value d (k). Then, the upload control unit 18 calculates the standardized accuracy information value S (k) for each latitude, longitude, and altitude.
  • FIG. 8A shows accuracy information values d A (k) and d obtained when the vehicle-mounted devices 1 mounted on different vehicles at a certain point perform vehicle position estimation using different vehicle position estimation methods. It is the figure which showed transition of B (k).
  • the accuracy information value d A (k) indicated by the graph G1 is a distribution of the average “ ⁇ A (k)” and the standard deviation “ ⁇ A (k)”
  • the accuracy information value indicated by the graph G2 d B (k) has a distribution of mean “ ⁇ B (k)” and standard deviation “ ⁇ B (k)”.
  • the accuracy information differs in average and standard deviation due to the vehicle position estimation method to be executed and the accuracy of the sensor to be used.
  • FIG. 8B shows the transition of the standardized accuracy information value S A (k) obtained by standardizing the accuracy information value d A (k) based on the equation (12), and the accuracy information value d B (k). is a diagram showing changes with the values S B of standardized standardized accuracy information (k) based on (12).
  • the value S A (k) of the standardization accuracy information indicated by the graph G3 and the value S B (k) of the standardization accuracy information indicated by the graph G4 are both distributions of mean 0 and standard deviation 1. Yes.
  • Expression (12) it is possible to handle accuracy information having different distributions by the same axis.
  • the vehicle position estimation unit 17 uses the vehicle position estimation method A in which the accuracy information value is d A (k) (see the graph G1 in FIG. 8A), and the accuracy information value is d B ( k) A case where the vehicle position is estimated by weighting the vehicle position estimation method B (see graph G2 in FIG. 8A) with a ratio of ⁇ : ⁇ (for example, 7: 3) will be described. To do.
  • the upload control unit 18 first adds standardized accuracy information S A (see graph G3 in FIG. 8B) based on accuracy information of the vehicle position estimation method A and accuracy information of the vehicle position estimation method B. Based on the standardized accuracy information S B (see graph G4 in FIG. 8B), each is calculated. Then, the upload control unit 18 calculates standardized accuracy information S AB obtained by weighting the calculated standardized accuracy information S A and the standardized accuracy information S B with a ratio of ⁇ : ⁇ .
  • Figure 9 (A) is, alpha: beta 7: shows a transition of normalization precision information S AB calculated in FIG. 8 (B) are shown to weighting and the standard-precision information S A and the standard-precision information S B as 3 It is a graph.
  • the standardized accuracy information S AB is calculated by adding the standardized accuracy information S B multiplied by.
  • the upload control unit 18 uses the average “ ⁇ AB ” and the standard deviation “ ⁇ AB ” within the past predetermined time of the standardization accuracy information S AB , so that the standardization accuracy is 1 and the standard deviation 1 is distributed. Standardize the information SAB .
  • FIG. 9B shows the transition of the value S (k) obtained by standardizing the standardized accuracy information SAB .
  • the upload control unit 18 divides the standardized accuracy information S AB subtracted by the average ⁇ AB by the standard deviation ⁇ AB , and the standardized accuracy information value S included in the upload information Iu. Calculate as (k).
  • the upload control unit 18 preferably calculates the standardized accuracy information in which the average and standard deviation are standardized even when the vehicle position estimation unit 17 executes a plurality of vehicle position estimation methods. It can be included in the upload information Iu.
  • FIG. 10 is a diagram showing an outline of the data structure of the upload information Iu transmitted by the in-vehicle device 1.
  • the upload information Iu includes header information, travel route information, event information, and media information.
  • the header information includes items of “version”, “transmission source”, and “vehicle metadata”.
  • the in-vehicle device 1 designates information on the version of the data structure of the upload information Iu used in “Version”, and the name of the company (OEM name or system vendor of the vehicle that transmits the upload information Iu) in “Sender” Name) information. Further, the in-vehicle device 1 specifies vehicle attribute information (for example, vehicle type, vehicle ID, vehicle width, vehicle height, etc.) in “vehicle metadata”.
  • the travel route information includes an item “position estimation”. The in-vehicle device 1 designates, for this “position estimation”, the time stamp information indicating the position estimation time, the latitude, longitude, altitude information indicating the estimated vehicle position, and information regarding the estimation accuracy. .
  • Event information includes an item of “object recognition event”.
  • the vehicle-mounted device 1 When the vehicle-mounted device 1 detects an object recognition event, it designates information as a detection result as an “object recognition event”.
  • the “object recognition event” includes elements of “time stamp”, “object ID”, “offset position”, “object type”, “object size”, “object size accuracy”, and “media ID”.
  • the media information is a data type used when transmitting raw data that is output data (detection information) of an external sensor such as the lidar 2.
  • the in-vehicle device 1 transmits the upload information Iu including at least the position information of the detected object and the standardization accuracy information to the server device 6.
  • the in-vehicle device 1 specifies the standardization accuracy information in the item “position estimation”, and also specifies the position information of the detected object in the “object recognition event” of the event information.
  • the in-vehicle device 1 designates the relative position from the vehicle position as the sub-item “offset position” of the “object recognition event” as the position information of the object designated as the “object recognition event”.
  • the absolute position of the vehicle is specified by the item “position estimation”.
  • the in-vehicle device 1 includes the information on the absolute position of the object generated based on the estimated own vehicle position and the measured relative position of the object in the upload information Iu, instead of including the information on the relative position of the object in the upload information Iu. May be.
  • the upload information Iu is an example of “object measurement information”
  • the information on the absolute position of the vehicle specified by the item “position estimation” or the like is an example of “estimated position information”.
  • the server device 6 calculates the position information of the object detected by each in-vehicle device 1 by statistical processing based on the upload information Iu received from each in-vehicle device 1. At this time, the server device 6 weights the position information of the corresponding object on the basis of the weighting value “w (k)” calculated based on the standardization accuracy information S (k), so that the object to be registered in the distribution map DB 20 The position information of is calculated.
  • the server device 6 is set such that the smaller the standardization accuracy information S (k), the better the position estimation accuracy at that time, so that the weighting value w (k) increases. In other words, the server device 6 sets the weighting value w (k) to be smaller because the position estimation accuracy at that time is lower as the standardization accuracy information S (k) is larger.
  • the server device 6 calculates the weighting value w (k) by referring to any one of the following formulas (13) to (15), so that the standardized accuracy information S (k) increases.
  • the weighting value w (k) can be generated so as to approach 0, and the weighting value w (k) approaches 2 as the standardized accuracy information S (k) increases in the negative direction.
  • FIG. 11A shows the correspondence between the standardized accuracy information S (k) based on the equations (13) to (15) and the weighting value w (k).
  • the graph G5 shows the correspondence between the standardized accuracy information S (k) based on the equation (13) and the weighting value w (k)
  • the graph G6 shows the standardized accuracy information S (k) based on the equation (14).
  • the weighting value w (k) and the graph G7 shows the correspondence between the standardized accuracy information S (k) based on the equation (15) and the weighting value w (k).
  • the server device 6 uses the following formulas (16) to (18), in which the coefficient c is introduced into the formulas (13) to (15), to weight the standardized accuracy information S (k). It is possible to suitably adjust the conversion ratio to the value w (k).
  • FIG. 11B shows the correspondence between the standardized accuracy information S (k) based on the equations (16) to (18) and the weighting value w (k) when the coefficient c is set to “0.5”.
  • the graph G8 shows the correspondence between the standardized accuracy information S (k) based on the equation (16) and the weighting value w (k)
  • the graph G9 shows the standardized accuracy information S (k) based on the equation (17).
  • the graph G10 shows the correspondence between the standardized accuracy information S (k) based on the equation (18) and the weighting value w (k).
  • the upload control unit 18 calculates the weight value w (k) based on the standardization accuracy information for each of the latitude, longitude, altitude, or xyz coordinate value from a certain reference point.
  • the accuracy information value d (k) is not obtained for each direction (that is, only one accuracy information value d (k) is calculated) as in point cloud base position estimation, upload control is performed.
  • the unit 18 considers that the values d (k) of the latitude, longitude, and altitude accuracy information are the same, and calculates the weighting value w (k) for each of the latitude, longitude, and altitude.
  • the absolute position (also referred to as “object measurement position”) of the object indicated by each upload information Iu is [p x (k), p y (k), p z (k)] T , and the upload information Iu.
  • the server device 6 can suitably calculate the estimated object position to be reflected in the distribution map DB 20 by the weighted averaging process.
  • the server device 6 In addition to the estimated object position, the server device 6 also includes the number of object measurement positions (also referred to as “number of samples”) used to calculate the estimated object position and the object measurement position used to calculate the estimated object position.
  • a variation index value (also referred to as “variation index value”) is further calculated.
  • the variation index value is, for example, variance or standard deviation.
  • the server apparatus 6 registers the object estimated position which linked
  • the greater the number of samples the higher the reliability of the calculated estimated object position.
  • the server device 6 registers information on the number of samples and the variation index value together with the estimated position information of the object in the distribution map DB 20, so that information serving as a reliability index of the registered position information of the object is preferably used. Can be added to the distribution map DB 20.
  • FIG. 12 is an example of a flowchart showing an outline of processing related to transmission / reception of upload information Iu and download information Id.
  • the vehicle-mounted device 1 estimates its own vehicle position, calculates accuracy information, and stores it in the storage unit 12 (step S101).
  • the in-vehicle device 1 determines whether or not a predetermined object has been detected based on the output of an external sensor such as the lidar 2 (step S102). That is, the in-vehicle device 1 determines whether an object to be notified of position information is detected based on the upload information Iu. And the vehicle equipment 1 will detect the predetermined
  • the vehicle equipment 1 transmits the upload information Iu containing the information regarding objects, such as the positional information and identification information of the detected object, and the standardization precision information calculated by step S103 to the server apparatus 6 (step S104). On the other hand, the vehicle equipment 1 returns a process to step S101, when the predetermined object is not detected (step S102; No).
  • the server device 6 that has received the upload information Iu from the in-vehicle device 1 stores the upload information Iu in the upload information DB 27 (step S201).
  • the server device 6 receives, for the same object, upload information Iu at a plurality of times from the in-vehicle devices 1 of the plurality of vehicles.
  • the server device 6 determines whether or not it is the update timing of the distribution map DB 20 (step S202). And when it is not the update timing of distribution map DB20 (step S202; No), the server apparatus 6 receives the upload information Iu from the vehicle equipment 1, and performs the process of step S201.
  • the server device 6 determines that it is the update timing of the distribution map DB 20 (step S202; Yes)
  • the upload information DB 27 is referred to, and each object detected by the in-vehicle device 1 is weighted based on the standardization accuracy information.
  • the server device 6 determines the weighting value w (k) from the standardized accuracy information S (k) with reference to, for example, any one of the equations (13) to (18).
  • the server device 6 calculates, for each object to be updated, the number of samples of the object measurement position used for calculating the object estimated position and the variation index value indicating the variation of the object measured position, and the estimated object position At the same time, it is registered in the distribution map DB 20 (step S204). Then, the server device 6 transmits download information Id including the estimated object position for each object, the number of samples, a combination of variation index values, and the like as map update information to each vehicle-mounted device 1 (step S205).
  • the in-vehicle device 1 When the in-vehicle device 1 receives the download information Id (step S105; Yes), it updates the map DB 10 using the download information Id (step S106). For example, the in-vehicle device 1 updates the object information IO based on the download information Id. Thereby, the number of samples and the variation index value are associated with the position information of each object included in the object information IO. Thereafter, the in-vehicle device 1 may select, for example, a landmark to be used for landmark base position estimation with reference to the number of samples and the variation index value included in the object information IO.
  • the in-vehicle device 1 determines that an object whose number of samples is equal to or greater than a predetermined value and the variation indicated by the variation index value is equal to or less than a predetermined degree is registered with highly reliable position information, Select as a landmark.
  • the vehicle equipment 1 returns a process to step S101, when the download information Id is not received from the server apparatus 6 (step S105; No).
  • FIGS. 13A to 13C are diagrams showing the positions of the vehicles A to D and the vehicle position estimation accuracy when the obstacle 30 is detected by the lidar 2 at different times.
  • the point 31a indicates the estimated position coordinates of the vehicle
  • the ellipse 32a indicates the vehicle position estimation accuracy of the vehicle.
  • the size of the ellipse 32a is expressed as being smaller as the vehicle position estimation accuracy is higher, and larger as the vehicle position estimation accuracy is lower.
  • the length of the ellipse 32a indicates the vehicle position estimation accuracy with respect to the extension direction of the ellipse 32a. The better the vehicle position estimation accuracy is, the shorter the length is expressed.
  • the vehicle-mounted device 1 of the vehicle A detects the obstacle 30 at the position shown in FIG. 13A by the lidar 2 and calculates the coordinates of the obstacle 30 based on the estimated position coordinates of the respective own vehicle.
  • the upload information Iu indicating the detection result of the obstacle 30 is transmitted to the server device 6 respectively.
  • the vehicle-mounted device 1 of the vehicle B detects the obstacle 30 at a position shown in FIG. 13B at a time different from that in FIG. 13A, and upload information Iu indicating the detection result is sent to the server device. 6 to send.
  • the plurality of vehicles A to D transmit the upload information Iu indicating the detection results to the server device 6 for the same object (here, the obstacle 30).
  • the vehicles A to D have different own vehicle position estimation accuracy (see the ellipse 32a).
  • the upload information Iu is transmitted to the server device 6 at a plurality of times from the vehicle-mounted devices 1 of a plurality of vehicles having different estimation accuracy of the vehicle position for the same object.
  • the server device 6 performs a weighted averaging process based on the standardization accuracy information included in the upload information Iu received from each of the in-vehicle devices 1 of the plurality of vehicles, for each object to be updated.
  • the object estimated position is calculated. Accordingly, regardless of the position estimation method, the weight value of the object measurement position based on the upload information Iu from the vehicle-mounted device 1 of the vehicle having high vehicle position estimation accuracy is increased, and the vehicle position estimation accuracy is low. The weight value of the object measurement position based on the upload information Iu from the in-vehicle device 1 is lowered. Thereby, the server apparatus 6 can determine correctly the object estimated position registered into distribution map DB20.
  • the server device 6 further sets a weighting value (also referred to as “vehicle weighting value”) for each vehicle-mounted device 1 that is the transmission source of the upload information Iu (that is, for each vehicle in which the vehicle-mounted device 1 is mounted), and sets the vehicle weighting value.
  • a weighting value also referred to as “vehicle weighting value”
  • vehicle weighting value Based on the object estimated position [p ⁇ x , p ⁇ y , p ⁇ z ] T of the object may be calculated.
  • This vehicle weighting value is set by a method to be described later so as to be a value according to the accuracy and performance of the sensor used for detecting the object.
  • the server device 6 sets the object estimated positions [p ⁇ x , p ⁇ y , p ⁇ z ] T based on the following equations (22) to (24). calculate.
  • the server device 6 stores information (also referred to as “vehicle weighting information IV”) in which a vehicle weighting value is associated with each vehicle ID, and refers to the vehicle weighting information IV to formulas (22) to (22) 24)
  • vehicle weighting information IV in which a vehicle weighting value is associated with each vehicle ID, and refers to the vehicle weighting information IV to formulas (22) to (22) 24
  • the estimated object position of the object is calculated based on 24).
  • the server device 6 updates the vehicle weight value recorded in the vehicle weight information IV during the map update process.
  • the vehicle weighting value is set to 11 steps from 0 to 1 in 0.1 steps, and the deviation amount (that is, the distance between the indicated positions) between the calculated object estimated position and the object measurement position indicated by certain upload information Iu is a predetermined threshold value.
  • the weighting value of the transmission source vehicle of the upload information Iu is lowered by 0.1 from the value recorded in the vehicle weighting information IV.
  • the server device 6 transmits the upload information Iu if the amount of deviation between the calculated estimated object position and the object measurement position indicated by certain upload information Iu (that is, the distance between the indicated positions) is less than a predetermined threshold.
  • the vehicle weighting value is increased by 0.1 from the value recorded in the vehicle weighting information IV. Therefore, in a vehicle in which the difference in the object measurement position is often large with respect to the calculated object estimation position, the vehicle weighting value approaches 0 and does not contribute much to the calculation of the object estimation position. On the other hand, a vehicle in which the difference in the object measurement position is often small with respect to the calculated object estimated position has a vehicle weight value approaching 1 and greatly contributes to the calculation of the object estimated position.
  • the server device 6 receives the upload information Iu from a vehicle for which no vehicle weight value is recorded in the vehicle weight information IV, the server device 6 calculates the vehicle weight value for the vehicle as a predetermined initial value (for example, 0.5). The initial value is updated based on the amount of deviation between the estimated object position and the object measurement position indicated by the upload information Iu.
  • the object estimated position by the averaging process using the vehicle weighting value it is possible to reduce the influence due to the measurement value having a large error and to increase the estimation accuracy of the object position.
  • an object measurement position measured by a vehicle in which an external sensor such as the lidar 2 is deficient and the object measurement position is normally deviated or a vehicle having an external sensor with low measurement accuracy is reduced. Therefore, according to this modification, even when the upload information Iu is received from a vehicle having a defect in the external sensor, a vehicle having a low measurement accuracy of the external sensor, or the like, the calculation of the estimated object position is less affected. .
  • the vehicle weight value of the vehicle gradually increases and is reflected in the calculation of the object estimated position.
  • FIG. 14 is a flowchart showing a procedure related to map update processing of the server device 6 according to the present modification.
  • the server device 6 that has received the upload information Iu from the in-vehicle device 1 stores the upload information Iu in the upload information DB 27 (step S211).
  • the server device 6 determines that it is the update timing of the distribution map DB 20 (step S212; Yes)
  • the server device 6 refers to the upload information DB 27 and the vehicle weighting information IV, and for each object detected by the vehicle-mounted device 1, the vehicle Using the weighting value and the weighting value based on the standardized accuracy information, the estimated object position [p ⁇ x , p ⁇ y , p ⁇ z ] T is calculated from the expressions (22) to (24) (step S213).
  • the server device 6 calculates, for each object, the number of samples indicating the number of object measurement positions used to calculate the object estimated position and the variation index value indicating the variation of the object measurement position, and the estimated object position And is registered in the distribution map DB 20 (step S214).
  • the server device 6 records the vehicle recorded in the vehicle weighting information IV based on the deviation amount between the object estimated position of each object calculated in step S213 and each object measurement position used for calculating the object estimated position.
  • the weighting value is updated (step S215).
  • the server device 6 transmits download information Id including a combination of the estimated object position, the number of samples, and the variation index value for each object to each vehicle-mounted device 1 (step S216).
  • the server device 6 may calculate the object estimated position by weighted averaging processing using only the vehicle weight values without using the weight values based on the standardization accuracy information. In this case, the server device 6 calculates the object estimated position [p ⁇ x , p ⁇ y , p ⁇ z ] T based on the following equations (25) to (27).
  • the server device 6 can preferably estimate the position information of the object to be registered in the distribution map DB 20 based on the reliability of the object detection result for each vehicle.
  • the in-vehicle device 1 may not include the standardization accuracy information in the upload information Iu.
  • the server device 6 may execute the standardization accuracy information calculation process instead of the in-vehicle device 1.
  • the server device when the vehicle-mounted device 1 detects an object in step S102 in FIG. 12, the server device includes the accuracy information of the current time and the average and standard deviation of accuracy information within the past predetermined time in the upload information Iu.
  • the upload information Iu is transmitted to 6.
  • the server device 6 calculates standardization accuracy information for each received upload information Iu after receiving the upload information Iu in step S201 or after determining the update timing of the distribution map DB 20 in step S202.
  • the configuration of the map update system shown in FIG. 1 is an example, and the configuration of the map update system to which the present invention is applicable is not limited to the configuration shown in FIG.
  • the electronic control device of the vehicle instead of having the in-vehicle device 1, the electronic control device of the vehicle executes the processes of the vehicle position estimation unit 17, the upload control unit 18, and the automatic driving control unit 19 of the in-vehicle device 1.
  • the map DB 10 is stored in, for example, a storage unit in the vehicle, and the electronic control device of the vehicle exchanges upload information Iu and download information Id with the server device 6 via the in-vehicle device 1 or communication (not shown). You may go through the part.

Abstract

An on-vehicle machine 1 estimates a host vehicle position, and calculates and stores accuracy information in a storage unit 12 (step S101). Then, the on-vehicle machine 1 determines whether or not a prescribed object has been detected on the basis of the output of an external sensor of a LIDAR 2 or the like (step S102). The on-vehicle machine 1 calculates, when the prescribed object is detected (Yes in step S102), standardization accuracy information on the basis of accuracy information generated at current time and the average and the standard deviation of the values of accuracy information within a prescribed past time stored in step S101 (step S103). Then, the on-vehicle machine 1 transmits, to a server device 6, upload information Iu which includes the standardization accuracy information calculated in step S103 and information about the object such as position information and identification information of the object (step S104).

Description

情報送信装置、データ構造、制御方法、プログラム及び記憶媒体Information transmitting apparatus, data structure, control method, program, and storage medium
 本発明は、地図を更新する技術に関する。 The present invention relates to a technique for updating a map.
 従来から、車両に設置されたセンサの出力に基づき地図データを更新する技術が知られている。例えば、特許文献1には、車両等の移動体に設置されたセンサの出力に基づいて部分地図の変化点を検出した場合に、当該変化点に関する変化点情報をサーバ装置に送信する運転支援装置が開示されている。また、特許文献2には、カルマンフィルタを用いた自車位置推定技術が開示されている。さらに、非特許文献1には、車両側のセンサが検出したデータをクラウドサーバで収集するためのデータフォーマットに関する仕様が開示されている。 Conventionally, a technique for updating map data based on the output of a sensor installed in a vehicle is known. For example, in Patent Literature 1, when a change point of a partial map is detected based on an output of a sensor installed on a moving body such as a vehicle, the driving support device transmits change point information regarding the change point to a server device. Is disclosed. Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter. Furthermore, Non-Patent Document 1 discloses specifications related to a data format for collecting data detected by a vehicle-side sensor with a cloud server.
特開2016-156973号公報Japanese Patent Laid-Open No. 2016-156973 特開2017-72422号公報JP 2017-72422 A
 車両に搭載されたセンサが検出した道路上あるいは道路周辺のオブジェクトのデータをサーバで収集して、地図データの生成・更新に活用するシステムにおいては、各車両が認識した各オブジェクトの位置は,自車両の位置と、そのオブジェクトまでの計測距離から算出される。従って,その時の自車位置推定が高精度であれば、比較的正確なオブジェクト位置の検出結果がサーバに送信され、地図データには正確な位置が登録される。一方、オブジェクト検出時の車両の推定位置精度が悪い場合、サーバに送信されるオブジェクト位置の検出結果の精度も低くなり、地図データに正確なオブジェクトの位置が登録されない場合がある。 In a system where data on objects on or around roads detected by sensors mounted on vehicles is collected by a server and used to generate and update map data, the position of each object recognized by each vehicle It is calculated from the position of the vehicle and the measured distance to the object. Therefore, if the vehicle position estimation at that time is highly accurate, a relatively accurate object position detection result is transmitted to the server, and the accurate position is registered in the map data. On the other hand, when the estimated position accuracy of the vehicle at the time of object detection is poor, the accuracy of the object position detection result transmitted to the server is also low, and the accurate object position may not be registered in the map data.
 本発明は、上記のような課題を解決するためになされたものであり、地図に登録すべきオブジェクトの位置情報を生成する処理に好適なデータを送信可能な情報送信装置及び当該データのデータ構造を提供することを主な目的とする。 The present invention has been made to solve the above-described problems, and is an information transmission apparatus capable of transmitting data suitable for processing for generating position information of an object to be registered in a map, and a data structure of the data The main purpose is to provide
 請求項に記載の発明は、情報送信装置であって、移動体について推定された位置を示す推定位置情報に基づき、前記移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成部と、前記推定位置情報の推定精度を示す精度情報を取得する取得部と、前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信部と、を有する。 The invention described in claim is an information transmission device, which includes object position information of an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body A generation unit that generates measurement information; an acquisition unit that acquires accuracy information indicating the estimation accuracy of the estimated position information; and a transmission unit that transmits the object measurement information and the accuracy information to an information processing device. .
 また、請求項に記載の発明は、移動体に搭載された計測装置が計測した物体に関する物体計測情報を収集する情報処理装置に送信されるデータのデータ構造であって、前記移動体について推定された位置を示す推定位置情報に基づき生成された、前記物体の物体位置情報を含む前記物体計測情報と、前記推定位置情報の推定精度を示す精度情報と、を含み、地図上における前記物体の位置を前記情報処理装置が決定するのに用いられるデータのデータ構造である。 The invention described in the claims is a data structure of data transmitted to an information processing apparatus that collects object measurement information related to an object measured by a measurement apparatus mounted on a moving body, and is estimated for the moving body. The object measurement information including the object position information of the object and the accuracy information indicating the estimation accuracy of the estimated position information generated based on the estimated position information indicating the position of the object, and the position of the object on the map Is a data structure of data used to determine the information processing apparatus.
 また、請求項に記載の発明は、情報送信装置であって、移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成部と、前記物体を計測した際に推定された前記移動体の位置の推定精度を示す精度情報を取得する取得部と、前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信部と、を有する。 Further, the invention described in the claims is an information transmission device, and a generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body, and the object is measured An acquisition unit that acquires accuracy information indicating the estimated accuracy of the position of the moving body, and a transmission unit that transmits the object measurement information and the accuracy information to the information processing apparatus.
 また、請求項に記載の発明は、情報送信装置が実行する制御方法であって、移動体について推定された位置を示す推定位置情報に基づき、前記移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成工程と、前記推定位置情報の推定精度を示す精度情報を取得する取得工程と、前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信工程と、を有する。 The invention described in the claims is a control method executed by the information transmission device, and is an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body. Generating the object measurement information including the object position information, acquiring the accuracy information indicating the estimation accuracy of the estimated position information, and transmitting the object measurement information and the accuracy information to the information processing apparatus And a transmitting step.
地図更新システムの概略構成図である。It is a schematic block diagram of a map update system. 車載機及びサーバ装置の機能的構成を示すブロック図である。It is a block diagram which shows the functional structure of a vehicle equipment and a server apparatus. 状態変数ベクトルを2次元直交座標で表した図である。It is the figure which represented the state variable vector by the two-dimensional orthogonal coordinate. 予測ステップと計測更新ステップとの概略的な関係を示す図である。It is a figure which shows the schematic relationship between a prediction step and a measurement update step. ランドマークベース位置推定における自車位置推定部の機能ブロックを示す。The functional block of the own vehicle position estimation part in landmark base position estimation is shown. 点群ベース位置推定における自車位置推定部の機能ブロックを示す。The functional block of the own vehicle position estimation part in point cloud base position estimation is shown. ボクセルデータの概略的なデータ構造の一例を示す。An example of the schematic data structure of voxel data is shown. 精度情報の値の推移及び標準化精度情報の値の推移を示すグラフである。It is a graph which shows transition of the value of accuracy information, and transition of the value of standardization accuracy information. 複数の自車位置推定方式の精度情報から算出された標準化精度情報の値の推移を示すグラフである。It is a graph which shows transition of the value of the standardization accuracy information calculated from the accuracy information of a plurality of self-vehicle position estimation methods. アップロード情報のデータ構造例である。It is an example of a data structure of upload information. 標準化精度情報の値と、重み付け値との対応関係を示す。A correspondence relationship between the value of the standardized accuracy information and the weighting value is shown. アップロード情報の送受信に関する処理概要を示すフローチャートである。It is a flowchart which shows the process outline | summary regarding transmission / reception of upload information. 障害物を検出した車両の位置及び自車位置推定精度の遷移を示す図である。It is a figure which shows the transition of the position of the vehicle which detected the obstruction, and the own vehicle position estimation precision. 変形例におけるサーバ装置の処理概要を示すフローチャートである。It is a flowchart which shows the process outline | summary of the server apparatus in a modification.
 本発明の好適な実施形態によれば、情報送信装置であって、移動体について推定された位置を示す推定位置情報に基づき、前記移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成部と、前記推定位置情報の推定精度を示す精度情報を取得する取得部と、前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信部と、を有する。移動体で計測された物体計測情報に基づき地図を更新する場合、計測を行った移動体の位置推定精度が地図更新の精度に影響する。よって、この態様では、情報送信装置は、情報処理装置が移動体で計測された物体計測情報に基づき地図を更新する場合に、当該物体計測情報の正確性及び信頼性の指標となる精度情報を、物体計測情報と共に情報処理装置へ好適に送信することができる。 According to a preferred embodiment of the present invention, the information transmission device is an object position information of an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body. A generation unit that generates object measurement information including: an acquisition unit that acquires accuracy information indicating the estimation accuracy of the estimated position information; a transmission unit that transmits the object measurement information and the accuracy information to an information processing device; Have. When updating a map based on object measurement information measured by a moving object, the position estimation accuracy of the moving object that has performed the measurement affects the accuracy of the map update. Therefore, in this aspect, when the information transmission apparatus updates the map based on the object measurement information measured by the moving body, the information transmission apparatus provides accuracy information that is an index of accuracy and reliability of the object measurement information. It can be suitably transmitted to the information processing apparatus together with the object measurement information.
 上記情報送信装置の一態様では、前記取得部は、前記移動体の位置を推定した時点を含む所定期間における前記推定精度の平均及び標準偏差に基づき、平均及び標準偏差が標準化された前記推定精度を示す標準化精度情報を生成し、前記送信部は、前記物体計測情報と、前記標準化精度情報とを、前記情報処理装置へ送信する。この態様により、情報送信装置は、平均やばらつきが異なる精度情報が取得される場合であっても、位置推定方式等によらずに均一な平均及びばらつきを有する標準化制度情報を情報処理装置に好適に送信することができる。 In one aspect of the information transmitting apparatus, the acquisition unit is configured to standardize the estimation accuracy based on an average and a standard deviation based on an average and a standard deviation of the estimation accuracy in a predetermined period including a time point when the position of the moving body is estimated. The transmission unit transmits the object measurement information and the standardization accuracy information to the information processing apparatus. According to this aspect, the information transmission apparatus is suitable for the information processing apparatus with the standardization system information having uniform average and variation regardless of the position estimation method or the like, even when accuracy information having different averages and variations is acquired. Can be sent to.
 上記情報送信装置の他の一態様では、前記移動体の位置は、複数の方式による位置推定結果の重み付け平均により推定が行われ、前記取得部は、前記複数の方式による位置推定結果の各々に対する標準化精度情報を生成し、当該標準化精度情報を前記重み付け平均と同一の重み付けにより平均化し、前記送信部は、前記物体計測情報と、前記平均化された標準化精度情報とを、前記情報処理装置へ送信する。この態様により、情報送信装置は、複数の方式による位置推定結果の重み付け平均により移動体の位置推定が行われる場合であっても、標準化精度情報を好適に生成して情報処理装置に送信することができる。 In another aspect of the information transmitting apparatus, the position of the moving body is estimated by weighted average of the position estimation results by a plurality of methods, and the acquisition unit is configured for each of the position estimation results by the plurality of methods. Standardization accuracy information is generated, the standardization accuracy information is averaged by the same weighting as the weighted average, and the transmission unit sends the object measurement information and the averaged standardization accuracy information to the information processing apparatus. Send. According to this aspect, the information transmission device preferably generates standardized accuracy information and transmits it to the information processing device even when the position of the moving object is estimated by weighted averaging of the position estimation results by a plurality of methods. Can do.
 本発明の他の好適な実施形態によれば、移動体に搭載された計測装置が計測した物体に関する物体計測情報を収集する情報処理装置に送信されるデータのデータ構造であって、前記移動体について推定された位置を示す推定位置情報に基づき生成された、前記物体の物体位置情報を含む前記物体計測情報と、前記推定位置情報の推定精度を示す精度情報と、を含み、地図上における前記物体の位置を前記情報処理装置が決定するのに用いられるデータのデータ構造である。情報処理装置は、このようなデータ構造を有するデータを受信することで、受信した物体計測情報の正確性及び信頼性の指標となる精度情報を参照することができ、地図上における物体の位置を好適に決定することができる。 According to another preferred embodiment of the present invention, there is provided a data structure of data transmitted to an information processing apparatus that collects object measurement information related to an object measured by a measurement apparatus mounted on the mobile object, the mobile object The object measurement information including the object position information of the object, generated based on the estimated position information indicating the position estimated for, and the accuracy information indicating the estimation accuracy of the estimated position information, and on the map It is the data structure of the data used for the said information processing apparatus to determine the position of an object. By receiving data having such a data structure, the information processing apparatus can refer to accuracy information that is an index of accuracy and reliability of the received object measurement information, and can determine the position of the object on the map. It can be suitably determined.
 本発明の他の好適な実施形態によれば、情報送信装置であって、移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成部と、前記物体を計測した際に推定された前記移動体の位置の推定精度を示す精度情報を取得する取得部と、前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信部と、を有する。この態様によっても、情報送信装置は、物体計測情報の正確性及び信頼性の指標となる精度情報を、物体計測情報と共に情報処理装置へ好適に送信することができる。 According to another preferred embodiment of the present invention, an information transmission device, a generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body, and the object An acquisition unit that acquires accuracy information indicating the estimation accuracy of the position of the moving body estimated when measuring the object, and a transmission unit that transmits the object measurement information and the accuracy information to an information processing device. . Also according to this aspect, the information transmission device can suitably transmit accuracy information that is an index of accuracy and reliability of the object measurement information together with the object measurement information to the information processing device.
 本発明の他の好適な実施形態によれば、情報送信装置が実行する制御方法であって、移動体について推定された位置を示す推定位置情報に基づき、前記移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成工程と、前記推定位置情報の推定精度を示す精度情報を取得する取得工程と、前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信工程と、を有する。情報送信装置は、この制御方法を実行することで、物体計測情報の正確性及び信頼性の指標となる精度情報を、物体計測情報と共に情報処理装置へ好適に送信することができる。 According to another preferred embodiment of the present invention, there is provided a control method executed by the information transmitting apparatus, wherein a measuring device mounted on the moving body is based on estimated position information indicating a position estimated for the moving body. A generation step of generating object measurement information including object position information of the measured object, an acquisition step of acquiring accuracy information indicating the estimation accuracy of the estimated position information, and the object measurement information and the accuracy information Transmitting to the apparatus. By executing this control method, the information transmitting apparatus can suitably transmit accuracy information that is an index of accuracy and reliability of the object measurement information to the information processing apparatus together with the object measurement information.
 本発明の他の好適な実施形態によれば、プログラムは、上記記載の制御方法を、コンピュータにより実行させる。コンピュータは、このプログラムを実行することで、上記記載の情報送信装置として機能する。好適には、上記プログラムは、記憶媒体に記憶される。 According to another preferred embodiment of the present invention, a program causes a computer to execute the control method described above. The computer functions as the information transmission device described above by executing this program. Preferably, the program is stored in a storage medium.
 以下、図面を参照して本発明の好適な実施例について説明する。なお、任意の記号の上に「^」または「-」が付された文字を、本明細書では便宜上、「A」または「A」(「A」は任意の文字)と表す。 Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings. For convenience, a character with “^” or “-” attached on an arbitrary symbol is represented as “A ^ ” or “A ” (“A” is an arbitrary character) in this specification.
 [地図更新システムの概要]
 図1は、本実施例に係る地図更新システムの概略構成である。地図更新システムは、移動体である各車両と共に移動する車載機1と、各車載機1とネットワークを介して通信を行うサーバ装置6とを備える。そして、地図更新システムは、各車載機1から送信された情報に基づき、サーバ装置6が保有する配信用の地図である配信地図DB20を更新する。なお、以後において、「地図」とは、従来の経路案内用の車載機が参照するデータに加えて、ADAS(Advanced Driver Assistance System)や自動運転に用いられるデータも含むものとする。
[Overview of map update system]
FIG. 1 is a schematic configuration of a map update system according to the present embodiment. The map update system includes an in-vehicle device 1 that moves together with each vehicle that is a moving body, and a server device 6 that communicates with each in-vehicle device 1 via a network. Then, the map update system updates the distribution map DB 20 that is a map for distribution held by the server device 6 based on the information transmitted from each in-vehicle device 1. In the following, the “map” includes data used for ADAS (Advanced Driver Assistance System) and automatic driving in addition to data referred to by a conventional in-vehicle device for route guidance.
 車載機1は、ライダ2、ジャイロセンサ3、車速センサ4、及びGPS受信機5と電気的に接続し、これらの出力に基づき、所定のオブジェクトの検出、及び、車載機1が搭載される車両の位置(「自車位置」とも呼ぶ。)の推定などを行う。そして、車載機1は、自車位置の推定結果に基づき、設定された目的地への経路に沿って走行するように、車両の自動運転制御などを行う。車載機1は、道路データ及び道路付近に設けられた目印となる地物や区画線等(「ランドマーク」とも呼ぶ。)に関する情報などが登録された地図データベース(DB:DataBase)10を記憶する。そして、車載機1は、この地図DB10に基づき、ライダ2等の出力と照合させて自車位置の推定を行う。また、車載機1は、検出したオブジェクトに関する情報を含むアップロード情報「Iu」をサーバ装置6へ送信する。車載機1は、情報送信装置の一例である。 The in-vehicle device 1 is electrically connected to the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, a predetermined object is detected and the vehicle in which the in-vehicle device 1 is mounted. The position of the vehicle (also referred to as “the vehicle position”) is estimated. And the vehicle equipment 1 performs automatic driving | operation control etc. of a vehicle so that it drive | works along the path | route to the set destination based on the estimation result of the own vehicle position. The in-vehicle device 1 stores a map database (DB: DataBase) 10 in which information related to road data and landmarks, marking lines, etc. (also referred to as “landmarks”) provided near the road are registered. . And the vehicle equipment 1 estimates the own vehicle position by collating with the output of the lidar 2 etc. based on this map DB10. The in-vehicle device 1 transmits upload information “Iu” including information on the detected object to the server device 6. The in-vehicle device 1 is an example of an information transmission device.
 ライダ2は、水平方向および垂直方向の所定の角度範囲に対してパルスレーザを出射することで、外界に存在する物体までの距離を離散的に測定し、当該物体の位置を示す3次元の点群情報を生成する。この場合、ライダ2は、照射方向を変えながらレーザ光を照射する照射部と、照射したレーザ光が物体で反射した反射光(散乱光)を受光する受光部と、受光部が出力する受光信号に基づくスキャンデータを出力する出力部とを有する。スキャンデータは、点群データであり、受光部が受光したレーザ光に対応する照射方向と、上述の受光信号に基づき特定される当該レーザ光の照射方向における物体までの距離とに基づき生成される。ライダ2、ジャイロセンサ3、車速センサ4、GPS受信機5は、それぞれ、出力データを車載機1へ供給する。 The lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information. In this case, the lidar 2 includes an irradiation unit that irradiates laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) reflected by the object, and a light reception signal output by the light receiving unit. And an output unit for outputting scan data based on. The scan data is point cloud data and is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal. . The rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
 サーバ装置6は、各車載機1からアップロード情報Iuを受信して記憶する。サーバ装置6は、例えば、収集したアップロード情報Iuに基づき、配信地図DB20を更新する。また、サーバ装置6は、配信地図DB20の更新情報を含むダウンロード情報Idを各車載機1へ送信する。サーバ装置6は、地図データ生成装置の一例である。 The server device 6 receives the upload information Iu from each in-vehicle device 1 and stores it. For example, the server device 6 updates the distribution map DB 20 based on the collected upload information Iu. In addition, the server device 6 transmits download information Id including update information of the distribution map DB 20 to each in-vehicle device 1. The server device 6 is an example of a map data generation device.
 図2(A)は、車載機1の機能的構成を示すブロック図である。車載機1は、主に、インターフェース11と、記憶部12と、通信部13と、入力部14と、制御部15と、情報出力部16と、を有する。これらの各要素は、バスラインを介して相互に接続されている。 FIG. 2A is a block diagram showing a functional configuration of the in-vehicle device 1. The in-vehicle device 1 mainly includes an interface 11, a storage unit 12, a communication unit 13, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
 インターフェース11は、ライダ2、ジャイロセンサ3、車速センサ4、及びGPS受信機5などのセンサから出力データを取得し、制御部15へ供給する。 The interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15.
 記憶部12は、制御部15が実行するプログラムや、制御部15が所定の処理を実行するのに必要な情報を記憶する。本実施例では、記憶部12は、オブジェクト情報IOと、ランドマーク情報ILと、ボクセルデータIBとを含む地図DB10を記憶する。オブジェクト情報IOは、道路上あるいは道路周辺に存在する障害物、穴、故障車両等に関する情報であり、各オブジェクトの位置、大きさ、形状などの属性情報を含む。ランドマーク情報ILは、ランドマークとなる道路上あるいは道路周辺の地物に関する情報であって、例えば、道路脇に周期的に並んでいるキロポスト、100mポスト、デリニエータ、交通インフラ設備(例えば標識、方面看板、信号)、電柱、街灯などの地物及び区画線等に関する情報である。ランドマーク情報ILは、後述するランドマークベース位置推定において用いられる。なお、事故等によって位置や形状が変化したランドマークは、オブジェクト情報IOの情報にもなる。ボクセルデータIBは、3次元空間を複数の領域に分割した場合の単位領域(「ボクセル」とも呼ぶ。)ごとの静止構造物の計測位置を示す点群データに関する情報である。ボクセルデータIBは、後述する点群ベース位置推定において用いられる。 The storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process. In the present embodiment, the storage unit 12 stores a map DB 10 including object information IO, landmark information IL, and voxel data IB. The object information IO is information related to obstacles, holes, broken vehicles, etc. existing on or around the road, and includes attribute information such as the position, size, and shape of each object. The landmark information IL is information related to features on or around the road to be a landmark. For example, a kilometer post, a 100 m post, a delineator, a traffic infrastructure facility (eg, a sign, a direction) This is information related to features such as signs, signals), utility poles, street lamps, and lane markings. The landmark information IL is used in landmark base position estimation described later. Note that the landmark whose position or shape has changed due to an accident or the like is also information of the object information IO. The voxel data IB is information on point cloud data indicating the measurement position of the stationary structure for each unit region (also referred to as “voxel”) when the three-dimensional space is divided into a plurality of regions. The voxel data IB is used in point cloud base position estimation described later.
 通信部13は、制御部15の制御に基づき、アップロード情報Iuの送信及びダウンロード情報Idの受信などを行う。入力部14は、ユーザが操作するためのボタン、タッチパネル、リモートコントローラ、音声入力装置等である。情報出力部16は、例えば、制御部15の制御に基づき出力を行うディスプレイやスピーカ等である。 The communication unit 13 performs transmission of the upload information Iu and reception of the download information Id based on the control of the control unit 15. The input unit 14 is a button, a touch panel, a remote controller, a voice input device, or the like for a user to operate. The information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
 制御部15は、プログラムを実行するCPUなどを含み、車載機1の全体を制御する。本実施例では、制御部15は、自車位置推定部17と、アップロード制御部18と、を有する。 The control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1. In the present embodiment, the control unit 15 includes a host vehicle position estimation unit 17 and an upload control unit 18.
 自車位置推定部17は、複数の自車位置推定方式を選択的に又は組み合わせて実行することで、高精度な自車位置の推定を行う。本実施例では、一例として、自車位置推定部17は、ランドマークの計測結果とオブジェクト情報IOに記録されたランドマークの位置情報とを用いた位置推定(「ランドマークベース位置推定」とも呼ぶ。)と、ボクセルデータを用いた位置推定(「点群ベース位置推定」とも呼ぶ。)と、全球測位衛星システム(GNSS:Global Navigation Satellite System)を用いた位置推定(「GNSSベース位置推定」とも呼ぶ。)とを、選択的に又は組み合わせて実行する。自車位置推定部17は、ランドマークベース位置推定及び点群ベース位置推定ではライダ2の出力に基づき自車位置推定を行い、GNSSベース位置推定ではGPS受信機5の出力に基づき自車位置推定を行う。 The own vehicle position estimation unit 17 performs highly accurate estimation of the own vehicle position by selectively or combining a plurality of own vehicle position estimation methods. In the present embodiment, as an example, the vehicle position estimation unit 17 performs position estimation using the landmark measurement result and the landmark position information recorded in the object information IO (also referred to as “landmark base position estimation”). ), Position estimation using voxel data (also referred to as “point cloud-based position estimation”), and position estimation using a global positioning satellite system (GNSS) (“GNSS-based position estimation”). Are executed selectively or in combination. The vehicle position estimation unit 17 performs vehicle position estimation based on the output of the lidar 2 in the landmark base position estimation and the point cloud base position estimation, and the vehicle position estimation based on the output of the GPS receiver 5 in the GNSS base position estimation. I do.
 さらに、自車位置推定部17は、自車位置を推定すると共に、推定した自車位置の推定精度に関する情報(「精度情報」とも呼ぶ。)を生成して記憶部12等に記憶する。ランドマークベース位置推定及び点群ベース位置推定の詳細とこれらの精度情報の生成方法については後述する。 Furthermore, the own vehicle position estimating unit 17 estimates the own vehicle position, and generates information on the estimated accuracy of the own vehicle position (also referred to as “accuracy information”) and stores it in the storage unit 12 or the like. Details of the landmark-based position estimation and the point cloud-based position estimation and the method of generating the accuracy information will be described later.
 アップロード制御部18は、ライダ2などの外界センサの出力に基づき所定のオブジェクトを検出した場合などに、検出したオブジェクトに関する情報を含むアップロード情報Iuを生成し、アップロード情報Iuをサーバ装置6へ送信する。上述のオブジェクトに関する情報は、少なくともオブジェクトの位置情報を含むものであり、オブジェクトの識別情報、オブジェクトの形状、大きさなどの属性情報がさらに含まれてもよい。また、本実施例では、アップロード制御部18は、自車位置推定部17が算出した精度情報を、所定の平均値及び標準偏差の値となるように標準化した精度情報(「標準化精度情報」とも呼ぶ。)に変換し、標準化精度情報をアップロード情報Iuに含めてサーバ装置6へ送信する。標準化精度情報については後述する。アップロード制御部18は、「生成部」、「送信部」及びプログラムを実行する「コンピュータ」の一例である。 The upload control unit 18 generates upload information Iu including information related to the detected object when a predetermined object is detected based on the output of an external sensor such as the lidar 2, and transmits the upload information Iu to the server device 6. . The information related to the object includes at least the position information of the object, and may further include attribute information such as object identification information, object shape, and size. Further, in this embodiment, the upload control unit 18 standardizes the accuracy information calculated by the vehicle position estimation unit 17 so as to be a predetermined average value and standard deviation value (also referred to as “standardized accuracy information”). And the standardized accuracy information is included in the upload information Iu and transmitted to the server device 6. The standardized accuracy information will be described later. The upload control unit 18 is an example of a “generation unit”, a “transmission unit”, and a “computer” that executes a program.
 図2(B)は、サーバ装置6の機能的構成を示すブロック図である。サーバ装置6は、主に、通信部61と、記憶部62と、制御部65とを有する。これらの各要素は、バスラインを介して相互に接続されている。 FIG. 2B is a block diagram showing a functional configuration of the server device 6. The server device 6 mainly includes a communication unit 61, a storage unit 62, and a control unit 65. Each of these elements is connected to each other via a bus line.
 通信部61は、制御部65の制御に基づき、アップロード情報Iuの受信及びダウンロード情報Idの送信などを行う。記憶部62は、制御部65が実行するプログラムや、制御部65が所定の処理を実行するのに必要な情報を記憶する。本実施例では、記憶部62は、地図DB10と同様のデータ構造を有する配信地図DB20と、各車載機1から受信したアップロード情報Iuのデータベースであるアップロード情報DB27と、を記憶する。 The communication unit 61 receives the upload information Iu and transmits the download information Id based on the control of the control unit 65. The storage unit 62 stores a program executed by the control unit 65 and information necessary for the control unit 65 to execute a predetermined process. In the present embodiment, the storage unit 62 stores a distribution map DB 20 having the same data structure as the map DB 10 and an upload information DB 27 that is a database of upload information Iu received from each in-vehicle device 1.
 制御部65は、プログラムを実行するCPUなどを含み、サーバ装置6の全体を制御する。本実施例では、制御部65は、通信部61により各車載機1から受信したアップロード情報Iuを蓄積したアップロード情報DB27に基づき配信地図DB20を更新する処理、及び生成した地図の更新情報を含むダウンロード情報Idを通信部61により各車載機1へ送信する処理などを行う。制御部65は、「受信部」、「生成部」及びプログラムを実行する「コンピュータ」の一例である。 The control unit 65 includes a CPU that executes a program and controls the entire server device 6. In the present embodiment, the control unit 65 updates the distribution map DB 20 based on the upload information DB 27 in which the upload information Iu received from each in-vehicle device 1 by the communication unit 61 is accumulated, and the download including the generated map update information. Processing for transmitting the information Id to each vehicle-mounted device 1 through the communication unit 61 is performed. The control unit 65 is an example of a “reception unit”, a “generation unit”, and a “computer” that executes a program.
 [自車位置推定方式の具体例]
 以下では、自車位置推定方式の具体例として、ランドマークベース位置推定、点群ベース位置推定、及びGNSSベース位置推定の概要について説明する。
[Specific example of vehicle position estimation method]
Below, the outline | summary of landmark base position estimation, point cloud base position estimation, and GNSS base position estimation is demonstrated as a specific example of the own vehicle position estimation system.
 (1)ランドマークベース位置推定
 ランドマークベース位置推定では、自車位置推定部17は、ランドマークに対するライダ2による距離及び角度の計測値と、地図DB10から抽出したランドマークの位置情報とに基づき、ジャイロセンサ3、車速センサ4、及び/又はGPS受信機5の出力データから推定した自車位置を補正する。本実施例では、一例として、自車位置推定部17は、ジャイロセンサ3、車速センサ4等の出力データから自車位置を予測する予測ステップと、直前の予測ステップで算出した自車位置の予測値を補正する計測更新ステップとを交互に実行する。これらのステップで用いる状態推定フィルタは、ベイズ推定を行うように開発された様々のフィルタが利用可能であり、例えば、拡張カルマンフィルタ、アンセンテッドカルマンフィルタ、パーティクルフィルタなどが該当する。以後では、自車位置推定部17は、ランドマークベース位置推定の代表例として、拡張カルマンフィルタを用いた自車位置推定を行う例について説明する。
(1) Landmark Base Position Estimation In the landmark base position estimation, the vehicle position estimation unit 17 is based on the distance and angle measurement values obtained by the lidar 2 with respect to the landmark and the landmark position information extracted from the map DB 10. The vehicle position estimated from the output data of the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 is corrected. In the present embodiment, as an example, the vehicle position estimation unit 17 predicts the vehicle position calculated in the prediction step of predicting the vehicle position from the output data of the gyro sensor 3, the vehicle speed sensor 4, and the like, and the prediction step immediately before. The measurement update step for correcting the value is executed alternately. Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter. Hereinafter, an example in which the vehicle position estimation unit 17 performs vehicle position estimation using an extended Kalman filter will be described as a representative example of landmark-based position estimation.
 図3は、推定すべき自車位置を2次元直交座標で表した図である。図3に示すように、xyの2次元直交座標上で定義された平面での自車位置は、座標「(x、y)」、自車の方位(ヨー角)「ψ」により表される。ここでは、ヨー角ψは、車の進行方向とx軸とのなす角として定義されている。なお、本実施例では、上述の座標(x、y)及びヨー角ψに加えて、x軸及びy軸に垂直なz軸の座標を勘案した4変数(x、y、z、ψ)を自車位置の状態変数とした自車位置推定を行う。なお、一般的な道路は勾配が緩やかであるため、車両のピッチ角及びロール角については本実施例では原則的に無視するものとする。 FIG. 3 is a diagram showing the position of the vehicle to be estimated in two-dimensional orthogonal coordinates. As shown in FIG. 3, the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ψ” of the vehicle. . Here, the yaw angle ψ is defined as an angle formed by the traveling direction of the vehicle and the x-axis. In this embodiment, in addition to the above-mentioned coordinates (x, y) and yaw angle ψ, four variables (x, y, z, ψ) taking into account the z-axis coordinates perpendicular to the x-axis and the y-axis are used. The vehicle position is estimated using the state variable of the vehicle position. Since a general road has a gentle slope, the pitch angle and roll angle of the vehicle are basically ignored in this embodiment.
 図4は、予測ステップと計測更新ステップとの概略的な関係を示す図である。また、図5は、自車位置推定部17の機能ブロックの一例を示す。図4に示すように、予測ステップと計測更新ステップとを繰り返すことで、自車位置を示す状態変数ベクトル「X」の推定値の算出及び更新を逐次的に実行する。また、図5に示すように、自車位置推定部17は、予測ステップを実行する位置予測部21と、計測更新ステップを実行する位置推定部22とを有する。位置予測部21は、デッドレコニングブロック23及び位置予測ブロック24を含み、位置推定部22は、ランドマーク探索・抽出ブロック25及び位置補正ブロック26を含む。なお、図4では、計算対象となる基準時刻(即ち現在時刻)「t」の状態変数ベクトルを、「X(k)」または「X(k)」と表記している。ここで、予測ステップで推定された暫定的な推定値(予測値)には当該予測値を表す文字の上に「」を付し、計測更新ステップで更新された,より精度の高い推定値には当該値を表す文字の上に「」を付す。 FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step. FIG. 5 shows an example of functional blocks of the vehicle position estimation unit 17. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the own vehicle position estimation part 17 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step. The position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26. In FIG. 4, the state variable vector of the reference time (ie, current time) “t” to be calculated is represented as “X (k)” or “X ^ (k)”. Here, the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ^ ” on the character representing the value.
 予測ステップでは、デッドレコニングブロック23は、車両の移動速度「v」と角速度「ω」(これらをまとめて「制御値u(k)=(v(k)、ω(k))」と表記する。)を用い、前回時刻からの移動距離と方位変化を求める。制御部15の位置予測ブロック24は、直前の計測更新ステップで算出された時刻t-1の状態変数ベクトルX(k-1)に対し、求めた移動距離と方位変化を加えて、時刻tの自車位置の予測値(「予測位置」とも呼ぶ。)X(k)を算出する。また、これと同時に、予測位置X(k)の誤差分布に相当する共分散行列「P(k)」を、直前の計測更新ステップで算出された時刻t-1での共分散行列「P(k-1)」から算出する。 In the prediction step, the dead reckoning block 23 describes the vehicle moving speed “v” and the angular speed “ω” (collectively “control value u (k) = (v (k), ω (k)) T ”. ) To obtain the movement distance and azimuth change from the previous time. The position prediction block 24 of the control unit 15 adds the obtained movement distance and azimuth change to the state variable vector X ^ (k-1) at time t-1 calculated in the immediately previous measurement update step, and then adds the time t A predicted value (also referred to as “predicted position”) X (k) is calculated. At the same time, the covariance matrix “P (k)” corresponding to the error distribution at the predicted position X (k) is replaced with the covariance matrix “t−1” calculated at the immediately previous measurement update step. P ^ (k-1) ".
 計測更新ステップでは、ランドマーク探索・抽出ブロック25は、地図DB10のランドマーク情報ILに登録されたランドマークの位置ベクトルとライダ2のスキャンデータとの対応付けを行う。そして、ランドマーク探索・抽出ブロック25は、この対応付けができた場合に、対応付けができたランドマークのライダ2による計測値「Z(k)」と、予測位置X(k)及びランドマーク情報ILに登録されたランドマークの位置ベクトルを用いてライダ2による計測処理をモデル化して求めたランドマークの計測値(「計測予測値」と呼ぶ。)「Z(k)」とをそれぞれ取得する。計測値Z(k)は、時刻tにライダ2が計測したランドマークの距離及びスキャン角度から、車両の進行方向と横方向を軸とした成分に変換した車両の座標系(「車両座標系」とも呼ぶ。)におけるベクトル値である。そして、位置補正ブロック26は、以下の式(1)に示すように、計測値Z(k)と計測予測値Z(k)との差分値にカルマンゲイン「K(k)」を乗算し、これを予測位置X(k)に加えることで、更新された状態変数ベクトル(「推定位置」とも呼ぶ。)X(k)を算出する。 In the measurement update step, the landmark search / extraction block 25 associates the landmark position vector registered in the landmark information IL of the map DB 10 with the scan data of the lidar 2. Then, when the association is made, the landmark search / extraction block 25, the measured value “Z (k)” by the lidar 2 of the made landmark, the predicted position X (k), and the land A landmark measurement value (referred to as “measurement prediction value”) “Z (k)” obtained by modeling the measurement processing by the lidar 2 using the landmark position vector registered in the mark information IL. Get each. The measured value Z (k) is a vehicle coordinate system ("vehicle coordinate system") converted from a landmark distance and a scan angle measured by the rider 2 at time t into components with the vehicle traveling direction and the lateral direction as axes. Vector value). Then, the position correction block 26 multiplies the difference value between the measured value Z (k) and the measured predicted value Z (k) by the Kalman gain “K (k)” as shown in the following equation (1). By adding this to the predicted position X (k), the updated state variable vector (also referred to as “estimated position”) X ^ (k) is calculated.
Figure JPOXMLDOC01-appb-M000001
 また、計測更新ステップでは、位置補正ブロック26は、予測ステップと同様、推定位置X(k)の誤差分布に相当する共分散行列P(k)(単にP(k)とも表記する)を共分散行列P(k)から求める。カルマンゲインK(k)等のパラメータについては、例えば拡張カルマンフィルタを用いた公知の自己位置推定技術と同様に算出することが可能である。
Figure JPOXMLDOC01-appb-M000001
In the measurement update step, the position correction block 26, as in the prediction step, uses a covariance matrix P ^ (k) (simply expressed as P (k)) corresponding to the error distribution of the estimated position X ^ (k). Obtained from the covariance matrix P (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
 このように、予測ステップと計測更新ステップが繰り返し実施され、予測位置X(k)と推定位置X(k)が逐次的に計算されることにより、もっとも確からしい自車位置が計算される。 In this way, the prediction step and the measurement update step are repeatedly performed, and the predicted position X (k) and the estimated position X ^ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
 ここで、位置推定の精度は、共分散行列Pの対角要素の値で判断することができる。ここで、時刻「k」のランドマークに対する計測値に基づき算出される共分散行列をP(k)とすると、共分散行列P(k)は、以下の式(2)により表される。 Here, the accuracy of position estimation can be determined by the value of the diagonal element of the covariance matrix P. Here, when the covariance matrix calculated based on the measured value for the landmark at time “k” is P (k), the covariance matrix P (k) is expressed by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
 この場合、自車位置推定部17は、共分散行列P(k)の対角要素の平方根「σ(k)」、「σ(k)」、「σ(k)」、「σψ(k)」を、各状態変数x、y、z、ψに対する精度情報の値「d(k)」とみなす。
Figure JPOXMLDOC01-appb-M000002
In this case, the vehicle position estimation unit 17 uses the square roots “σ x (k)”, “σ y (k)”, “σ z (k)”, “σ” of the diagonal elements of the covariance matrix P (k). ψ (k) ”is regarded as accuracy information value“ d (k) ”for each state variable x, y, z, ψ.
 なお、状態変数x、y、zが地図において採用されるグローバル座標系である場合は、以下の式(3)に示すように、現在の推定方位角ψを用いた行列「Cψ(k)」を用いた演算によって車両座標系(X、Y、Z)に変換することができる。 When the state variables x, y, and z are the global coordinate system adopted in the map, the matrix “C ψ (k) using the current estimated azimuth angle ψ as shown in the following equation (3). Can be converted into the vehicle coordinate system (X, Y, Z).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 行列Cψ(k)は以下の式(4)により示される。 The matrix C ψ (k) is expressed by the following equation (4).
Figure JPOXMLDOC01-appb-M000004
 この場合、自車位置推定部17は、車両座標系(X、Y、Z)に変換後の対角要素の平方根σ(k)、σ(k)、σ(k)、σΨ(k)を、各状態変数に対する精度情報の値d(k)とみなす。
Figure JPOXMLDOC01-appb-M000004
In this case, the vehicle position estimation unit 17 uses the square roots σ X (k), σ Y (k), σ Z (k), σ Ψ of the diagonal elements after conversion into the vehicle coordinate system (X, Y, Z). (K) is regarded as the precision information value d (k) for each state variable.
 なお、好適には、後述するGNSSベース位置推定と同様に、自車位置推定部17は、地図において採用されるグローバル座標系の座標軸(即ち緯度、経度、高度、あるいはある基準点からのxyz座標値)ごとに精度情報の値d(k)を算出するとよい。この場合、ランドマークベース位置推定において推定される状態変数x、y、zが車両座標系である場合には、自車位置推定部17は、上述の行列Cψ(k)を用いて共分散行列P(k)をグローバル座標系に変換し、変換後の対角要素を精度情報の値d(k)と定める。 Preferably, as in the GNSS-based position estimation described later, the vehicle position estimation unit 17 uses the coordinate axes (that is, latitude, longitude, altitude, or xyz coordinates from a certain reference point) of the global coordinate system adopted in the map. The value d (k) of accuracy information may be calculated for each value. In this case, when the state variables x, y, and z estimated in the landmark base position estimation are in the vehicle coordinate system, the vehicle position estimation unit 17 uses the matrix C ψ (k) described above to perform covariance. The matrix P (k) is converted into the global coordinate system, and the diagonal element after conversion is defined as the value d (k) of accuracy information.
 (2)点群ベース位置推定
 次に、ボクセルデータIBを用いた点群ベース位置推定について説明する。点群ベース位置推定で用いるボクセルデータIBは、各ボクセル内の静止構造物の計測された点群データを正規分布により表したデータを含み、NDT(Normal Distributions Transform)を用いたスキャンマッチングに用いられる。
(2) Point cloud base position estimation Next, point cloud base position estimation using the voxel data IB will be described. The voxel data IB used in the point cloud-based position estimation includes data representing the point cloud data measured for stationary structures in each voxel by a normal distribution, and is used for scan matching using NDT (Normal Distributions Transform). .
 図6は、点群ベース位置推定における自車位置推定部17の一例を示している。図5に示したランドマークベース位置推定における自車位置推定部17との違いは、ランドマーク探索・抽出部25の代わりに、ライダ2より得られた点群データと地図DBから取得したボクセルを対応付けする処理として、点群データ対応付けブロック27が設けられていることである。 FIG. 6 shows an example of the vehicle position estimation unit 17 in the point cloud base position estimation. The difference from the vehicle position estimation unit 17 in the landmark base position estimation shown in FIG. 5 is that the point cloud data obtained from the lidar 2 and the voxel acquired from the map DB are used instead of the landmark search / extraction unit 25. The point cloud data correlation block 27 is provided as the correlation process.
 図7は、ボクセルデータIBの概略的なデータ構造の一例を示す。ボクセルデータIBは、ボクセル内の点群を正規分布で表現する場合のパラメータの情報を含み、本実施例では、図7に示すように、ボクセルIDと、ボクセル座標と、平均ベクトルと、共分散行列とを含む。ここで、「ボクセル座標」は、各ボクセルの中心位置などの基準となる位置の絶対的な3次元座標を示す。なお、各ボクセルは、空間を格子状に分割した立方体であり、予め形状及び大きさが定められているため、ボクセル座標により各ボクセルの空間を特定することが可能である。ボクセル座標は、ボクセルIDとして用いられてもよい。 FIG. 7 shows an example of a schematic data structure of the voxel data IB. The voxel data IB includes information on parameters when the point cloud in the voxel is expressed by a normal distribution. In this embodiment, as shown in FIG. 7, the voxel ID, voxel coordinates, average vector, and covariance are included. Including matrix. Here, “voxel coordinates” indicate absolute three-dimensional coordinates of a reference position such as the center position of each voxel. Each voxel is a cube obtained by dividing the space into a lattice shape, and since the shape and size are determined in advance, the space of each voxel can be specified by the voxel coordinates. The voxel coordinates may be used as a voxel ID.
 「平均ベクトル」及び「共分散行列」は、対象のボクセル内での点群を正規分布で表現する場合のパラメータに相当する平均ベクトル及び共分散行列を示し、任意のボクセル「n」内の任意の点「i」の座標をX(i)=[x(i)、y(i)、z(i)]と定義し、ボクセルn内での点群数を「N」とすると、ボクセルnでの平均ベクトル「μ」及び共分散行列「V」は、それぞれ以下の式(5)及び式(6)により表される。 “Average vector” and “covariance matrix” indicate an average vector and a covariance matrix corresponding to parameters when the point group in the target voxel is expressed by a normal distribution, and an arbitrary vector in any voxel “n” Is defined as X n (i) = [x n (i), y n (i), z n (i)] T, and the number of point groups in voxel n is defined as “N n ”, The mean vector“ μ n ”and the covariance matrix“ V n ”at voxel n are expressed by the following equations (5) and (6), respectively.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
 車両を想定したNDTによるスキャンマッチングは、道路平面(ここではxy座標とする)内の移動量及び車両の向きを要素とした推定パラメータP=[t、t、t、tψを推定することとなる。ここで、「t」は、x方向の移動量を示し、「t」は、y方向の移動量を示し、「t」は、z方向の移動量を示し、「tψ」は、ヨー角を示す。なお、ピッチ角、ロール角は、道路勾配や振動によって生じるものの、無視できる程度に小さい。
Figure JPOXMLDOC01-appb-M000006
Scan matching by NDT assuming a vehicle is performed by estimating parameters P = [t x , t y , t z , t ψ ] T with the amount of movement in the road plane (here, xy coordinates) and the direction of the vehicle as elements. Will be estimated. Here, “t x ” represents the amount of movement in the x direction, “t y ” represents the amount of movement in the y direction, “t z ” represents the amount of movement in the z direction, and “t ψ ” represents Shows the yaw angle. Note that the pitch angle and roll angle are small enough to be ignored, although they are caused by road gradients and vibrations.
 また、ライダ2により得られた点群データに対して、マッチングさせるべきボクセルとの対応付けを行う。対応するボクセルnでの任意の点の座標をX(i)=[x(i)、y(i)、z(i)]とすると、ボクセルnでのX(i)の平均値「L´」は、以下の式(7)により表される。 Further, the point cloud data obtained by the lidar 2 is associated with voxels to be matched. If the coordinate of an arbitrary point in the corresponding voxel n is X L (i) = [x n (i), y n (i), z n (i)] T , then X L (i) in voxel n The average value “L ′ n ” is expressed by the following equation (7).
Figure JPOXMLDOC01-appb-M000007
 そして、上述の推定パラメータPを用い、平均値L´を座標変換すると、変換後の座標「L」は、以下の式(8)により表される。
Figure JPOXMLDOC01-appb-M000007
Then, when the average value L ′ is coordinate-converted using the estimation parameter P described above, the coordinate “L n ” after conversion is expressed by the following equation (8).
Figure JPOXMLDOC01-appb-M000008
 そして、本実施例では、車載機1は、座標変換した点群と、ボクセルデータに含まれる平均ベクトルμと共分散行列Vとを用い、以下の式(9)により示されるボクセルnの評価関数値「E」及び式(10)により示されるマッチングの対象となる全てのボクセルを対象とした総合的な評価関数値「E(k)」(「総合評価関数値」とも呼ぶ。)を算出する。
Figure JPOXMLDOC01-appb-M000008
In the present embodiment, the in-vehicle device 1 uses the point group obtained by coordinate transformation, the average vector μ n and the covariance matrix V n included in the voxel data, and the voxel n represented by the following equation (9). Overall evaluation function value “E (k)” (also referred to as “overall evaluation function value”) for all voxels to be matched indicated by the evaluation function value “E n ” and Expression (10). Is calculated.
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000010
 その後、車載機1は、ニュートン法などの任意の求根アルゴリズムにより総合評価関数値E(k)が最大となるとなる推定パラメータPを算出する。そして、車載機1は、図6に示した位置予測部21等から予測した自車位置X(k)に対し、推定パラメータPを適用することで、以下の式(11)を用いて高精度な自車位置X(k)を推定する。
Figure JPOXMLDOC01-appb-M000010
Thereafter, the in-vehicle device 1 calculates an estimation parameter P that maximizes the overall evaluation function value E (k) by an arbitrary root finding algorithm such as Newton's method. The in-vehicle device 1 applies the estimation parameter P to the own vehicle position X (k) predicted from the position prediction unit 21 shown in FIG. An accurate own vehicle position X ^ (k) is estimated.
Figure JPOXMLDOC01-appb-M000011
 ここで、総合評価関数値Eは、大きい値であるほど、マッチング度合が高いことを示し、位置推定精度が良い結果となる。よって、本実施例では、自車位置推定部17は、点群ベース位置推定の精度情報の値「d(k)」を、以下の式により定める。
       d(k)=-E(k)
Figure JPOXMLDOC01-appb-M000011
Here, the larger the comprehensive evaluation function value E, the higher the matching degree and the better the position estimation accuracy. Therefore, in this embodiment, the host vehicle position estimation unit 17 determines the accuracy information value “d (k)” of the point cloud base position estimation by the following equation.
d (k) = − E (k)
 なお、上記の式では、ランドマークベース位置推定で算出する精度情報の値d(k)と同様、位置推定精度が良好なほど小さくなるように、点群ベース位置推定の精度情報の値d(k)が定義されている。 In the above equation, the accuracy information value d () of the point cloud base position estimation is reduced so that the better the position estimation accuracy is, like the accuracy information value d (k) calculated in the landmark base position estimation. k) is defined.
 (3)GNSSベース位置推定
 自車位置推定部17は、GNSSベース位置推定では、GPS受信機5の出力に基づき自車位置推定を行う。また、自車位置推定部17は、GNSSベース位置推定を実行する場合、例えば、GPS受信機5から得られるDOP(Dilution Of Precision)を精度情報の値d(k)として取得する。他の例では、自車位置推定部17は、所定時間内にGPS受信機5から取得した緯度、経度、高度の各々の標準偏差の値を、緯度、経度、高度の各々に対する精度情報の値d(k)として取得する。なおGPS受信機5は、GPSだけでなく、GLONASS、Galileo、準天頂衛星(QZSS)等も測位できる受信機であっても良い。
(3) The GNSS base position estimation own vehicle position estimation unit 17 estimates the own vehicle position based on the output of the GPS receiver 5 in the GNSS base position estimation. Further, when executing the GNSS base position estimation, the host vehicle position estimation unit 17 acquires, for example, DOP (Division Of Precision) obtained from the GPS receiver 5 as the accuracy information value d (k). In another example, the vehicle position estimation unit 17 uses the standard deviation values of latitude, longitude, and altitude acquired from the GPS receiver 5 within a predetermined period of time as values of accuracy information for each of latitude, longitude, and altitude. Obtained as d (k). The GPS receiver 5 may be a receiver capable of positioning not only GPS but also GLONASS, Galileo, quasi-zenith satellite (QZSS), and the like.
 [標準化精度情報]
 次に、標準化精度情報の算出方法について説明する。車載機1のアップロード制御部18は、アップロード情報Iuの送信タイミングにおいて自車位置推定部17が生成した精度情報を、自車位置推定部17が過去所定時間以内に算出した自車位置推定の精度情報の値の平均及び標準偏差に基づき標準化した標準化精度情報を算出する。
[Standardized accuracy information]
Next, a method for calculating standardized accuracy information will be described. The upload control unit 18 of the in-vehicle device 1 uses the accuracy information generated by the vehicle position estimation unit 17 at the transmission timing of the upload information Iu, and the accuracy of the vehicle position estimation calculated by the vehicle position estimation unit 17 within the past predetermined time. Standardized accuracy information is calculated based on the average and standard deviation of the information values.
 ここで、アップロード情報Iuの送信タイミングにおいて自車位置推定部17が算出した精度情報を「d(k)」とし、自車位置推定部17が過去所定時間以内に算出した精度情報の値の平均及び標準偏差をそれぞれ「μ(k)」、「σ(k)」とすると、アップロード制御部18は、標準化精度情報の値「S(k)」を以下の式(12)により算出する。 Here, the accuracy information calculated by the own vehicle position estimating unit 17 at the transmission timing of the upload information Iu is “d (k)”, and the average of the accuracy information values calculated by the own vehicle position estimating unit 17 within the past predetermined time is used. And the standard deviations are “μ (k)” and “σ (k)”, respectively, the upload control unit 18 calculates the value “S (k)” of the standardization accuracy information by the following equation (12).
Figure JPOXMLDOC01-appb-M000012
 式(12)に示すように、標準化精度情報の値S(k)は、精度情報d(k)が平均μ(k)より小さいと負値となり、平均μ(k)より大きいと正値となる。また、標準化精度情報の値S(k)は、標準偏差σ(k)が大きいほど0に近づき、標準偏差σ(k)が小さいほど大きい値となる。従って、標準化精度情報の値S(k)は、精度情報の値d(k)の分布を用いて標準化した表現となる。そして、アップロード制御部18は、緯度、経度、高度ごとに標準化精度情報の値S(k)を算出する。なお、自車位置推定方式が点群ベース位置推定の場合には、例えば、アップロード制御部18は、緯度、経度及び高度で同一の標準化精度情報の値S(k)を、総合評価関数値Eに基づく精度情報の値(d(k)=-E(k))を用いて設定する。
Figure JPOXMLDOC01-appb-M000012
As shown in the equation (12), the value S (k) of the standardized accuracy information is a negative value when the accuracy information d (k) is smaller than the average μ (k), and a positive value when the accuracy information d (k) is larger than the average μ (k). Become. Further, the value S (k) of the standardization accuracy information approaches 0 as the standard deviation σ (k) increases, and increases as the standard deviation σ (k) decreases. Therefore, the standardized accuracy information value S (k) is a standardized expression using the distribution of the accuracy information value d (k). Then, the upload control unit 18 calculates the standardized accuracy information value S (k) for each latitude, longitude, and altitude. When the vehicle position estimation method is point cloud base position estimation, for example, the upload control unit 18 uses the same standardized accuracy information value S (k) at the latitude, longitude, and altitude as the overall evaluation function value E. Is set using the value of accuracy information based on (d (k) = − E (k)).
 図8(A)は、ある地点において異なる車両に搭載された車載機1がそれぞれ異なる自車位置推定方式で自車位置推定を行った場合に得られる精度情報の値d(k)、d(k)の推移を示した図である。ここで、グラフG1により示される精度情報の値d(k)は、平均「μ(k)」、標準偏差「σ(k)」の分布となり、グラフG2により示される精度情報の値d(k)は、平均「μ(k)」、標準偏差「σ(k)」の分布となっている。このように、精度情報は、実行する自車位置推定方式及び使用するセンサの精度などに起因して平均及び標準偏差が異なるものとなる。 FIG. 8A shows accuracy information values d A (k) and d obtained when the vehicle-mounted devices 1 mounted on different vehicles at a certain point perform vehicle position estimation using different vehicle position estimation methods. It is the figure which showed transition of B (k). Here, the accuracy information value d A (k) indicated by the graph G1 is a distribution of the average “μ A (k)” and the standard deviation “σ A (k)”, and the accuracy information value indicated by the graph G2 d B (k) has a distribution of mean “μ B (k)” and standard deviation “σ B (k)”. As described above, the accuracy information differs in average and standard deviation due to the vehicle position estimation method to be executed and the accuracy of the sensor to be used.
 図8(B)は、精度情報の値d(k)を式(12)に基づき標準化した標準化精度情報の値S(k)の推移と、精度情報の値d(k)を式(12)に基づき標準化した標準化精度情報の値S(k)の推移とを示した図である。ここで、グラフG3により示される標準化精度情報の値S(k)と、グラフG4により示される標準化精度情報の値S(k)とは、共に平均0及び標準偏差1の分布となっている。このように、式(12)に基づき標準化精度情報を算出することで、異なる分布となる精度情報を同じ軸により取り扱うことが可能となる。 FIG. 8B shows the transition of the standardized accuracy information value S A (k) obtained by standardizing the accuracy information value d A (k) based on the equation (12), and the accuracy information value d B (k). is a diagram showing changes with the values S B of standardized standardized accuracy information (k) based on (12). Here, the value S A (k) of the standardization accuracy information indicated by the graph G3 and the value S B (k) of the standardization accuracy information indicated by the graph G4 are both distributions of mean 0 and standard deviation 1. Yes. Thus, by calculating the standardized accuracy information based on Expression (12), it is possible to handle accuracy information having different distributions by the same axis.
 次に、自車位置推定部17が複数の自車位置推定方式を実行する場合の標準化精度情報の算出方法について説明する。ここでは、自車位置推定部17は、精度情報の値がd(k)(図8(A)のグラフG1参照)となる自車位置推定方式Aと、精度情報の値がd(k)(図8(A)のグラフG2参照)となる自車位置推定方式Bとを、α:β(例えば7:3)の比率により重み付けをすることで自車位置を推定する場合について説明する。 Next, a calculation method of standardization accuracy information when the vehicle position estimation unit 17 executes a plurality of vehicle position estimation methods will be described. Here, the vehicle position estimation unit 17 uses the vehicle position estimation method A in which the accuracy information value is d A (k) (see the graph G1 in FIG. 8A), and the accuracy information value is d B ( k) A case where the vehicle position is estimated by weighting the vehicle position estimation method B (see graph G2 in FIG. 8A) with a ratio of α: β (for example, 7: 3) will be described. To do.
 この場合、アップロード制御部18は、まず、自車位置推定方式Aの精度情報に基づく標準化精度情報S(図8(B)のグラフG3参照)と、自車位置推定方式Bの精度情報に基づく標準化精度情報S(図8(B)のグラフG4参照)とをそれぞれ算出する。そして、アップロード制御部18は、算出した標準化精度情報Sと、標準化精度情報Sとを、α:βの比率により重み付けした標準化精度情報SABを算出する。図9(A)は、α:βを7:3として図8(B)に示す標準化精度情報Sと標準化精度情報Sとを重み付けすることで算出した標準化精度情報SABの推移を示すグラフである。図9(A)に示すように、アップロード制御部18は、この場合、0.7(=7/(7+3))を乗じた標準化精度情報Sと0.3(=3/(7+3))を乗じた標準化精度情報Sとを加算することで、標準化精度情報SABを算出する。 In this case, the upload control unit 18 first adds standardized accuracy information S A (see graph G3 in FIG. 8B) based on accuracy information of the vehicle position estimation method A and accuracy information of the vehicle position estimation method B. Based on the standardized accuracy information S B (see graph G4 in FIG. 8B), each is calculated. Then, the upload control unit 18 calculates standardized accuracy information S AB obtained by weighting the calculated standardized accuracy information S A and the standardized accuracy information S B with a ratio of α: β. Figure 9 (A) is, alpha: beta 7: shows a transition of normalization precision information S AB calculated in FIG. 8 (B) are shown to weighting and the standard-precision information S A and the standard-precision information S B as 3 It is a graph. As shown in FIG. 9 (A), the upload control unit 18, in this case, 0.7 (= 7 / (7 + 3)) standardized accuracy information multiplied by S A and 0.3 (= 3 / (7 + 3)) The standardized accuracy information S AB is calculated by adding the standardized accuracy information S B multiplied by.
 そして、アップロード制御部18は、標準化精度情報SABの過去所定時間内の平均「μAB」及び標準偏差「σAB」を用いて、平均0及び標準偏差1の分布となるように、標準化精度情報SABを標準化する。図9(B)は、標準化精度情報SABを標準化した値S(k)の推移を示す。この場合、アップロード制御部18は、式(12)と同様に、平均μABにより減算した標準化精度情報SABを標準偏差σABで割った値を、アップロード情報Iuに含める標準化精度情報の値S(k)として算出する。 Then, the upload control unit 18 uses the average “μ AB ” and the standard deviation “σ AB ” within the past predetermined time of the standardization accuracy information S AB , so that the standardization accuracy is 1 and the standard deviation 1 is distributed. Standardize the information SAB . FIG. 9B shows the transition of the value S (k) obtained by standardizing the standardized accuracy information SAB . In this case, similarly to the equation (12), the upload control unit 18 divides the standardized accuracy information S AB subtracted by the average μ AB by the standard deviation σ AB , and the standardized accuracy information value S included in the upload information Iu. Calculate as (k).
 このように、アップロード制御部18は、自車位置推定部17が複数の自車位置推定方式を実行する場合であっても、平均及び標準偏差が標準化された標準化精度情報を好適に算出してアップロード情報Iuに含めることができる。 In this way, the upload control unit 18 preferably calculates the standardized accuracy information in which the average and standard deviation are standardized even when the vehicle position estimation unit 17 executes a plurality of vehicle position estimation methods. It can be included in the upload information Iu.
 [アップロード情報]
 次に、車載機1が送信するアップロード情報Iuについて説明する。
[Upload information]
Next, the upload information Iu transmitted by the in-vehicle device 1 will be described.
 図10は、車載機1が送信するアップロード情報Iuのデータ構造の概要を示す図である。図10に示すように、アップロード情報Iuは、ヘッダ情報と、走行経路情報と、イベント情報と、メディア情報とを含む。 FIG. 10 is a diagram showing an outline of the data structure of the upload information Iu transmitted by the in-vehicle device 1. As shown in FIG. 10, the upload information Iu includes header information, travel route information, event information, and media information.
 ヘッダ情報は、「バージョン」、「送信元」、「車両メタデータ」の各項目を含む。車載機1は、「バージョン」に、使用されるアップロード情報Iuのデータ構造のバージョンの情報を指定し、「送信元」には、アップロード情報Iuを送信する会社名(車両のOEM名又はシステムベンダー名)の情報を指定する。また、車載機1は、「車両メタデータ」に、車両の属性情報(例えば車両種別、車両ID、車幅、車高等)を指定する。走行経路情報は、「位置推定」の項目を含む。車載機1は、この「位置推定」には、位置推定時刻を示すタイムスタンプ情報の他、推定した車両の位置を示す緯度、経度、標高の情報、及びこれらの推定精度に関する情報などを指定する。 The header information includes items of “version”, “transmission source”, and “vehicle metadata”. The in-vehicle device 1 designates information on the version of the data structure of the upload information Iu used in “Version”, and the name of the company (OEM name or system vendor of the vehicle that transmits the upload information Iu) in “Sender” Name) information. Further, the in-vehicle device 1 specifies vehicle attribute information (for example, vehicle type, vehicle ID, vehicle width, vehicle height, etc.) in “vehicle metadata”. The travel route information includes an item “position estimation”. The in-vehicle device 1 designates, for this “position estimation”, the time stamp information indicating the position estimation time, the latitude, longitude, altitude information indicating the estimated vehicle position, and information regarding the estimation accuracy. .
 イベント情報は、「オブジェクト認識イベント」の項目を含む。車載機1は、オブジェクト認識イベントを検知した場合に、その検知結果となる情報を「オブジェクト認識イベント」に指定する。「オブジェクト認識イベント」は、「タイムスタンプ」、「オブジェクトID」、「オフセット位置」、「オブジェクトタイプ」、「オブジェクトサイズ」、「オブジェクトサイズ精度」、「メディアID」の各要素を含んでいる。メディア情報は、ライダ2などの外界センサの出力データ(検出情報)である生データを送信する際に使用されるデータタイプである。 Event information includes an item of “object recognition event”. When the vehicle-mounted device 1 detects an object recognition event, it designates information as a detection result as an “object recognition event”. The “object recognition event” includes elements of “time stamp”, “object ID”, “offset position”, “object type”, “object size”, “object size accuracy”, and “media ID”. The media information is a data type used when transmitting raw data that is output data (detection information) of an external sensor such as the lidar 2.
 本実施例では、車載機1は、検出したオブジェクトの位置情報及び標準化精度情報を少なくとも含むアップロード情報Iuをサーバ装置6へ送信する。この場合、車載機1は、例えば、標準化精度情報を「位置推定」の項目に指定すると共に、検出したオブジェクトの位置情報等をイベント情報の「オブジェクト認識イベント」に指定する。具体的には、車載機1は、「オブジェクト認識イベント」に指定するオブジェクトの位置情報として、車両位置からの相対位置を、「オブジェクト認識イベント」のサブ項目「オフセット位置」に指定する。なお、車両の絶対位置は、「位置推定」の項目で指定される。また、車載機1は、オブジェクトの相対位置の情報をアップロード情報Iuに含める代わりに、推定した自車位置及び計測したオブジェクトの相対位置に基づき生成したオブジェクトの絶対位置の情報をアップロード情報Iuに含めてもよい。アップロード情報Iuは、「物体計測情報」の一例であり、「位置推定」の項目等で指定される車両の絶対位置の情報は、「推定位置情報」の一例である。 In this embodiment, the in-vehicle device 1 transmits the upload information Iu including at least the position information of the detected object and the standardization accuracy information to the server device 6. In this case, for example, the in-vehicle device 1 specifies the standardization accuracy information in the item “position estimation”, and also specifies the position information of the detected object in the “object recognition event” of the event information. Specifically, the in-vehicle device 1 designates the relative position from the vehicle position as the sub-item “offset position” of the “object recognition event” as the position information of the object designated as the “object recognition event”. The absolute position of the vehicle is specified by the item “position estimation”. Further, the in-vehicle device 1 includes the information on the absolute position of the object generated based on the estimated own vehicle position and the measured relative position of the object in the upload information Iu, instead of including the information on the relative position of the object in the upload information Iu. May be. The upload information Iu is an example of “object measurement information”, and the information on the absolute position of the vehicle specified by the item “position estimation” or the like is an example of “estimated position information”.
 [地図更新処理]
 次に、サーバ装置6が実行する地図更新処理について説明する。
[Map update processing]
Next, the map update process which the server apparatus 6 performs is demonstrated.
 サーバ装置6は、各車載機1から受信したアップロード情報Iuに基づき、各車載機1が検出したオブジェクトの位置情報を統計処理により算出する。このとき、サーバ装置6は、標準化精度情報S(k)に基づき算出した重み付け値「w(k)」に基づき、対応するオブジェクトの位置情報を重み付けすることにより、配信地図DB20に登録すべきオブジェクトの位置情報を算出する。 The server device 6 calculates the position information of the object detected by each in-vehicle device 1 by statistical processing based on the upload information Iu received from each in-vehicle device 1. At this time, the server device 6 weights the position information of the corresponding object on the basis of the weighting value “w (k)” calculated based on the standardization accuracy information S (k), so that the object to be registered in the distribution map DB 20 The position information of is calculated.
 ここで、まず、重み付け値w(k)の決定方法について説明する。サーバ装置6は、標準化精度情報S(k)が小さいほど、そのときの位置推定精度が良いため、重み付け値w(k)が大きくなるように設定する。言い換えると、サーバ装置6は、標準化精度情報S(k)が大きいほど、そのときの位置推定精度が低いため、重み付け値w(k)が小さくなるように設定する。 Here, first, a method for determining the weighting value w (k) will be described. The server device 6 is set such that the smaller the standardization accuracy information S (k), the better the position estimation accuracy at that time, so that the weighting value w (k) increases. In other words, the server device 6 sets the weighting value w (k) to be smaller because the position estimation accuracy at that time is lower as the standardization accuracy information S (k) is larger.
 例えば、サーバ装置6は、一例として、以下の式(13)~式(15)のいずれかを参照して重み付け値w(k)を算出することにより、標準化精度情報S(k)が大きいほど重み付け値w(k)を0に近付け、標準化精度情報S(k)が負の方向に大きいほど重み付け値w(k)を2に近付けるように生成することができる。 For example, as an example, the server device 6 calculates the weighting value w (k) by referring to any one of the following formulas (13) to (15), so that the standardized accuracy information S (k) increases. The weighting value w (k) can be generated so as to approach 0, and the weighting value w (k) approaches 2 as the standardized accuracy information S (k) increases in the negative direction.
Figure JPOXMLDOC01-appb-M000013
Figure JPOXMLDOC01-appb-M000013
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000015
 図11(A)は、式(13)~式(15)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示す。ここで、グラフG5は、式(13)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示し、グラフG6は、式(14)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示し、グラフG7は、式(15)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示す。図11(A)に示すように、式(13)~式(15)のいずれを参照して標準化精度情報S(k)から重み付け値w(k)を決定した場合であっても、標準化精度情報S(k)が負の方向に大きいほど重み付け値w(k)が2に近付き、標準化精度情報S(k)が大きいほど重み付け値w(k)が0に近付く。
Figure JPOXMLDOC01-appb-M000015
FIG. 11A shows the correspondence between the standardized accuracy information S (k) based on the equations (13) to (15) and the weighting value w (k). Here, the graph G5 shows the correspondence between the standardized accuracy information S (k) based on the equation (13) and the weighting value w (k), and the graph G6 shows the standardized accuracy information S (k) based on the equation (14). ) And the weighting value w (k), and the graph G7 shows the correspondence between the standardized accuracy information S (k) based on the equation (15) and the weighting value w (k). As shown in FIG. 11A, even if the weighting value w (k) is determined from the standardization accuracy information S (k) with reference to any one of the equations (13) to (15), the standardization accuracy The weighting value w (k) approaches 2 as the information S (k) increases in the negative direction, and the weighting value w (k) approaches 0 as the standardized accuracy information S (k) increases.
 また、サーバ装置6は、式(13)~式(15)に係数cを導入した以下の式(16)~式(18)のいずれかを用いることで、標準化精度情報S(k)から重み付け値w(k)への変換割合を好適に調整することが可能である。 Further, the server device 6 uses the following formulas (16) to (18), in which the coefficient c is introduced into the formulas (13) to (15), to weight the standardized accuracy information S (k). It is possible to suitably adjust the conversion ratio to the value w (k).
Figure JPOXMLDOC01-appb-M000016
Figure JPOXMLDOC01-appb-M000016
Figure JPOXMLDOC01-appb-M000017
Figure JPOXMLDOC01-appb-M000017
Figure JPOXMLDOC01-appb-M000018
Figure JPOXMLDOC01-appb-M000018
 図11(B)は、係数cを「0.5」に設定した場合の式(16)~式(18)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示す。ここで、グラフG8は、式(16)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示し、グラフG9は、式(17)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示し、グラフG10は、式(18)に基づく標準化精度情報S(k)と重み付け値w(k)との対応関係を示す。図11(B)に示すように、式(16)~式(18)のいずれかを参照して標準化精度情報S(k)から重み付け値w(k)を決定した場合、式(13)~式(15)のいずれかを参照して標準化精度情報S(k)から重み付け値w(k)を決定した場合と比べて、標準化精度情報S(k)から重み付け値w(k)への変換割合が異なっている。このように、係数cを調整することで、標準化精度情報S(k)から重み付け値w(k)への変換割合を好適に調整することができる。 FIG. 11B shows the correspondence between the standardized accuracy information S (k) based on the equations (16) to (18) and the weighting value w (k) when the coefficient c is set to “0.5”. Show. Here, the graph G8 shows the correspondence between the standardized accuracy information S (k) based on the equation (16) and the weighting value w (k), and the graph G9 shows the standardized accuracy information S (k) based on the equation (17). ) And the weighting value w (k), and the graph G10 shows the correspondence between the standardized accuracy information S (k) based on the equation (18) and the weighting value w (k). As shown in FIG. 11B, when the weighting value w (k) is determined from the standardized accuracy information S (k) with reference to any one of the equations (16) to (18), the equations (13) to (13) Compared to the case where the weighted value w (k) is determined from the standardized accuracy information S (k) with reference to any one of the equations (15), the conversion from the standardized accuracy information S (k) to the weighted value w (k) The ratio is different. Thus, by adjusting the coefficient c, the conversion ratio from the standardized accuracy information S (k) to the weighting value w (k) can be suitably adjusted.
 そして、アップロード制御部18は、緯度、経度、高度、あるいは,ある基準点からのxyz座標値のそれぞれに対し、標準化精度情報に基づき重み付け値w(k)を算出する。なお、点群ベース位置推定のように精度情報の値d(k)が方向ごとに求められていない(即ち1つの精度情報の値d(k)のみが算出される)場合には、アップロード制御部18は、緯度、経度、高度の各精度情報の値d(k)が同一であるとみなし、緯度、経度、高度のそれぞれに対する重み付け値w(k)を算出するとよい。 Then, the upload control unit 18 calculates the weight value w (k) based on the standardization accuracy information for each of the latitude, longitude, altitude, or xyz coordinate value from a certain reference point. When the accuracy information value d (k) is not obtained for each direction (that is, only one accuracy information value d (k) is calculated) as in point cloud base position estimation, upload control is performed. The unit 18 considers that the values d (k) of the latitude, longitude, and altitude accuracy information are the same, and calculates the weighting value w (k) for each of the latitude, longitude, and altitude.
 次に、配信地図DB20に反映するオブジェクトの位置情報の算出方法について説明する。ここで、各アップロード情報Iuにより示されるオブジェクトの絶対位置(「オブジェクト計測位置」とも呼ぶ。)を[p(k)、p(k)、p(k)]とし、アップロード情報Iuに含まれる標準化精度情報に基づき算出した重み付け値を[w(k)、w(k)、w(k)]とすると、サーバ装置6は、複数の車両からそれぞれ複数の時刻のアップロード情報Iuを受取り、トータルでm個の情報であった場合,以下の式(19)~式(21)により、配信地図DB20に反映すべきオブジェクトの絶定位置(「オブジェクト推定位置」とも呼ぶ。)[p 、p 、p を算出する。 Next, a method for calculating the position information of the object reflected in the distribution map DB 20 will be described. Here, the absolute position (also referred to as “object measurement position”) of the object indicated by each upload information Iu is [p x (k), p y (k), p z (k)] T , and the upload information Iu. If based on the standard-precision information calculated weighted value [w x (k), w y (k), w z (k)] and T included in the server device 6, each of the plurality of time from a plurality of vehicles When the upload information Iu is received and there are a total of m pieces of information, the fixed position (also referred to as “object estimated position”) of the object to be reflected in the distribution map DB 20 according to the following equations (19) to (21). )) [P ^ x , p ^ y , p ^ z ] T is calculated.
Figure JPOXMLDOC01-appb-M000019
Figure JPOXMLDOC01-appb-M000019
Figure JPOXMLDOC01-appb-M000020
Figure JPOXMLDOC01-appb-M000020
Figure JPOXMLDOC01-appb-M000021
Figure JPOXMLDOC01-appb-M000021
 このように、サーバ装置6は、重み付け平均化処理により、配信地図DB20に反映すべきオブジェクト推定位置を好適に算出することができる。 Thus, the server device 6 can suitably calculate the estimated object position to be reflected in the distribution map DB 20 by the weighted averaging process.
 また、サーバ装置6は、オブジェクト推定位置に加えて、オブジェクト推定位置の算出に用いたオブジェクト計測位置の数(「サンプル数」とも呼ぶ。)と、オブジェクト推定位置の算出に用いたオブジェクト計測位置のばらつきの指標値(「ばらつき指標値」とも呼ぶ。)と、をさらに算出する。ここで、ばらつき指標値は、例えば分散や標準偏差である。そして、サーバ装置6は、オブジェクトごとに上述のサンプル数及びばらつき指標値を関連付けたオブジェクト推定位置を配信地図DB20に登録する。ここで、一般に、サンプル数が多いほど、算出されたオブジェクト推定位置の信頼性が高い。同様に、ばらつき指標値が示すばらつきが小さいほど、算出されたオブジェクト推定位置の信頼性が高い。よって、サーバ装置6は、推定したオブジェクトの位置情報と共にサンプル数及びばらつき指標値の情報を配信地図DB20に登録することで、登録されたオブジェクトの位置情報の信頼性の指標となる情報を、好適に配信地図DB20に付加することができる。 In addition to the estimated object position, the server device 6 also includes the number of object measurement positions (also referred to as “number of samples”) used to calculate the estimated object position and the object measurement position used to calculate the estimated object position. A variation index value (also referred to as “variation index value”) is further calculated. Here, the variation index value is, for example, variance or standard deviation. And the server apparatus 6 registers the object estimated position which linked | related the above-mentioned sample number and the dispersion | variation index value for every object in delivery map DB20. Here, generally, the greater the number of samples, the higher the reliability of the calculated estimated object position. Similarly, the smaller the variation indicated by the variation index value, the higher the reliability of the calculated object estimated position. Therefore, the server device 6 registers information on the number of samples and the variation index value together with the estimated position information of the object in the distribution map DB 20, so that information serving as a reliability index of the registered position information of the object is preferably used. Can be added to the distribution map DB 20.
 [処理フロー]
 図12は、アップロード情報Iuおよびダウンロード情報Idの送受信に関する処理の概要を示すフローチャートの一例である。
[Processing flow]
FIG. 12 is an example of a flowchart showing an outline of processing related to transmission / reception of upload information Iu and download information Id.
 まず、車載機1は、自車位置推定を行い、精度情報の算出及び記憶部12への記憶を行う(ステップS101)。次に、車載機1は、ライダ2等の外界センサの出力に基づき所定のオブジェクトを検出したか否か判定する(ステップS102)。即ち、車載機1は、アップロード情報Iuにより位置情報を通知すべき対象となるオブジェクトを検出したか否か判定する。そして、車載機1は、所定のオブジェクトを検出した場合(ステップS102;Yes)、現在時刻に生成された精度情報と、ステップS101で記憶された過去所定時間内の精度情報の値の平均及び標準偏差とに基づき、標準化精度情報を算出する(ステップS103)。そして、車載機1は、検出したオブジェクトの位置情報や識別情報などのオブジェクトに関する情報とステップS103で算出した標準化精度情報とを含むアップロード情報Iuをサーバ装置6へ送信する(ステップS104)。一方、車載機1は、所定のオブジェクトを検出していない場合(ステップS102;No)、ステップS101へ処理を戻す。 First, the vehicle-mounted device 1 estimates its own vehicle position, calculates accuracy information, and stores it in the storage unit 12 (step S101). Next, the in-vehicle device 1 determines whether or not a predetermined object has been detected based on the output of an external sensor such as the lidar 2 (step S102). That is, the in-vehicle device 1 determines whether an object to be notified of position information is detected based on the upload information Iu. And the vehicle equipment 1 will detect the predetermined | prescribed object (step S102; Yes), the accuracy information produced | generated at the present time, and the average and standard of the value of the accuracy information within the past predetermined time memorize | stored by step S101. Based on the deviation, standardization accuracy information is calculated (step S103). And the vehicle equipment 1 transmits the upload information Iu containing the information regarding objects, such as the positional information and identification information of the detected object, and the standardization precision information calculated by step S103 to the server apparatus 6 (step S104). On the other hand, the vehicle equipment 1 returns a process to step S101, when the predetermined object is not detected (step S102; No).
 次に、車載機1からアップロード情報Iuを受信したサーバ装置6は、アップロード情報Iuをアップロード情報DB27に記憶する(ステップS201)。この場合、図13を参照して後述するように、サーバ装置6は、同一オブジェクトに対し、複数の車両の車載機1からそれぞれ複数の時刻のアップロード情報Iuを受信する。 Next, the server device 6 that has received the upload information Iu from the in-vehicle device 1 stores the upload information Iu in the upload information DB 27 (step S201). In this case, as will be described later with reference to FIG. 13, the server device 6 receives, for the same object, upload information Iu at a plurality of times from the in-vehicle devices 1 of the plurality of vehicles.
 次に、サーバ装置6は、配信地図DB20の更新タイミングか否か判定する(ステップS202)。そして、サーバ装置6は、配信地図DB20の更新タイミングではない場合(ステップS202;No)、引き続き車載機1からアップロード情報Iuを受信してステップS201の処理を実行する。 Next, the server device 6 determines whether or not it is the update timing of the distribution map DB 20 (step S202). And when it is not the update timing of distribution map DB20 (step S202; No), the server apparatus 6 receives the upload information Iu from the vehicle equipment 1, and performs the process of step S201.
 一方、サーバ装置6は、配信地図DB20の更新タイミングであると判断した場合(ステップS202;Yes)、アップロード情報DB27を参照し、車載機1が検出した各オブジェクトに対し、標準化精度情報に基づく重み付け平均化処理を行うことで、式(19)~式(21)に示されるオブジェク推定位置[p 、p 、p を算出する(ステップS203)。この場合、サーバ装置6は、例えば、式(13)~式(18)のいずれかを参照して標準化精度情報S(k)から重み付け値w(k)を決定する。これにより、オブジェクトを検出した車載機1の位置推定精度が高いほど、当該車載機1が計測したオブジェクト計測位置に対して高い重み付け値を設定し、オブジェクト推定位置を高精度に算出することができる。 On the other hand, when the server device 6 determines that it is the update timing of the distribution map DB 20 (step S202; Yes), the upload information DB 27 is referred to, and each object detected by the in-vehicle device 1 is weighted based on the standardization accuracy information. By performing the averaging process, the object estimated positions [p ^ x , p ^ y , p ^ z ] T shown in the equations (19) to (21) are calculated (step S203). In this case, the server device 6 determines the weighting value w (k) from the standardized accuracy information S (k) with reference to, for example, any one of the equations (13) to (18). Thereby, the higher the position estimation accuracy of the vehicle-mounted device 1 that has detected the object, the higher the weighted value can be set for the object measurement position measured by the vehicle-mounted device 1, and the object estimated position can be calculated with higher accuracy. .
 また、サーバ装置6は、更新対象のオブジェクトごとに、オブジェクト推定位置の算出に用いたオブジェクト計測位置のサンプル数と、当該オブジェクト計測位置のばらつきを示すばらつき指標値とをそれぞれ算出し、オブジェクト推定位置と共に配信地図DB20に登録する(ステップS204)。そして、サーバ装置6は、オブジェクトごとのオブジェクト推定位置、サンプル数、ばらつき指標値の組合せ等を地図更新情報として含んだダウンロード情報Idを各車載機1へ送信する(ステップS205)。 Further, the server device 6 calculates, for each object to be updated, the number of samples of the object measurement position used for calculating the object estimated position and the variation index value indicating the variation of the object measured position, and the estimated object position At the same time, it is registered in the distribution map DB 20 (step S204). Then, the server device 6 transmits download information Id including the estimated object position for each object, the number of samples, a combination of variation index values, and the like as map update information to each vehicle-mounted device 1 (step S205).
 車載機1は、ダウンロード情報Idを受信した場合(ステップS105;Yes)、当該ダウンロード情報Idを用いて地図DB10を更新する(ステップS106)。例えば、車載機1は、ダウンロード情報Idに基づきオブジェクト情報IOを更新する。これにより、オブジェクト情報IOに含まれる各オブジェクトの位置情報には、サンプル数及びばらつき指標値が関連付けられる。その後、車載機1は、例えば、オブジェクト情報IOに含まれるサンプル数及びばらつき指標値を参照してランドマークベース位置推定に用いるランドマークを選定してもよい。この場合、例えば、車載機1は、サンプル数が所定値以上であって、ばらつき指標値が示すばらつきが所定度合以下となるオブジェクトを、信頼性が高い位置情報が登録されていると判断し、ランドマークとして選定する。一方、車載機1は、サーバ装置6からダウンロード情報Idを受信していない場合(ステップS105;No)、ステップS101へ処理を戻す。 When the in-vehicle device 1 receives the download information Id (step S105; Yes), it updates the map DB 10 using the download information Id (step S106). For example, the in-vehicle device 1 updates the object information IO based on the download information Id. Thereby, the number of samples and the variation index value are associated with the position information of each object included in the object information IO. Thereafter, the in-vehicle device 1 may select, for example, a landmark to be used for landmark base position estimation with reference to the number of samples and the variation index value included in the object information IO. In this case, for example, the in-vehicle device 1 determines that an object whose number of samples is equal to or greater than a predetermined value and the variation indicated by the variation index value is equal to or less than a predetermined degree is registered with highly reliable position information, Select as a landmark. On the other hand, the vehicle equipment 1 returns a process to step S101, when the download information Id is not received from the server apparatus 6 (step S105; No).
 なお、図12のフローチャートは、自車位置の推定精度が異なる複数の車両の車載機1によって実行される処理であり、同一オブジェクトに対して、それら各々の車両の車載機1から、様々な時刻においてアップロード情報Iuがサーバ装置6へ送信される。図13(A)~(C)は、それぞれ異なる時刻において障害物30をライダ2により検出した車両A乃至車両Dの位置及び自車位置推定精度を示す図である。なお、点31aは車両の推定位置座標を示し、楕円32aは車両の自車位置推定精度を示す。ここで、楕円32aの大きさは、自車位置推定精度が良いほど小さく、自車位置推定精度が悪いほど大きく表現されている。そして、楕円32aの経の長さは、その経の伸長方向に対する自車位置推定精度を示し、自車位置推定精度が良いほど経が短く表現されている。 Note that the flowchart of FIG. 12 is a process executed by the in-vehicle devices 1 of a plurality of vehicles having different estimation accuracy of the own vehicle position, and for various times from the in-vehicle devices 1 of the respective vehicles for the same object. The upload information Iu is transmitted to the server device 6. FIGS. 13A to 13C are diagrams showing the positions of the vehicles A to D and the vehicle position estimation accuracy when the obstacle 30 is detected by the lidar 2 at different times. The point 31a indicates the estimated position coordinates of the vehicle, and the ellipse 32a indicates the vehicle position estimation accuracy of the vehicle. Here, the size of the ellipse 32a is expressed as being smaller as the vehicle position estimation accuracy is higher, and larger as the vehicle position estimation accuracy is lower. The length of the ellipse 32a indicates the vehicle position estimation accuracy with respect to the extension direction of the ellipse 32a. The better the vehicle position estimation accuracy is, the shorter the length is expressed.
 車両Aの車載機1は、図13(A)に示される位置において障害物30をライダ2により検出し、それぞれの自車両の推定位置座標を基に障害物30の座標を算出し、各位置における障害物30の検出結果を示すアップロード情報Iuをそれぞれサーバ装置6へ送信する。また、車両Bの車載機1は、図13(A)とは異なる時刻において、図13(B)に示される位置で、障害物30を検出し、その検出結果を示すアップロード情報Iuをサーバ装置6へ送信する。図13(C)で示される車両C及び車両Dについても同様である。このように、同一オブジェクト(ここでは障害物30)に対して、複数の車両A乃至Dがそれぞれ検出結果を示すアップロード情報Iuをサーバ装置6へ送信することになる。ここで、図13(A)~(C)に示されるように、車両A乃至Dは、自車位置推定精度(楕円32a参照)がそれぞれ異なる。 The vehicle-mounted device 1 of the vehicle A detects the obstacle 30 at the position shown in FIG. 13A by the lidar 2 and calculates the coordinates of the obstacle 30 based on the estimated position coordinates of the respective own vehicle. The upload information Iu indicating the detection result of the obstacle 30 is transmitted to the server device 6 respectively. Further, the vehicle-mounted device 1 of the vehicle B detects the obstacle 30 at a position shown in FIG. 13B at a time different from that in FIG. 13A, and upload information Iu indicating the detection result is sent to the server device. 6 to send. The same applies to the vehicle C and the vehicle D shown in FIG. As described above, the plurality of vehicles A to D transmit the upload information Iu indicating the detection results to the server device 6 for the same object (here, the obstacle 30). Here, as shown in FIGS. 13A to 13C, the vehicles A to D have different own vehicle position estimation accuracy (see the ellipse 32a).
 このように、本実施例では、同一オブジェクトに対して、自車位置の推定精度が異なる複数の車両の車載機1から、複数の時刻においてアップロード情報Iuがサーバ装置6へ送信される。そして、サーバ装置6は、ステップS203において、更新対象の各オブジェクトに対し、複数の車両の車載機1のそれぞれから受信したアップロード情報Iuに含まれる標準化精度情報に基づく重み付け平均化処理を行うことで、オブジェク推定位置を算出する。これにより、位置推定手法の違いによらず、自車位置推定精度が高い車両の車載機1からのアップロード情報Iuに基づくオブジェクト計測位置の重み付け値を高くし、自車位置推定精度が低い車両の車載機1からのアップロード情報Iuに基づくオブジェクト計測位置の重み付け値を低くする。これにより、サーバ装置6は、配信地図DB20に登録するオブジェクト推定位置を正確に決定することができる。 As described above, in this embodiment, the upload information Iu is transmitted to the server device 6 at a plurality of times from the vehicle-mounted devices 1 of a plurality of vehicles having different estimation accuracy of the vehicle position for the same object. In step S203, the server device 6 performs a weighted averaging process based on the standardization accuracy information included in the upload information Iu received from each of the in-vehicle devices 1 of the plurality of vehicles, for each object to be updated. The object estimated position is calculated. Accordingly, regardless of the position estimation method, the weight value of the object measurement position based on the upload information Iu from the vehicle-mounted device 1 of the vehicle having high vehicle position estimation accuracy is increased, and the vehicle position estimation accuracy is low. The weight value of the object measurement position based on the upload information Iu from the in-vehicle device 1 is lowered. Thereby, the server apparatus 6 can determine correctly the object estimated position registered into distribution map DB20.
 [変形例]
 以下、実施例に好適な変形例について説明する。以下の変形例は、組み合わせてこれらの実施例に適用してもよい。
[Modification]
Hereinafter, modified examples suitable for the embodiments will be described. The following modifications may be applied to these embodiments in combination.
 (変形例1)
 サーバ装置6は、アップロード情報Iuの送信元である車載機1ごと(即ち車載機1を搭載する車両ごと)に重み付け値(「車両重み付け値」とも呼ぶ。)をさらに設定し、車両重み付け値に基づきオブジェクトのオブジェクト推定位置[p 、p 、p を算出してもよい。この車両重み付け値は、オブジェクトの検出に用いるセンサの精度や性能に応じた値となるように後述する方法によって設定される。
(Modification 1)
The server device 6 further sets a weighting value (also referred to as “vehicle weighting value”) for each vehicle-mounted device 1 that is the transmission source of the upload information Iu (that is, for each vehicle in which the vehicle-mounted device 1 is mounted), and sets the vehicle weighting value. Based on the object estimated position [p ^ x , p ^ y , p ^ z ] T of the object may be calculated. This vehicle weighting value is set by a method to be described later so as to be a value according to the accuracy and performance of the sensor used for detecting the object.
 ここで、車両重み付け値を「W」とすると、サーバ装置6は、以下の式(22)~(24)に基づき、オブジェクト推定位置[p 、p 、p を算出する。 Here, assuming that the vehicle weighting value is “W v ”, the server device 6 sets the object estimated positions [p ^ x , p ^ y , p ^ z ] T based on the following equations (22) to (24). calculate.
Figure JPOXMLDOC01-appb-M000022
Figure JPOXMLDOC01-appb-M000022
Figure JPOXMLDOC01-appb-M000023
Figure JPOXMLDOC01-appb-M000023
Figure JPOXMLDOC01-appb-M000024
 この場合、サーバ装置6は、車両IDごとに車両重み付け値を対応付けた情報(「車両重み付け情報IV」とも呼ぶ。)を記憶し、当該車両重み付け情報IVを参照して式(22)~(24)に基づきオブジェクトのオブジェクト推定位置を算出する。
Figure JPOXMLDOC01-appb-M000024
In this case, the server device 6 stores information (also referred to as “vehicle weighting information IV”) in which a vehicle weighting value is associated with each vehicle ID, and refers to the vehicle weighting information IV to formulas (22) to (22) 24) The estimated object position of the object is calculated based on 24).
 また、好適には、サーバ装置6は、地図更新処理時に車両重み付け情報IVに記録された車両重み付け値を更新する。例えば、車両重み付け値を0.1ステップで0から1の11段階とし、算出したオブジェクト推定位置と、あるアップロード情報Iuが示すオブジェクト計測位置とのずれ量(即ち指し示す位置の距離)が所定の閾値以上である場合、当該アップロード情報Iuの送信元の車両の重み付け値を、車両重み付け情報IVに記録された値から0.1だけ下げる。一方、サーバ装置6は、算出したオブジェクト推定位置と、あるアップロード情報Iuが示すオブジェクト計測位置とのずれ量(即ち指し示す位置の距離)が所定の閾値未満である場合、当該アップロード情報Iuの送信元の車両の重み付け値を、車両重み付け情報IVに記録された値から0.1だけ上げる。したがって、算出されるオブジェクト推定位置に対してオブジェクト計測位置の差異が大きい場合が多い車両は、その車両重み付け値は0に近づき、オブジェクト推定位置の算出にはあまり寄与しなくなる。一方、算出されるオブジェクト推定位置に対してオブジェクト計測位置の差異が小さい場合が多い車両は、その車両重み付け値は1に近づき、オブジェクト推定位置の算出に大きく寄与するようになる。なお、サーバ装置6は、車両重み付け情報IVに車両重み付け値が記録されていない車両からアップロード情報Iuを受信した場合、当該車両に対する車両重み付け値を所定の初期値(例えば0.5)とし、算出したオブジェクト推定位置と、当該アップロード情報Iuが示すオブジェクト計測位置とのずれ量に基づき上述の初期値を更新する。 Also preferably, the server device 6 updates the vehicle weight value recorded in the vehicle weight information IV during the map update process. For example, the vehicle weighting value is set to 11 steps from 0 to 1 in 0.1 steps, and the deviation amount (that is, the distance between the indicated positions) between the calculated object estimated position and the object measurement position indicated by certain upload information Iu is a predetermined threshold value. In the case above, the weighting value of the transmission source vehicle of the upload information Iu is lowered by 0.1 from the value recorded in the vehicle weighting information IV. On the other hand, if the amount of deviation between the calculated estimated object position and the object measurement position indicated by certain upload information Iu (that is, the distance between the indicated positions) is less than a predetermined threshold, the server device 6 transmits the upload information Iu. The vehicle weighting value is increased by 0.1 from the value recorded in the vehicle weighting information IV. Therefore, in a vehicle in which the difference in the object measurement position is often large with respect to the calculated object estimation position, the vehicle weighting value approaches 0 and does not contribute much to the calculation of the object estimation position. On the other hand, a vehicle in which the difference in the object measurement position is often small with respect to the calculated object estimated position has a vehicle weight value approaching 1 and greatly contributes to the calculation of the object estimated position. When the server device 6 receives the upload information Iu from a vehicle for which no vehicle weight value is recorded in the vehicle weight information IV, the server device 6 calculates the vehicle weight value for the vehicle as a predetermined initial value (for example, 0.5). The initial value is updated based on the amount of deviation between the estimated object position and the object measurement position indicated by the upload information Iu.
 このように、車両重み付け値を用いた平均化処理によりオブジェクト推定位置を算出することで、誤差の大きい計測値による影響を低減し、オブジェクト位置の推定精度を高めることができる。例えばライダ2などの外界センサに不備があってオブジェクト計測位置に常態的なずれが生じる車両、又は、計測精度が低い外界センサを備える車両等により計測されたオブジェクト計測位置への重み付けが小さくなる。よって、本変形例によれば、外界センサに不備がある車両や外界センサの計測精度が低い車両等からアップロード情報Iuを受信した場合であっても、オブジェクト推定位置の算出に影響を受けにくくなる。一方、このような車両は、メンテナンス等によって、オブジェクト計測位置にずれが生じなくなれば、当該車両の車両重み付け値が少しずつ大きくなり、オブジェクト推定位置の算出に反映されるようになる。 As described above, by calculating the object estimated position by the averaging process using the vehicle weighting value, it is possible to reduce the influence due to the measurement value having a large error and to increase the estimation accuracy of the object position. For example, an object measurement position measured by a vehicle in which an external sensor such as the lidar 2 is deficient and the object measurement position is normally deviated or a vehicle having an external sensor with low measurement accuracy is reduced. Therefore, according to this modification, even when the upload information Iu is received from a vehicle having a defect in the external sensor, a vehicle having a low measurement accuracy of the external sensor, or the like, the calculation of the estimated object position is less affected. . On the other hand, in such a vehicle, if there is no deviation in the object measurement position due to maintenance or the like, the vehicle weight value of the vehicle gradually increases and is reflected in the calculation of the object estimated position.
 図14は、本変形例に係るサーバ装置6の地図更新処理に関する手順を示すフローチャートである。 FIG. 14 is a flowchart showing a procedure related to map update processing of the server device 6 according to the present modification.
 車載機1からアップロード情報Iuを受信したサーバ装置6は、アップロード情報Iuをアップロード情報DB27に記憶する(ステップS211)。そして、サーバ装置6は、配信地図DB20の更新タイミングであると判断した場合(ステップS212;Yes)、アップロード情報DB27及び車両重み付け情報IVを参照し、車載機1が検出した各オブジェクトに対し、車両重み付け値と標準化精度情報に基づく重み付け値とを用いて、式(22)~式(24)からオブジェクト推定位置[p 、p 、p を算出する(ステップS213)。そして、サーバ装置6は、オブジェクトごとに、オブジェクト推定位置の算出に用いたオブジェクト計測位置の数を示すサンプル数と、当該オブジェクト計測位置のばらつきを示すばらつき指標値とをそれぞれ算出し、オブジェクト推定位置に関連付けて配信地図DB20に登録する(ステップS214)。 The server device 6 that has received the upload information Iu from the in-vehicle device 1 stores the upload information Iu in the upload information DB 27 (step S211). When the server device 6 determines that it is the update timing of the distribution map DB 20 (step S212; Yes), the server device 6 refers to the upload information DB 27 and the vehicle weighting information IV, and for each object detected by the vehicle-mounted device 1, the vehicle Using the weighting value and the weighting value based on the standardized accuracy information, the estimated object position [p ^ x , p ^ y , p ^ z ] T is calculated from the expressions (22) to (24) (step S213). Then, the server device 6 calculates, for each object, the number of samples indicating the number of object measurement positions used to calculate the object estimated position and the variation index value indicating the variation of the object measurement position, and the estimated object position And is registered in the distribution map DB 20 (step S214).
 次に、サーバ装置6は、ステップS213で算出した各オブジェクトのオブジェクト推定位置と、当該オブジェクト推定位置の算出に用いた各オブジェクト計測位置とのずれ量に基づき、車両重み付け情報IVに記録された車両重み付け値を更新する(ステップS215)。そして、サーバ装置6は、オブジェクトごとのオブジェクト推定位置、サンプル数、ばらつき指標値の組合せを含むダウンロード情報Idを各車載機1へ送信する(ステップS216)。 Next, the server device 6 records the vehicle recorded in the vehicle weighting information IV based on the deviation amount between the object estimated position of each object calculated in step S213 and each object measurement position used for calculating the object estimated position. The weighting value is updated (step S215). Then, the server device 6 transmits download information Id including a combination of the estimated object position, the number of samples, and the variation index value for each object to each vehicle-mounted device 1 (step S216).
 なお、サーバ装置6は、標準化精度情報に基づく重み付け値を用いることなく、車両重み付け値のみを用いた重み付け平均化処理により、オブジェクト推定位置を算出してもよい。この場合、サーバ装置6は、以下の式(25)~式(27)に基づき、オブジェクト推定位置[p 、p 、p を算出する。 Note that the server device 6 may calculate the object estimated position by weighted averaging processing using only the vehicle weight values without using the weight values based on the standardization accuracy information. In this case, the server device 6 calculates the object estimated position [p ^ x , p ^ y , p ^ z ] T based on the following equations (25) to (27).
Figure JPOXMLDOC01-appb-M000025
Figure JPOXMLDOC01-appb-M000025
Figure JPOXMLDOC01-appb-M000026
Figure JPOXMLDOC01-appb-M000026
Figure JPOXMLDOC01-appb-M000027
 この場合であっても、サーバ装置6は、車両ごとのオブジェクト検出結果の信頼性に基づき好適に配信地図DB20に登録すべきオブジェクトの位置情報を推定することができる。なお、この場合、車載機1は、アップロード情報Iuに標準化精度情報を含めなくともよい。
Figure JPOXMLDOC01-appb-M000027
Even in this case, the server device 6 can preferably estimate the position information of the object to be registered in the distribution map DB 20 based on the reliability of the object detection result for each vehicle. In this case, the in-vehicle device 1 may not include the standardization accuracy information in the upload information Iu.
 (変形例2)
 サーバ装置6は、標準化精度情報の算出処理を車載機1の代わりに実行してもよい。この場合、車載機1は、図12のステップS102においてオブジェクトを検出した場合に、現在時刻の精度情報と、過去所定時間内の精度情報の平均及び標準偏差とをアップロード情報Iuに含めてサーバ装置6へアップロード情報Iuを送信する。そして、サーバ装置6は、ステップS201でアップロード情報Iuの受信後、又は、ステップS202で配信地図DB20の更新タイミングと判定後などに、受信したアップロード情報Iuごとに標準化精度情報を算出する。
(Modification 2)
The server device 6 may execute the standardization accuracy information calculation process instead of the in-vehicle device 1. In this case, when the vehicle-mounted device 1 detects an object in step S102 in FIG. 12, the server device includes the accuracy information of the current time and the average and standard deviation of accuracy information within the past predetermined time in the upload information Iu. The upload information Iu is transmitted to 6. Then, the server device 6 calculates standardization accuracy information for each received upload information Iu after receiving the upload information Iu in step S201 or after determining the update timing of the distribution map DB 20 in step S202.
 (変形例3)
 図1に示す地図更新システムの構成は一例であり、本発明が適用可能な地図更新システムの構成は図1に示す構成に限定されない。例えば、地図更新システムは、車載機1を有する代わりに、車両の電子制御装置が車載機1の自車位置推定部17、アップロード制御部18、及び自動運転制御部19の処理を実行してもよい。この場合、地図DB10は、例えば車両内の記憶部に記憶され、車両の電子制御装置は、サーバ装置6とのアップロード情報Iu及びダウンロード情報Idの授受を、車載機1を介して又は図示しない通信部を介して行ってもよい。
(Modification 3)
The configuration of the map update system shown in FIG. 1 is an example, and the configuration of the map update system to which the present invention is applicable is not limited to the configuration shown in FIG. For example, in the map update system, instead of having the in-vehicle device 1, the electronic control device of the vehicle executes the processes of the vehicle position estimation unit 17, the upload control unit 18, and the automatic driving control unit 19 of the in-vehicle device 1. Good. In this case, the map DB 10 is stored in, for example, a storage unit in the vehicle, and the electronic control device of the vehicle exchanges upload information Iu and download information Id with the server device 6 via the in-vehicle device 1 or communication (not shown). You may go through the part.
 1 車載機
 2 ライダ
 3 ジャイロセンサ
 4 車速センサ
 5 GPS受信機
 6 サーバ装置
 10 地図DB
 20 配信地図DB
DESCRIPTION OF SYMBOLS 1 In-vehicle apparatus 2 Rider 3 Gyro sensor 4 Vehicle speed sensor 5 GPS receiver 6 Server apparatus 10 Map DB
20 Distribution map DB

Claims (8)

  1.  移動体について推定された位置を示す推定位置情報に基づき、前記移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成部と、
     前記推定位置情報の推定精度を示す精度情報を取得する取得部と、
     前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信部と、
    を有する情報送信装置。
    A generating unit that generates object measurement information including object position information of an object measured by a measurement device mounted on the moving body, based on estimated position information indicating a position estimated for the moving body;
    An acquisition unit for acquiring accuracy information indicating the estimation accuracy of the estimated position information;
    A transmission unit for transmitting the object measurement information and the accuracy information to an information processing device;
    An information transmission apparatus having
  2.  前記取得部は、前記移動体の位置を推定した時点を含む所定期間における前記推定精度の平均及び標準偏差に基づき、平均及び標準偏差が標準化された前記推定精度を示す標準化精度情報を生成し、
     前記送信部は、前記物体計測情報と、前記標準化精度情報とを、前記情報処理装置へ送信する請求項1に記載の情報送信装置。
    The acquisition unit generates standardized accuracy information indicating the estimated accuracy in which the average and standard deviation are standardized based on the average and standard deviation of the estimated accuracy in a predetermined period including the time when the position of the moving object is estimated,
    The information transmission apparatus according to claim 1, wherein the transmission unit transmits the object measurement information and the standardization accuracy information to the information processing apparatus.
  3.  前記移動体の位置は、複数の方式による位置推定結果の重み付け平均により推定が行われ、
     前記取得部は、前記複数の方式による位置推定結果の各々に対する標準化精度情報を生成し、当該標準化精度情報を前記重み付け平均と同一の重み付けにより平均化し、
     前記送信部は、前記物体計測情報と、前記平均化された標準化精度情報とを、前記情報処理装置へ送信する請求項2に記載の情報送信装置。
    The position of the moving body is estimated by weighted average of position estimation results by a plurality of methods,
    The acquisition unit generates standardized accuracy information for each of the position estimation results by the plurality of methods, averages the standardized accuracy information by the same weighting as the weighted average,
    The information transmission device according to claim 2, wherein the transmission unit transmits the object measurement information and the averaged standardization accuracy information to the information processing device.
  4.  移動体に搭載された計測装置が計測した物体に関する物体計測情報を収集する情報処理装置に送信されるデータのデータ構造であって、
     前記移動体について推定された位置を示す推定位置情報に基づき生成された、前記物体の物体位置情報を含む前記物体計測情報と、
     前記推定位置情報の推定精度を示す精度情報と、を含み、
     地図上における前記物体の位置を前記情報処理装置が決定するのに用いられるデータのデータ構造。
    A data structure of data transmitted to an information processing device that collects object measurement information about an object measured by a measurement device mounted on a moving body,
    The object measurement information including object position information of the object, generated based on estimated position information indicating a position estimated for the moving object;
    Accuracy information indicating the estimation accuracy of the estimated position information,
    A data structure of data used for the information processing apparatus to determine the position of the object on a map.
  5.  移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成部と、
     前記物体を計測した際に推定された前記移動体の位置の推定精度を示す精度情報を取得する取得部と、
     前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信部と、
    を有する情報送信装置。
    A generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body;
    An acquisition unit that acquires accuracy information indicating the estimation accuracy of the position of the moving body estimated when the object is measured;
    A transmission unit for transmitting the object measurement information and the accuracy information to an information processing device;
    An information transmission apparatus having
  6.  情報送信装置が実行する制御方法であって、
     移動体について推定された位置を示す推定位置情報に基づき、前記移動体に搭載された計測装置が計測した物体の物体位置情報を含む物体計測情報を生成する生成工程と、
     前記推定位置情報の推定精度を示す精度情報を取得する取得工程と、
     前記物体計測情報と前記精度情報とを、情報処理装置へ送信する送信工程と、
    を有する制御方法。
    A control method executed by the information transmitting apparatus,
    Based on the estimated position information indicating the position estimated for the moving body, generating step for generating object measurement information including the object position information of the object measured by the measurement device mounted on the moving body;
    An acquisition step of acquiring accuracy information indicating the estimation accuracy of the estimated position information;
    A transmission step of transmitting the object measurement information and the accuracy information to an information processing device;
    A control method.
  7.  請求項6に記載の制御方法を、コンピュータにより実行させるプログラム。 A program that causes a computer to execute the control method according to claim 6.
  8.  請求項7に記載のプログラムを記憶した記憶媒体。 A storage medium storing the program according to claim 7.
PCT/JP2019/012176 2018-03-28 2019-03-22 Information transmission device, data structure, control method, program, and storage medium WO2019188820A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018063279 2018-03-28
JP2018-063279 2018-03-28

Publications (1)

Publication Number Publication Date
WO2019188820A1 true WO2019188820A1 (en) 2019-10-03

Family

ID=68061873

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/012176 WO2019188820A1 (en) 2018-03-28 2019-03-22 Information transmission device, data structure, control method, program, and storage medium

Country Status (1)

Country Link
WO (1) WO2019188820A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021149567A (en) * 2020-03-19 2021-09-27 株式会社Soken Moving object sensing system
CN114526744A (en) * 2020-11-05 2022-05-24 丰田自动车株式会社 Map updating device and map updating method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002036990A (en) * 2000-07-26 2002-02-06 Honda Motor Co Ltd Lane deviation warning device for vehicle
JP2004198997A (en) * 2002-12-20 2004-07-15 Denso Corp Map evaluation system, collating device and map evaluation device
JP2016156973A (en) * 2015-02-25 2016-09-01 パイオニア株式会社 Map data storage device, control method, program and recording medium
JP2016180980A (en) * 2015-03-23 2016-10-13 株式会社豊田中央研究所 Information processing device, program, and map data updating system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002036990A (en) * 2000-07-26 2002-02-06 Honda Motor Co Ltd Lane deviation warning device for vehicle
JP2004198997A (en) * 2002-12-20 2004-07-15 Denso Corp Map evaluation system, collating device and map evaluation device
JP2016156973A (en) * 2015-02-25 2016-09-01 パイオニア株式会社 Map data storage device, control method, program and recording medium
JP2016180980A (en) * 2015-03-23 2016-10-13 株式会社豊田中央研究所 Information processing device, program, and map data updating system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021149567A (en) * 2020-03-19 2021-09-27 株式会社Soken Moving object sensing system
JP7328170B2 (en) 2020-03-19 2023-08-16 株式会社Soken Moving object detection system
CN114526744A (en) * 2020-11-05 2022-05-24 丰田自动车株式会社 Map updating device and map updating method
CN114526744B (en) * 2020-11-05 2024-03-22 丰田自动车株式会社 Map updating device and map updating method

Similar Documents

Publication Publication Date Title
CN106352867B (en) Method and device for determining the position of a vehicle
JP2022113746A (en) Determination device
JP7155284B2 (en) Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium
JP6806891B2 (en) Information processing equipment, control methods, programs and storage media
JP2007333385A (en) Immobile object position recording device
JP6589570B2 (en) Center processing apparatus, map generation system, and program
JP6980010B2 (en) Self-position estimator, control method, program and storage medium
EP3671278B1 (en) Road surface detection
JP2022176322A (en) Self-position estimation device, control method, program, and storage medium
JP2023164553A (en) Position estimation device, estimation device, control method, program and storage medium
JP2023054314A (en) Information processing device, control method, program, and storage medium
JP2019174191A (en) Data structure, information transmitting device, control method, program, and storage medium
WO2019188820A1 (en) Information transmission device, data structure, control method, program, and storage medium
JP2023174739A (en) Data structure, information processing device, and map data generator
WO2019188886A1 (en) Terminal device, information processing method, and storage medium
JP2023076673A (en) Information processing device, control method, program and storage medium
WO2019188874A1 (en) Data structure, information processing device, and map data generation device
JP2019174675A (en) Data structure, map data generator, control method, program, and storage medium
JP2020046411A (en) Data structure, storage device, terminal device, server device, control method, program, and storage medium
WO2019188877A1 (en) Information transmission device, data structure, control method, program, and storage medium
Verentsov et al. Bayesian framework for vehicle localization using crowdsourced data
WO2012089274A1 (en) System and method for automatic road detection
WO2019189098A1 (en) Self-position estimation device, self-position estimation method, program, and recording medium
CN115979257A (en) Hierarchical fusion positioning method, system, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19775550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19775550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP