WO2019188745A1 - Information processing device, control method, program, and storage medium - Google Patents

Information processing device, control method, program, and storage medium Download PDF

Info

Publication number
WO2019188745A1
WO2019188745A1 PCT/JP2019/011974 JP2019011974W WO2019188745A1 WO 2019188745 A1 WO2019188745 A1 WO 2019188745A1 JP 2019011974 W JP2019011974 W JP 2019011974W WO 2019188745 A1 WO2019188745 A1 WO 2019188745A1
Authority
WO
WIPO (PCT)
Prior art keywords
measurement
unit
predicted
distance
measurement unit
Prior art date
Application number
PCT/JP2019/011974
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 正浩
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019188745A1 publication Critical patent/WO2019188745A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map

Definitions

  • the present invention relates to a technique for detecting a positional deviation of a measurement unit.
  • Patent Document 1 discloses a technique for estimating a self-position by collating the output of a measurement sensor with the position information of a feature registered in advance on a map.
  • Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
  • Data obtained from measurement units such as radar and cameras are coordinate system values based on the measurement unit, and are data that depends on the attitude of the measurement unit with respect to the vehicle. Need to be converted to Therefore, when a deviation occurs in the posture of the measurement unit, there is a possibility that an error occurs in the measurement value after conversion into the coordinate system based on the vehicle.
  • the present invention has been made to solve the above-described problems, and it is an object of the present invention to provide an information processing apparatus that can suitably detect a positional deviation of a measuring unit that measures a distance to an object with respect to a moving body. With a purpose.
  • the invention according to the claim is an information processing apparatus, and obtains a measurement distance obtained by the measurement unit up to an object included in a measurement range of each of at least three measurement units provided in the moving body.
  • a predicted distance acquisition unit that acquires a predicted distance from the moving body to the target predicted based on position information of the target, the measured distance acquired for each of the measuring units, and the predicted distance
  • a detection unit configured to detect, from the at least three measurement units, at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred based on a plurality of difference values.
  • the invention described in the claims is a control method executed by the information processing apparatus, wherein the measurement unit measures up to an object included in a measurement range of each of at least three measurement units provided in the moving body.
  • the invention described in the claims is a program executed by a computer, and obtains a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body.
  • a measurement distance acquisition unit a prediction distance acquisition unit that acquires a prediction distance from the moving object predicted based on position information of the object, and the measurement distance acquired for each measurement unit.
  • the computer is caused to function as a detection unit that detects from at least three measurement units at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred.
  • the information processing apparatus acquires a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body.
  • a detection unit that detects, from the at least three measurement units, at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated based on the plurality of difference values.
  • the “positional deviation” is not limited to the deviation of the center of gravity position of the measuring unit with respect to the moving body, but also includes a deviation of the orientation (posture) that does not involve the deviation of the center of gravity position.
  • the information processing apparatus can suitably detect a measurement unit in which a positional deviation has occurred by comparing difference values calculated from measurement distances of three or more measurement units.
  • the detection unit detects, from the at least three measurement units, measurement units in which a positional deviation has occurred based on an average value of the difference values for each measurement unit in a predetermined time. According to this aspect, the information processing apparatus can accurately detect a measurement unit in which a difference value difference is steadily generated as a measurement unit in which a position error occurs.
  • the information processing apparatus includes a predicted position acquisition unit that acquires the predicted predicted position of the moving body for each measurement unit, and a value obtained by multiplying the difference value by a predetermined gain. And a correction unit that corrects the predicted position for each measurement unit, and the predicted position acquisition unit corrects the predicted position corrected based on the difference value corresponding to one measurement unit of the measurement unit. Then, the predicted position corresponding to the measurement unit other than the measurement unit is acquired.
  • the information processing apparatus can perform highly accurate position estimation using the measurement distances of the three measurement units.
  • the correction unit corrects the predicted position based on the difference value of a measurement unit other than the measurement unit detected by the detection unit. According to this aspect, it is possible to suitably suppress a decrease in position estimation accuracy caused by using the measurement result of the measurement unit in which the positional deviation occurs.
  • the predicted distance acquisition unit corrects the predicted position corrected based on the difference value of a measurement unit other than the measurement unit detected by the detection unit, and the measurement detected by the detection unit.
  • the information processing device acquires the predicted distance based on the position information of the target object measured by the unit, and the information processing apparatus detects the detection unit based on the predicted distance and the measured distance detected by the measurement unit.
  • An estimation unit for estimating the position of the measurement unit is provided.
  • the “position” estimated by the estimation unit is not limited to the position of the center of gravity of the measurement unit, and may include the direction (posture) of the measurement unit.
  • the information processing apparatus can detect the predicted distance of the target object based on the predicted position of the moving object and the position of the target object on the map predicted by the measurement unit in which the positional error has not occurred, and the measurement in which the positional error has occurred. It is possible to suitably estimate the position (including the posture) of the measurement unit where the positional deviation has occurred so that the measurement distance of the object by the unit matches or approximates.
  • the information processing apparatus includes: the predicted position corrected based on the difference value of the measurement unit detected by the detection unit; and a measurement unit other than the measurement unit detected by the detection unit.
  • An estimation unit configured to estimate the position of the measurement unit detected by the detection unit based on the predicted position corrected based on the difference value;
  • the information processing apparatus can suitably estimate the position (including the posture) of the measurement unit where the positional deviation has occurred.
  • the estimation unit estimates the displacement amount of the positional deviation based on the estimated position of the measurement unit and the position of the measurement unit stored in the storage unit. According to this aspect, the information processing apparatus can perform processing such as correction of the measurement result of the measurement unit in which the positional deviation has occurred, and can appropriately suppress a decrease in accuracy of processing using the measurement unit.
  • the estimation unit may detect, as the deviation amount, a deviation amount in the pitch direction, yaw direction, or roll direction of the measurement unit detected by the detection unit, or a three-dimensional measurement unit detected by the detection unit. It is preferable to calculate at least one of the shift amounts of the center of gravity position in the space.
  • a control method executed by the information processing apparatus wherein the measurement up to an object included in each measurement range of at least three measurement units provided in the moving body is performed.
  • a detection step of detecting, from the at least three measurement units, at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred based on a plurality of difference values between the measured distance and the predicted distance, Have By using this control method, the information processing apparatus can suitably detect the measurement unit in which the positional deviation has occurred.
  • the program executed by the computer is measured by the measurement unit up to an object included in each measurement range of at least three measurement units provided on the moving body.
  • a measurement distance acquisition unit that acquires a distance; a predicted distance acquisition unit that acquires a predicted distance from the moving object to the target predicted based on position information of the target; and the measurement acquired for each measurement unit
  • the computer functions as a detection unit that detects from at least three measurement units at least one measurement unit in which the displacement of the attachment position with respect to the moving body has occurred.
  • the computer can suitably detect the measurement unit in which the positional deviation has occurred.
  • the program is stored in a storage medium.
  • FIG. 1A is a schematic configuration diagram of a driving support system according to the present embodiment.
  • the driving support system shown in FIG. 1 (A) is mounted on a vehicle and has an in-vehicle device 1 that performs control related to driving support of the vehicle, a lidar (Lider: Light Detection and Ranging, or Laser Illuminated Detection And Ranging) 2 (2A 2C), a gyro sensor 3, a vehicle speed sensor 4, and a GPS receiver 5.
  • FIG. 1B is an overhead view of the vehicle showing an example of the arrangement of the lidar 2.
  • the in-vehicle device 1 is electrically connected to the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, the position of the vehicle on which the in-vehicle device 1 is mounted ("own vehicle position"). Also called.) And the vehicle equipment 1 performs automatic driving
  • the vehicle-mounted device 1 stores a map database (DB: DataBase) 10 that stores road data and feature information that is information on landmarks (landmarks) provided in the vicinity of the road.
  • DB DataBase
  • the above-mentioned landmark features are, for example, features such as kilometer posts, 100-meter posts, delineators, traffic infrastructure facilities (eg signs, direction signs, signals), utility poles, street lights, etc. that are periodically lined up on the roadside.
  • the feature information is information in which at least an index assigned to each feature, position information of the feature, and information on the direction of the feature are associated with each other.
  • the vehicle equipment 1 estimates the own vehicle position by making it collate with the output of the lidar 2 etc. based on this feature information.
  • the in-vehicle device 1 detects a rider 2 (also referred to as a “deviation-generating lidar”) in which a positional deviation (including an attitude deviation) has occurred based on the vehicle position estimation result by each rider 2 and the occurrence of the deviation. Estimate the position and attitude angle of the rider. And the vehicle equipment 1 performs the process etc. which correct
  • production lidar outputs based on this estimation result.
  • the in-vehicle device 1 is an example of the “information processing device” in the present invention.
  • the lidar 2 (2A to 2C) emits a pulse laser to a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to the object existing in the outside world, and determining the position of the object.
  • the three-dimensional point cloud information shown is generated.
  • the lidar 2 includes an irradiation unit that emits laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. Output unit.
  • the scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal, and is supplied to the in-vehicle device 1.
  • the lidar 2 is an example of the “measurement unit” in the present invention.
  • the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
  • FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 1.
  • the in-vehicle device 1 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
  • the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15. In addition, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
  • ECU Electronic Control Unit
  • the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
  • the storage unit 12 includes a map DB 10 and rider installation information IL.
  • the lidar installation information IL is information related to the relative three-dimensional position and posture angle of each rider 2 measured at a certain reference time (for example, when there is no positional deviation such as immediately after alignment adjustment of the rider 2).
  • the attitude angle of the lidar 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, Euler angle).
  • the input unit 14 is a button operated by the user, a touch panel, a remote controller, a voice input device, and the like, and receives an input for specifying a destination for route search, an input for specifying on / off of automatic driving, and the like.
  • the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
  • the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
  • the control unit 15 estimates the vehicle position based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and controls vehicle driving support including automatic driving control based on the estimation result of the vehicle position. And so on.
  • the control unit 15 uses the measurement data output by the rider 2 as the reference for the attitude angle and position of the rider 2 recorded in the rider installation information IL.
  • the reference coordinate system also referred to as “rider coordinate system” is converted into a coordinate system based on the vehicle (also referred to as “vehicle coordinate system”).
  • control unit 15 detects the lidar 2 in which the positional deviation has occurred, and estimates the position and posture angle of the lidar 2.
  • the control unit 15 includes a “measurement distance acquisition unit”, a “predicted distance acquisition unit”, a “detection unit”, a “predicted position acquisition unit”, a “correction unit”, an “estimation unit”, and a “computer” that executes a program. Is an example.
  • the control unit 15 outputs the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 based on the measured values of the distance and angle by the lidar 2 with respect to the landmark and the position information of the landmark extracted from the map DB 10.
  • the vehicle position estimated from the data is corrected.
  • the control unit 15 performs a prediction step of predicting the vehicle position from output data of the gyro sensor 3 and the vehicle speed sensor 4 based on a state estimation method based on Bayesian estimation, and a previous prediction step.
  • a measurement update step for correcting the calculated predicted value of the vehicle position is executed alternately.
  • Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter. As described above, various methods have been proposed for position estimation based on Bayesian estimation.
  • FIG. 3 is a diagram showing the state variable vector x in two-dimensional orthogonal coordinates.
  • the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ ⁇ ” of the vehicle.
  • the yaw angle ⁇ is defined as an angle formed by the traveling direction of the vehicle and the x-axis.
  • the description here is based on four variables (x, y, z, ⁇ ) taking into account the coordinates of the z axis perpendicular to the x axis and the y axis in addition to the coordinates (x, y) and the yaw angle ⁇ described above.
  • the vehicle position is estimated using the state variable of the vehicle position. This is because a general road has a gentle slope, so it is possible to ignore the pitch angle and roll angle of the vehicle. However, six variables including the pitch angle and roll angle of the vehicle are not used. An example of the vehicle position estimation using the vehicle position state variable will also be described later.
  • FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step.
  • FIG. 5 shows an example of functional blocks of the control unit 15. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the control part 15 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step.
  • the position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26.
  • the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ⁇ ” on the character representing the value.
  • T is used to obtain the movement distance and azimuth change from the previous time.
  • the position prediction block 24 of the control unit 15 adds the obtained moving distance and azimuth change to the state variable vector X ⁇ (k-1) at the time k-1 calculated in the immediately previous measurement update step, so that the time k A predicted value (also referred to as “predicted position”) X ⁇ (k) is calculated.
  • the landmark search / extraction block 25 of the control unit 15 associates the landmark position vector registered in the map DB 10 with the scan data of the lidar 2. Then, the landmark search / extraction block 25 of the control unit 15, when this association is possible, the measurement value “Z (k)” by the lidar 2 of the landmark that has been associated, and the predicted position X ⁇ ( k) and a landmark measurement value obtained by modeling the measurement processing by the lidar 2 using the landmark position vector registered in the map DB 10 (referred to as “measurement prediction value”) “Z ⁇ (k)”. And get respectively.
  • the measured value Z (k) is a vector value in the vehicle coordinate system converted from the landmark distance and scan angle measured by the lidar 2 at time k into components with the vehicle traveling direction and the lateral direction as axes. Then, the position correction block 26 of the control unit 15 calculates a difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k).
  • the position correction block 26 of the control unit 15 then adds the Kalman gain “K (k) to the difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k) as shown in the following equation (1). ”And adding this to the predicted position X ⁇ (k), an updated state variable vector (also referred to as“ estimated position ”) X ⁇ (k) is calculated.
  • the position correction block 26 of the control unit 15 is similar to the prediction step in that the covariance matrix P ⁇ (k) (simply referred to as P (k) corresponding to the error distribution of the estimated position X ⁇ (k). Is expressed from the covariance matrix P ⁇ (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
  • the prediction step and the measurement update step are repeatedly performed, and the predicted position X ⁇ (k) and the estimated position X ⁇ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
  • the vehicle position estimation using a plurality of riders 2 will be described.
  • the vehicle position is determined by using the Kalman filter update equation corresponding to equation (1) using the measured values of the riders 2A to 2C. Estimate with high accuracy.
  • the landmarks detected by the lidars 2A to 2C may be the same or different.
  • the measured values for the landmark at time “k” in the point cloud data of the riders 2A to 2C are “Z 1 (k)”, “Z 2 (k)”, and “Z 3 (k)”, respectively.
  • the measurement predicted values corresponding to these are assumed to be “Z - 1 (k)”, “Z - 2 (k)”, and “Z - 3 (k)”.
  • the Kalman filter update formulas for the riders 2A to 2C are expressed by the following formulas (2-1) to (2-3).
  • the control unit 15 uses the estimated position X ⁇ (k) obtained by using the equation (2-1) based on the measured value Z 1 (k) of the lidar 2A as the predicted position of the lidar 2B.
  • X ⁇ (k) is substituted, and the estimated position X ⁇ (k) is calculated from the equation (2-2) based on the measured value Z 2 (k) of the lidar 2B.
  • the control unit 15 substitutes this estimated position X ⁇ (k) as the predicted position X ⁇ (k) of the lidar 2C, and from the equation (2-3) based on the measured value Z 3 (k) of the lidar 2C.
  • Estimated position X ⁇ (k) is calculated. In this way, it is possible to calculate the estimated position X ⁇ (k) with high accuracy using the measurement values of the riders 2A to 2C.
  • x mi (j)”, “y mi (j)”, and “z mi (j)” are also called landmark positions (“landmark coordinates”) of index number j extracted from the map DB 10. ).
  • the landmark coordinates are represented by predicted positions x ⁇ (k), y ⁇ (k), z ⁇ (k ) And a rotation matrix using the vehicle yaw angle ⁇ , it is converted into a vehicle coordinate system.
  • directions along the respective coordinate axes in the vehicle coordinate system are also simply referred to as “x direction”, “y direction”, and “z direction”, respectively.
  • This difference value ⁇ Z i (k) is used for detection of a deviation occurrence lidar, as will be described later.
  • the control unit 15 calculates an average value for a predetermined time of a difference value between the measured predicted value Z ⁇ i (k) and the measured value Z i (k) of each rider 2, and the average value of the rider whose average value is separated from 0 2 is detected as a deviation occurrence lidar.
  • the predetermined time is set to a time length such that the difference value ⁇ Z calculated based on the measurement value of the rider 2 in which no positional deviation has occurred is substantially zero.
  • FIG. 6A is a graph showing temporal changes in the coordinate values ⁇ x L1 , ⁇ y L1 , and ⁇ z L1 of the difference value ⁇ Z 1 (k) of the rider 2A at a predetermined time.
  • FIG. 6B is a graph showing temporal changes in the coordinate values ⁇ x L2 , ⁇ y L2 , and ⁇ z L2 of the difference value ⁇ Z 2 (k) of the rider 2B at a predetermined time.
  • FIG. 6C is a graph showing temporal changes in the coordinate values ⁇ x L3 , ⁇ y L3 , and ⁇ z L3 of the difference value Z 3 (k) of the rider 2C at a predetermined time.
  • the time average size of the coordinate values ⁇ x L1 , ⁇ y L1 , ⁇ z L1 of the difference value ⁇ Z 1 (k) in a predetermined time is close to zero.
  • the magnitude of the time average of the coordinate values ⁇ x L2 , ⁇ y L2 , ⁇ z L2 of the difference value ⁇ Z 2 (k) shown in FIG. On the other hand, a displacement occurs in the lidar 2C, and the measurement value of the lidar 2C in the z coordinate is affected by the displacement. Therefore, in this case, as shown in FIG.
  • the difference value ⁇ Z based on the measurement value of the lidar 2 in which the positional deviation has occurred is an abnormal value if at least one of the coordinate values x L , y L , and z L of the measured value Z is caused by the positional deviation.
  • at least one of the coordinate values ⁇ x L , ⁇ y L , ⁇ z L of the difference value ⁇ Z is far from 0.
  • the control unit 15 calculates the time average of the coordinate values ⁇ x L , ⁇ y L , ⁇ z L of the difference values ⁇ Z of the riders 2 for a predetermined time, and the magnitude of the time average
  • the corresponding rider 2 is regarded as a position shift occurrence rider.
  • the control unit 15 determines that the position deviation occurrence rider cannot be specified when the time average size of the coordinate values of the difference values ⁇ Z corresponding to all the riders 2 is larger than a predetermined value, and at least any of them is determined.
  • the information output unit 16 may output a warning or the like indicating that a positional deviation has occurred in the rider 2.
  • the control unit 15 uses the vehicle position estimation result estimated by the rider 2 other than the deviation occurrence rider and the landmark coordinates of the landmark measured by the deviation occurrence rider.
  • the measured predicted value of the landmark calculated based on the calculated value is calculated as an expected value (also referred to as “measured expected value”) of the measured value of the deviation occurrence lidar 2.
  • the control unit 15 performs the least square method using the Newton method based on the set of the measurement expected value and the actual measurement value of the deviation occurrence lidar, and estimates the attitude angle and position of the deviation occurrence lidar.
  • positional deviation is measured predicted value Z rider 2C by not not rider 2A and rider 2B which is a deviation occurs rider based on the vehicle position estimation result of the estimation occurs -
  • measured expected value 3 (k) measuring expected value [x b (k), y b (k), z b (k)] T is expressed by the following equation (5).
  • each predicted value x ⁇ , y ⁇ , z ⁇ , ⁇ ⁇ is estimated based on the measured values of the lidars 2A and 2B in which no positional deviation occurs, as shown in the following equation (7). Indicates the position.
  • the expected measurement value [x b (k), y b (k), z b (k)] T is the landmark coordinate of the landmark of the index number j measured by the lidar 2C, which is the displacement-generating lidar [ x m3 (j), y m3 (j), z m3 (j)] T is the vehicle position estimation result (x ⁇ , y ⁇ , z ⁇ , ⁇ ⁇ ) by the riders 2A and 2B in which no positional deviation has occurred. Is the value converted into the vehicle coordinate system.
  • the control unit 15 performs the least square method using the Newton method on the measurement expected value shown in the equation (5), and the roll angle “L ⁇ ” and the pitch angle “ L ⁇ ”, yaw angle“ L ⁇ ”, position“ L x ”in the x direction, position“ L y ”in the y direction, and position“ L z ”in the z direction are estimated.
  • the relationship between the vehicle coordinate system and the lidar coordinate system is determined by these posture angles L ⁇ , L ⁇ , L ⁇ and positions L x , L y , L z .
  • the relationship between the vehicle coordinate system and the lidar coordinate system will be described in detail in the section “(3) Coordinate system conversion”.
  • the measured value of the landmark at time k by the shift generation lidar is expressed as [x L (k), y L (k), z L (k)] T.
  • the measured values [x L (k), y L (k), z L (k)] T use posture angles L ⁇ , L ⁇ , L ⁇ , and positions L x , L y , L z . Is represented by the following equation (8).
  • the solution of these six variables is obtained by substituting into the formula (9) using the measurement result and the expected measurement value at least twice.
  • the accuracy of the solution by the least square method can be improved as the number of measured values and measured expected values to be used is increased, preferably, all obtained measured values and measured expected values are used. Good.
  • initial values of posture angles and positions stored in the rider installation information IL are “L ⁇ 0 ”, “L ⁇ 0 ”, “L ⁇ 0 ”, “L x0 ”, Assuming that “L y0 ” and “L z0 ”, the following equation (14) similar to the equation (9) is established.
  • Equation (15) the 9-row and 1-column matrix (that is, the left side) of the measurement values for 3 times is “z”, the 9-by-6 Jacobian matrix is “C”, and the initial values of the posture angle and position
  • the matrix of 6 rows and 1 column indicating the difference between the current value and the current value is “ ⁇ x”, and the matrix of 9 rows and 1 column regarding the initial measurement values x L0 , y L0 , and z L0 obtained by the equation (14) is “d”.
  • Expression (15) is expressed by the following expression.
  • ⁇ x is expressed by the following equation (20).
  • control unit 15 can suitably estimate the attitude angle and position of the deviation-generating lidar by performing the following first to fifth steps.
  • the control unit 15 calculates ⁇ x based on Expression (20).
  • the control unit 15 sets x to the positions L x0 , L y0 , L z0 and the posture angles L ⁇ 0 , L ⁇ 0 , L ⁇ 0 , and the first step again.
  • the control unit 15 regards x obtained in the fourth step as a final solution, and positions L x , L y , L z indicated by x and the posture angle L ⁇ . , L ⁇ , L ⁇ are regarded as the position and posture angle of the displacement-generating lidar.
  • FIG. 7 is an example of a flowchart showing the execution procedure of the position / posture angle estimation process based on the detection process of the deviation occurrence lidar and the first estimation method.
  • the control unit 15 repeatedly executes the process of the flowchart of FIG.
  • the control unit 15 performs vehicle position estimation using a Kalman filter using three or more riders (in this case, the riders 2A to 2C) while the vehicle is traveling (step S101).
  • the control unit 15 calculates the estimated position X ⁇ reflecting the measurement value of each rider 2 by sequentially executing the Kalman filter update formulas shown in Formulas (2-1) to (2-3). To do.
  • control unit 15 averages the difference value ⁇ Z i between the measured predicted value Z ⁇ i of each rider 2 and the measured value Z i for a predetermined time until the current processing reference time (step S102). .
  • the control unit 15 stores the past difference value ⁇ Z i for the predetermined time in a buffer such as the storage unit 12, and each coordinate value ⁇ x Li of the stored difference value ⁇ Z i for the past predetermined time. , ⁇ y Li , ⁇ z Li are averaged.
  • the control unit 15 determines whether there is a rider 2 whose average difference value ⁇ Z i is away from 0 (step S103). For example, the control unit 15 determines whether or not the absolute value of any of the coordinate values ⁇ x Li , ⁇ y Li , ⁇ z Li of the averaged difference value ⁇ Z i is equal to or greater than a predetermined value.
  • the above-mentioned predetermined value is a value larger than the average absolute value of ⁇ x Li , ⁇ y Li , ⁇ z Li based on the measured value of the lidar 2 in which no deviation occurs, taking into account the length of the predetermined time used for averaging, etc.
  • step S103 when there is a rider 2 whose average difference value ⁇ Z i is away from 0 (step S103; Yes), the control unit 15 shifts the rider 2 corresponding to the difference value ⁇ Z i whose average is away from 0. And the own vehicle position estimation process is executed by the rider 2 other than the deviation occurrence rider (step S104). On the other hand, when there is no rider 2 whose average difference value ⁇ Z i is away from 0 (step S103; No), the control unit 15 regards that there is no deviation occurrence rider and returns the process to step S101 again.
  • step S104 after executing the own vehicle position estimation process by the rider 2 other than the deviation occurrence rider, the control unit 15 performs measurement prediction from the own vehicle position estimation result and the landmark coordinates of the landmark registered in the map DB 10.
  • a value Z ⁇ is calculated and set as a measurement expected value [x b (k), y b (k), z b (k)] T of the deviation generation lidar (step S105).
  • the control unit 15 selects a landmark that can be measured by the deviation generation lidar as the above-described landmark. Then, the control unit 15 acquires a plurality of sets of measurement values and measurement expectation values of the deviation occurrence lidar for the above-described landmarks.
  • control unit 15 performs a least square method using the Newton method based on a plurality of sets of measurement expected values and measurement values, and estimates the attitude angle and position of the deviation-generating lidar (step S106).
  • the use of the posture angle and position of the estimated deviation generation lidar will be described in the section “(3) Application Example”.
  • the control unit 15 uses the own vehicle position estimated by the rider 2 other than the deviation occurrence rider (also referred to as “reference position”) and the own vehicle estimated by the deviation occurrence rider. Based on the difference value with the position (also referred to as “temporary position”), the position and the posture angle of the deviation occurrence lidar are updated.
  • the state variables of the vehicle position estimated by the Kalman filter are assumed to be six variables of position x, y, z, yaw angle ⁇ , roll angle ⁇ , and pitch angle ⁇ .
  • the measured predicted value Z ⁇ i (k) for the landmark at time k of each rider 2 is expressed by the following equation (21).
  • the control unit 15 detects the deviation occurrence lidar based on the average of the difference values ⁇ Z i (k) as described in [Detection of deviation occurrence lidar].
  • the control unit 15 detects the lidar 2C as a shift generation lidar.
  • the control unit 15 uses the reference positions [x ⁇ r (k), y ⁇ r (k), z ⁇ r (k), ⁇ ⁇ r (k), estimated by the lidars 2A and 2B other than the deviation-generating lidar.
  • control part 15 adds the difference value shown by Formula (22) to the position and attitude
  • FIG. 8 is an example of a flowchart showing the execution procedure of the position angle and position estimation processing based on the detection processing of the deviation occurrence lidar and the second estimation method.
  • the control unit 15 repeatedly executes the process of the flowchart of FIG.
  • control unit 15 performs vehicle position estimation using a Kalman filter using three or more riders (here, riders 2A to 2C) while the vehicle is traveling (step S201).
  • control unit 15 the measurement predicted value of each rider 2 Z - averaging at a given time with respect to the difference values [Delta] Z i of i and the measured value Z i, until the current standards (step S202).
  • step S203 when there is a rider 2 whose average difference value ⁇ Z i is away from 0 (step S203; Yes), the control unit 15 shifts the rider 2 corresponding to the difference value ⁇ Z i whose average is away from 0. And the own vehicle position estimated from the measured values of the lidar 2 other than the deviation-generating lidar is set as a reference position (step S204). Further, the control unit 15 sets the own vehicle position estimated from the measurement value of the deviation occurrence lidar as a temporary position (step S205). Then, the control unit 15 updates the estimated value of the position and posture angle of the deviation-generating lidar based on the difference value between the reference position and the temporary position (step S206).
  • step S207 when the difference value between the reference position and the temporary position is close to 0 (step S207; Yes), the control unit 15 determines that the estimated value of the position and posture angle of the deviation occurrence lidar is sufficiently accurate. Then, the process of the flowchart ends. On the other hand, when the difference value between the reference position and the temporary position is not close to 0 (step S207; No), the control unit 15 returns the process to step S204, and recalculates the reference position.
  • FIG. 9 is a diagram showing the relationship between the vehicle coordinate system and the lidar coordinate system represented by two-dimensional coordinates.
  • the vehicle coordinate system has a coordinate center “x v ” along the traveling direction of the vehicle and a coordinate axis “y v ” along the lateral direction of the vehicle with the vehicle center as the origin.
  • the lidar coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A ⁇ b> 2) and a coordinate axis “y L ” along the side surface direction of the rider 2.
  • the measurement point [x] at the time “k” viewed from the vehicle coordinate system [x v (k), y v (k)] T is converted to the coordinates [x L (k), y L (k)] T of the lidar coordinate system by the following equation (23) using the rotation matrix “C ⁇ ”. Converted.
  • the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the rotation matrix. Therefore, the measurement point [x L (k), y L (k)] T obtained at the time k obtained in the lidar coordinate system is expressed by the coordinates [x v (k), y v of the vehicle coordinate system according to the following equation (24). (K)] It is possible to convert to T.
  • FIG. 10 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by three-dimensional coordinates.
  • a coordinate axis perpendicular to the coordinate axes x v and y v is “z v ”
  • a coordinate axis perpendicular to the coordinate axes x L and y L is “z L ”.
  • the roll angle of the lidar 2 with respect to the vehicle coordinate system is “L ⁇ ”, the pitch angle is “L ⁇ ”, the yaw angle is “L ⁇ ”, the position of the lidar 2 on the coordinate axis x v is “L x ”, and the coordinate axis y v
  • the measurement point [x v (k), y v (k), z v ( k)] T is the following equation (25) using the direction cosine matrix “C” represented by the rotation matrices “C ⁇ ”, “C ⁇ ”, and “C ⁇ ” corresponding to roll, pitch, and yaw. Is converted into coordinates [x L (k), y L (k), z L (k)] T in the lidar coordinate system.
  • the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the direction cosine matrix. Therefore, the measurement point [x L (k), y L (k), z L (k)] T obtained at the time k acquired in the lidar coordinate system is expressed by the coordinate [x v (K), y v (k), z v (k)] T.
  • the control unit 15 calculates the estimated posture angle and position change amount of the position deviation occurrence lidar with respect to the posture angle and position of the position deviation occurrence rider before the occurrence of the deviation recorded in the rider installation information IL, and Based on the amount of change, each measurement value of the point cloud data output by the misregistration occurrence lidar is corrected.
  • the control unit 15 stores a map or the like indicating the correction amount of the measurement value for each change amount, and corrects the above-described measurement value by referring to the map or the like.
  • the measurement value may be corrected using the value of a predetermined ratio of the change amount as the correction amount of the measurement value.
  • the control unit 15 stops the use of the position shift occurrence lidar and outputs a predetermined warning by the information output unit 16 when detecting a shift in position or posture angle at which the change amount is equal to or greater than a predetermined threshold. May be.
  • the control unit 15 uses the calculated roll angle L ⁇ , pitch angle L ⁇ , yaw angle L ⁇ , x-direction position L x , y-direction position L y , and z-direction position L z to use the lidar 2.
  • Each measured value of the point cloud data output from the vehicle may be converted from the lidar coordinate system to the vehicle body coordinate system, and based on the converted data, the vehicle position estimation or automatic driving control may be executed. Accordingly, the measurement value of the rider 2 can be appropriately converted into the vehicle coordinate system based on the attitude angle and position of the rider 2 after the occurrence of the deviation.
  • the in-vehicle device 1 when the in-vehicle device 1 includes an adjustment mechanism such as an actuator for correcting the posture angle and position of each rider 2, the posture angle of the rider 2 based on the estimation result. And the position may be modified.
  • the in-vehicle device 1 calculates the estimated amount of change in posture angle and position on the basis of the posture angle and position recorded in the rider installation information IL, and the posture angle and position of the rider 2 by the amount of the change amount. Control is performed to drive the adjustment mechanism so as to correct.
  • the in-vehicle device 1 acquires the measurement value Z i by the rider 2 up to the landmark included in each measurement range of at least three riders 2 provided in the vehicle. Then, the in-vehicle device 1 acquires a predicted measurement value Z - i predicted based on the landmark coordinates indicated by the position information of the landmark registered in the map DB 10. The in-vehicle device 1 has at least one displacement of the mounting position with respect to the vehicle based on a plurality of difference values ⁇ Z i between the measured value Z i and the measured predicted value Z ⁇ i acquired for each rider 2. The lidar 2 is detected. As a result, the in-vehicle device 1 can accurately detect the lidar 2 in which the positional deviation has occurred.
  • the control unit 15 may individually execute the Kalman filter update formula shown in Formula (1) based on the measurement value of each rider 2. That is, in this case, the control unit 15 performs vehicle position estimation based on the measurement value of the lidar 2A, vehicle position estimation based on the measurement value of the lidar 2B, and vehicle position estimation based on the measurement value of the lidar 2C based on the formula (1). And the difference value ⁇ Z between the measured value Z calculated at the time of executing each formula (1) and the measured predicted value Z ⁇ is compared. Then, when there is a difference value ⁇ Z that is greater than or equal to a predetermined value, the control unit 15 detects the rider 2 corresponding to the difference value ⁇ Z as a deviation occurrence rider. In this case, as in the embodiment, the landmarks to be measured by the riders 2 do not have to be the same.
  • the configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG.
  • the electronic control device of the vehicle instead of having the in-vehicle device 1, the electronic control device of the vehicle may execute the processing shown in FIG.
  • the lidar installation information IL is stored in, for example, a storage unit in the vehicle, and the vehicle electronic control device is configured to be able to receive output data of various sensors such as the lidar 2.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

An on-vehicle machine 1 acquires a measurement value Zi , by each one of at least three LIDARs 2 provided to a vehicle, to a landmark included in the respective measurement ranges of the LIDARs 2. The on-vehicle machine 1 acquires a predicted measurement value Z- i predicted on the basis of landmark coordinates indicated by position information of the landmark registered in the map DB 10. The on-vehicle machine 1 detects at least one LIDAR 2, in which misalignment has occurred with respect to the mounting position on the vehicle, on the basis of a plurality of difference values ΔZi between the measurement value Zi acquired for each one of the LIDARs 2 and the predicted measurement value Z- i.

Description

情報処理装置、制御方法、プログラム及び記憶媒体Information processing apparatus, control method, program, and storage medium
 本発明は、計測部の位置ずれを検出する技術に関する。 The present invention relates to a technique for detecting a positional deviation of a measurement unit.
 従来から、レーダやカメラなどの計測部の計測データに基づいて、自車位置推定などを行う技術が知られている。例えば、特許文献1には、計測センサの出力と、予め地図上に登録された地物の位置情報とを照合させることで自己位置を推定する技術が開示されている。また、特許文献2には、カルマンフィルタを用いた自車位置推定技術が開示されている。 2. Description of the Related Art Conventionally, a technique for estimating a vehicle position based on measurement data of a measurement unit such as a radar or a camera is known. For example, Patent Document 1 discloses a technique for estimating a self-position by collating the output of a measurement sensor with the position information of a feature registered in advance on a map. Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
特開2013-257742号公報JP 2013-257742 A 特開2017-72422号公報JP 2017-72422 A
 レーダやカメラなどの計測部から得られるデータは、計測部を基準とした座標系の値であり、車両に対する計測部の姿勢等に依存したデータとなっているため、車両を基準とした座標系の値に変換する必要がある。従って、計測部の姿勢にずれが生じた場合には、車両を基準とした座標系に変換した後の計測値に誤差が生じる可能性がある。 Data obtained from measurement units such as radar and cameras are coordinate system values based on the measurement unit, and are data that depends on the attitude of the measurement unit with respect to the vehicle. Need to be converted to Therefore, when a deviation occurs in the posture of the measurement unit, there is a possibility that an error occurs in the measurement value after conversion into the coordinate system based on the vehicle.
 本発明は、上記のような課題を解決するためになされたものであり、対象物に対する距離を計測する計測部の移動体に対する位置ずれを好適に検出可能な情報処理装置を提供することを主な目的とする。 The present invention has been made to solve the above-described problems, and it is an object of the present invention to provide an information processing apparatus that can suitably detect a positional deviation of a measuring unit that measures a distance to an object with respect to a moving body. With a purpose.
 請求項に記載の発明は、情報処理装置であって、移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得部と、前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得部と、前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出部と、を有する。 The invention according to the claim is an information processing apparatus, and obtains a measurement distance obtained by the measurement unit up to an object included in a measurement range of each of at least three measurement units provided in the moving body. A predicted distance acquisition unit that acquires a predicted distance from the moving body to the target predicted based on position information of the target, the measured distance acquired for each of the measuring units, and the predicted distance A detection unit configured to detect, from the at least three measurement units, at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred based on a plurality of difference values.
 また、請求項に記載の発明は、情報処理装置が実行する制御方法であって、移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得工程と、前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得工程と、前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出工程と、を有する。 The invention described in the claims is a control method executed by the information processing apparatus, wherein the measurement unit measures up to an object included in a measurement range of each of at least three measurement units provided in the moving body. A measurement distance acquisition step of acquiring a distance; a predicted distance acquisition step of acquiring a predicted distance from the moving body predicted based on position information of the target object to the target object; And a detection step of detecting at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated from the at least three measurement units based on a plurality of difference values between the measurement distance and the predicted distance.
 また、請求項に記載の発明は、コンピュータが実行するプログラムであって、移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得部と、前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得部と、前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出部として前記コンピュータを機能させる。 The invention described in the claims is a program executed by a computer, and obtains a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body. A measurement distance acquisition unit, a prediction distance acquisition unit that acquires a prediction distance from the moving object predicted based on position information of the object, and the measurement distance acquired for each measurement unit. Based on a plurality of difference values of the predicted distance, the computer is caused to function as a detection unit that detects from at least three measurement units at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred.
運転支援システムの概略構成図である。It is a schematic block diagram of a driving assistance system. 車載機の機能的構成を示すブロック図である。It is a block diagram which shows the functional structure of a vehicle equipment. 状態変数ベクトルを2次元直交座標で表した図である。It is the figure which represented the state variable vector by the two-dimensional orthogonal coordinate. 予測ステップと計測更新ステップとの概略的な関係を示す図である。It is a figure which shows the schematic relationship between a prediction step and a measurement update step. 制御部の機能ブロックの一例を示す。An example of the functional block of a control part is shown. 所定時間におけるライダごとの差分値の時間変化を示すグラフである。It is a graph which shows the time change of the difference value for every rider in predetermined time. ずれ発生ライダの検出処理及び第1推定方法に基づく位置及び姿勢の推定処理の実行手順を示すフローチャートの一例である。It is an example of the flowchart which shows the execution procedure of the estimation process of the position and attitude | position based on the detection process of a shift | offset | difference lidar, and the 1st estimation method. ずれ発生ライダの検出処理及び第2推定方法に基づく位置及び姿勢の推定処理の実行手順を示すフローチャートの一例である。It is an example of the flowchart which shows the execution procedure of the detection process of a shift | offset | difference generation | occurrence | production lidar, and the estimation process of the position and attitude | position based on a 2nd estimation method. 2次元座標により表された車両座標系とライダ座標系との関係を示す図である。It is a figure which shows the relationship between the vehicle coordinate system represented by the two-dimensional coordinate, and a lidar coordinate system. 3次元座標により表された車両座標系とライダ座標系との関係を示す図である。It is a figure which shows the relationship between the vehicle coordinate system represented by the three-dimensional coordinate, and a lidar coordinate system.
 本発明の好適な実施形態によれば、情報処理装置は、移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得部と、前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得部と、前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出部と、を有する。ここで、「位置ずれ」は、計測部の移動体に対する重心位置のずれに限らず、重心位置のずれを伴わない向き(姿勢)のずれも含む。情報処理装置は、この態様により、3つ以上の計測部の計測距離から算出される差分値を比較することで、位置ずれが発生している計測部を好適に検出することができる。 According to a preferred embodiment of the present invention, the information processing apparatus acquires a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body. An acquisition unit; a predicted distance acquisition unit that acquires a predicted distance from the moving object to the target predicted based on position information of the target; and the measured distance and the predicted distance acquired for each of the measuring units And a detection unit that detects, from the at least three measurement units, at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated based on the plurality of difference values. Here, the “positional deviation” is not limited to the deviation of the center of gravity position of the measuring unit with respect to the moving body, but also includes a deviation of the orientation (posture) that does not involve the deviation of the center of gravity position. According to this aspect, the information processing apparatus can suitably detect a measurement unit in which a positional deviation has occurred by comparing difference values calculated from measurement distances of three or more measurement units.
 上記情報処理装置の一態様では、前記検出部は、前記計測部ごとの差分値の所定時間における平均値に基づいて、位置ずれが生じている計測部を前記少なくとも3つの計測部から検出する。この態様により、情報処理装置は、定常的に差分値のずれが生じている計測部を、位置ずれが発生している計測部として的確に検出することができる。 In one aspect of the information processing apparatus, the detection unit detects, from the at least three measurement units, measurement units in which a positional deviation has occurred based on an average value of the difference values for each measurement unit in a predetermined time. According to this aspect, the information processing apparatus can accurately detect a measurement unit in which a difference value difference is steadily generated as a measurement unit in which a position error occurs.
 上記情報処理装置の他の一態様では、情報処理装置は、予測された前記移動体の予測位置を前記計測部ごとに取得する予測位置取得部と、前記差分値に所定の利得を乗じた値により、前記計測部ごとに前記予測位置を補正する補正部と、をさらに備え、前記予測位置取得部は、前記計測部の1つの計測部に対応する前記差分値に基づき補正した前記予測位置を、当該計測部以外の計測部に対応する予測位置として取得する。この態様により、情報処理装置は、3つの計測部のそれぞれの計測距離を用いて高精度な位置推定を行うことができる。 In another aspect of the information processing apparatus, the information processing apparatus includes a predicted position acquisition unit that acquires the predicted predicted position of the moving body for each measurement unit, and a value obtained by multiplying the difference value by a predetermined gain. And a correction unit that corrects the predicted position for each measurement unit, and the predicted position acquisition unit corrects the predicted position corrected based on the difference value corresponding to one measurement unit of the measurement unit. Then, the predicted position corresponding to the measurement unit other than the measurement unit is acquired. According to this aspect, the information processing apparatus can perform highly accurate position estimation using the measurement distances of the three measurement units.
 上記情報処理装置の他の一態様では、前記補正部は、前記検出部が検出した計測部以外の計測部の前記差分値に基づき、前記予測位置を補正する。この態様により、位置ずれが生じている計測部の計測結果を用いることに起因した位置推定精度の低下を好適に抑制することができる。 In another aspect of the information processing apparatus, the correction unit corrects the predicted position based on the difference value of a measurement unit other than the measurement unit detected by the detection unit. According to this aspect, it is possible to suitably suppress a decrease in position estimation accuracy caused by using the measurement result of the measurement unit in which the positional deviation occurs.
 上記情報処理装置の他の一態様では、前記予測距離取得部は、前記検出部が検出した計測部以外の計測部の前記差分値に基づき補正した前記予測位置と、前記検出部が検出した計測部が計測する対象物の位置情報とに基づく前記予測距離を取得し、情報処理装置は、当該予測距離と、前記検出部が検出した計測部による計測距離とに基づき、前記検出部が検出した計測部の位置を推定する推定部を備える。ここで、推定部が推定する「位置」は、計測部の重心位置に限らず、計測部の向き(姿勢)を含んでもよい。この態様により、情報処理装置は、位置ずれが発生していない計測部により予測した移動体の予測位置と対象物の地図上の位置とに基づく対象物の予測距離と、位置ずれが発生した計測部による対象物の計測距離とが一致又は近似するように、位置ずれが発生した計測部の位置(姿勢を含む)を好適に推定することができる。 In another aspect of the information processing apparatus, the predicted distance acquisition unit corrects the predicted position corrected based on the difference value of a measurement unit other than the measurement unit detected by the detection unit, and the measurement detected by the detection unit. The information processing device acquires the predicted distance based on the position information of the target object measured by the unit, and the information processing apparatus detects the detection unit based on the predicted distance and the measured distance detected by the measurement unit. An estimation unit for estimating the position of the measurement unit is provided. Here, the “position” estimated by the estimation unit is not limited to the position of the center of gravity of the measurement unit, and may include the direction (posture) of the measurement unit. According to this aspect, the information processing apparatus can detect the predicted distance of the target object based on the predicted position of the moving object and the position of the target object on the map predicted by the measurement unit in which the positional error has not occurred, and the measurement in which the positional error has occurred. It is possible to suitably estimate the position (including the posture) of the measurement unit where the positional deviation has occurred so that the measurement distance of the object by the unit matches or approximates.
 上記情報処理装置の他の一態様では、情報処理装置は、前記検出部が検出した計測部の前記差分値に基づき補正した前記予測位置と、前記検出部が検出した計測部以外の計測部の前記差分値に基づき補正した前記予測位置とに基づき、前記検出部が検出した計測部の位置を推定する推定部を備える。この態様によっても、情報処理装置は、位置ずれが発生した計測部の位置(姿勢を含む)を好適に推定することができる。 In another aspect of the information processing apparatus, the information processing apparatus includes: the predicted position corrected based on the difference value of the measurement unit detected by the detection unit; and a measurement unit other than the measurement unit detected by the detection unit. An estimation unit configured to estimate the position of the measurement unit detected by the detection unit based on the predicted position corrected based on the difference value; Also according to this aspect, the information processing apparatus can suitably estimate the position (including the posture) of the measurement unit where the positional deviation has occurred.
 上記情報処理装置の他の一態様では、前記推定部は、推定した前記計測部の位置と、記憶部に記憶された当該計測部の位置とに基づき、前記位置ずれのずれ量を推定する。この態様により、情報処理装置は、位置ずれが発生した計測部の計測結果の補正などの処理を行い、当該計測部を用いた処理の精度低下を好適に抑制することができる。好適な例では、前記推定部は、前記ずれ量として、前記検出部が検出した計測部のピッチ方向、ヨー方向、若しくはロール方向のずれ量、又は、前記検出部が検出した計測部の3次元空間での重心位置のずれ量の少なくとも一方を算出するとよい。 In another aspect of the information processing apparatus, the estimation unit estimates the displacement amount of the positional deviation based on the estimated position of the measurement unit and the position of the measurement unit stored in the storage unit. According to this aspect, the information processing apparatus can perform processing such as correction of the measurement result of the measurement unit in which the positional deviation has occurred, and can appropriately suppress a decrease in accuracy of processing using the measurement unit. In a preferred example, the estimation unit may detect, as the deviation amount, a deviation amount in the pitch direction, yaw direction, or roll direction of the measurement unit detected by the detection unit, or a three-dimensional measurement unit detected by the detection unit. It is preferable to calculate at least one of the shift amounts of the center of gravity position in the space.
 本発明の他の好適な実施形態によれば、情報処理装置が実行する制御方法であって、移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得工程と、前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得工程と、前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出工程と、を有する。情報処理装置は、この制御方法を用いることで、位置ずれが発生している計測部を好適に検出することができる。 According to another preferred embodiment of the present invention, there is provided a control method executed by the information processing apparatus, wherein the measurement up to an object included in each measurement range of at least three measurement units provided in the moving body is performed. A measurement distance acquisition step of acquiring a measurement distance by a unit, a predicted distance acquisition step of acquiring a predicted distance from the moving body to the target predicted based on position information of the target, and acquisition for each measurement unit A detection step of detecting, from the at least three measurement units, at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred based on a plurality of difference values between the measured distance and the predicted distance, Have By using this control method, the information processing apparatus can suitably detect the measurement unit in which the positional deviation has occurred.
 本発明の他の好適な実施形態によれば、コンピュータが実行するプログラムであって、移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得部と、前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得部と、前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出部として前記コンピュータを機能させる。コンピュータは、このプログラムを実行することで、位置ずれが発生している計測部を好適に検出することができる。好適には、上記プログラムは、記憶媒体に記憶される。 According to another preferred embodiment of the present invention, the program executed by the computer is measured by the measurement unit up to an object included in each measurement range of at least three measurement units provided on the moving body. A measurement distance acquisition unit that acquires a distance; a predicted distance acquisition unit that acquires a predicted distance from the moving object to the target predicted based on position information of the target; and the measurement acquired for each measurement unit Based on a plurality of difference values between the measurement distance and the predicted distance, the computer functions as a detection unit that detects from at least three measurement units at least one measurement unit in which the displacement of the attachment position with respect to the moving body has occurred. Let By executing this program, the computer can suitably detect the measurement unit in which the positional deviation has occurred. Preferably, the program is stored in a storage medium.
 以下、図面を参照して本発明の好適な実施例について説明する。 Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
 [概略構成]
 図1(A)は、本実施例に係る運転支援システムの概略構成図である。図1(A)に示す運転支援システムは、車両に搭載され、車両の運転支援に関する制御を行う車載機1と、ライダ(Lidar:Light Detection and Ranging、または、Laser Illuminated Detection And Ranging)2(2A~2C)と、ジャイロセンサ3と、車速センサ4と、GPS受信機5とを有する。また、図1(B)は、ライダ2の配置例を示す車両の俯瞰図である。
[Schematic configuration]
FIG. 1A is a schematic configuration diagram of a driving support system according to the present embodiment. The driving support system shown in FIG. 1 (A) is mounted on a vehicle and has an in-vehicle device 1 that performs control related to driving support of the vehicle, a lidar (Lider: Light Detection and Ranging, or Laser Illuminated Detection And Ranging) 2 (2A 2C), a gyro sensor 3, a vehicle speed sensor 4, and a GPS receiver 5. FIG. 1B is an overhead view of the vehicle showing an example of the arrangement of the lidar 2.
 車載機1は、ライダ2、ジャイロセンサ3、車速センサ4、及びGPS受信機5と電気的に接続し、これらの出力に基づき、車載機1が搭載される車両の位置(「自車位置」とも呼ぶ。)の推定を行う。そして、車載機1は、自車位置の推定結果に基づき、設定された目的地への経路に沿って走行するように、車両の自動運転制御などを行う。車載機1は、道路データ及び道路付近に設けられた目印となる地物(ランドマーク)に関する情報である地物情報を記憶した地図データベース(DB:DataBase)10を記憶する。上述の目印となる地物は、例えば、道路脇に周期的に並んでいるキロポスト、100mポスト、デリニエータ、交通インフラ設備(例えば標識、方面看板、信号)、電柱、街灯などの地物であり、地物情報は、各地物に割り当てられたインデックスと、地物の位置情報と、地物の向きの情報とが少なくとも関連付けられた情報である。そして、車載機1は、この地物情報に基づき、ライダ2等の出力と照合させて自車位置の推定を行う。 The in-vehicle device 1 is electrically connected to the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, the position of the vehicle on which the in-vehicle device 1 is mounted ("own vehicle position"). Also called.) And the vehicle equipment 1 performs automatic driving | operation control etc. of a vehicle so that it drive | works along the path | route to the set destination based on the estimation result of the own vehicle position. The vehicle-mounted device 1 stores a map database (DB: DataBase) 10 that stores road data and feature information that is information on landmarks (landmarks) provided in the vicinity of the road. The above-mentioned landmark features are, for example, features such as kilometer posts, 100-meter posts, delineators, traffic infrastructure facilities (eg signs, direction signs, signals), utility poles, street lights, etc. that are periodically lined up on the roadside. The feature information is information in which at least an index assigned to each feature, position information of the feature, and information on the direction of the feature are associated with each other. And the vehicle equipment 1 estimates the own vehicle position by making it collate with the output of the lidar 2 etc. based on this feature information.
 また、車載機1は、各ライダ2による自車位置推定結果等に基づき、位置ずれ(姿勢ずれも含む)が生じているライダ2(「ずれ発生ライダ」とも呼ぶ。)を検出し、ずれ発生ライダの位置及び姿勢角を推定する。そして、車載機1は、この推定結果に基づいて、ずれ発生ライダが出力する点群データの各計測値を補正する処理などを行う。車載機1は、本発明における「情報処理装置」の一例である。 Further, the in-vehicle device 1 detects a rider 2 (also referred to as a “deviation-generating lidar”) in which a positional deviation (including an attitude deviation) has occurred based on the vehicle position estimation result by each rider 2 and the occurrence of the deviation. Estimate the position and attitude angle of the rider. And the vehicle equipment 1 performs the process etc. which correct | amend each measured value of the point cloud data which a shift | offset | difference generation | occurrence | production lidar outputs based on this estimation result. The in-vehicle device 1 is an example of the “information processing device” in the present invention.
 ライダ2(2A~2C)は、水平方向および垂直方向の所定の角度範囲に対してパルスレーザを出射することで、外界に存在する物体までの距離を離散的に測定し、当該物体の位置を示す3次元の点群情報を生成する。この場合、ライダ2は、照射方向を変えながらレーザ光を照射する照射部と、照射したレーザ光の反射光(散乱光)を受光する受光部と、受光部が出力する受光信号に基づくスキャンデータを出力する出力部とを有する。スキャンデータは、受光部が受光したレーザ光に対応する照射方向と、上述の受光信号に基づき特定される当該レーザ光の照射方向における物体までの距離とに基づき生成され、車載機1へ供給される。本実施例では、図1(B)に示すように、ライダ2A、2Bは、車両のフロント部分に設けられ、ライダ2Cは、車両のリア部分に設けられている。ライダ2は、本発明における「計測部」の一例である。このように、本実施例に係るライダ2は、少なくとも3台設けられる。ライダ2、ジャイロセンサ3、車速センサ4、GPS受信機5は、それぞれ、出力データを車載機1へ供給する。 The lidar 2 (2A to 2C) emits a pulse laser to a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to the object existing in the outside world, and determining the position of the object. The three-dimensional point cloud information shown is generated. In this case, the lidar 2 includes an irradiation unit that emits laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. Output unit. The scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal, and is supplied to the in-vehicle device 1. The In this embodiment, as shown in FIG. 1B, the riders 2A and 2B are provided at the front portion of the vehicle, and the rider 2C is provided at the rear portion of the vehicle. The lidar 2 is an example of the “measurement unit” in the present invention. Thus, at least three riders 2 according to the present embodiment are provided. The rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
 図2は、車載機1の機能的構成を示すブロック図である。車載機1は、主に、インターフェース11と、記憶部12と、入力部14と、制御部15と、情報出力部16と、を有する。これらの各要素は、バスラインを介して相互に接続されている。 FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 1. The in-vehicle device 1 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
 インターフェース11は、ライダ2、ジャイロセンサ3、車速センサ4、及びGPS受信機5などのセンサから出力データを取得し、制御部15へ供給する。また、インターフェース11は、制御部15が生成した車両の走行制御に関する信号を車両の電子制御装置(ECU:Electronic Control Unit)へ供給する。 The interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15. In addition, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
 記憶部12は、制御部15が実行するプログラムや、制御部15が所定の処理を実行するのに必要な情報を記憶する。本実施例では、記憶部12は、地図DB10と、ライダ設置情報ILとを有する。ライダ設置情報ILは、ある基準時(例えばライダ2のアライメント調整直後などの位置ずれが生じていない時)に計測された各ライダ2の相対的な3次元位置と姿勢角に関する情報である。本実施例では、ライダ2等の姿勢角を、ロール角、ピッチ角、ヨー角(即ちオイラー角)により表すものとする。 The storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process. In the present embodiment, the storage unit 12 includes a map DB 10 and rider installation information IL. The lidar installation information IL is information related to the relative three-dimensional position and posture angle of each rider 2 measured at a certain reference time (for example, when there is no positional deviation such as immediately after alignment adjustment of the rider 2). In the present embodiment, the attitude angle of the lidar 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, Euler angle).
 入力部14は、ユーザが操作するためのボタン、タッチパネル、リモートコントローラ、音声入力装置等であり、経路探索のための目的地を指定する入力、自動運転のオン及びオフを指定する入力などを受け付ける。情報出力部16は、例えば、制御部15の制御に基づき出力を行うディスプレイやスピーカ等である。 The input unit 14 is a button operated by the user, a touch panel, a remote controller, a voice input device, and the like, and receives an input for specifying a destination for route search, an input for specifying on / off of automatic driving, and the like. . The information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
 制御部15は、プログラムを実行するCPUなどを含み、車載機1の全体を制御する。制御部15は、インターフェース11から供給される各センサの出力信号及び地図DB10に基づき、自車位置の推定を行い、自車位置の推定結果に基づいて自動運転制御を含む車両の運転支援に関する制御などを行う。このとき、制御部15は、ライダ2の出力データを用いる場合には、ライダ2が出力する計測データを、ライダ設置情報ILに記録されたライダ2の姿勢角及び位置を基準として、ライダ2を基準とした座標系(「ライダ座標系」とも呼ぶ。)から車両を基準とした座標系(「車両座標系」とも呼ぶ。)に変換する。さらに、本実施例では、制御部15は、位置ずれが生じているライダ2を検出し、当該ライダ2の位置及び姿勢角を推定する。制御部15は、本発明における「計測距離取得部」、「予測距離取得部」、「検出部」、「予測位置取得部」、「補正部」、「推定部」及びプログラムを実行する「コンピュータ」の一例である。 The control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1. The control unit 15 estimates the vehicle position based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and controls vehicle driving support including automatic driving control based on the estimation result of the vehicle position. And so on. At this time, when the output data of the rider 2 is used, the control unit 15 uses the measurement data output by the rider 2 as the reference for the attitude angle and position of the rider 2 recorded in the rider installation information IL. The reference coordinate system (also referred to as “rider coordinate system”) is converted into a coordinate system based on the vehicle (also referred to as “vehicle coordinate system”). Further, in the present embodiment, the control unit 15 detects the lidar 2 in which the positional deviation has occurred, and estimates the position and posture angle of the lidar 2. The control unit 15 includes a “measurement distance acquisition unit”, a “predicted distance acquisition unit”, a “detection unit”, a “predicted position acquisition unit”, a “correction unit”, an “estimation unit”, and a “computer” that executes a program. Is an example.
 [自車位置推定処理の概要]
 まず、制御部15による自車位置の推定処理の概要について説明する。
[Outline of the vehicle position estimation process]
First, the outline | summary of the estimation process of the own vehicle position by the control part 15 is demonstrated.
 制御部15は、ランドマークに対するライダ2による距離及び角度の計測値と、地図DB10から抽出したランドマークの位置情報とに基づき、ジャイロセンサ3、車速センサ4、及び/又はGPS受信機5の出力データから推定した自車位置を補正する。本実施例では、一例として、制御部15は、ベイズ推定に基づく状態推定手法に基づき、ジャイロセンサ3、車速センサ4等の出力データから自車位置を予測する予測ステップと、直前の予測ステップで算出した自車位置の予測値を補正する計測更新ステップとを交互に実行する。これらのステップで用いる状態推定フィルタは、ベイズ推定を行うように開発された様々のフィルタが利用可能であり、例えば、拡張カルマンフィルタ、アンセンテッドカルマンフィルタ、パーティクルフィルタなどが該当する。このように、ベイズ推定に基づく位置推定は、種々の方法が提案されている。 The control unit 15 outputs the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 based on the measured values of the distance and angle by the lidar 2 with respect to the landmark and the position information of the landmark extracted from the map DB 10. The vehicle position estimated from the data is corrected. In the present embodiment, as an example, the control unit 15 performs a prediction step of predicting the vehicle position from output data of the gyro sensor 3 and the vehicle speed sensor 4 based on a state estimation method based on Bayesian estimation, and a previous prediction step. A measurement update step for correcting the calculated predicted value of the vehicle position is executed alternately. Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter. As described above, various methods have been proposed for position estimation based on Bayesian estimation.
 以下では、拡張カルマンフィルタを用いた自車位置推定について簡略的に説明する。 In the following, the vehicle position estimation using the extended Kalman filter will be briefly described.
 図3は、状態変数ベクトルxを2次元直交座標で表した図である。図3に示すように、xyの2次元直交座標上で定義された平面での自車位置は、座標「(x、y)」、自車の方位(ヨー角)「ψ」により表される。ここでは、ヨー角ψは、車の進行方向とx軸とのなす角として定義されている。なお、ここでの説明は、上述の座標(x、y)及びヨー角ψに加えて、x軸及びy軸に垂直なz軸の座標を勘案した4変数(x、y、z、ψ)を自車位置の状態変数とした自車位置推定を行う。これは、一般的な道路は勾配が緩やかであるため、車両のピッチ角及びロール角については無視することが可能であるためであるが、車両のピッチ角およびロール角も含めた6変数を自車位置の状態変数とした自車位置推定の例も後述する。 FIG. 3 is a diagram showing the state variable vector x in two-dimensional orthogonal coordinates. As shown in FIG. 3, the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ψ” of the vehicle. . Here, the yaw angle ψ is defined as an angle formed by the traveling direction of the vehicle and the x-axis. The description here is based on four variables (x, y, z, ψ) taking into account the coordinates of the z axis perpendicular to the x axis and the y axis in addition to the coordinates (x, y) and the yaw angle ψ described above. The vehicle position is estimated using the state variable of the vehicle position. This is because a general road has a gentle slope, so it is possible to ignore the pitch angle and roll angle of the vehicle. However, six variables including the pitch angle and roll angle of the vehicle are not used. An example of the vehicle position estimation using the vehicle position state variable will also be described later.
 図4は、予測ステップと計測更新ステップとの概略的な関係を示す図である。また、図5は、制御部15の機能ブロックの一例を示す。図4に示すように、予測ステップと計測更新ステップとを繰り返すことで、自車位置を示す状態変数ベクトル「X」の推定値の算出及び更新を逐次的に実行する。また、図5に示すように、制御部15は、予測ステップを実行する位置予測部21と、計測更新ステップを実行する位置推定部22とを有する。位置予測部21は、デッドレコニングブロック23及び位置予測ブロック24を含み、位置推定部22は、ランドマーク探索・抽出ブロック25及び位置補正ブロック26を含む。なお、図4では、計算対象となる基準時刻(即ち現在時刻)「k」の状態変数ベクトルを、「X(k)」または「X(k)」と表記している(「状態変数ベクトルX(k)=(x(k)、y(k)、z(k)、ψ(k))」と表記する)。ここで、予測ステップで推定された暫定的な推定値(予測値)には当該予測値を表す文字の上に「」を付し、計測更新ステップで更新された,より精度の高い推定値には当該値を表す文字の上に「」を付す。 FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step. FIG. 5 shows an example of functional blocks of the control unit 15. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the control part 15 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step. The position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26. In FIG. 4, the state variable vector of the reference time (ie, current time) “k” to be calculated is expressed as “X (k)” or “X ^ (k)” (“state variable Vector X (k) = (denoted as x (k), y (k), z (k), ψ (k)) T ”). Here, the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ^ ” on the character representing the value.
 予測ステップでは、制御部15のデッドレコニングブロック23は、車両の移動速度「v」と角速度「ω」(これらをまとめて「制御値u(k)=(v(k)、ω(k))」と表記する。)を用い、前回時刻からの移動距離と方位変化を求める。制御部15の位置予測ブロック24は、直前の計測更新ステップで算出された時刻k-1の状態変数ベクトルX(k-1)に対し、求めた移動距離と方位変化を加えて、時刻kの自車位置の予測値(「予測位置」とも呼ぶ。)X(k)を算出する。また、これと同時に、予測位置X(k)の誤差分布に相当する共分散行列「P(k)」を、直前の計測更新ステップで算出された時刻k-1での共分散行列「P(k-1)」から算出する。 In the prediction step, the dead reckoning block 23 of the control unit 15 moves the vehicle moving speed “v” and the angular velocity “ω” (collectively, “control value u (k) = (v (k), ω (k))”. T ”is used to obtain the movement distance and azimuth change from the previous time. The position prediction block 24 of the control unit 15 adds the obtained moving distance and azimuth change to the state variable vector X ^ (k-1) at the time k-1 calculated in the immediately previous measurement update step, so that the time k A predicted value (also referred to as “predicted position”) X (k) is calculated. At the same time, the covariance matrix “P (k)” corresponding to the error distribution at the predicted position X (k) is changed to the covariance matrix “at time k−1 calculated in the immediately preceding measurement update step”. P ^ (k-1) ".
 計測更新ステップでは、制御部15のランドマーク探索・抽出ブロック25は、地図DB10に登録されたランドマークの位置ベクトルとライダ2のスキャンデータとの対応付けを行う。そして、制御部15のランドマーク探索・抽出ブロック25は、この対応付けができた場合に、対応付けができたランドマークのライダ2による計測値「Z(k)」と、予測位置X(k)及び地図DB10に登録されたランドマークの位置ベクトルを用いてライダ2による計測処理をモデル化して求めたランドマークの計測値(「計測予測値」と呼ぶ。)「Z(k)」とをそれぞれ取得する。計測値Z(k)は、時刻kにライダ2が計測したランドマークの距離及びスキャン角度から、車両の進行方向と横方向を軸とした成分に変換した車両座標系におけるベクトル値である。そして、制御部15の位置補正ブロック26は、計測値Z(k)と計測予測値Z(k)との差分値を算出する。 In the measurement update step, the landmark search / extraction block 25 of the control unit 15 associates the landmark position vector registered in the map DB 10 with the scan data of the lidar 2. Then, the landmark search / extraction block 25 of the control unit 15, when this association is possible, the measurement value “Z (k)” by the lidar 2 of the landmark that has been associated, and the predicted position X ( k) and a landmark measurement value obtained by modeling the measurement processing by the lidar 2 using the landmark position vector registered in the map DB 10 (referred to as “measurement prediction value”) “Z (k)”. And get respectively. The measured value Z (k) is a vector value in the vehicle coordinate system converted from the landmark distance and scan angle measured by the lidar 2 at time k into components with the vehicle traveling direction and the lateral direction as axes. Then, the position correction block 26 of the control unit 15 calculates a difference value between the measured value Z (k) and the measured predicted value Z (k).
 そして、制御部15の位置補正ブロック26は、以下の式(1)に示すように、計測値Z(k)と計測予測値Z(k)との差分値にカルマンゲイン「K(k)」を乗算し、これを予測位置X(k)に加えることで、更新された状態変数ベクトル(「推定位置」とも呼ぶ。)X(k)を算出する。 The position correction block 26 of the control unit 15 then adds the Kalman gain “K (k) to the difference value between the measured value Z (k) and the measured predicted value Z (k) as shown in the following equation (1). ”And adding this to the predicted position X (k), an updated state variable vector (also referred to as“ estimated position ”) X ^ (k) is calculated.
Figure JPOXMLDOC01-appb-M000001
 また、計測更新ステップでは、制御部15の位置補正ブロック26は、予測ステップと同様、推定位置X(k)の誤差分布に相当する共分散行列P(k)(単にP(k)とも表記する)を共分散行列P(k)から求める。カルマンゲインK(k)等のパラメータについては、例えば拡張カルマンフィルタを用いた公知の自己位置推定技術と同様に算出することが可能である。
Figure JPOXMLDOC01-appb-M000001
Further, in the measurement update step, the position correction block 26 of the control unit 15 is similar to the prediction step in that the covariance matrix P ^ (k) (simply referred to as P (k) corresponding to the error distribution of the estimated position X ^ (k). Is expressed from the covariance matrix P (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
 このように、予測ステップと計測更新ステップが繰り返し実施され、予測位置X(k)と推定位置X(k)が逐次的に計算されることにより、もっとも確からしい自車位置が計算される。 In this way, the prediction step and the measurement update step are repeatedly performed, and the predicted position X (k) and the estimated position X ^ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
 次に、複数のライダ2を用いた自車位置推定について説明する。本実施例では、いずれのライダ2にも位置ずれが生じていないときには、ライダ2A~2Cの各計測値を用いて、式(1)に相当するカルマンフィルタの更新式を用いることで自車位置を高精度に推定する。なお、この場合、各ライダ2A~2Cが検出するランドマークは同一であってもよく、異なっていてもよい。 Next, the vehicle position estimation using a plurality of riders 2 will be described. In this embodiment, when no displacement occurs in any of the riders 2, the vehicle position is determined by using the Kalman filter update equation corresponding to equation (1) using the measured values of the riders 2A to 2C. Estimate with high accuracy. In this case, the landmarks detected by the lidars 2A to 2C may be the same or different.
 ここで、ライダ2A~2Cの各点群データにおける時刻「k」のランドマークに対する計測値をそれぞれ「Z(k)」、「Z(k)」、「Z(k)」とし、これらに対応する計測予測値を「Z (k)」、「Z (k)」、「Z (k)」とする。この場合、各ライダ2A~2Cに対するカルマンフィルタの更新式は、以下の式(2-1)~式(2-3)により表される。 Here, the measured values for the landmark at time “k” in the point cloud data of the riders 2A to 2C are “Z 1 (k)”, “Z 2 (k)”, and “Z 3 (k)”, respectively. The measurement predicted values corresponding to these are assumed to be “Z - 1 (k)”, “Z - 2 (k)”, and “Z - 3 (k)”. In this case, the Kalman filter update formulas for the riders 2A to 2C are expressed by the following formulas (2-1) to (2-3).
Figure JPOXMLDOC01-appb-M000002
 そして、本実施例では、制御部15は、ライダ2Aの計測値Z(k)に基づき式(2-1)を用いて得られた推定位置X(k)を、ライダ2Bの予測位置X(k)として代入し、ライダ2Bの計測値Z(k)に基づき式(2-2)から推定位置X(k)を算出する。さらに、制御部15は、この推定位置X(k)を、ライダ2Cの予測位置X(k)として代入し、ライダ2Cの計測値Z(k)に基づき式(2-3)から推定位置X(k)を算出する。このようにすることで、各ライダ2A~2Cの計測値を用いて高精度な推定位置X(k)を算出することができる。
Figure JPOXMLDOC01-appb-M000002
In this embodiment, the control unit 15 uses the estimated position X ^ (k) obtained by using the equation (2-1) based on the measured value Z 1 (k) of the lidar 2A as the predicted position of the lidar 2B. X (k) is substituted, and the estimated position X ^ (k) is calculated from the equation (2-2) based on the measured value Z 2 (k) of the lidar 2B. Further, the control unit 15 substitutes this estimated position X ^ (k) as the predicted position X (k) of the lidar 2C, and from the equation (2-3) based on the measured value Z 3 (k) of the lidar 2C. Estimated position X ^ (k) is calculated. In this way, it is possible to calculate the estimated position X ^ (k) with high accuracy using the measurement values of the riders 2A to 2C.
 なお、計測予測値Z (k)(i=1、2、3)は、以下の式(3)により表される。 The predicted measurement value Z i (k) (i = 1, 2, 3) is expressed by the following equation (3).
Figure JPOXMLDOC01-appb-M000003
 ここで、「xmi(j)」、「ymi(j)」、「zmi(j)」は、地図DB10から抽出したインデックス番号jのランドマークの位置(「ランドマーク座標」とも呼ぶ。)を示す。ここでは、計測予測値を車両の進行方向を軸にとった車両座標系により表すため、ランドマーク座標を、自車位置の予測位置x(k)、y(k)、z(k)と、車両のヨー角ψを用いた回転行列とを用いて、車両座標系に変換している。以後では、車両座標系における各座標軸に沿った方向(車両の進行方向、左右方向、高さ方向)を、それぞれ単に「x方向」、「y方向」、「z方向」とも呼ぶ。
Figure JPOXMLDOC01-appb-M000003
Here, “x mi (j)”, “y mi (j)”, and “z mi (j)” are also called landmark positions (“landmark coordinates”) of index number j extracted from the map DB 10. ). Here, since the measured predicted value is represented by a vehicle coordinate system with the traveling direction of the vehicle as an axis, the landmark coordinates are represented by predicted positions x (k), y (k), z (k ) And a rotation matrix using the vehicle yaw angle ψ, it is converted into a vehicle coordinate system. Hereinafter, directions along the respective coordinate axes in the vehicle coordinate system (vehicle traveling direction, left-right direction, and height direction) are also simply referred to as “x direction”, “y direction”, and “z direction”, respectively.
 また、計測予測値Z (k)(=[x Li(k)、y Li(k)、z Li(k)])と計測値Z(k)(=[xLi(k)、yLi(k)、zLi(k)])との差分値「ΔZ(k)」(=[ΔxLi(k)、ΔyLi(k)、ΔzLi(k)])は、以下の式(4)により表される。 Further, the measured predicted value Z i (k) (= [x Li (k), y Li (k), z Li (k)] T ) and the measured value Z i (k) (= [x Li (K), y Li (k), z Li (k)] T ) and the difference value “ΔZ i (k)” (= [Δx Li (k), Δy Li (k), Δz Li (k)] T ) is represented by the following formula (4).
Figure JPOXMLDOC01-appb-M000004
 この差分値ΔZ(k)は、後述するように、ずれ発生ライダの検出に用いられる。
Figure JPOXMLDOC01-appb-M000004
This difference value ΔZ i (k) is used for detection of a deviation occurrence lidar, as will be described later.
 [ずれ発生ライダの検出]
 制御部15は、各ライダ2の計測予測値Z (k)と計測値Z(k)との差分値の所定時間の平均値を算出し、当該平均値が0から離れているライダ2を、ずれ発生ライダとして検出する。上述の所定時間は、例えば位置ずれが生じてないライダ2の計測値に基づき算出した差分値ΔZが実質的に0となるような時間長に設定される。
[Detection of misaligned lidar]
The control unit 15 calculates an average value for a predetermined time of a difference value between the measured predicted value Z i (k) and the measured value Z i (k) of each rider 2, and the average value of the rider whose average value is separated from 0 2 is detected as a deviation occurrence lidar. For example, the predetermined time is set to a time length such that the difference value ΔZ calculated based on the measurement value of the rider 2 in which no positional deviation has occurred is substantially zero.
 図6(A)は、所定時間におけるライダ2Aの差分値ΔZ(k)の各座標値ΔxL1、ΔyL1、ΔzL1の時間変化を示すグラフである。また、図6(B)は、所定時間におけるライダ2Bの差分値ΔZ(k)の各座標値ΔxL2、ΔyL2、ΔzL2の時間変化を示すグラフである。図6(C)は、所定時間におけるライダ2Cの差分値Z(k)の各座標値ΔxL3、ΔyL3、ΔzL3の時間変化を示すグラフである。図6(A)~図6(C)の例では、ライダ2Cのみに位置ずれが生じているものとする。 FIG. 6A is a graph showing temporal changes in the coordinate values Δx L1 , Δy L1 , and Δz L1 of the difference value ΔZ 1 (k) of the rider 2A at a predetermined time. FIG. 6B is a graph showing temporal changes in the coordinate values Δx L2 , Δy L2 , and Δz L2 of the difference value ΔZ 2 (k) of the rider 2B at a predetermined time. FIG. 6C is a graph showing temporal changes in the coordinate values Δx L3 , Δy L3 , and Δz L3 of the difference value Z 3 (k) of the rider 2C at a predetermined time. In the examples of FIGS. 6A to 6C, it is assumed that only the lidar 2C has a positional shift.
 この場合、図6(A)に示すように、所定時間における差分値ΔZ(k)の各座標値ΔxL1、ΔyL1、ΔzL1の時間平均の大きさは0に近似している。同様に、図6(B)に示す差分値ΔZ(k)の各座標値ΔxL2、ΔyL2、ΔzL2の所定時間における時間平均の大きさは0に近似している。一方、ライダ2Cには位置ずれが発生しており、z座標におけるライダ2Cの計測値がその位置ずれの影響を受けている。従って、この場合、図6(C)に示すように、差分値ΔZ(k)の座標値ΔxL3、ΔyL3の所定時間における時間平均の大きさは0に近似しているが、座標値ΔzL3の所定時間における時間平均の大きさは、明らかに0からかけ離れている。 In this case, as shown in FIG. 6A, the time average size of the coordinate values Δx L1 , Δy L1 , Δz L1 of the difference value ΔZ 1 (k) in a predetermined time is close to zero. Similarly, the magnitude of the time average of the coordinate values Δx L2 , Δy L2 , Δz L2 of the difference value ΔZ 2 (k) shown in FIG. On the other hand, a displacement occurs in the lidar 2C, and the measurement value of the lidar 2C in the z coordinate is affected by the displacement. Therefore, in this case, as shown in FIG. 6C, although the coordinate values Δx L3 and Δy L3 of the difference value ΔZ 3 (k) in the predetermined time are close to 0, the coordinate value The magnitude of the time average of Δz L3 at a predetermined time is clearly different from zero.
 このように、位置ずれが生じているライダ2の計測値に基づく差分値ΔZは、位置ずれに起因して計測値Zの各座標値x、y、zの少なくともいずれかが異常値となり、その結果差分値ΔZの各座標値Δx、Δy、Δzの少なくともいずれかが0からかけ離れることになる。 As described above, the difference value ΔZ based on the measurement value of the lidar 2 in which the positional deviation has occurred is an abnormal value if at least one of the coordinate values x L , y L , and z L of the measured value Z is caused by the positional deviation. As a result, at least one of the coordinate values Δx L , Δy L , Δz L of the difference value ΔZ is far from 0.
 以上を勘案し、本実施例では、制御部15は、各ライダ2の差分値ΔZの各座標値Δx、Δy、Δzの所定時間の時間平均を算出し、当該時間平均の大きさが所定値よりも大きくなる差分値ΔZの座標値が存在した場合、対応するライダ2を位置ずれ発生ライダとみなす。これにより、位置ずれ発生ライダを高精度に検出することができる。なお、制御部15は、全てのライダ2に対応する差分値ΔZの座標値の時間平均の大きさが所定値よりも大きくなる場合には、位置ずれ発生ライダを特定できないと判断し、少なくともいずれかのライダ2に位置ずれが生じている旨の警告等を情報出力部16により出力してもよい。 Considering the above, in the present embodiment, the control unit 15 calculates the time average of the coordinate values Δx L , Δy L , Δz L of the difference values ΔZ of the riders 2 for a predetermined time, and the magnitude of the time average When there is a coordinate value of the difference value ΔZ that becomes larger than a predetermined value, the corresponding rider 2 is regarded as a position shift occurrence rider. As a result, it is possible to detect the position shift occurrence lidar with high accuracy. Note that the control unit 15 determines that the position deviation occurrence rider cannot be specified when the time average size of the coordinate values of the difference values ΔZ corresponding to all the riders 2 is larger than a predetermined value, and at least any of them is determined. The information output unit 16 may output a warning or the like indicating that a positional deviation has occurred in the rider 2.
 [姿勢角及び位置の推定]
 次に、ずれ発生ライダの姿勢角及び位置の推定方法(第1推定方法、第2推定方法)について説明する。以下では、一例として、ライダ2Cをずれ発生ライダとした場合のライダ2Cの位置及び姿勢角の推定方法について説明する。
[Estimation of posture angle and position]
Next, an estimation method (first estimation method and second estimation method) of the attitude angle and position of the deviation-generating lidar will be described. Hereinafter, as an example, a method for estimating the position and posture angle of the rider 2C when the rider 2C is used as a displacement-generating rider will be described.
 (1)第1推定方法
 第1推定方法では、制御部15は、ずれ発生ライダ以外のライダ2により推定した自車位置推定結果と、ずれ発生ライダにより計測されるランドマークのランドマーク座標とに基づき算出した当該ランドマークの計測予測値を、ずれ発生ライダ2の計測値の期待値(「計測期待値」とも呼ぶ。)として算出する。そして、制御部15は、計測期待値とずれ発生ライダの実際の計測値の組に基づき、ニュートン法を用いた最小2乗法を行い、ずれ発生ライダの姿勢角及び位置を推定する。
(1) First Estimation Method In the first estimation method, the control unit 15 uses the vehicle position estimation result estimated by the rider 2 other than the deviation occurrence rider and the landmark coordinates of the landmark measured by the deviation occurrence rider. The measured predicted value of the landmark calculated based on the calculated value is calculated as an expected value (also referred to as “measured expected value”) of the measured value of the deviation occurrence lidar 2. Then, the control unit 15 performs the least square method using the Newton method based on the set of the measurement expected value and the actual measurement value of the deviation occurrence lidar, and estimates the attitude angle and position of the deviation occurrence lidar.
 ここで、位置ずれが生じていないライダ2A及びライダ2Bにより推定した自車位置推定結果に基づくずれ発生ライダであるライダ2Cの計測予測値Z (k)を計測期待値とした場合、計測期待値[x(k)、y(k)、z(k)]は、以下の式(5)により表される。 Here, positional deviation is measured predicted value Z rider 2C by not not rider 2A and rider 2B which is a deviation occurs rider based on the vehicle position estimation result of the estimation occurs - When measured expected value 3 (k), measuring expected value [x b (k), y b (k), z b (k)] T is expressed by the following equation (5).
Figure JPOXMLDOC01-appb-M000005
 また、ずれ発生ライダであるライダ2Cの計測予測値Z (k)は、以下の式(6)により表される。
Figure JPOXMLDOC01-appb-M000005
Further, the measurement predicted value Z rider 2C is a deviation occurs rider - 3 (k) is expressed by the following equation (6).
Figure JPOXMLDOC01-appb-M000006
 ここで、各予測値x、y、z、ψは、以下の式(7)に示されるように、位置ずれが生じていないライダ2A、2Bの計測値に基づき推定された推定位置を示す。
Figure JPOXMLDOC01-appb-M000006
Here, each predicted value x , y , z , ψ is estimated based on the measured values of the lidars 2A and 2B in which no positional deviation occurs, as shown in the following equation (7). Indicates the position.
Figure JPOXMLDOC01-appb-M000007
 このように、計測期待値[x(k)、y(k)、z(k)]は、ずれ発生ライダであるライダ2Cが計測するインデックス番号jのランドマークのランドマーク座標[xm3(j)、ym3(j)、zm3(j)]を、位置ずれが生じていないライダ2A、2Bによる自車位置推定結果(x、y、z、ψ)により車両座標系に変換した値となる。そして、制御部15は、式(5)に示される計測期待値に対し、ニュートン法を用いた最小二乗法を実行し、ずれ発生ライダであるライダ2Cのロール角「Lφ」、ピッチ角「Lθ」、ヨー角「Lψ」、x方向における位置「L」、y方向における位置「L」、z方向における位置「L」を推定する。これらの姿勢角Lφ、Lθ、Lψ及び位置L、L、Lにより、車両座標系とライダ座標系との関係が定まる。車両座標系とライダ座標系との関係については、「(3)座標系の変換」のセクションで詳しく説明する。
Figure JPOXMLDOC01-appb-M000007
In this way, the expected measurement value [x b (k), y b (k), z b (k)] T is the landmark coordinate of the landmark of the index number j measured by the lidar 2C, which is the displacement-generating lidar [ x m3 (j), y m3 (j), z m3 (j)] T is the vehicle position estimation result (x , y , z , ψ ) by the riders 2A and 2B in which no positional deviation has occurred. Is the value converted into the vehicle coordinate system. Then, the control unit 15 performs the least square method using the Newton method on the measurement expected value shown in the equation (5), and the roll angle “L φ ” and the pitch angle “ L θ ”, yaw angle“ L ψ ”, position“ L x ”in the x direction, position“ L y ”in the y direction, and position“ L z ”in the z direction are estimated. The relationship between the vehicle coordinate system and the lidar coordinate system is determined by these posture angles L φ , L θ , L ψ and positions L x , L y , L z . The relationship between the vehicle coordinate system and the lidar coordinate system will be described in detail in the section “(3) Coordinate system conversion”.
 次に、ずれ発生ライダの姿勢角Lφ、Lθ、Lψ、及び位置L、L、Lを、ニュートン法を用いた最小二乗法により算出する方法の具体例について説明する。以後では、ずれ発生ライダによる時刻kのランドマークの計測値を[x(k)、y(k)、z(k)]と表記する。 Next, a specific example of a method for calculating the attitude angles L φ , L θ , L ψ and the positions L x , L y , L z of the deviation-generating lidar by the least square method using the Newton method will be described. Hereinafter, the measured value of the landmark at time k by the shift generation lidar is expressed as [x L (k), y L (k), z L (k)] T.
 この場合、計測値[x(k)、y(k)、z(k)]は、姿勢角Lφ、Lθ、Lψ、及び位置L、L、Lを用いて以下の式(8)により表される。 In this case, the measured values [x L (k), y L (k), z L (k)] T use posture angles L φ , L θ , L ψ , and positions L x , L y , L z . Is represented by the following equation (8).
Figure JPOXMLDOC01-appb-M000008
 また、以下の式(9)は、式(8)と等価な式となる。
Figure JPOXMLDOC01-appb-M000008
The following formula (9) is an equivalent formula to the formula (8).
Figure JPOXMLDOC01-appb-M000009
 一般的に,6個の変数(姿勢角Lφ、Lθ、Lψ、及び位置L、L、L)を求めるには,最低でも6個の式が必要である。よって、少なくとも2回以上の計測結果と計測期待値を用いて式(9)に代入することでこれらの6個の変数の解が求める。なお、使用する計測値及び計測期待値の個数が多いほど、最小2乗法による解の正確性を向上させることができるため、好適には、得られた全ての計測値と計測期待値を用いるとよい。
Figure JPOXMLDOC01-appb-M000009
Generally, in order to obtain six variables (attitude angles L φ , L θ , L ψ , and positions L x , L y , L z ), at least six equations are required. Therefore, the solution of these six variables is obtained by substituting into the formula (9) using the measurement result and the expected measurement value at least twice. In addition, since the accuracy of the solution by the least square method can be improved as the number of measured values and measured expected values to be used is increased, preferably, all obtained measured values and measured expected values are used. Good.
 また、式(9)に示されるx(k)、y(k)、z(k)の関数は、非線形関数のため,解析的に解を求めることができない。よって、本実施例では、初期値の周りで線形化し、ニュートン法を用いた逐次計算によって解を求めることにする。そのため,以下の式(10)を作成する。 Further, since the functions of x L (k), y L (k), and z L (k) shown in Expression (9) are nonlinear functions, a solution cannot be obtained analytically. Therefore, in this embodiment, the solution is linearized around the initial value and the solution is obtained by sequential calculation using the Newton method. Therefore, the following formula (10) is created.
Figure JPOXMLDOC01-appb-M000010
 ここで、x(k)に対する偏微分の式は、以下の式(11)により表される。
Figure JPOXMLDOC01-appb-M000010
Here, the partial differential expression with respect to x L (k) is expressed by the following expression (11).
Figure JPOXMLDOC01-appb-M000011
 また、yL(k)に対する偏微分の式は、以下の式(12)により表される。
Figure JPOXMLDOC01-appb-M000011
Moreover, the equation of partial differentiation with respect to yL (k) is expressed by the following equation (12).
Figure JPOXMLDOC01-appb-M000012
 さらに、z(k)に対する偏微分の式は、以下の式(13)により表される。
Figure JPOXMLDOC01-appb-M000012
Furthermore, the equation of partial differentiation with respect to z L (k) is represented by the following equation (13).
Figure JPOXMLDOC01-appb-M000013
 ここで、ライダ設置情報ILに記憶されている(即ちずれが発生していない)姿勢角及び位置の初期値をそれぞれ「Lφ0」、「Lθ0」、「Lψ0」、「Lx0」、「Ly0」、「Lz0」とすると、式(9)と同様の以下の式(14)が成立する。
Figure JPOXMLDOC01-appb-M000013
Here, initial values of posture angles and positions stored in the rider installation information IL (that is, no deviation occurs) are “L φ0 ”, “L θ0 ”, “L ψ0 ”, “L x0 ”, Assuming that “L y0 ” and “L z0 ”, the following equation (14) similar to the equation (9) is established.
Figure JPOXMLDOC01-appb-M000014
 そして、3回の計測値を基に解を求める場合を例にとると、3回分の計測値と計測期待値の組{[xL1(k)、yL1(k)、zL1(k)]、[xb1(k)、yb1(k)、zb1(k)]}、{[xL2(k)、yL2(k)、zL2(k)]、[xb2(k)、yb2(k)、zb2(k)]}、{[xL3(k)、yL3(k)、zL3(k)]、[xb3(k)、yb3(k)、zb3(k)]}を用いると、以下の式(15)に示すような線形式が成立する。
Figure JPOXMLDOC01-appb-M000014
Then, taking as an example a case where a solution is obtained based on three measurement values, a set of measurement values and measurement expected values for three times {[x L1 (k), y L1 (k), z L1 (k) ] T , [ xb1 (k), yb1 (k), zb1 (k)] T }, {[ xL2 (k), yL2 (k), zL2 (k)] T , [ xb2 (K), y b2 (k), z b2 (k)] T }, {[x L3 (k), y L3 (k), z L3 (k)] T , [x b3 (k), y b3 When (k), z b3 (k)] T } is used, a linear form as shown in the following equation (15) is established.
Figure JPOXMLDOC01-appb-M000015
 以後では、式(15)において、3回分の計測値からなる9行1列の行列(即ち左辺)を「z」、9行6列のヤコビ行列を「C」、姿勢角及び位置の初期値と現在値との差分を示す6行1列の行列を「Δx」、式(14)により求まる初期計測値xL0、yL0、zL0に関する9行1列の行列を「d」とする。この場合、式(15)は、以下の式により表される。
       z=CΔx+d
Figure JPOXMLDOC01-appb-M000015
Hereinafter, in Equation (15), the 9-row and 1-column matrix (that is, the left side) of the measurement values for 3 times is “z”, the 9-by-6 Jacobian matrix is “C”, and the initial values of the posture angle and position The matrix of 6 rows and 1 column indicating the difference between the current value and the current value is “Δx”, and the matrix of 9 rows and 1 column regarding the initial measurement values x L0 , y L0 , and z L0 obtained by the equation (14) is “d”. In this case, Expression (15) is expressed by the following expression.
z = CΔx + d
 ここで、線形化した式「z=CΔx+d」で計算されるzと、実際に計測される計測値「zob」との間には、誤差「v」が存在すると考えると、
       zob=z+v
が成立する。よって、
       v=-CΔx-d+zob
が成立する。
Here, when an error “v” exists between z calculated by the linearized expression “z = CΔx + d” and a measured value “z ob ” actually measured,
z ob = z + v
Is established. Therefore,
v = −CΔx−d + z ob
Is established.
 ここで、Δz=zob-dとおくと、誤差の2乗和「vv」は、以下の式(16)により表される。 Here, if Δz = z ob −d, the square sum of errors “v T v” is expressed by the following equation (16).
Figure JPOXMLDOC01-appb-M000016
 従って、式(16)の右辺が最小となるときのΔxが求めたい解となる。
Figure JPOXMLDOC01-appb-M000016
Therefore, Δx when the right side of Equation (16) is the minimum is the desired solution.
 ここで、誤差の2乗和「vv」が最小となるときは、偏微分したものが0であるため、以下の式(17)に示す関係が成立する。 Here, when the square sum of errors “v T v” is minimized, since the partial differential is 0, the relationship shown in the following equation (17) is established.
Figure JPOXMLDOC01-appb-M000017
 そして、式(17)に基づき、以下の式(18)が成立する。
Figure JPOXMLDOC01-appb-M000017
And based on Formula (17), the following Formula (18) is materialized.
Figure JPOXMLDOC01-appb-M000018
 そして、式(18)の両辺を転置にすると、正規方程式に相当する以下の式(19)が得られる。
Figure JPOXMLDOC01-appb-M000018
Then, when both sides of the equation (18) are transposed, the following equation (19) corresponding to the normal equation is obtained.
Figure JPOXMLDOC01-appb-M000019
 よって、Δxは、以下の式(20)により表される。
Figure JPOXMLDOC01-appb-M000019
Therefore, Δx is expressed by the following equation (20).
Figure JPOXMLDOC01-appb-M000020
 そして、式(20)で求められたΔxを「x=Δx+x」の式に代入することで、最小2乗解が求まる。このxを新たなxとして漸化式計算を繰り返し、xの変化が極小になれば、それが最終的なずれ発生ライダの位置姿勢となる。
Figure JPOXMLDOC01-appb-M000020
Then, by substituting Δx obtained by Expression (20) into an expression “x = Δx + x 0 ”, a least square solution is obtained. Repeat recurrence formula calculation of the x as a new x 0, if the change in x is minimal, it is the position and orientation of the final misalignment occurs rider.
 以上を勘案し、制御部15は、以下の第1ステップ~第5ステップを行うことで、ずれ発生ライダの姿勢角及び位置を好適に推定することができる。 Taking the above into consideration, the control unit 15 can suitably estimate the attitude angle and position of the deviation-generating lidar by performing the following first to fifth steps.
 まず、第1ステップでは、制御部15は、ライダ設置情報ILに記憶された初期の位置Lx0、Ly0、Lz0及び姿勢角Lφ0、Lθ0、Lψ0と、複数の計測期待値とに基づき、ヤコビ行列Cと、式(14)に基づく初期計測値[xL0、yL0、zL0とを算出する。そして、第2ステップでは、制御部15は、複数の計測値から上述の行例「zob」を生成し、初期計測値から構成した上述の行列「d」を減算することで、Δz(=zob-d)を算出する。 First, in the first step, the control unit 15 includes initial positions L x0 , L y0 , L z0 and attitude angles L φ0 , L θ0 , L ψ0 stored in the rider installation information IL, and a plurality of measurement expected values. Based on, the Jacobian matrix C and the initial measurement values [x L0 , y L0 , z L0 ] T based on the equation (14) are calculated. Then, in the second step, the control unit 15 generates a Gyorei "z ob" described above from a plurality of measurements, subtracting the matrix above constructed from the initial measured value "d", Delta] z (= z ob -d) is calculated.
 次に、第3ステップでは、制御部15は、式(20)に基づき、Δxを算出する。そして、第4ステップでは、制御部15は、「x=Δx+x」の式に基づき、最小二乗解を求める。そして、第5ステップでは、制御部15は、Δxが所定値より大きければ、xを位置Lx0、Ly0、Lz0及び姿勢角Lφ0、Lθ0、Lψ0に設定し、再び第1ステップに戻る。一方、制御部15は、Δxが所定値以下の場合には、第4ステップで求めたxを最終的な解とみなし、当該xが示す位置L、L、L及び姿勢角Lφ、Lθ、Lψをずれ発生ライダの位置及び姿勢角とみなす。 Next, in the third step, the control unit 15 calculates Δx based on Expression (20). In the fourth step, the control unit 15 obtains a least squares solution based on the expression “x = Δx + x 0 ”. In the fifth step, if Δx is larger than the predetermined value, the control unit 15 sets x to the positions L x0 , L y0 , L z0 and the posture angles L φ0 , L θ0 , L ψ0 , and the first step again. Return to. On the other hand, when Δx is equal to or smaller than a predetermined value, the control unit 15 regards x obtained in the fourth step as a final solution, and positions L x , L y , L z indicated by x and the posture angle L φ. , L θ , L ψ are regarded as the position and posture angle of the displacement-generating lidar.
 図7は、ずれ発生ライダの検出処理及び第1推定方法に基づく位置及び姿勢角の推定処理の実行手順を示すフローチャートの一例である。制御部15は、図7のフローチャートの処理を繰り返し実行する。 FIG. 7 is an example of a flowchart showing the execution procedure of the position / posture angle estimation process based on the detection process of the deviation occurrence lidar and the first estimation method. The control unit 15 repeatedly executes the process of the flowchart of FIG.
 まず、制御部15は、車両の走行中に、3個以上のライダ(ここではライダ2A~2C)を用いてカルマンフィルタによる自車位置推定を実行する(ステップS101)。この場合、制御部15は、式(2-1)~式(2-3)に示されるカルマンフィルタの更新式を逐次実行することで、各ライダ2の計測値を反映した推定位置Xを算出する。この場合、制御部15は、上述したように、1つのライダ2の計測値に基づく推定位置Xを他のライダ2の予測位置Xとして代入する。 First, the control unit 15 performs vehicle position estimation using a Kalman filter using three or more riders (in this case, the riders 2A to 2C) while the vehicle is traveling (step S101). In this case, the control unit 15 calculates the estimated position X ^ reflecting the measurement value of each rider 2 by sequentially executing the Kalman filter update formulas shown in Formulas (2-1) to (2-3). To do. In this case, the control unit 15, as described above, the estimated position X ^ based on the measurement values of one rider 2 predicted position of the other rider 2 X - is substituted as.
 次に、制御部15は、各ライダ2の計測予測値Z と、計測値Zとの差分値ΔZに対し、現在の処理基準時までの所定時間において平均化する(ステップS102)。この場合、制御部15は、上記所定時間分の過去の差分値ΔZを記憶部12などのバッファなどに記憶しておき、記憶した過去所定時間分の差分値ΔZの各座標値ΔxLi、ΔyLi、ΔzLiを平均化する。 Next, the control unit 15 averages the difference value ΔZ i between the measured predicted value Z i of each rider 2 and the measured value Z i for a predetermined time until the current processing reference time (step S102). . In this case, the control unit 15 stores the past difference value ΔZ i for the predetermined time in a buffer such as the storage unit 12, and each coordinate value Δx Li of the stored difference value ΔZ i for the past predetermined time. , Δy Li , Δz Li are averaged.
 そして、制御部15は、差分値ΔZの平均が0から離れたライダ2が存在するか否か判定する(ステップS103)。例えば、制御部15は、平均化した差分値ΔZの各座標値ΔxLi、ΔyLi、ΔzLiのいずれかの絶対値が所定値以上となるか否か判定する。上述の所定値は、平均化に用いた所定時間の長さなどを勘案し、ずれが発生していないライダ2の計測値に基づくΔxLi、ΔyLi、ΔzLiの平均の絶対値より大きい値となるように設定される。そして、制御部15は、差分値ΔZの平均が0から離れたライダ2が存在する場合(ステップS103;Yes)、平均が0から離れた差分値ΔZに対応するライダ2をずれ発生ライダと認定し、ずれ発生ライダ以外のライダ2により自車位置推定処理を実行する(ステップS104)。一方、制御部15は、差分値ΔZの平均が0から離れたライダ2が存在しない場合(ステップS103;No)、ずれ発生ライダが存在しないとみなし、再びステップS101へ処理を戻す。 Then, the control unit 15 determines whether there is a rider 2 whose average difference value ΔZ i is away from 0 (step S103). For example, the control unit 15 determines whether or not the absolute value of any of the coordinate values Δx Li , Δy Li , Δz Li of the averaged difference value ΔZ i is equal to or greater than a predetermined value. The above-mentioned predetermined value is a value larger than the average absolute value of Δx Li , Δy Li , Δz Li based on the measured value of the lidar 2 in which no deviation occurs, taking into account the length of the predetermined time used for averaging, etc. Is set to be Then, when there is a rider 2 whose average difference value ΔZ i is away from 0 (step S103; Yes), the control unit 15 shifts the rider 2 corresponding to the difference value ΔZ i whose average is away from 0. And the own vehicle position estimation process is executed by the rider 2 other than the deviation occurrence rider (step S104). On the other hand, when there is no rider 2 whose average difference value ΔZ i is away from 0 (step S103; No), the control unit 15 regards that there is no deviation occurrence rider and returns the process to step S101 again.
 ステップS104でずれ発生ライダ以外のライダ2により自車位置推定処理を実行後、制御部15は、当該自車位置推定結果と、地図DB10に登録されたランドマークのランドマーク座標とから、計測予測値Zを算出し、これをずれ発生ライダの計測期待値[x(k)、y(k)、z(k)]とする(ステップS105)。制御部15は、上述のランドマークとして、ずれ発生ライダにより計測可能なランドマークを選定する。そして、制御部15は、上述のランドマークに対してずれ発生ライダの計測値及び計測期待値を複数組取得する。 In step S104, after executing the own vehicle position estimation process by the rider 2 other than the deviation occurrence rider, the control unit 15 performs measurement prediction from the own vehicle position estimation result and the landmark coordinates of the landmark registered in the map DB 10. A value Z is calculated and set as a measurement expected value [x b (k), y b (k), z b (k)] T of the deviation generation lidar (step S105). The control unit 15 selects a landmark that can be measured by the deviation generation lidar as the above-described landmark. Then, the control unit 15 acquires a plurality of sets of measurement values and measurement expectation values of the deviation occurrence lidar for the above-described landmarks.
 次に、制御部15は、複数の計測期待値及び計測値の組に基づき、ニュートン法を用いた最小2乗法を行い、ずれ発生ライダの姿勢角及び位置を推定する(ステップS106)。推定したずれ発生ライダの姿勢角及び位置の用途については、「(3)応用例」のセクションで説明する。 Next, the control unit 15 performs a least square method using the Newton method based on a plurality of sets of measurement expected values and measurement values, and estimates the attitude angle and position of the deviation-generating lidar (step S106). The use of the posture angle and position of the estimated deviation generation lidar will be described in the section “(3) Application Example”.
 (2)第2推定方法
 第2推定方法では、制御部15は、ずれ発生ライダ以外のライダ2により推定した自車位置(「リファレンス位置」とも呼ぶ。)と、ずれ発生ライダにより推定した自車位置(「仮位置」とも呼ぶ。)との差分値に基づき、ずれ発生ライダの位置及び姿勢角を更新する。以後では、カルマンフィルタで推定する自車位置の状態変数を、位置x、y、z及びヨー角ψ、ロール角φ、ピッチ角θの6変数とする。
(2) Second Estimation Method In the second estimation method, the control unit 15 uses the own vehicle position estimated by the rider 2 other than the deviation occurrence rider (also referred to as “reference position”) and the own vehicle estimated by the deviation occurrence rider. Based on the difference value with the position (also referred to as “temporary position”), the position and the posture angle of the deviation occurrence lidar are updated. Hereinafter, the state variables of the vehicle position estimated by the Kalman filter are assumed to be six variables of position x, y, z, yaw angle ψ, roll angle φ, and pitch angle θ.
 まず、各ライダ2の時刻kのランドマークに対する計測予測値Z (k)は、以下の式(21)により表される。 First, the measured predicted value Z i (k) for the landmark at time k of each rider 2 is expressed by the following equation (21).
Figure JPOXMLDOC01-appb-M000021
 そして、制御部15は、[ずれ発生ライダの検出]で説明したように、差分値ΔZ(k)の平均に基づき、ずれ発生ライダを検出する。ここでは、制御部15は、ライダ2Cをずれ発生ライダとして検出したとする。そして、制御部15は、ずれ発生ライダ以外のライダ2A、2Bにより推定したリファレンス位置[x (k)、y (k)、z (k)、φ (k)、θ (k)、ψ (k)]と、ずれ発生ライダであるライダ2Cの仮位置[x (k)、y (k)、z (k)、φ (k)、θ (k)、ψ (k)]とをそれぞれ算出し、これらの差分値[Δx、Δy、Δz、Δφ、Δθ、Δψ]を以下の式(22)により算出する。
Figure JPOXMLDOC01-appb-M000021
Then, the control unit 15 detects the deviation occurrence lidar based on the average of the difference values ΔZ i (k) as described in [Detection of deviation occurrence lidar]. Here, it is assumed that the control unit 15 detects the lidar 2C as a shift generation lidar. Then, the control unit 15 uses the reference positions [x ^ r (k), y ^ r (k), z ^ r (k), φ ^ r (k), estimated by the lidars 2A and 2B other than the deviation-generating lidar. θ ^ r (k), ψ ^ r (k)] T and the temporary position [x ^ 3 (k), y ^ 3 (k), z ^ 3 (k), φ ^ 3 (k), θ ^ 3 (k), ψ ^ 3 (k)] T are calculated, and the difference values [Δx, Δy, Δz, Δφ, Δθ, Δψ] T are calculated by the following formula ( 22).
Figure JPOXMLDOC01-appb-M000022
 そして、制御部15は、式(22)に示される差分値を、ライダ設置情報ILに記録された位置・姿勢角又はこれらの更新値に加算することで、ずれ発生ライダの現在の位置及び姿勢角を更新する。その後、制御部15は、上述の差分値が0に十分に近い(例えば差分値の各要素Δx、Δy、Δz、Δφ、Δθ、Δψの絶対値がいずれも所定値以下)の場合、位置及び姿勢角の更新を終了し、差分値が0に近くない場合には、再びリファレンス位置と仮位置との計算により新たな差分値を計算して位置及び姿勢角の更新を行う。
Figure JPOXMLDOC01-appb-M000022
And the control part 15 adds the difference value shown by Formula (22) to the position and attitude | position angle recorded on lidar installation information IL, or these update values, and is the present position and attitude | position of a shift | offset | difference lidar Update corners. Thereafter, when the above-described difference value is sufficiently close to 0 (for example, the absolute value of each element Δx, Δy, Δz, Δφ, Δθ, Δψ of the difference value is not more than a predetermined value), the control unit 15 When the update of the posture angle is completed and the difference value is not close to 0, a new difference value is calculated again by calculating the reference position and the temporary position, and the position and posture angle are updated.
 図8は、ずれ発生ライダの検出処理及び第2推定方法に基づく姿勢角と位置の推定処理の実行手順を示すフローチャートの一例である。制御部15は、図8のフローチャートの処理を繰り返し実行する。 FIG. 8 is an example of a flowchart showing the execution procedure of the position angle and position estimation processing based on the detection processing of the deviation occurrence lidar and the second estimation method. The control unit 15 repeatedly executes the process of the flowchart of FIG.
 まず、制御部15は、車両の走行中に、3個以上のライダ(ここではライダ2A~2C)を用いてカルマンフィルタによる自車位置推定を実行する(ステップS201)。次に、制御部15は、各ライダ2の計測予測値Z と計測値Zとの差分値ΔZに対し、現在の処理基準時までの所定時間において平均化する(ステップS202)。 First, the control unit 15 performs vehicle position estimation using a Kalman filter using three or more riders (here, riders 2A to 2C) while the vehicle is traveling (step S201). Next, the control unit 15, the measurement predicted value of each rider 2 Z - averaging at a given time with respect to the difference values [Delta] Z i of i and the measured value Z i, until the current standards (step S202).
 そして、制御部15は、差分値ΔZの平均が0から離れたライダ2が存在する場合(ステップS203;Yes)、平均が0から離れた差分値ΔZに対応するライダ2をずれ発生ライダと認定し、ずれ発生ライダ以外のライダ2の計測値により推定した自車位置をリファレンス位置として設定する(ステップS204)。また、制御部15は、ずれ発生ライダの計測値により推定した自車位置を仮位置として設定する(ステップS205)。そして、制御部15は、リファレンス位置と仮位置との差分値に基づき、ずれ発生ライダの位置及び姿勢角の推定値を更新する(ステップS206)。そして、制御部15は、リファレンス位置と仮位置との差分値が0に近似している場合(ステップS207;Yes)、ずれ発生ライダの位置及び姿勢角の推定値は十分に精度が高いと判断し、フローチャートの処理を終了する。一方、制御部15は、リファレンス位置と仮位置との差分値が0に近似していない場合(ステップS207;No)、ステップS204へ処理を戻し、リファレンス位置の再計算などを行う。 Then, when there is a rider 2 whose average difference value ΔZ i is away from 0 (step S203; Yes), the control unit 15 shifts the rider 2 corresponding to the difference value ΔZ i whose average is away from 0. And the own vehicle position estimated from the measured values of the lidar 2 other than the deviation-generating lidar is set as a reference position (step S204). Further, the control unit 15 sets the own vehicle position estimated from the measurement value of the deviation occurrence lidar as a temporary position (step S205). Then, the control unit 15 updates the estimated value of the position and posture angle of the deviation-generating lidar based on the difference value between the reference position and the temporary position (step S206). Then, when the difference value between the reference position and the temporary position is close to 0 (step S207; Yes), the control unit 15 determines that the estimated value of the position and posture angle of the deviation occurrence lidar is sufficiently accurate. Then, the process of the flowchart ends. On the other hand, when the difference value between the reference position and the temporary position is not close to 0 (step S207; No), the control unit 15 returns the process to step S204, and recalculates the reference position.
 (3)座標系の変換
 図9は、2次元座標により表された車両座標系とライダ座標系との関係を示す図である。ここでは、車両座標系は、車両の中心を原点とし、車両の進行方向に沿った座標軸「x」と車両の側面方向に沿った座標軸「y」を有する。また、ライダ座標系は、ライダ2の正面方向(矢印A2参照)に沿った座標軸「x」とライダ2の側面方向に沿った座標軸「y」を有する。
(3) Conversion of Coordinate System FIG. 9 is a diagram showing the relationship between the vehicle coordinate system and the lidar coordinate system represented by two-dimensional coordinates. Here, the vehicle coordinate system has a coordinate center “x v ” along the traveling direction of the vehicle and a coordinate axis “y v ” along the lateral direction of the vehicle with the vehicle center as the origin. The lidar coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A <b> 2) and a coordinate axis “y L ” along the side surface direction of the rider 2.
 ここで、車両座標系に対するライダ2のヨー角を「Lψ」、ライダ2の位置を[L、Lとした場合、車両座標系から見た時刻「k」の計測点[x(k)、y(k)]は、回転行列「Cψ」を用いた以下の式(23)によりライダ座標系の座標[x(k)、y(k)]へ変換される。 Here, when the yaw angle of the lidar 2 with respect to the vehicle coordinate system is “L ψ ” and the position of the lidar 2 is [L x , L y ] T , the measurement point [x] at the time “k” viewed from the vehicle coordinate system [x v (k), y v (k)] T is converted to the coordinates [x L (k), y L (k)] T of the lidar coordinate system by the following equation (23) using the rotation matrix “C ψ ”. Converted.
Figure JPOXMLDOC01-appb-M000023
 一方、ライダ座標系から車両座標系への変換は、回転行列の逆行列(転置行列)を用いればよい。よって、ライダ座標系で取得した時刻kの計測点[x(k)、y(k)]は、以下の式(24)により車両座標系の座標[x(k)、y(k)]に変換することが可能である。
Figure JPOXMLDOC01-appb-M000023
On the other hand, the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the rotation matrix. Therefore, the measurement point [x L (k), y L (k)] T obtained at the time k obtained in the lidar coordinate system is expressed by the coordinates [x v (k), y v of the vehicle coordinate system according to the following equation (24). (K)] It is possible to convert to T.
Figure JPOXMLDOC01-appb-M000024
 図10は、3次元座標により表された車両座標系とライダ座標系との関係を示す図である。ここでは、座標軸x、yに垂直な座標軸を「z」、座標軸x、yに垂直な座標軸を「z」とする。
Figure JPOXMLDOC01-appb-M000024
FIG. 10 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by three-dimensional coordinates. Here, a coordinate axis perpendicular to the coordinate axes x v and y v is “z v ”, and a coordinate axis perpendicular to the coordinate axes x L and y L is “z L ”.
 車両座標系に対するライダ2のロール角を「Lφ」、ピッチ角を「Lθ」、ヨー角を「Lψ」とし、ライダ2の座標軸xにおける位置が「L」、座標軸yにおける位置が「L」、座標軸zにおける位置が「L」とした場合、車両座標系から見た時刻「k」の計測点[x(k)、y(k)、z(k)]は、ロール、ピッチ、ヨーに対応する各回転行列「Cφ」、「Cθ」、「Cψ」により表される方向余弦行列「C」を用いた以下の式(25)により、ライダ座標系の座標[x(k)、y(k)、z(k)]へ変換される。 The roll angle of the lidar 2 with respect to the vehicle coordinate system is “L φ ”, the pitch angle is “L θ ”, the yaw angle is “L ψ ”, the position of the lidar 2 on the coordinate axis x v is “L x ”, and the coordinate axis y v When the position is “L y ” and the position on the coordinate axis z v is “L z ”, the measurement point [x v (k), y v (k), z v ( k)] T is the following equation (25) using the direction cosine matrix “C” represented by the rotation matrices “C φ ”, “C θ ”, and “C ψ ” corresponding to roll, pitch, and yaw. Is converted into coordinates [x L (k), y L (k), z L (k)] T in the lidar coordinate system.
Figure JPOXMLDOC01-appb-M000025
 一方、ライダ座標系から車両座標系への変換は、方向余弦行列の逆行列(転置行列)を用いればよい。よって、ライダ座標系で取得した時刻kの計測点[x(k)、y(k)、z(k)]は、以下の式(26)により車両座標系の座標[x(k)、y(k)、z(k)]に変換することが可能である。
Figure JPOXMLDOC01-appb-M000025
On the other hand, the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the direction cosine matrix. Therefore, the measurement point [x L (k), y L (k), z L (k)] T obtained at the time k acquired in the lidar coordinate system is expressed by the coordinate [x v (K), y v (k), z v (k)] T.
Figure JPOXMLDOC01-appb-M000026
 (4)応用例
 ここで、第1推定方法又は第2推定方法により推定した位置ずれ発生ライダの推定した位置及び姿勢角の用途の具体例について補足説明する。
Figure JPOXMLDOC01-appb-M000026
(4) Application Example Here, a specific example of the use of the estimated position and posture angle of the displacement generation lidar estimated by the first estimation method or the second estimation method will be described.
 例えば、制御部15は、ライダ設置情報ILに記録されたずれ発生前の位置ずれ発生ライダの姿勢角及び位置に対し、推定した位置ずれ発生ライダの姿勢角及び位置の変化量を算出し、当該変化量に基づき、位置ずれ発生ライダが出力する点群データの各計測値を補正する。この場合、制御部15は、例えば、各変化量の大きさごとの計測値の補正量を示すマップ等を記憶しておき、当該マップ等を参照することで、上述の計測値を補正する。また、変化量の所定の割合の値を計測値の補正量として計測値を補正してもよい。なお、制御部15は、上記変化量が所定の閾値以上となる位置又は姿勢角のずれを検出した場合には、位置ずれ発生ライダの使用を中止し、所定の警告を情報出力部16により出力してもよい。 For example, the control unit 15 calculates the estimated posture angle and position change amount of the position deviation occurrence lidar with respect to the posture angle and position of the position deviation occurrence rider before the occurrence of the deviation recorded in the rider installation information IL, and Based on the amount of change, each measurement value of the point cloud data output by the misregistration occurrence lidar is corrected. In this case, for example, the control unit 15 stores a map or the like indicating the correction amount of the measurement value for each change amount, and corrects the above-described measurement value by referring to the map or the like. Alternatively, the measurement value may be corrected using the value of a predetermined ratio of the change amount as the correction amount of the measurement value. The control unit 15 stops the use of the position shift occurrence lidar and outputs a predetermined warning by the information output unit 16 when detecting a shift in position or posture angle at which the change amount is equal to or greater than a predetermined threshold. May be.
 他の例では、制御部15は、算出したロール角Lφ、ピッチ角Lθ、ヨー角Lψ、x方向位置L、y方向位置L、z方向位置Lを用いて、ライダ2が出力する点群データの各計測値をライダ座標系から車体座標系に変換し、変換後のデータに基づいて、自車位置推定や自動運転制御などを実行してもよい。これにより、ずれが発生した後のライダ2の姿勢角及び位置に基づいて適切にライダ2の計測値を車両座標系に変換することができる。 In another example, the control unit 15 uses the calculated roll angle L φ , pitch angle L θ , yaw angle L ψ , x-direction position L x , y-direction position L y , and z-direction position L z to use the lidar 2. Each measured value of the point cloud data output from the vehicle may be converted from the lidar coordinate system to the vehicle body coordinate system, and based on the converted data, the vehicle position estimation or automatic driving control may be executed. Accordingly, the measurement value of the rider 2 can be appropriately converted into the vehicle coordinate system based on the attitude angle and position of the rider 2 after the occurrence of the deviation.
 さらに別の例では、車載機1は、各ライダ2の姿勢角及び位置を修正するためのアクチュエータなどの調整機構が各ライダ2に備わっている場合には、推定結果に基づきライダ2の姿勢角及び位置を修正してもよい。この場合、車載機1は、ライダ設置情報ILに記録された姿勢角及び位置を基準として、推定した姿勢角及び位置の変化量を算出し、当該変化量の分だけライダ2の姿勢角及び位置を修正するように調整機構を駆動させる制御を行う。 In yet another example, when the in-vehicle device 1 includes an adjustment mechanism such as an actuator for correcting the posture angle and position of each rider 2, the posture angle of the rider 2 based on the estimation result. And the position may be modified. In this case, the in-vehicle device 1 calculates the estimated amount of change in posture angle and position on the basis of the posture angle and position recorded in the rider installation information IL, and the posture angle and position of the rider 2 by the amount of the change amount. Control is performed to drive the adjustment mechanism so as to correct.
 以上説明したように、本実施例における車載機1は、車両に設けられた少なくとも3つのライダ2の各々の計測範囲に含まれるランドマークまでのライダ2による計測値Zを取得する。そして、車載機1は、地図DB10に登録された当該ランドマークの位置情報が示すランドマーク座標に基づき予測された計測予測値Z を取得する。そして、車載機1は、ライダ2ごとに取得された計測値Zと計測予測値Z の複数の差分値ΔZに基づいて、車両に対する取付位置の位置ずれが生じている少なくとも1つのライダ2を検出する。これにより、車載機1は、位置ずれが生じているライダ2を的確に検出することができる。 As described above, the in-vehicle device 1 according to the present embodiment acquires the measurement value Z i by the rider 2 up to the landmark included in each measurement range of at least three riders 2 provided in the vehicle. Then, the in-vehicle device 1 acquires a predicted measurement value Z - i predicted based on the landmark coordinates indicated by the position information of the landmark registered in the map DB 10. The in-vehicle device 1 has at least one displacement of the mounting position with respect to the vehicle based on a plurality of difference values ΔZ i between the measured value Z i and the measured predicted value Z i acquired for each rider 2. The lidar 2 is detected. As a result, the in-vehicle device 1 can accurately detect the lidar 2 in which the positional deviation has occurred.
 [変形例]
 以下、実施例に好適な変形例について説明する。以下の変形例は、組み合わせて実施例に適用してもよい。
[Modification]
Hereinafter, modified examples suitable for the embodiments will be described. The following modifications may be applied to the embodiments in combination.
 (変形例1)
 制御部15は、各ライダ2の計測値に基づき、式(1)に示されるカルマンフィルタの更新式を個別に実行してもよい。即ち、この場合、制御部15は、ライダ2Aの計測値による自車位置推定、ライダ2Bの計測値による自車位置推定、ライダ2Cの計測値による自車位位置推定を式(1)に基づき別々に実行し、それぞれの式(1)の実行時に算出した計測値Zと計測予測値Zとの差分値ΔZを比較する。そして、制御部15は、所定値以上大きい差分値ΔZが存在する場合、当該差分値ΔZに対応するライダ2をずれ発生ライダとして検出する。なお、この場合、実施例と同様、各ライダ2の計測対象となるランドマークは同一のものである必要はない。
(Modification 1)
The control unit 15 may individually execute the Kalman filter update formula shown in Formula (1) based on the measurement value of each rider 2. That is, in this case, the control unit 15 performs vehicle position estimation based on the measurement value of the lidar 2A, vehicle position estimation based on the measurement value of the lidar 2B, and vehicle position estimation based on the measurement value of the lidar 2C based on the formula (1). And the difference value ΔZ between the measured value Z calculated at the time of executing each formula (1) and the measured predicted value Z is compared. Then, when there is a difference value ΔZ that is greater than or equal to a predetermined value, the control unit 15 detects the rider 2 corresponding to the difference value ΔZ as a deviation occurrence rider. In this case, as in the embodiment, the landmarks to be measured by the riders 2 do not have to be the same.
 (変形例2)
 図1に示す運転支援システムの構成は一例であり、本発明が適用可能な運転支援システムの構成は図1に示す構成に限定されない。例えば、運転支援システムは、車載機1を有する代わりに、車両の電子制御装置が図7又は図8等に示す処理を実行してもよい。この場合、ライダ設置情報ILは、例えば車両内の記憶部に記憶され、車両の電子制御装置は、ライダ2などの各種センサの出力データを受信可能に構成される。
(Modification 2)
The configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG. For example, in the driving support system, instead of having the in-vehicle device 1, the electronic control device of the vehicle may execute the processing shown in FIG. In this case, the lidar installation information IL is stored in, for example, a storage unit in the vehicle, and the vehicle electronic control device is configured to be able to receive output data of various sensors such as the lidar 2.
 1 車載機
 2 ライダ
 3 ジャイロセンサ
 4 車速センサ
 5 GPS受信機
 10 地図DB
DESCRIPTION OF SYMBOLS 1 In-vehicle apparatus 2 Rider 3 Gyro sensor 4 Vehicle speed sensor 5 GPS receiver 10 Map DB

Claims (11)

  1.  移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得部と、
     前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得部と、
     前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出部と、
    を有する情報処理装置。
    A measurement distance acquisition unit for acquiring a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body;
    A predicted distance acquisition unit that acquires a predicted distance from the moving object to the target predicted based on position information of the target;
    Based on a plurality of difference values between the measurement distance and the predicted distance acquired for each measurement unit, at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated from the at least three measurement units. A detection unit to detect;
    An information processing apparatus.
  2.  前記検出部は、前記計測部ごとの前記差分値の所定時間における平均値に基づいて、位置ずれが生じている計測部を前記少なくとも3つの計測部から検出する請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the detection unit detects, from the at least three measurement units, a measurement unit in which a positional deviation has occurred based on an average value of the difference values for each measurement unit over a predetermined time. .
  3.  予測された前記移動体の予測位置を前記計測部ごとに取得する予測位置取得部と、
     前記差分値に所定の利得を乗じた値により、前記計測部ごとに前記予測位置を補正する補正部と、をさらに備え、
     前記予測位置取得部は、前記計測部の1つの計測部に対応する前記差分値に基づき補正した前記予測位置を、当該計測部以外の計測部に対応する予測位置として取得する請求項1または2に記載の情報処理装置。
    A predicted position acquisition unit that acquires the predicted position of the predicted moving body for each measurement unit;
    A correction unit that corrects the predicted position for each measurement unit according to a value obtained by multiplying the difference value by a predetermined gain; and
    The said predicted position acquisition part acquires the said predicted position corrected based on the said difference value corresponding to one measurement part of the said measurement part as a predicted position corresponding to measurement parts other than the said measurement part. The information processing apparatus described in 1.
  4.  前記補正部は、前記検出部が検出した計測部以外の計測部の前記差分値に基づき、前記予測位置を補正する請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the correction unit corrects the predicted position based on the difference value of a measurement unit other than the measurement unit detected by the detection unit.
  5.  前記予測距離取得部は、前記検出部が検出した計測部以外の計測部の前記差分値に基づき補正した前記予測位置と、前記検出部が検出した計測部が計測する対象物の位置情報とに基づく前記予測距離を取得し、
     当該予測距離と、前記検出部が検出した計測部による計測距離とに基づき、前記検出部が検出した計測部の位置を推定する推定部を備える請求項1~4のいずれか一項に記載の情報処理装置。
    The predicted distance acquisition unit includes the predicted position corrected based on the difference value of the measurement unit other than the measurement unit detected by the detection unit, and the position information of the object measured by the measurement unit detected by the detection unit. Obtaining the predicted distance based on
    The estimation unit according to any one of claims 1 to 4, further comprising: an estimation unit that estimates a position of the measurement unit detected by the detection unit based on the predicted distance and a measurement distance by the measurement unit detected by the detection unit. Information processing device.
  6.  前記検出部が検出した計測部の前記差分値に基づき補正した前記予測位置と、前記検出部が検出した計測部以外の計測部の前記差分値に基づき補正した前記予測位置とに基づき、前記検出部が検出した計測部の位置を推定する推定部を備える請求項1~4のいずれか一項に記載の情報処理装置。 The detection based on the predicted position corrected based on the difference value of the measurement unit detected by the detection unit and the predicted position corrected based on the difference value of a measurement unit other than the measurement unit detected by the detection unit. The information processing apparatus according to any one of claims 1 to 4, further comprising an estimation unit that estimates a position of the measurement unit detected by the unit.
  7.  前記推定部は、推定した前記計測部の位置と、記憶部に記憶された当該計測部の位置とに基づき、前記位置ずれのずれ量を推定する請求項5または6に記載の情報処理装置。 The information processing apparatus according to claim 5, wherein the estimation unit estimates the displacement amount of the positional deviation based on the estimated position of the measurement unit and the position of the measurement unit stored in the storage unit.
  8.  前記推定部は、前記ずれ量として、前記検出部が検出した計測部のピッチ方向、ヨー方向、若しくはロール方向のずれ量、又は、前記検出部が検出した計測部の3次元空間での重心位置のずれ量の少なくとも一方を算出する請求項7に記載の情報処理装置。 The estimation unit, as the amount of displacement, is the amount of displacement in the pitch direction, yaw direction, or roll direction of the measurement unit detected by the detection unit, or the barycentric position in the three-dimensional space of the measurement unit detected by the detection unit The information processing apparatus according to claim 7, wherein at least one of the shift amounts is calculated.
  9.  情報処理装置が実行する制御方法であって、
     移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得工程と、
     前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得工程と、
     前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出工程と、
    を有する制御方法。
    A control method executed by the information processing apparatus,
    A measurement distance acquisition step of acquiring a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided on the moving body;
    A predicted distance acquisition step of acquiring a predicted distance from the moving object predicted based on position information of the target object to the target object;
    Based on a plurality of difference values between the measurement distance and the predicted distance acquired for each measurement unit, at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated from the at least three measurement units. A detection process to detect;
    A control method.
  10.  コンピュータが実行するプログラムであって、
     移動体に設けられた少なくとも3つの計測部の各々の計測範囲に含まれる対象物までの前記計測部による計測距離を取得する計測距離取得部と、
     前記対象物の位置情報に基づき予測された前記移動体から前記対象物までの予測距離を取得する予測距離取得部と、
     前記計測部ごとに取得された前記計測距離と前記予測距離の複数の差分値に基づいて、前記移動体に対する取付位置の位置ずれが生じている少なくとも1つの計測部を前記少なくとも3つの計測部から検出する検出部
    として前記コンピュータを機能させるプログラム。
    A program executed by a computer,
    A measurement distance acquisition unit for acquiring a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body;
    A predicted distance acquisition unit that acquires a predicted distance from the moving object to the target predicted based on position information of the target;
    Based on a plurality of difference values between the measurement distance and the predicted distance acquired for each measurement unit, at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated from the at least three measurement units. A program for causing the computer to function as a detection unit for detection.
  11.  請求項10に記載のプログラムを記憶した記憶媒体。 A storage medium storing the program according to claim 10.
PCT/JP2019/011974 2018-03-28 2019-03-22 Information processing device, control method, program, and storage medium WO2019188745A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-063186 2018-03-28
JP2018063186 2018-03-28

Publications (1)

Publication Number Publication Date
WO2019188745A1 true WO2019188745A1 (en) 2019-10-03

Family

ID=68061748

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/011974 WO2019188745A1 (en) 2018-03-28 2019-03-22 Information processing device, control method, program, and storage medium

Country Status (1)

Country Link
WO (1) WO2019188745A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111879287A (en) * 2020-07-08 2020-11-03 河南科技大学 Forward terrain three-dimensional construction method of low-speed vehicle based on multiple sensors
WO2021112177A1 (en) 2019-12-04 2021-06-10 パイオニア株式会社 Information processing apparatus, control method, program, and storage medium
WO2021112074A1 (en) 2019-12-02 2021-06-10 パイオニア株式会社 Information processing device, control method, program, and storage medium
CN113790720A (en) * 2021-08-16 2021-12-14 北京自动化控制设备研究所 Disturbance-rejection coarse alignment method based on recursive least squares
WO2022102577A1 (en) 2020-11-13 2022-05-19 パイオニア株式会社 Information processing apparatus, control method, program, and storage medium
WO2022124115A1 (en) 2020-12-07 2022-06-16 パイオニア株式会社 Information processing device, control method, program, and recording medium
WO2022208617A1 (en) 2021-03-29 2022-10-06 パイオニア株式会社 Map data structure, storage device, information processing device, control method, program, and storage medium
WO2023037500A1 (en) 2021-09-10 2023-03-16 パイオニア株式会社 Information processing device, determining method, program, and storage medium
WO2023037502A1 (en) 2021-09-10 2023-03-16 パイオニア株式会社 Server device, control method, program, and storage medium
WO2023062782A1 (en) 2021-10-14 2023-04-20 パイオニア株式会社 Information processing apparatus, control method, program, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037239A (en) * 2002-07-03 2004-02-05 Fuji Heavy Ind Ltd Identical object judging method and system, and misregistration correcting method and system
JP2014238297A (en) * 2013-06-06 2014-12-18 トヨタ自動車株式会社 Vehicle position recognition device
JP2016183911A (en) * 2015-03-26 2016-10-20 三菱電機株式会社 Radar axial deviation determination device, radar device and radar axial deviation determination method
JP2016197081A (en) * 2015-04-06 2016-11-24 日立建機株式会社 Transport vehicle
WO2017060947A1 (en) * 2015-10-05 2017-04-13 パイオニア株式会社 Estimation apparatus, control method, program, and storage medium
JP2017227580A (en) * 2016-06-24 2017-12-28 三菱電機株式会社 Object recognition device, object recognition method and automatic driving system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037239A (en) * 2002-07-03 2004-02-05 Fuji Heavy Ind Ltd Identical object judging method and system, and misregistration correcting method and system
JP2014238297A (en) * 2013-06-06 2014-12-18 トヨタ自動車株式会社 Vehicle position recognition device
JP2016183911A (en) * 2015-03-26 2016-10-20 三菱電機株式会社 Radar axial deviation determination device, radar device and radar axial deviation determination method
JP2016197081A (en) * 2015-04-06 2016-11-24 日立建機株式会社 Transport vehicle
WO2017060947A1 (en) * 2015-10-05 2017-04-13 パイオニア株式会社 Estimation apparatus, control method, program, and storage medium
JP2017227580A (en) * 2016-06-24 2017-12-28 三菱電機株式会社 Object recognition device, object recognition method and automatic driving system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021112074A1 (en) 2019-12-02 2021-06-10 パイオニア株式会社 Information processing device, control method, program, and storage medium
WO2021112177A1 (en) 2019-12-04 2021-06-10 パイオニア株式会社 Information processing apparatus, control method, program, and storage medium
CN111879287A (en) * 2020-07-08 2020-11-03 河南科技大学 Forward terrain three-dimensional construction method of low-speed vehicle based on multiple sensors
WO2022102577A1 (en) 2020-11-13 2022-05-19 パイオニア株式会社 Information processing apparatus, control method, program, and storage medium
WO2022124115A1 (en) 2020-12-07 2022-06-16 パイオニア株式会社 Information processing device, control method, program, and recording medium
WO2022208617A1 (en) 2021-03-29 2022-10-06 パイオニア株式会社 Map data structure, storage device, information processing device, control method, program, and storage medium
CN113790720A (en) * 2021-08-16 2021-12-14 北京自动化控制设备研究所 Disturbance-rejection coarse alignment method based on recursive least squares
CN113790720B (en) * 2021-08-16 2023-08-15 北京自动化控制设备研究所 Anti-disturbance coarse alignment method based on recursive least square
WO2023037500A1 (en) 2021-09-10 2023-03-16 パイオニア株式会社 Information processing device, determining method, program, and storage medium
WO2023037502A1 (en) 2021-09-10 2023-03-16 パイオニア株式会社 Server device, control method, program, and storage medium
WO2023062782A1 (en) 2021-10-14 2023-04-20 パイオニア株式会社 Information processing apparatus, control method, program, and storage medium

Similar Documents

Publication Publication Date Title
WO2019188745A1 (en) Information processing device, control method, program, and storage medium
JP2022113746A (en) Determination device
EP4170282A1 (en) Method for calibrating mounting deviation angle between sensors, combined positioning system, and vehicle
US8041472B2 (en) Positioning device, and navigation system
KR102086270B1 (en) Control method and traveling control device of the traveling control device
WO2012086401A1 (en) Driving assist device
JP6806891B2 (en) Information processing equipment, control methods, programs and storage media
JP2011122921A (en) Position location apparatus, position location method, position location program, velocity vector calculation apparatus, velocity vector calculation method, and velocity vector calculation program
JP6980010B2 (en) Self-position estimator, control method, program and storage medium
US20210278217A1 (en) Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium
WO2021112074A1 (en) Information processing device, control method, program, and storage medium
JP2022031266A (en) Self-position estimation device, control method, program, and storage medium
KR20160120467A (en) Azimuth correction apparatus and method of 2-dimensional radar for vehicle
JP2023164553A (en) Position estimation device, estimation device, control method, program and storage medium
JP7418196B2 (en) Travel trajectory estimation method and travel trajectory estimation device
CN113795726B (en) Self-position correction method and self-position correction device
WO2019188886A1 (en) Terminal device, information processing method, and storage medium
TWI635302B (en) Real-time precise positioning system of vehicle
JP6707627B2 (en) Measuring device, measuring method, and program
WO2017168588A1 (en) Measurement device, measurement method, and program
WO2019188802A1 (en) Information processing device, control method, program, and storage medium
JP2017181195A (en) Measurement device, measurement method, and program
WO2019188874A1 (en) Data structure, information processing device, and map data generation device
WO2018212290A1 (en) Information processing device, control method, program and storage medium
CN112985385A (en) Positioning and orientation system and positioning and orientation method applying high-precision map

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19776476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19776476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP