WO2020045057A1 - Posture estimation device, control method, program, and storage medium - Google Patents
Posture estimation device, control method, program, and storage medium Download PDFInfo
- Publication number
- WO2020045057A1 WO2020045057A1 PCT/JP2019/031632 JP2019031632W WO2020045057A1 WO 2020045057 A1 WO2020045057 A1 WO 2020045057A1 JP 2019031632 W JP2019031632 W JP 2019031632W WO 2020045057 A1 WO2020045057 A1 WO 2020045057A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- point group
- feature
- rider
- measurement point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 52
- 238000005259 measurement Methods 0.000 claims abstract description 137
- 238000009434 installation Methods 0.000 claims abstract description 22
- 239000000284 extract Substances 0.000 claims description 12
- 238000001514 detection method Methods 0.000 description 67
- 241001125929 Trisopterus luscus Species 0.000 description 25
- 238000010586 diagram Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 15
- 238000012935 Averaging Methods 0.000 description 12
- 239000011159 matrix material Substances 0.000 description 8
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 240000004050 Pentaglottis sempervirens Species 0.000 description 5
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 238000000611 regression analysis Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007717 exclusion Effects 0.000 description 2
- SMYMJHWAQXWPDB-UHFFFAOYSA-N (2,4,5-trichlorophenoxy)acetic acid Chemical compound OC(=O)COC1=CC(Cl)=C(Cl)C=C1Cl SMYMJHWAQXWPDB-UHFFFAOYSA-N 0.000 description 1
- 101001096578 Homo sapiens Rhomboid domain-containing protein 3 Proteins 0.000 description 1
- 102100037471 Rhomboid domain-containing protein 3 Human genes 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present invention relates to a technique for estimating the attitude of a measuring device.
- Patent Literature 1 discloses a technique for estimating a self-position by comparing an output of a measurement sensor with position information of a feature registered on a map in advance.
- Patent Document 2 discloses a vehicle position estimation technology using a Kalman filter.
- the data obtained from the measuring device is a value in the coordinate system based on the measuring device, and is data depending on the attitude of the measuring device with respect to the vehicle, and so is converted into a value in the coordinate system based on the vehicle.
- the data obtained from the measuring device is a value in the coordinate system based on the measuring device, and is data depending on the attitude of the measuring device with respect to the vehicle, and so is converted into a value in the coordinate system based on the vehicle.
- the present invention has been made to solve the above-described problem, and has as its main object to provide a posture estimating apparatus capable of suitably estimating a mounting posture of a measuring device attached to a moving body to the moving body. Purpose.
- the invention according to claim is a posture estimating device, wherein a feature having a plane perpendicular to a traveling direction of a traveling path of the moving body or a plane perpendicular to a width direction of the traveling path is provided to the moving body.
- the invention described in the claims is a control method executed by the posture estimating apparatus, wherein the ground has a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path.
- An object an acquisition step of acquiring a group of measurement points measured by a measurement device attached to the moving body, and the vertical calculated based on the coordinate system based on the installation information of the measurement device, which is calculated from the measurement point group.
- the invention according to the claims is a program executed by a computer, a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path, Acquisition means for acquiring a group of measurement points measured by a measurement device attached to the moving body, and inclination of the vertical plane calculated from the measurement point group and based on a coordinate system based on installation information of the measurement device. And causing the computer to function as estimating means for estimating an attachment posture of the measuring device to the moving object based on the information.
- FIG. 1 is a schematic configuration diagram of a vehicle system. It is a block diagram which shows the functional structure of an in-vehicle apparatus.
- FIG. 3 is a diagram illustrating a relationship between a vehicle coordinate system and a rider coordinate system represented by two-dimensional coordinates.
- FIG. 3 is a diagram illustrating a relationship between a vehicle coordinate system represented by three-dimensional coordinates and a rider coordinate system.
- FIG. 4 is a bird's-eye view of the vicinity of the vehicle when the vehicle runs near a road sign that is a detection target feature when estimating the roll angle of the rider. The positional relationship between the road sign and the target measurement point group when the measured surface of the road sign is viewed from the front is shown.
- FIG. 4 shows a target measurement point group and an extracted outer edge point group in a lidar coordinate system. The approximate straight line of each side of the rectangle formed by the outer edge point group is shown. It is a flowchart which shows the procedure of a roll angle estimation process.
- FIG. 5 is an overhead view of the vicinity of a vehicle when the vehicle travels near a road sign that is a detection target feature when estimating a rider's pitch angle. It is a flowchart which shows the procedure of a pitch angle estimation process.
- FIG. 4 is a bird's-eye view of the periphery of the vehicle when the vehicle travels along a lane marking serving as a detection target feature when estimating a yaw angle of a rider.
- FIG. 9 is a diagram showing a center point in a detection window by a circle. It shows the angle between the center line of the division line calculated for each detection window and the yaw direction reference angle. It is a flowchart which shows the procedure of a yaw angle estimation process. It is a flowchart which shows the specific example of a process based on the estimated value of a roll angle, a pitch angle, and a yaw angle.
- the posture estimating device is configured to convert a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path into the moving body.
- Estimating means for estimating the posture of the measuring device attached to the moving body is configured to convert a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path into the moving body.
- the posture estimating device is configured to attach the measuring device to the moving body with reference to a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path. Can be suitably estimated.
- the acquiring unit recognizes a position of the feature based on feature information on the feature, so that the measurement point group measured by the measurement device at the position is determined by the feature.
- the posture estimating apparatus grasps the position of a feature having a plane perpendicular to the traveling direction of the traveling path of the moving object or a plane perpendicular to the width direction of the traveling path, and measures a measurement point group of the feature. Can be suitably obtained.
- the estimating means may include a roll of the measuring device based on the inclination calculated from a group of measurement points of a feature having a plane perpendicular to the traveling direction of the travel path. And estimating the attitude of the measuring device in the pitch direction based on the inclination calculated from a group of measurement points of a feature having a plane perpendicular to the width direction of the travel path. According to this aspect, the posture estimation device can appropriately estimate the posture of the measurement device in the roll direction and the posture of the measurement device in the pitch direction.
- the feature is a road sign provided on the travel path.
- the estimation unit extracts an outer edge point group forming an edge of the measurement point group, and calculates the inclination of the feature based on the outer edge point group.
- the posture estimating apparatus can extract the outer edge point group representing the shape of the feature, and can appropriately calculate the inclination of the feature.
- the vertical plane is a rectangle
- the estimation unit calculates the inclination corresponding to each side of the rectangle from the outside point group, thereby calculating the inclination of the feature. calculate.
- the posture estimation device can appropriately calculate the inclination of the feature from the outer edge point group.
- the slope of the feature is calculated by weighting the slope corresponding to each side based on the number of point groups forming each of the sides. According to this aspect, the posture estimation device can calculate the inclination of the feature from the outer edge point group with higher accuracy.
- a control method executed by the posture estimating device wherein the plane is a plane perpendicular to a traveling direction of a traveling path of a moving body or a plane perpendicular to a width direction of the traveling path.
- the posture estimating device can appropriately estimate the mounting posture of the measuring device on the moving body.
- a program executed by a computer comprising: a ground surface having a plane perpendicular to a traveling direction of a traveling path of a moving object or a plane perpendicular to a width direction of the traveling path.
- An object an acquisition unit that acquires a measurement point group measured by a measurement device attached to the moving body, and the vertical calculated with reference to a coordinate system based on installation information of the measurement device, calculated from the measurement point group.
- the computer is caused to function as estimating means for estimating a mounting posture of the measuring device to the moving object based on a tilt of a surface.
- the computer can appropriately estimate the mounting posture of the measuring device on the moving body.
- the program is stored in a storage medium.
- FIG. 1 is a schematic configuration diagram of the vehicle system according to the present embodiment.
- the vehicle system shown in FIG. 1 includes an in-vehicle device 1 that performs control related to driving support of a vehicle, a lidar (Light Detection and Ranging or Laser Illuminated Detection And Ranging) 2, a gyro sensor 3, an acceleration sensor 4, and a GPS. And a sensor group such as the receiver 5.
- the on-vehicle device 1 is electrically connected to a group of sensors such as the rider 2, the gyro sensor 3, the acceleration sensor 4, and the GPS receiver 5, and obtains output data of these.
- a map database (DB: DataBase) 10 that stores road data and feature information about features provided near the road is stored.
- the in-vehicle device 1 estimates the position of the vehicle based on the output data and the map DB 10, and performs control related to driving support of the vehicle such as automatic driving control based on the position estimation result.
- the in-vehicle device 1 estimates the attitude of the rider 2 based on the output of the rider 2 and the like. Then, the vehicle-mounted device 1 performs a process of correcting each measurement value of the point cloud data output by the rider 2 based on the estimation result.
- the in-vehicle device 1 is an example of the “posture estimation device” in the present invention.
- the lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring a distance to an object existing in the external world, and a three-dimensional point indicating the position of the object. Generate group information.
- the lidar 2 includes an irradiation unit that irradiates laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit.
- an output unit that outputs The scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the above-described light reception signal, and transmitted to the in-vehicle device 1. Supplied.
- the rider 2 is an example of the “measuring device” in the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the vehicle-mounted device 1.
- the in-vehicle device 1 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. These components are interconnected via a bus line.
- the interface 11 acquires output data from sensors such as the rider 2, the gyro sensor 3, the acceleration sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15. Further, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
- ECU Electronic Control Unit
- the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
- the storage unit 12 has a map DB 10 and rider installation information IL.
- the map DB 10 is a database including, for example, road data, facility data, and feature data around roads.
- the road data includes lane network data for route search, road shape data, traffic regulation data, and the like.
- the feature data includes information (for example, position information and type information) on a signboard such as a road sign, a road marking such as a stop line, a road division line such as a white line, and a structure along a road.
- the feature data may include highly accurate point cloud information of the feature to be used for the vehicle position estimation.
- various data required for position estimation may be stored in the map DB.
- the rider installation information IL is information relating to the relative posture and position of the rider 2 with respect to the vehicle at a certain reference time (for example, when there is no posture shift such as immediately after the alignment adjustment of the rider 2).
- the attitude of the rider 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, an Euler angle).
- the rider installation information IL may be updated based on the estimation result when the later-described posture estimation process of the rider 2 is executed.
- the input unit 14 is a button for a user to operate, a touch panel, a remote controller, a voice input device, and the like, and receives an input for designating a destination for a route search, an input for designating ON and OFF of automatic driving, and the like.
- the information output unit 16 is, for example, a display, a speaker, or the like that outputs under the control of the control unit 15.
- the control unit 15 includes a CPU that executes a program, and controls the entire vehicle-mounted device 1.
- the control unit 15 estimates the position of the own vehicle based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and performs control related to driving support of the vehicle including automatic driving control based on the estimation result of the own vehicle position. And so on.
- the control unit 15 converts the measurement data output by the rider 2 based on the attitude and position included in the installation information of the rider 2 recorded in the rider installation information IL. Is converted from a coordinate system based on the rider 2 to a coordinate system based on the vehicle (hereinafter, “reference coordinate system”).
- the control unit 15 calculates the amount of change with respect to the attitude recorded in the rider installation information IL by estimating the current (ie, processing reference) attitude of the rider 2 with respect to the vehicle.
- the measurement data output by the rider 2 is corrected based on the amount.
- the control unit 15 corrects the measurement data output by the rider 2 so as not to be affected by the shift.
- the control unit 15 is an example of a “computer” that executes a program according to the present invention.
- FIG. 3 is a diagram illustrating a relationship between a vehicle coordinate system and a rider coordinate system represented by two-dimensional coordinates.
- the vehicle coordinate system has a coordinate axis “x b ” along the traveling direction of the vehicle and a coordinate axis “y b ” along the side direction of the vehicle, with the center of the vehicle as the origin.
- the rider coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A2) and a coordinate axis “y L ” along the side direction of the rider 2.
- the measurement point [x] at the time “k” as viewed from the vehicle coordinate system b (k), y b (k)] T is converted to the coordinates [x L (k), y L (k)] T of the rider coordinate system by the following equation (1) using the rotation matrix “C ⁇ ”. Is converted.
- the conversion from the rider coordinate system to the vehicle coordinate system may use an inverse matrix (transposed matrix) of the rotation matrix. Therefore, the measurement point of time k obtained in the rider coordinate system [x L (k), y L (k)] T is the vehicle coordinate system of coordinates by the following equation (2) [x b (k ), y b (K)] can be converted to T.
- FIG. 4 is a diagram illustrating a relationship between the vehicle coordinate system and the rider coordinate system represented by three-dimensional coordinates.
- the coordinate axis perpendicular to the coordinate axes x b and y b is “z b ”
- the coordinate axis perpendicular to the coordinate axes x L and y L is “z L ”.
- the roll angle of the rider 2 with respect to the vehicle coordinate system is “L ⁇ ”, the pitch angle is “L ⁇ ”, the yaw angle is “ L ⁇ ”, the position of the rider 2 on the coordinate axis xb is “L x ”, and the position on the coordinate axis y b
- the measurement point [x b (k), y b (k), z b () at the time “k” viewed from the vehicle coordinate system k)] T is the following equation (3) using a directional cosine matrix “C” represented by rotation matrices “C ⁇ ”, “C ⁇ ”, and “C ⁇ ” corresponding to roll, pitch, and yaw. Accordingly, the rider coordinate system of the coordinate is converted into [x L (k), y L (k), z L (k)] T.
- the conversion from the rider coordinate system to the vehicle coordinate system may use an inverse matrix (transposed matrix) of the direction cosine matrix. Therefore, the measurement point of time k obtained in the rider coordinate system [x L (k), y L (k), z L (k)] T is the vehicle coordinate system of coordinates by the following equation (4) [x b (K), y b (k), z b (k)] T.
- the on-vehicle device 1 detects a square (rectangular) road sign having a plane perpendicular to the traveling direction of the traveling path (that is, a front surface with respect to the traveling path) by the rider 2 as a target feature (“detection”). also referred to as a target feature Ftag ".) and regarded, by the inclination of the road sign rider 2 is calculated based on the measurement point group to be measured to estimate the roll angle L phi rider 2.
- FIG. 5 is a bird's-eye view of the vicinity of the vehicle when the vehicle travels near the road sign 22 that is the detection target feature Ftag when estimating the roll angle of the rider 2.
- a dashed circle 20 indicates a range in which a feature can be detected by the rider 2 (also referred to as a “lider detection range”), and a dashed frame 21 detects a road sign 22 which is a detection target feature Ftag.
- the vehicle-mounted device 1 refers to the map DB 10 and recognizes that the rectangular road sign 22 having a plane perpendicular to the traveling direction of the traveling path exists in the lidar detection range, and A detection window indicated by a broken-line frame 21 is set based on the feature data relating to 22.
- the in-vehicle device 1 previously stores, for example, type information of a road sign to be a detection target feature Ftag, and determines whether or not a road sign of the type indicated by the type information exists within the lidar detection range. Is determined with reference to the feature data in the map DB 10.
- the on-vehicle device 1 determines the size or / and / or the size of the road sign 22 as the detection target feature Ftag.
- the size and / or shape of the detection window may be determined based on the shape information.
- the on-vehicle device 1 converts the measurement point group (also referred to as “target measurement point group Ptag”) existing in the set detection window from the measurement point group measured by the rider 2 to the measurement point group of the road sign 22. Extract as
- FIG. 6A shows the relationship between the target measurement point group Ptag and the scanning line 23 of the laser light by the rider 2 when the measurement target surface of the road sign 22 is viewed from the front.
- L theta shows the initial attitude angle L phi rider 2, L theta, taking an example where L [psi is 0 degrees.
- FIG. 6A shows the relationship between the target measurement point group Ptag and the scanning line 23 of the rider 2 when there is no shift in the roll direction in the rider 2
- FIGS. 6B and 6C shows the relationship between the target measurement point group Ptag and rider second scan line 23 when the deviation [Delta] L phi in the roll direction occurs.
- FIG. 6B shows the relationship between the target measurement point group Ptag and the scanning line 23 of the rider 2 in the y b -z b plane of the vehicle coordinate system
- FIG. 6C shows the reference coordinate system
- 5 shows the relationship between the target measurement point group Ptag and the scanning line 23 of the rider 2 in the y b ′ -z b ′ plane.
- the longitudinal direction of the road sign 22 is assumed to be parallel to the yb axis, and the vehicle is present on a flat road. .
- the rider 2 scans in a direction parallel to the lateral direction (that is, the horizontal direction or the longitudinal direction) of the road sign 22.
- the number of point groups in the vertical direction of the target measurement point group Ptag is the same number (six in FIG. 6A) regardless of the position in the horizontal direction.
- the scanning line is inclined with respect to the road sign 22.
- the longitudinal direction of the road sign 22 is not inclined with respect to the vehicle, and the scanning line of the rider 2 is inclined with respect to the vehicle. as shown, becomes parallel to the longitudinal direction of the y b-axis and the road signs 22, the scanning lines of the rider 2 is inclined with respect to y b axis.
- y b 'and the scanning line of the shaft and the rider 2 is parallel to the longitudinal direction of the road sign 22 is y b' with respect to the axis Tilt.
- the in-vehicle device 1 determines the received light intensity for each measurement point of the target measurement point group Ptag by using a predetermined threshold, and excludes the measurement points corresponding to the light reception intensity that is equal to or less than the predetermined threshold from the target measurement point group Ptag. I do. Then, the in-vehicle device 1 extracts a point group forming the outer edge (also referred to as “outer edge point group Pout”) from the target measurement point group Ptag after the exclusion process based on the received light intensity. For example, the in-vehicle device 1 extracts, as the outer edge point group Pout, the measurement points of the target measurement point group Ptag in which there is no adjacent measurement point in at least one of the vertical and horizontal directions.
- FIG. 7A shows the target measurement point group Ptag in the lidar coordinate system
- FIG. 7B shows the target measurement point group Ptag after excluding the measurement points corresponding to the light receiving intensity equal to or less than the predetermined threshold value.
- FIG. 7C shows an outer edge point group Pout extracted from the target measurement point group Ptag shown in FIG. 7B.
- the measurement intensity of a part of the laser light reflected by only a part of the laser light is missing. Is less than or equal to a predetermined threshold, and is excluded from the target measurement point group Ptag.
- FIGS. 7B and 7C measurement points having no adjacent measurement points in at least one of the upper, lower, left, and right directions are extracted as the outer edge point group Pout.
- the in-vehicle device 1 calculates the inclination of the road sign 22 in the longitudinal direction from the outer edge point group Pout.
- the outer edge point group Pout forms a quadrangle
- the outer edge point group Pout is classified for each side of the rectangle, and a regression analysis method such as a least square method is used from the outer edge point group Pout classified for each side. To calculate the slope of the approximate straight line of each side.
- FIG. 8A is a diagram in which a group of outer edge points Pout forming the upper side of a rectangle is extracted and a straight line is drawn by the least square method.
- FIG. 8B is a diagram in which a group of outer edge points Pout forming the base of the rectangle is extracted and a straight line is drawn by the least square method.
- FIG. 8C is a diagram in which a group of outer edge points Pout forming the left side of the rectangle is extracted and a straight line is drawn by the least square method.
- FIG. 8D shows a diagram in which a group of outer edge points Pout forming the right side of the rectangle is extracted and a straight line is drawn by the least square method.
- the on-vehicle device 1 calculates each straight line shown in FIGS. 8A to 8D, and calculates the slopes “ ⁇ 1 ” to “ ⁇ 4 ” of each straight line with respect to the y b ′ axis. Then, the in-vehicle device 1 calculates the average of these slopes “ ⁇ 1 ” to “ ⁇ 4 ” as the slope “ ⁇ ” of the road sign 22 as shown in the following equation (5).
- the in-vehicle device 1 may weight each of the inclinations ⁇ 1 to ⁇ 4 based on the number of the outer edge point groups Pout constituting each side of the rectangle.
- the number of the outer edge point groups Pout forming the upper side of the rectangle is “n 1 ”
- the number of the outer edge point groups Pout forming the base of the square is “n 2 ”
- the number of the outer edge point group Pout forming the left side of the square is the number "n 3" and the number of the outer edge point group Pout forming the right side of the rectangle and "n 4"
- the vehicle-mounted device 1 performs a weighted average based on the following equation (6), the inclination ⁇ calculate.
- the vehicle-mounted device 1 determines the interval “I H ” between the horizontal target measurement point group Ptag and the vertical interval “I V ” between the target measurement point group Ptag of the rider 2.
- the slopes ⁇ 1 to ⁇ 4 may be weighted.
- the horizontal interval I H and the vertical interval IV are the intervals shown in FIG. 7B, respectively.
- the in-vehicle device 1 reduces the accuracy of the inclinations ⁇ 1 and ⁇ 2 of the top and bottom sides of the square as the vertical interval IV increases, and decreases the inclination ⁇ of the left side and right side of the rectangle as the horizontal interval I H increases.
- the slope ⁇ is calculated by performing weighted averaging using the reciprocals of the horizontal interval I H and the vertical interval IV as shown in the following equation (7).
- the vehicle-mounted unit 1 the weighted average in consideration of both the number n 1 ⁇ n 4 and the vertical spacing I V and horizontal spacing I H of the point group constituting the sides of the rectangle may be calculated inclination phi.
- the vehicle-mounted device 1 calculates the slope ⁇ by performing weighted averaging based on the following equation (8).
- the on-vehicle device 1 calculates the roll angle “ ⁇ 0 ” of the vehicle body based on the outputs of the gyro sensor 3 and the acceleration sensor 4, and calculates the difference between the roll angle “ ⁇ 0 ” and the inclination ⁇ calculated by any of the equations (5) to (8). by taking, for calculating a roll angle L phi rider 2. In this case, the vehicle-mounted device 1 calculates the roll angle L phi rider 2 based on the following equation (9).
- the vehicle-mounted device 1 can be suitably calculated roll angle L phi lidar 2 in which the influence of inclination of the vehicle.
- the vehicle-mounted unit 1 the calculation of the roll angle L phi of the rider 2 It is good to carry out for many road signs which become the detection target feature Ftag and average them.
- the vehicle-mounted device 1 can be suitably calculated roll-direction tilt L phi of probable rider 2.
- the roll angle ⁇ 0 of the vehicle body can be regarded as 0 ° on average, it is not necessary to acquire the roll angle ⁇ 0 of the vehicle body by lengthening the averaging time.
- the vehicle-mounted unit 1 when acquiring the inclination phi of sufficiently large the N content, the vehicle-mounted unit 1, the roll angle L phi rider 2 may be defined by the following equation (10).
- N is the number of samples that can average the body roll angle phi 0 is regarded as substantially 0 °, for example is predetermined based on experiments or the like.
- FIG. 9 is a flowchart showing the procedure of the roll angle estimation process.
- the vehicle-mounted device 1 refers to the map DB 10 and sets a detection window for the detection target feature Ftag around the vehicle (step S101).
- the on-vehicle device 1 selects a rectangular road sign having a vertical plane with respect to the traveling direction of the traveling path as the detection target feature Ftag, and selects the above-described road sign existing within a predetermined distance from the current position. It is specified from the map DB 10. Then, the on-vehicle device 1 sets a detection window by acquiring position information and the like regarding the specified road sign from the map DB 10.
- the vehicle-mounted device 1 acquires the measurement point group of the rider 2 within the detection window set in step S101 as a point group of the detection target feature Ftag (that is, the target measurement point group Ptag) (step S102). Then, the vehicle-mounted device 1 extracts an outer edge point group Pout that forms the outer edge of these point groups from the target measurement point group Ptag acquired in step S102 (step S103).
- the on-vehicle device 1 calculates the lateral inclination ⁇ of the outer edge point group Pout extracted in Step S103 (Step S104).
- the in-vehicle device 1 changes the slope ⁇ 1 of a straight line corresponding to each side from a point group corresponding to each side of the square formed by the outer edge point group Pout.
- seek ⁇ phi 4 calculates an inclination phi employ any of the formulas (5) to (8).
- the vehicle-mounted unit 1 calculates the roll angle L phi rider 2 (step S105).
- the on-vehicle device 1 obtains a plurality of slopes ⁇ by performing the processing of steps S101 to S104 on a plurality of detection target features Ftag, and obtains an average value of the obtained plurality of slopes ⁇ . Is calculated as the slope L ⁇ .
- the in-vehicle device 1 regards a rectangular road sign having a plane perpendicular to the width direction of the traveling road (that is, a plane transverse to the traveling road) as the detection target feature Ftag, and determines the inclination of the road sign to the rider 2.
- the pitch angle of the rider 2 is estimated by performing calculation based on the measurement point group output by.
- FIG. 10 is a bird's-eye view of the periphery of the vehicle when the vehicle travels near the road sign 24 that is the target feature Ftag when estimating the pitch angle of the rider 2.
- a dashed circle 20 indicates a lidar detection range in which a feature can be detected by the rider 2
- a dashed frame 21 indicates a detection window set for a road sign 24 which is a detection target feature Ftag.
- the in-vehicle device 1 refers to the map DB 10 to determine that the rectangular road sign 24 having a vertical surface with respect to the width direction of the traveling path exists within the lidar detection range indicated by the broken circle 20. Recognition is performed, and a detection window indicated by a broken-line frame 21 is set based on position information and the like regarding the road sign 24.
- the in-vehicle device 1 previously stores, for example, type information of a road sign to be a detection target feature Ftag, and determines whether or not a road sign of the type indicated by the type information exists within the lidar detection range. Is determined with reference to the feature data in the map DB 10. Then, the on-vehicle device 1 extracts, as the target measurement point group Ptag, the measurement point group existing in the set detection window from the measurement point group output by the rider 2.
- the on-vehicle device 1 estimates the pitch angle of the rider 2 by executing the same processing as the roll angle estimation method on the extracted target measurement point group Ptag.
- the in-vehicle device 1 first determines the light receiving intensity for each of the measurement points of the target measurement point group Ptag by using a predetermined threshold, and performs the target measurement on the measurement point corresponding to the light receiving intensity that is equal to or less than the predetermined threshold. Exclude from point cloud Ptag. Then, the vehicle-mounted device 1 extracts the outer edge point group Pout constituting the outer edge from the target measurement point group Ptag after the exclusion processing based on the received light intensity. Next, the in-vehicle device 1 calculates the slopes “ ⁇ 1 ” to “ ⁇ 4 ” of the approximate straight lines of each side of the rectangle formed by the outer edge point group Pout by a regression analysis method such as the least square method.
- the in-vehicle device 1 calculates an inclination “ ⁇ ” of the road sign 22 by performing an averaging process based on the inclinations ⁇ 1 to ⁇ 4 .
- the vehicle-mounted device 1 is similar to the equation (6) to (8), the number n 1 - n 4 points group constituting the sides of the rectangle, at least one of the vertical interval I V and horizontal spacing I H
- the inclination ⁇ may be calculated based on the weighted average with consideration.
- the on-vehicle device 1 calculates the pitch angle “ ⁇ 0 ” of the vehicle body based on the outputs of the gyro sensor 3 and the acceleration sensor 4 and calculates the pitch angle L ⁇ of the rider 2 by calculating the difference from the inclination ⁇ . .
- the vehicle-mounted device 1 calculates the pitch direction tilt L theta rider 2 based on equation (11) below.
- the vehicle-mounted device 1 can be suitably calculated pitch angle L theta lidar 2 in which the influence of inclination of the vehicle.
- the detection target features a road sign 24 is Ftag are considering the possibility and measurement error and the like which are inclined with respect to the horizontal direction, the vehicle-mounted unit 1, the pitch angle L theta of the rider 2
- the calculation may be performed on a number of road signs serving as the detection target feature Ftag, and may be averaged.
- the vehicle-mounted device 1 can be suitably calculated pitch angle L theta in probable rider 2.
- the pitch angle ⁇ 0 of the vehicle body can be regarded as 0 ° on average, it is not necessary to acquire the pitch angle ⁇ 0 of the vehicle body by lengthening the averaging time.
- the pitch angle L theta rider 2 is expressed by the following Equation (12).
- FIG. 11 is a flowchart showing the procedure of pitch angle estimation processing.
- the in-vehicle device 1 sets a detection window for the detection target feature Ftag around the vehicle with reference to the map DB 10 (Step S201).
- the on-vehicle device 1 selects, as the detection target feature Ftag, a rectangular road sign having a vertical plane with respect to the width direction of the traveling path, and determines the above-described road sign existing within a predetermined distance from the current position. It is specified from the map DB 10.
- the on-vehicle device 1 sets a detection window by acquiring position information and the like regarding the specified road sign from the map DB 10.
- the vehicle-mounted device 1 acquires the measurement point group of the rider 2 in the detection window set in step S201 as a point group of the detection target feature Ftag (that is, the target measurement point group Ptag) (step S202). Then, the vehicle-mounted device 1 extracts, from the point group of the detection target feature Ftag acquired in step S202, an outer edge point group Pout that forms the outer edge of these point groups (step S203).
- the vehicle-mounted device 1 calculates the inclination ⁇ in the longitudinal direction of the outer edge point group Pout extracted in Step S203 (Step S204). Thereafter, the on-vehicle device 1 calculates the inclination L ⁇ of the rider 2 in the pitch direction by averaging the inclination ⁇ and / or processing the difference from the pitch angle ⁇ 0 of the vehicle body (step S205). In the averaging process described above, the vehicle-mounted unit 1, step S201 calculates a plurality of inclination theta by performing the process of ⁇ S204 for a plurality of target measurement point group PTAG, these average gradient as L theta calculate.
- Vehicle device 1 regards the demarcation line, such as a white line and the detection target feature FTAG, by the direction of the center line of the lane line rider 2 is calculated based on the measurement point group to be measured of the rider 2 yaw angle L [psi presume.
- FIG. 12 is a bird's-eye view of the periphery of the vehicle when the vehicle travels along a lane marking that is a target feature Ftag when estimating the yaw angle of the rider 2.
- a continuous line 30 which is a continuous lane marking on the left side of the vehicle
- a broken line 31 which is an intermittent lane marking on the right side of the vehicle.
- the arrow 25 indicates the reference direction of the yaw direction when the rider 2 does not occur yaw direction deviation, be a direction obtained by rotating the coordinate axes of the rider 2 "x L" only yaw angle L [psi rider 2 , And the coordinate axis “x b ” along the traveling direction of the vehicle.
- the yaw angle L [psi rider 2, if no deviation occurs, indicating the yaw angle regarded as the running direction of the vehicle (also referred to as "yaw direction reference angle".).
- the arrow 26 indicates the yaw direction reference angle in the case where the deviation in the yaw direction of the rider 2 is ⁇ L ⁇ . Unless the yaw direction reference angle is corrected, the arrow 26 will not match the running direction of the vehicle.
- the yaw direction reference angle is stored in advance in, for example, the storage unit 12 or the like.
- the map DB 10 includes demarcation line information indicating the discrete coordinate position of the continuous line 30 and the discrete coordinate position of the broken line 31.
- the in-vehicle device 1 refers to the lane marking information and determines the nearest coordinate position from a position at a predetermined distance (for example, 5 m) from the vehicle in each of the left front, left rear, right front, and right rear directions of the vehicle.
- a rectangular area centered on the extracted coordinate position is set as a detection window indicated by broken-line frames 21A to 21D.
- the in-vehicle device 1 extracts, from the measurement point group measured by the rider 2, a point group on the road surface and having a reflection intensity equal to or higher than a predetermined threshold within the detection window as a target measurement point group Ptag. Then, the vehicle-mounted device 1 obtains the center point of the target measurement point group Ptag for each scanning line, and calculates a straight line (see arrows 27A to 27D) passing through these center points for each detection window as the center line of the division line. .
- FIG. 13A is a diagram in which the center point of the dashed frame 21A in the detection window is indicated by a circle
- FIG. 13B is a diagram in which the center point of the dashed frame 21B in the detection window is indicated by a circle.
- the scanning line 28 of the laser light by the rider 2 is clearly shown.
- the intervals between the scanning lines become shorter as approaching the vehicle.
- the vehicle-mounted device 1 calculates a center point for each scanning line.
- the center point may be a position obtained by averaging the coordinate positions indicated by the measurement points for each scanning line 28, or may be an intermediate point between the leftmost and rightmost measurement points on each scanning line 28. It may be a measurement point existing in the upper middle.
- the in-vehicle device 1 calculates the center line (see arrows 27A and 27B) of the division line for each detection window from the calculated center point by a regression analysis method such as the least square method.
- FIGS. 14A to 14D are diagrams showing the slopes ⁇ 1 to 4 4 , respectively. As shown in FIGS. 14A to 14D, the inclinations 1 1 to 4 4 correspond to the angles formed by arrows 27A to 27D indicating the center line of the dividing line and arrows 26 indicating the yaw direction reference angle. .
- the vehicle-mounted device 1 calculates the slope ⁇ by averaging the slopes ⁇ 1 to ⁇ 4 as shown in the following equation (13).
- the in-vehicle device 1 may also weight slopes 1 1 to ⁇ 4 based on the number of center points used for calculating the center line of the lane marking.
- “n 1" to the number of center points used to calculate the center line of division line as a gradient [psi 1 "n number of center points used to calculate the center line of division line as a gradient [psi 2 2 ”
- the number of center points used to calculate the center line of the division line having the slope ⁇ 3 is“ n 3 ”
- the number of center points used to calculate the center line of the division line having the slope 4 4 is“ n ”. 4 ", the vehicle-mounted device 1 calculates the slope ⁇ based on the following equation (14).
- the on-vehicle device 1 can relatively lower the weight of the inclination of the center line of the division line (the broken line 31 in FIG. 12) calculated with a small number of center points, and can accurately obtain the inclination ⁇ . Therefore, the weight is lower in the detection situation of FIG. 13B than in the detection situation of FIG.
- FIG. 12 to FIG. 14 an example in which the inclinations 1 1 to 4 4 for the four lane marking lines are calculated at the same time is shown, but the inclinations for at least one lane marking may be calculated.
- the vehicle body fluctuates in the lane.
- the in-vehicle device 1 may execute a process of calculating the inclination of the center line for many lane markings, and average the values.
- the vehicle-mounted device 1 can be suitably calculated deviation [Delta] L [psi of probable yaw angle of the rider 2.
- the yaw angle L [psi rider 2 is expressed by the following equation (15).
- L .phi.0 indicates yaw angle L [psi (i.e. yaw direction reference angle) when the deviation in the yaw angle has not occurred.
- FIG. 15 is a flowchart showing the procedure of the yaw angle estimation process.
- the in-vehicle device 1 refers to the map DB 10 and sets one or more detection windows for the lane markings (Step S301). Then, the vehicle-mounted device 1 acquires the measurement point group of the rider 2 within the detection window set in step S301 as a lane marking point group (ie, a target measurement point group Ptag) (step S302).
- a lane marking point group ie, a target measurement point group Ptag
- the in-vehicle device 1 calculates the center line of the division line in each set detection window (step S303).
- the vehicle-mounted device 1 obtains the center point of the target measurement point group Ptag for each scanning line of the rider 2, and calculates the center line of the division line for each detection window from the center point based on the least square method or the like.
- the vehicle-mounted device 1 calculates an angle ⁇ between the yaw angle reference angle stored in advance in the storage unit 12 and the center line of the lane marking (step S304).
- the in-vehicle device 1 obtains the angle between the yaw angle reference angle and the center line of the division line for each detection window, and averages these angles.
- the above-described angle ⁇ is calculated by performing weighted averaging with the number of center points.
- the on-vehicle device 1 averages the plurality of angles ⁇ calculated by executing steps S301 to S304 on different lane markings, and calculates the yaw angle L ⁇ of the rider 2 based on Expression (15) ( Step S305).
- FIG. 16 is a flowchart illustrating a specific example of the processing based on the estimated values of the roll angle, the pitch angle, and the yaw angle.
- the above-mentioned threshold value is a threshold value for determining whether or not the measurement data of the rider 2 can be continuously used by performing the correction processing of the measurement data of the rider 2 in step S404 described later. Is set.
- the in-vehicle device 1 uses the output data of the target rider 2 (that is, obstacle detection). And the information output unit 16 outputs a warning to the effect that it is necessary to perform the alignment adjustment again for the target rider 2 (step S403).
- a decrease in safety or the like caused by using the measurement data of the rider 2 in which the posture and the position are significantly shifted due to an accident or the like is reliably suppressed.
- the in-vehicle device 1 calculates the angle calculated from the angle recorded in the rider installation information IL.
- the measured values of the point cloud data output by the rider 2 are corrected based on the amount of change (step S404).
- the in-vehicle device 1 stores, for example, a map indicating the correction amount of the measurement value with respect to the above-described change amount, and corrects the above-described measurement value by referring to the map or the like. Further, the measurement value may be corrected using a value of a predetermined ratio of the change amount as a correction amount of the measurement value.
- the in-vehicle device 1 is configured such that a road sign having a plane perpendicular to the traveling direction of the vehicle travel path or a plane perpendicular to the width direction of the travel path is attached to the rider attached to the vehicle. 2.
- the target measurement point group Ptag measured by Step 2 is acquired.
- the in-vehicle device 1 determines the mounting posture of the rider 2 to the vehicle based on the inclination of the road sign in the longitudinal direction based on the coordinate system based on the installation information of the rider 2, which is calculated from the target measurement point group Ptag. presume. Thereby, the vehicle-mounted device 1 can accurately estimate the mounting attitude of the rider 2 to the vehicle in the roll direction and the pitch direction.
- the vehicle-mounted device 1 in the present embodiment acquires the group of measurement points obtained by measuring the lane markings by the rider 2 attached to the vehicle, and, based on the group of measurement points, moves along the traveling direction of the vehicle. The center line of the demarcation line is calculated. Then, the on-vehicle device 1 estimates the mounting posture of the rider 2 to the vehicle based on the direction of the center line of the lane marking with respect to the yaw direction reference angle referenced by the rider 2. Thereby, the vehicle-mounted device 1 can appropriately estimate the mounting posture of the rider 2 in the yaw direction.
- step S303 in FIG. 13 and FIG. 15 the vehicle-mounted device 1 calculates the center line of the lane marking by calculating the center point in the detection window in the width direction of the lane marking. Instead, the on-vehicle device 1 may extract the right or left end point of the scanning line in the detection window and calculate a straight line passing through the right or left end of the division line. In this way, the on-vehicle device 1 can appropriately identify the line parallel to the traveling path and calculate the angle ⁇ .
- the vehicle-mounted device 1 regards the section line as the detection target feature Ftag.
- the in-vehicle device 1 may regard the curbstone or the like as the detection target feature Ftag. In this manner, the in-vehicle device 1 regards an arbitrary feature whose surface to be measured is parallel to the traveled road surface and formed along the traveled road surface as the detection target feature Ftag, without being limited to the lane markings.
- the yaw angle estimation process described with reference to FIG. 15 may be executed.
- step S404 in FIG. 16 instead of correcting each measurement value of the point cloud data output by the rider 2, the on-vehicle device 1 uses the roll angle and the pitch calculated by the processing in the flowchart in FIG. 9, FIG. 11, or FIG.
- Each measured value of the point cloud data output by the rider 2 may be converted into a vehicle coordinate system based on the angle and the yaw angle.
- the in-vehicle device 1 uses the calculated roll angle L ⁇ , pitch angle L ⁇ , and yaw angle L ⁇ to calculate each measurement value of the point cloud data output by the rider 2 based on the equation (4) in the rider coordinates.
- the system may be converted into a vehicle body coordinate system, and the own vehicle position estimation, automatic driving control, and the like may be executed based on the converted data.
- control for driving the adjustment mechanism may be performed such that the attitude of the rider 2 is corrected by the amount of deviation from the angle recorded in the information IL.
- the configuration of the vehicle system shown in FIG. 1 is an example, and the configuration of the vehicle system to which the present invention can be applied is not limited to the configuration shown in FIG.
- the electronic control unit of the vehicle may execute the processes shown in FIGS. 9, 11, 15, 16, and the like.
- the rider installation information IL is stored in, for example, a storage unit in the vehicle, and the electronic control unit of the vehicle is configured to be able to receive output data of various sensors such as the rider 2.
- the vehicle system may include a plurality of riders 2.
- the in-vehicle device 1 estimates the roll angle, the pitch angle, and the yaw angle of each rider 2 by executing the processing of the flowcharts of FIGS. 9, 11, and 15 for each rider 2. .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
In the present invention, an on-board device 1 acquires a target measurement point group Ptag obtained by measuring, by a LIDAR 2 attached to the vehicle, a road sign having a surface perpendicular to the direction of advancement of the travelling road of the vehicle or a surface perpendicular to the width direction of the travelling road. The posture of the attachment of the LIDAR 2 to the vehicle is estimated on the basis of the longitudinal direction inclination of the road sign as calculated from the target measurement point group Ptag and with reference to a coordinate system based on installation information of the LIDAR 2.
Description
本発明は、計測装置の姿勢を推定する技術に関する。
The present invention relates to a technique for estimating the attitude of a measuring device.
従来から、レーダやカメラなどの計測装置の計測データに基づいて、自車位置推定などを行う技術が知られている。例えば、特許文献1には、計測センサの出力と、予め地図上に登録された地物の位置情報とを照合させることで自己位置を推定する技術が開示されている。また、特許文献2には、カルマンフィルタを用いた自車位置推定技術が開示されている。
技術 Conventionally, there has been known a technology for estimating a position of a vehicle based on measurement data of a measurement device such as a radar or a camera. For example, Patent Literature 1 discloses a technique for estimating a self-position by comparing an output of a measurement sensor with position information of a feature registered on a map in advance. Patent Document 2 discloses a vehicle position estimation technology using a Kalman filter.
計測装置から得られるデータは、計測装置を基準とした座標系の値であり、車両に対する計測装置の姿勢等に依存したデータとなっているため、車両を基準とした座標系の値に変換する必要がある。従って、計測装置の姿勢にずれが生じた場合には、そのずれを的確に検知して計測装置のデータに反映させたり、計測装置の姿勢を修正したりする必要がある。
The data obtained from the measuring device is a value in the coordinate system based on the measuring device, and is data depending on the attitude of the measuring device with respect to the vehicle, and so is converted into a value in the coordinate system based on the vehicle. There is a need. Therefore, when a deviation occurs in the posture of the measuring device, it is necessary to accurately detect the deviation and reflect the deviation in the data of the measuring device, or to correct the posture of the measuring device.
本発明は、上記のような課題を解決するためになされたものであり、移動体に取り付けられた計測装置の移動体への取り付け姿勢を好適に推定可能な姿勢推定装置を提供することを主な目的とする。
The present invention has been made to solve the above-described problem, and has as its main object to provide a posture estimating apparatus capable of suitably estimating a mounting posture of a measuring device attached to a moving body to the moving body. Purpose.
請求項に記載の発明は、姿勢推定装置であって、移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得手段と、前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定手段と、を有する。
The invention according to claim is a posture estimating device, wherein a feature having a plane perpendicular to a traveling direction of a traveling path of the moving body or a plane perpendicular to a width direction of the traveling path is provided to the moving body. Acquisition means for acquiring a measurement point group measured by the attached measurement device, and calculated from the measurement point group, based on the inclination of the vertical plane with reference to a coordinate system based on the installation information of the measurement device, Estimating means for estimating an attachment posture of the measuring device to the moving body.
また、請求項に記載の発明は、姿勢推定装置が実行する制御方法であって、移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得工程と、前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定工程と、を有する。
The invention described in the claims is a control method executed by the posture estimating apparatus, wherein the ground has a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path. An object, an acquisition step of acquiring a group of measurement points measured by a measurement device attached to the moving body, and the vertical calculated based on the coordinate system based on the installation information of the measurement device, which is calculated from the measurement point group. An estimating step of estimating an attachment posture of the measuring device to the moving body based on a tilt of a surface.
また、請求項に記載の発明は、コンピュータが実行するプログラムであって、移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得手段と、前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定手段として前記コンピュータを機能させる。
Further, the invention according to the claims is a program executed by a computer, a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path, Acquisition means for acquiring a group of measurement points measured by a measurement device attached to the moving body, and inclination of the vertical plane calculated from the measurement point group and based on a coordinate system based on installation information of the measurement device. And causing the computer to function as estimating means for estimating an attachment posture of the measuring device to the moving object based on the information.
本発明の好適な実施形態によれば、姿勢推定装置は、移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得手段と、前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定手段と、を有する。姿勢推定装置は、この態様により、移動体の走行路の進行方向に対して垂直面又は走行路の幅方向に対して垂直面を有する地物を基準として、計測装置の移動体への取り付け姿勢を好適に推定することができる。
According to a preferred embodiment of the present invention, the posture estimating device is configured to convert a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path into the moving body. Acquisition means for acquiring a group of measurement points measured by a measurement device attached to the, based on the inclination of the vertical plane calculated from the measurement point group, based on a coordinate system based on the installation information of the measurement device Estimating means for estimating the posture of the measuring device attached to the moving body. According to this aspect, the posture estimating device is configured to attach the measuring device to the moving body with reference to a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path. Can be suitably estimated.
上記姿勢推定装置の一態様では、前記取得手段は、前記地物に関する地物情報に基づき前記地物の位置を認識することで、当該位置において前記計測装置が計測した計測点群を前記地物の計測点群として取得する。姿勢推定装置は、この態様により、移動体の走行路の進行方向に対して垂直面又は走行路の幅方向に対して垂直面を有する地物の位置を把握して当該地物の計測点群を好適に取得することができる。
In one aspect of the posture estimating apparatus, the acquiring unit recognizes a position of the feature based on feature information on the feature, so that the measurement point group measured by the measurement device at the position is determined by the feature. As a measurement point cloud. According to this aspect, the posture estimating apparatus grasps the position of a feature having a plane perpendicular to the traveling direction of the traveling path of the moving object or a plane perpendicular to the width direction of the traveling path, and measures a measurement point group of the feature. Can be suitably obtained.
上記姿勢推定装置の他の一態様では、前記推定手段は、前記走行路の進行方向に対して垂直面を有する地物の計測点群から算出される前記傾きに基づいて、前記計測装置のロール方向の姿勢を推定し、前記走行路の幅方向に対して垂直面を有する地物の計測点群から算出される前記傾きに基づいて、前記計測装置のピッチ方向の姿勢を推定する。姿勢推定装置は、この態様により、計測装置のロール方向の姿勢及び計測装置のピッチ方向の姿勢を好適に推定することができる。
In another aspect of the posture estimating device, the estimating means may include a roll of the measuring device based on the inclination calculated from a group of measurement points of a feature having a plane perpendicular to the traveling direction of the travel path. And estimating the attitude of the measuring device in the pitch direction based on the inclination calculated from a group of measurement points of a feature having a plane perpendicular to the width direction of the travel path. According to this aspect, the posture estimation device can appropriately estimate the posture of the measurement device in the roll direction and the posture of the measurement device in the pitch direction.
好適な例では、前記地物は、前記走行路に設けられた道路標識であるとよい。
In a preferred example, the feature is a road sign provided on the travel path.
上記姿勢推定装置の他の一態様では、前記推定手段は、前記計測点群の縁を形成する外縁点群を抽出し、当該外縁点群に基づいて前記地物の傾きを算出する。この態様により、姿勢推定装置は、地物の形状を表した外縁点群を抽出し、地物の傾きを好適に算出することができる。
In another aspect of the attitude estimation device, the estimation unit extracts an outer edge point group forming an edge of the measurement point group, and calculates the inclination of the feature based on the outer edge point group. According to this aspect, the posture estimating apparatus can extract the outer edge point group representing the shape of the feature, and can appropriately calculate the inclination of the feature.
上記姿勢推定装置の他の一態様では、前記垂直面は矩形であり、前記推定手段は、前記外側点群から前記矩形の各辺に対応する傾きを算出することで、前記地物の傾きを算出する。この態様により、姿勢推定装置は、外縁点群から地物の傾きを好適に算出することができる。
In another aspect of the attitude estimation device, the vertical plane is a rectangle, and the estimation unit calculates the inclination corresponding to each side of the rectangle from the outside point group, thereby calculating the inclination of the feature. calculate. According to this aspect, the posture estimation device can appropriately calculate the inclination of the feature from the outer edge point group.
上記姿勢推定装置の他の一態様では、前記各辺の各々を構成する点群の数に基づき、前記各辺に対応する傾きの重み付けを行うことで、前記地物の傾きを算出する。この態様により、姿勢推定装置は、外縁点群から地物の傾きをより高精度に算出することができる。
In another aspect of the posture estimation apparatus, the slope of the feature is calculated by weighting the slope corresponding to each side based on the number of point groups forming each of the sides. According to this aspect, the posture estimation device can calculate the inclination of the feature from the outer edge point group with higher accuracy.
本発明の他の好適な実施形態によれば、姿勢推定装置が実行する制御方法であって、移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得工程と、前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定工程と、を有する。姿勢推定装置は、この制御方法を実行することで、計測装置の移動体への取り付け姿勢を好適に推定することができる。
According to another preferred embodiment of the present invention, there is provided a control method executed by the posture estimating device, wherein the plane is a plane perpendicular to a traveling direction of a traveling path of a moving body or a plane perpendicular to a width direction of the traveling path. An acquisition step of acquiring a measurement point cloud measured by a measurement device attached to the moving body, and a feature having the coordinate system based on the installation information of the measurement device calculated from the measurement point cloud. Estimating the attitude of attaching the measuring device to the moving object based on the inclination of the vertical plane. By executing this control method, the posture estimating device can appropriately estimate the mounting posture of the measuring device on the moving body.
本発明の他の好適な実施形態によれば、コンピュータが実行するプログラムであって、移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得手段と、前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定手段として前記コンピュータを機能させる。コンピュータは、このプログラムを実行することで、計測装置の移動体への取り付け姿勢を好適に推定することができる。好適には、上記プログラムは、記憶媒体に記憶される。
According to another preferred embodiment of the present invention, there is provided a program executed by a computer, comprising: a ground surface having a plane perpendicular to a traveling direction of a traveling path of a moving object or a plane perpendicular to a width direction of the traveling path. An object, an acquisition unit that acquires a measurement point group measured by a measurement device attached to the moving body, and the vertical calculated with reference to a coordinate system based on installation information of the measurement device, calculated from the measurement point group. The computer is caused to function as estimating means for estimating a mounting posture of the measuring device to the moving object based on a tilt of a surface. By executing this program, the computer can appropriately estimate the mounting posture of the measuring device on the moving body. Preferably, the program is stored in a storage medium.
以下、図面を参照して本発明の好適な各実施例について説明する。
Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
[概略構成]
図1は、本実施例に係る車両システムの概略構成図である。図1に示す車両システムは、車両の運転支援に関する制御を行う車載機1と、ライダ(Lidar:Light Detection and Ranging、または、Laser Illuminated Detection And Ranging)2、ジャイロセンサ3、加速度センサ4、及びGPS受信機5などのセンサ群とを有する。 [Schematic configuration]
FIG. 1 is a schematic configuration diagram of the vehicle system according to the present embodiment. The vehicle system shown in FIG. 1 includes an in-vehicle device 1 that performs control related to driving support of a vehicle, a lidar (Light Detection and Ranging or Laser Illuminated Detection And Ranging) 2, a gyro sensor 3, an acceleration sensor 4, and a GPS. And a sensor group such as the receiver 5.
図1は、本実施例に係る車両システムの概略構成図である。図1に示す車両システムは、車両の運転支援に関する制御を行う車載機1と、ライダ(Lidar:Light Detection and Ranging、または、Laser Illuminated Detection And Ranging)2、ジャイロセンサ3、加速度センサ4、及びGPS受信機5などのセンサ群とを有する。 [Schematic configuration]
FIG. 1 is a schematic configuration diagram of the vehicle system according to the present embodiment. The vehicle system shown in FIG. 1 includes an in-
車載機1は、ライダ2、ジャイロセンサ3、加速度センサ4、及びGPS受信機5などのセンサ群と電気的に接続し、これらの出力データを取得する。また、道路データ及び道路付近に設けられた地物に関する地物情報などを記憶した地図データベース(DB:DataBase)10を記憶している。そして、車載機1は、上述の出力データ及び地図DB10に基づき、車両の位置の推定を行い、位置推定結果に基づいた自動運転制御などの車両の運転支援に関する制御などを行う。また、車載機1は、ライダ2の出力などに基づいて、ライダ2の姿勢の推定を行う。そして、車載機1は、この推定結果に基づいて、ライダ2が出力する点群データの各計測値を補正する処理などを行う。車載機1は、本発明における「姿勢推定装置」の一例である。
The on-vehicle device 1 is electrically connected to a group of sensors such as the rider 2, the gyro sensor 3, the acceleration sensor 4, and the GPS receiver 5, and obtains output data of these. In addition, a map database (DB: DataBase) 10 that stores road data and feature information about features provided near the road is stored. The in-vehicle device 1 estimates the position of the vehicle based on the output data and the map DB 10, and performs control related to driving support of the vehicle such as automatic driving control based on the position estimation result. The in-vehicle device 1 estimates the attitude of the rider 2 based on the output of the rider 2 and the like. Then, the vehicle-mounted device 1 performs a process of correcting each measurement value of the point cloud data output by the rider 2 based on the estimation result. The in-vehicle device 1 is an example of the “posture estimation device” in the present invention.
ライダ2は、水平方向および垂直方向の所定の角度範囲に対してパルスレーザを出射することで、外界に存在する物体までの距離を離散的に測定し、当該物体の位置を示す3次元の点群情報を生成する。この場合、ライダ2は、照射方向を変えながらレーザ光を照射する照射部と、照射したレーザ光の反射光(散乱光)を受光する受光部と、受光部が出力する受光信号に基づくスキャンデータを出力する出力部とを有する。スキャンデータは、受光部が受光したレーザ光に対応する照射方向と、上述の受光信号に基づき特定される当該レーザ光のその照射方向での物体までの距離とに基づき生成され、車載機1へ供給される。ライダ2は、本発明における「計測装置」の一例である。
The lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring a distance to an object existing in the external world, and a three-dimensional point indicating the position of the object. Generate group information. In this case, the lidar 2 includes an irradiation unit that irradiates laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. And an output unit that outputs The scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the above-described light reception signal, and transmitted to the in-vehicle device 1. Supplied. The rider 2 is an example of the “measuring device” in the present invention.
図2は、車載機1の機能的構成を示すブロック図である。車載機1は、主に、インターフェース11と、記憶部12と、入力部14と、制御部15と、情報出力部16と、を有する。これらの各要素は、バスラインを介して相互に接続されている。
FIG. 2 is a block diagram showing a functional configuration of the vehicle-mounted device 1. The in-vehicle device 1 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. These components are interconnected via a bus line.
インターフェース11は、ライダ2、ジャイロセンサ3、加速度センサ4、及びGPS受信機5などのセンサから出力データを取得し、制御部15へ供給する。また、インターフェース11は、制御部15が生成した車両の走行制御に関する信号を車両の電子制御装置(ECU:Electronic Control Unit)へ供給する。
The interface 11 acquires output data from sensors such as the rider 2, the gyro sensor 3, the acceleration sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15. Further, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
記憶部12は、制御部15が実行するプログラムや、制御部15が所定の処理を実行するのに必要な情報を記憶する。本実施例では、記憶部12は、地図DB10と、ライダ設置情報ILとを有する。地図DB10は、例えば、道路データ、施設データ、及び、道路周辺の地物データなどを含むデータベースである。道路データには、経路探索用の車線ネットワークデータ、道路形状データ、交通法規データなどが含まれる。地物データは、道路標識等の看板や停止線等の道路標示、白線等の道路区画線や道路沿いの構造物等の情報(例えば位置情報及び種別情報)を含む。また、地物データは、自車位置推定に用いるための地物の高精度な点群情報などを含んでもよい。その他、地図DBには、位置推定に必要な種々のデータが記憶されてもよい。
The storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process. In the present embodiment, the storage unit 12 has a map DB 10 and rider installation information IL. The map DB 10 is a database including, for example, road data, facility data, and feature data around roads. The road data includes lane network data for route search, road shape data, traffic regulation data, and the like. The feature data includes information (for example, position information and type information) on a signboard such as a road sign, a road marking such as a stop line, a road division line such as a white line, and a structure along a road. In addition, the feature data may include highly accurate point cloud information of the feature to be used for the vehicle position estimation. In addition, various data required for position estimation may be stored in the map DB.
ライダ設置情報ILは、ある基準時(例えばライダ2のアライメント調整直後などの姿勢ずれが生じていない時)におけるライダ2の車両に対する相対的な姿勢及び位置に関する情報である。本実施例では、ライダ2等の姿勢を、ロール角、ピッチ角、ヨー角(即ちオイラー角)により表すものとする。ライダ設置情報ILは、後述するライダ2の姿勢の推定処理が実行された場合に、推定結果に基づき更新されてもよい。
The rider installation information IL is information relating to the relative posture and position of the rider 2 with respect to the vehicle at a certain reference time (for example, when there is no posture shift such as immediately after the alignment adjustment of the rider 2). In the present embodiment, the attitude of the rider 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, an Euler angle). The rider installation information IL may be updated based on the estimation result when the later-described posture estimation process of the rider 2 is executed.
入力部14は、ユーザが操作するためのボタン、タッチパネル、リモートコントローラ、音声入力装置等であり、経路探索のための目的地を指定する入力、自動運転のオン及びオフを指定する入力などを受け付ける。情報出力部16は、例えば、制御部15の制御に基づき出力を行うディスプレイやスピーカ等である。
The input unit 14 is a button for a user to operate, a touch panel, a remote controller, a voice input device, and the like, and receives an input for designating a destination for a route search, an input for designating ON and OFF of automatic driving, and the like. . The information output unit 16 is, for example, a display, a speaker, or the like that outputs under the control of the control unit 15.
制御部15は、プログラムを実行するCPUなどを含み、車載機1の全体を制御する。制御部15は、インターフェース11から供給される各センサの出力信号及び地図DB10に基づき、自車位置の推定を行い、自車位置の推定結果に基づいて自動運転制御を含む車両の運転支援に関する制御などを行う。このとき、制御部15は、ライダ2の出力データを用いる場合には、ライダ2が出力する計測データを、ライダ設置情報ILに記録されたライダ2の設置情報に含まれる姿勢及び位置を基準として、ライダ2を基準とした座標系から車両を基準とした座標系(以下「基準座標系」)に変換する。さらに、本実施例では、制御部15は、車両に対するライダ2の現在(即ち処理基準時)の姿勢を推定することで、ライダ設置情報ILに記録された姿勢に対する変化量を算出し、当該変化量に基づきライダ2が出力する計測データを補正する。これにより、制御部15は、ライダ2の姿勢にずれが生じた場合であっても、当該ずれの影響を受けないようにライダ2が出力する計測データを補正する。制御部15は、本発明におけるプログラムを実行する「コンピュータ」の一例である。
The control unit 15 includes a CPU that executes a program, and controls the entire vehicle-mounted device 1. The control unit 15 estimates the position of the own vehicle based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and performs control related to driving support of the vehicle including automatic driving control based on the estimation result of the own vehicle position. And so on. At this time, when using the output data of the rider 2, the control unit 15 converts the measurement data output by the rider 2 based on the attitude and position included in the installation information of the rider 2 recorded in the rider installation information IL. Is converted from a coordinate system based on the rider 2 to a coordinate system based on the vehicle (hereinafter, “reference coordinate system”). Further, in the present embodiment, the control unit 15 calculates the amount of change with respect to the attitude recorded in the rider installation information IL by estimating the current (ie, processing reference) attitude of the rider 2 with respect to the vehicle. The measurement data output by the rider 2 is corrected based on the amount. As a result, even when the attitude of the rider 2 is shifted, the control unit 15 corrects the measurement data output by the rider 2 so as not to be affected by the shift. The control unit 15 is an example of a “computer” that executes a program according to the present invention.
[座標系の変換]
ライダ2により取得される3次元点群データの各計測点が示す3次元座標は、ライダ2の位置及び姿勢を基準とした座標系(「ライダ座標系」とも呼ぶ。)で表されており、車両の位置及び姿勢を基準とした座標系(「車両座標系」とも呼ぶ。)に変換する必要がある。ここでは、ライダ座標系と車両座標系との変換について説明する。 [Coordinate system conversion]
The three-dimensional coordinates indicated by each measurement point of the three-dimensional point group data acquired by therider 2 are expressed in a coordinate system (also referred to as a “lider coordinate system”) based on the position and orientation of the rider 2. It is necessary to convert to a coordinate system based on the position and orientation of the vehicle (also referred to as “vehicle coordinate system”). Here, the conversion between the rider coordinate system and the vehicle coordinate system will be described.
ライダ2により取得される3次元点群データの各計測点が示す3次元座標は、ライダ2の位置及び姿勢を基準とした座標系(「ライダ座標系」とも呼ぶ。)で表されており、車両の位置及び姿勢を基準とした座標系(「車両座標系」とも呼ぶ。)に変換する必要がある。ここでは、ライダ座標系と車両座標系との変換について説明する。 [Coordinate system conversion]
The three-dimensional coordinates indicated by each measurement point of the three-dimensional point group data acquired by the
図3は、2次元座標により表された車両座標系とライダ座標系との関係を示す図である。ここでは、車両座標系は、車両の中心を原点とし、車両の進行方向に沿った座標軸「xb」と車両の側面方向に沿った座標軸「yb」を有する。また、ライダ座標系は、ライダ2の正面方向(矢印A2参照)に沿った座標軸「xL」とライダ2の側面方向に沿った座標軸「yL」を有する。
FIG. 3 is a diagram illustrating a relationship between a vehicle coordinate system and a rider coordinate system represented by two-dimensional coordinates. Here, the vehicle coordinate system has a coordinate axis “x b ” along the traveling direction of the vehicle and a coordinate axis “y b ” along the side direction of the vehicle, with the center of the vehicle as the origin. The rider coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A2) and a coordinate axis “y L ” along the side direction of the rider 2.
ここで、車両座標系に対するライダ2のヨー角を「Lψ」、ライダ2の位置を[Lx、Ly]Tとした場合、車両座標系から見た時刻「k」の計測点[xb(k)、yb(k)]Tは、回転行列「Cψ」を用いた以下の式(1)によりライダ座標系の座標[xL(k)、yL(k)]Tへ変換される。
Here, when the yaw angle of the rider 2 with respect to the vehicle coordinate system is “L ψ ” and the position of the rider 2 is [L x , L y ] T , the measurement point [x] at the time “k” as viewed from the vehicle coordinate system b (k), y b (k)] T is converted to the coordinates [x L (k), y L (k)] T of the rider coordinate system by the following equation (1) using the rotation matrix “C ψ ”. Is converted.
車両座標系に対するライダ2のロール角を「Lφ」、ピッチ角を「Lθ」、ヨー角を「Lψ」とし、ライダ2の座標軸xbにおける位置が「Lx」、座標軸ybにおける位置が「Ly」、座標軸zbにおける位置が「Lz」とした場合、車両座標系から見た時刻「k」の計測点[xb(k)、yb(k)、zb(k)]Tは、ロール、ピッチ、ヨーに対応する各回転行列「Cφ」、「Cθ」、「Cψ」により表される方向余弦行列「C」を用いた以下の式(3)により、ライダ座標系の座標[xL(k)、yL(k)、zL(k)]Tへ変換される。
The roll angle of the rider 2 with respect to the vehicle coordinate system is “L φ ”, the pitch angle is “L θ ”, the yaw angle is “ Lψ ”, the position of the rider 2 on the coordinate axis xb is “L x ”, and the position on the coordinate axis y b If the position is “L y ” and the position on the coordinate axis z b is “L z ”, the measurement point [x b (k), y b (k), z b () at the time “k” viewed from the vehicle coordinate system k)] T is the following equation (3) using a directional cosine matrix “C” represented by rotation matrices “C φ ”, “C θ ”, and “C ψ ” corresponding to roll, pitch, and yaw. Accordingly, the rider coordinate system of the coordinate is converted into [x L (k), y L (k), z L (k)] T.
[ロール角の推定]
まず、ライダ2のロール角Lφの推定方法について説明する。車載機1は、走行路の進行方向に対して垂直面(即ち走行路に対して正面)を有する方形(矩形)状の道路標識を、ライダ2により検出すべき対象となる地物(「検出対象地物Ftag」とも呼ぶ。)とみなし、当該道路標識の傾きをライダ2が計測する計測点群に基づき算出することで、ライダ2のロール角Lφを推定する。 [Estimation of roll angle]
First, a description method for estimating the roll angle L phi rider 2. The on-vehicle device 1 detects a square (rectangular) road sign having a plane perpendicular to the traveling direction of the traveling path (that is, a front surface with respect to the traveling path) by the rider 2 as a target feature (“detection”). also referred to as a target feature Ftag ".) and regarded, by the inclination of the road sign rider 2 is calculated based on the measurement point group to be measured to estimate the roll angle L phi rider 2.
まず、ライダ2のロール角Lφの推定方法について説明する。車載機1は、走行路の進行方向に対して垂直面(即ち走行路に対して正面)を有する方形(矩形)状の道路標識を、ライダ2により検出すべき対象となる地物(「検出対象地物Ftag」とも呼ぶ。)とみなし、当該道路標識の傾きをライダ2が計測する計測点群に基づき算出することで、ライダ2のロール角Lφを推定する。 [Estimation of roll angle]
First, a description method for estimating the roll angle L phi rider 2. The on-
図5は、ライダ2のロール角を推定する際の検出対象地物Ftagとなる道路標識22付近に車両が走行する場合の車両周辺の俯瞰図を示す。図5において、破線円20は、ライダ2により地物を検出可能な範囲(「ライダ検出範囲」とも呼ぶ。)を示し、破線枠21は、検出対象地物Ftagである道路標識22を検出するための検出ウィンドウを示す。
FIG. 5 is a bird's-eye view of the vicinity of the vehicle when the vehicle travels near the road sign 22 that is the detection target feature Ftag when estimating the roll angle of the rider 2. In FIG. 5, a dashed circle 20 indicates a range in which a feature can be detected by the rider 2 (also referred to as a “lider detection range”), and a dashed frame 21 detects a road sign 22 which is a detection target feature Ftag. The detection window for
この場合、車載機1は、地図DB10を参照することで、走行路の進行方向に対して垂直面を有する方形状の道路標識22がライダ検出範囲内に存在することを認識し、当該道路標識22に関する地物データに基づき破線枠21に示す検出ウィンドウを設定する。この場合、車載機1は、例えば、検出対象地物Ftagとなる道路標識の種別情報などを予め記憶しておき、当該種別情報が示す種別の道路標識がライダ検出範囲内に存在するか否かを地図DB10の地物データを参照して判定する。なお、地図DB10の地物データに地物の大きさ又は/及び形状の情報が含まれている場合には、車載機1は、検出対象地物Ftagとなる道路標識22の大きさ又は/及び形状の情報に基づき検出ウィンドウの大きさ又は/及び形状を決定してもよい。そして、車載機1は、ライダ2が計測する計測点群のうち、設定した検出ウィンドウ内に存在する計測点群(「対象計測点群Ptag」とも呼ぶ。)を、道路標識22の計測点群として抽出する。
In this case, the vehicle-mounted device 1 refers to the map DB 10 and recognizes that the rectangular road sign 22 having a plane perpendicular to the traveling direction of the traveling path exists in the lidar detection range, and A detection window indicated by a broken-line frame 21 is set based on the feature data relating to 22. In this case, the in-vehicle device 1 previously stores, for example, type information of a road sign to be a detection target feature Ftag, and determines whether or not a road sign of the type indicated by the type information exists within the lidar detection range. Is determined with reference to the feature data in the map DB 10. When the feature data of the map DB 10 includes information on the size and / or shape of the feature, the on-vehicle device 1 determines the size or / and / or the size of the road sign 22 as the detection target feature Ftag. The size and / or shape of the detection window may be determined based on the shape information. Then, the on-vehicle device 1 converts the measurement point group (also referred to as “target measurement point group Ptag”) existing in the set detection window from the measurement point group measured by the rider 2 to the measurement point group of the road sign 22. Extract as
図6(A)~(C)は、道路標識22の被計測面を正面視した場合の対象計測点群Ptagとライダ2によるレーザ光の走査線23(即ち、レーザ光の被照射点を時系列に結ぶ線)との位置関係を示す。なお、説明を容易にするため,ライダ2の初期姿勢角Lφ,Lθ,LΨが0度である場合を例にとっている。ここで、図6(A)は、ライダ2にロール方向のずれが生じていない場合の対象計測点群Ptagとライダ2の走査線23との関係を示し、図6(B)、(C)は、ロール方向のずれΔLφが生じている場合の対象計測点群Ptagとライダ2の走査線23との関係を示す。ここで、図6(B)は、車両座標系のyb-zb平面内における対象計測点群Ptagとライダ2の走査線23との関係を示し、図6(C)は、基準座標系のyb‘-zb‘平面内における対象計測点群Ptagとライダ2の走査線23との関係を示す。なお、ここでは、説明便宜上、ライダ2の姿勢ずれが生じていないときには、道路標識22の長手方向はyb軸と平行であるものとし、車両は平坦な道路上に存在しているものとする。
6 (A) to 6 (C) show the target measurement point group Ptag and the scanning line 23 of the laser light by the rider 2 when the measurement target surface of the road sign 22 is viewed from the front. (Line connecting the series). Incidentally, for ease of explanation, the initial attitude angle L phi rider 2, L theta, taking an example where L [psi is 0 degrees. Here, FIG. 6A shows the relationship between the target measurement point group Ptag and the scanning line 23 of the rider 2 when there is no shift in the roll direction in the rider 2, and FIGS. 6B and 6C. shows the relationship between the target measurement point group Ptag and rider second scan line 23 when the deviation [Delta] L phi in the roll direction occurs. Here, FIG. 6B shows the relationship between the target measurement point group Ptag and the scanning line 23 of the rider 2 in the y b -z b plane of the vehicle coordinate system, and FIG. 6C shows the reference coordinate system. 5 shows the relationship between the target measurement point group Ptag and the scanning line 23 of the rider 2 in the y b ′ -z b ′ plane. Here, for the sake of convenience, when the attitude of the rider 2 is not shifted, the longitudinal direction of the road sign 22 is assumed to be parallel to the yb axis, and the vehicle is present on a flat road. .
図6(A)に示すように、ライダ2のロール角にずれが生じていない場合には、ライダ2は、道路標識22の横方向(即ち水平方向又は長手方向)と平行にスキャンを行う。そして、この場合、対象計測点群Ptagの縦方向における点群の数は、横方向における位置に関わらず同じ数(図6(A)では6個)となる。
As shown in FIG. 6A, when there is no deviation in the roll angle of the rider 2, the rider 2 scans in a direction parallel to the lateral direction (that is, the horizontal direction or the longitudinal direction) of the road sign 22. In this case, the number of point groups in the vertical direction of the target measurement point group Ptag is the same number (six in FIG. 6A) regardless of the position in the horizontal direction.
一方、ライダ2のロール角にずれが生じている場合には、道路標識22に対して走査線が斜めに傾いている。ここで、道路標識22の長手方向は車両に対して傾いておらず、ライダ2の走査線が車両に対して傾いているため、車両座標系で観察した場合には、図6(B)に示すように、yb軸と道路標識22の長手方向とが平行となり、ライダ2の走査線がyb軸に対して傾く。一方、基準座標系で観察した場合には、図6(C)に示すように、yb‘軸とライダ2の走査線とが平行となり、道路標識22の長手方向がyb‘軸に対して傾く。
On the other hand, when the roll angle of the rider 2 is shifted, the scanning line is inclined with respect to the road sign 22. Here, the longitudinal direction of the road sign 22 is not inclined with respect to the vehicle, and the scanning line of the rider 2 is inclined with respect to the vehicle. as shown, becomes parallel to the longitudinal direction of the y b-axis and the road signs 22, the scanning lines of the rider 2 is inclined with respect to y b axis. On the other hand, when viewed in the reference coordinate system, as shown in FIG. 6 (C), y b 'and the scanning line of the shaft and the rider 2 is parallel to the longitudinal direction of the road sign 22 is y b' with respect to the axis Tilt.
次に、対象計測点群Ptagの抽出後の処理について説明する。まず、車載機1は、対象計測点群Ptagの各計測点に対する受光強度を所定の閾値により判定することで、所定の閾値以下となる受光強度に対応する計測点を対象計測点群Ptagから除外する。そして、車載機1は、受光強度に基づく除外処理後の対象計測点群Ptagから、外縁を構成する点群(「外縁点群Pout」とも呼ぶ。)を抽出する。例えば、車載機1は、上下左右の少なくとも一方向において、隣接する計測点が存在しない対象計測点群Ptagの計測点を、外縁点群Poutとして抽出する。
Next, processing after the extraction of the target measurement point group Ptag will be described. First, the in-vehicle device 1 determines the received light intensity for each measurement point of the target measurement point group Ptag by using a predetermined threshold, and excludes the measurement points corresponding to the light reception intensity that is equal to or less than the predetermined threshold from the target measurement point group Ptag. I do. Then, the in-vehicle device 1 extracts a point group forming the outer edge (also referred to as “outer edge point group Pout”) from the target measurement point group Ptag after the exclusion process based on the received light intensity. For example, the in-vehicle device 1 extracts, as the outer edge point group Pout, the measurement points of the target measurement point group Ptag in which there is no adjacent measurement point in at least one of the vertical and horizontal directions.
図7(A)は、ライダ座標系における対象計測点群Ptagを示し、図7(B)は、所定の閾値以下となる受光強度に対応する計測点を除外した後の対象計測点群Ptagを示す。また、図7(C)は、図7(B)に示す対象計測点群Ptagから抽出した外縁点群Poutを示す。図7(A)、(B)に示すように、図7(A)に示す計測点のうち、レーザ光の一部のみが反射したことにより一部が欠けている計測点については、受光強度が所定の閾値以下となり、対象計測点群Ptagから除外されている。また、図7(B)、(C)に示すように、上下左右の少なくとも一方向において隣接する計測点が存在しない計測点が外縁点群Poutとして抽出されている。
FIG. 7A shows the target measurement point group Ptag in the lidar coordinate system, and FIG. 7B shows the target measurement point group Ptag after excluding the measurement points corresponding to the light receiving intensity equal to or less than the predetermined threshold value. Show. FIG. 7C shows an outer edge point group Pout extracted from the target measurement point group Ptag shown in FIG. 7B. As shown in FIGS. 7A and 7B, among the measurement points shown in FIG. 7A, the measurement intensity of a part of the laser light reflected by only a part of the laser light is missing. Is less than or equal to a predetermined threshold, and is excluded from the target measurement point group Ptag. Further, as shown in FIGS. 7B and 7C, measurement points having no adjacent measurement points in at least one of the upper, lower, left, and right directions are extracted as the outer edge point group Pout.
次に、車載機1は、外縁点群Poutから道路標識22の長手方向の傾きを算出する。ここで、外縁点群Poutは四角形を形成することから、この四角形のそれぞれの辺ごとに外縁点群Poutを分類し、辺ごとに分類した外縁点群Poutから、最小二乗法などの回帰分析手法により各辺の近似直線の傾きを算出する。
Next, the in-vehicle device 1 calculates the inclination of the road sign 22 in the longitudinal direction from the outer edge point group Pout. Here, since the outer edge point group Pout forms a quadrangle, the outer edge point group Pout is classified for each side of the rectangle, and a regression analysis method such as a least square method is used from the outer edge point group Pout classified for each side. To calculate the slope of the approximate straight line of each side.
図8(A)は、四角形の上辺を形成する外縁点群Poutを抽出し,最小2乗法で直線を引いた図を示す。図8(B)は、四角形の底辺を形成する外縁点群Poutを抽出し,最小2乗法で直線を引いた図を示す。図8(C)は、四角形の左辺を形成する外縁点群Poutを抽出し,最小2乗法で直線を引いた図を示す。図8(D)は、四角形の右辺を形成する外縁点群Poutを抽出し,最小2乗法で直線を引いた図を示す。
FIG. 8A is a diagram in which a group of outer edge points Pout forming the upper side of a rectangle is extracted and a straight line is drawn by the least square method. FIG. 8B is a diagram in which a group of outer edge points Pout forming the base of the rectangle is extracted and a straight line is drawn by the least square method. FIG. 8C is a diagram in which a group of outer edge points Pout forming the left side of the rectangle is extracted and a straight line is drawn by the least square method. FIG. 8D shows a diagram in which a group of outer edge points Pout forming the right side of the rectangle is extracted and a straight line is drawn by the least square method.
この場合、車載機1は、図8(A)~(D)に示される各直線を算出し、各直線のyb‘軸に対する傾き「φ1」~「φ4」を算出する。そして、車載機1は、以下の式(5)に示すように、これらの傾き「φ1」~「φ4」の平均を、道路標識22の傾き「φ」として算出する。
In this case, the on-vehicle device 1 calculates each straight line shown in FIGS. 8A to 8D, and calculates the slopes “φ 1 ” to “φ 4 ” of each straight line with respect to the y b ′ axis. Then, the in-vehicle device 1 calculates the average of these slopes “φ 1 ” to “φ 4 ” as the slope “φ” of the road sign 22 as shown in the following equation (5).
なお、好適には、車載機1は、傾きφを求める場合、四角形の各辺を構成する外縁点群Poutの数に基づき、各傾きφ1~φ4に重み付けをしてもよい。この場合、四角形の上辺を形成する外縁点群Poutの数を「n1」、四角形の底辺を形成する外縁点群Poutの数を「n2」、四角形の左辺を形成する外縁点群Poutの数を「n3」、四角形の右辺を形成する外縁点群Poutの数を「n4」とすると、車載機1は、以下の式(6)に基づき重み付け平均を行うことで、傾きφを算出する。
Preferably, when determining the inclination φ, the in-vehicle device 1 may weight each of the inclinations φ 1 to φ 4 based on the number of the outer edge point groups Pout constituting each side of the rectangle. In this case, the number of the outer edge point groups Pout forming the upper side of the rectangle is “n 1 ”, the number of the outer edge point groups Pout forming the base of the square is “n 2 ”, and the number of the outer edge point group Pout forming the left side of the square is the number "n 3" and the number of the outer edge point group Pout forming the right side of the rectangle and "n 4", the vehicle-mounted device 1 performs a weighted average based on the following equation (6), the inclination φ calculate.
好適には、検出対象地物Ftagである道路標識22が水平方向に対して傾いている可能性や計測誤差等を勘案し、車載機1は、上記のライダ2のロール角Lφの算出を、検出対象地物Ftagとなる多くの道路標識に対して実施し、それらを平均化するとよい。これにより、車載機1は、確からしいライダ2のロール方向の傾きLφを好適に算出することができる。
Preferably, considering the possibility and measurement errors such road sign 22 is detected feature Ftag is inclined with respect to the horizontal direction, the vehicle-mounted unit 1, the calculation of the roll angle L phi of the rider 2 It is good to carry out for many road signs which become the detection target feature Ftag and average them. Thus, the vehicle-mounted device 1 can be suitably calculated roll-direction tilt L phi of probable rider 2.
また、車体のロール角φ0は平均すると0°と見なせるため,上述の平均化の時間を長くすることにより、車体のロール角度φ0を取得する必要がなくなる。例えば、十分大きいN個分の傾きφを取得した場合、車載機1は、ライダ2のロール角Lφを、以下の式(10)により定めてもよい。
Further, since the roll angle φ 0 of the vehicle body can be regarded as 0 ° on average, it is not necessary to acquire the roll angle φ 0 of the vehicle body by lengthening the averaging time. For example, when acquiring the inclination phi of sufficiently large the N content, the vehicle-mounted unit 1, the roll angle L phi rider 2 may be defined by the following equation (10).
図9は、ロール角推定処理の手順を示すフローチャートである。
FIG. 9 is a flowchart showing the procedure of the roll angle estimation process.
まず、車載機1は、地図DB10を参照し、車両周辺の検出対象地物Ftagに対して検出ウィンドウを設定する(ステップS101)。この場合、車載機1は、検出対象地物Ftagとして、走行路の進行方向に対して垂直面を有する方形状の道路標識を選定し、現在位置から所定距離以内に存在する上述の道路標識を地図DB10から特定する。そして、車載機1は、特定した道路標識に関する位置情報等を地図DB10から取得することで、検出ウィンドウを設定する。
First, the vehicle-mounted device 1 refers to the map DB 10 and sets a detection window for the detection target feature Ftag around the vehicle (step S101). In this case, the on-vehicle device 1 selects a rectangular road sign having a vertical plane with respect to the traveling direction of the traveling path as the detection target feature Ftag, and selects the above-described road sign existing within a predetermined distance from the current position. It is specified from the map DB 10. Then, the on-vehicle device 1 sets a detection window by acquiring position information and the like regarding the specified road sign from the map DB 10.
次に、車載機1は、ステップS101で設定した検出ウィンドウ内のライダ2の計測点群を、検出対象地物Ftagの点群(即ち対象計測点群Ptag)として取得する(ステップS102)。そして、車載機1は、ステップS102で取得した対象計測点群Ptagから、これらの点群の外縁を形成する外縁点群Poutを抽出する(ステップS103)。
Next, the vehicle-mounted device 1 acquires the measurement point group of the rider 2 within the detection window set in step S101 as a point group of the detection target feature Ftag (that is, the target measurement point group Ptag) (step S102). Then, the vehicle-mounted device 1 extracts an outer edge point group Pout that forms the outer edge of these point groups from the target measurement point group Ptag acquired in step S102 (step S103).
そして、車載機1は、ステップS103で抽出した外縁点群Poutの横方向の傾きφを算出する(ステップS104)。この場合、車載機1は、図8(A)~(D)において説明したように、外縁点群Poutが形成する四角形の各辺に該当する点群から各辺に対応する直線の傾きφ1~φ4を求め、式(5)~(8)のいずれかの式を採用して傾きφを算出する。
Then, the on-vehicle device 1 calculates the lateral inclination φ of the outer edge point group Pout extracted in Step S103 (Step S104). In this case, as described with reference to FIGS. 8A to 8D, the in-vehicle device 1 changes the slope φ 1 of a straight line corresponding to each side from a point group corresponding to each side of the square formed by the outer edge point group Pout. seek ~ phi 4, calculates an inclination phi employ any of the formulas (5) to (8).
その後、車載機1は、傾きφの平均化処理又は/及び車体のロール角φ0との差分処理により、ライダ2のロール角Lφを算出する(ステップS105)。上述の平均化処理では、車載機1は、ステップS101~S104の処理を複数の検出対象地物Ftagに対して実行することで複数の傾きφを取得し、取得した複数の傾きφの平均値を、傾きLφとして算出する。
Then, the vehicle-mounted unit 1, the difference processing between the averaging process and / or the vehicle body roll angle phi 0 of inclination phi, calculates the roll angle L phi rider 2 (step S105). In the above-described averaging process, the on-vehicle device 1 obtains a plurality of slopes φ by performing the processing of steps S101 to S104 on a plurality of detection target features Ftag, and obtains an average value of the obtained plurality of slopes φ. Is calculated as the slope Lφ .
[ピッチ角の推定]
次に、ライダ2のピッチ角Lθの推定方法について説明する。車載機1は、走行路の幅方向に対して垂直面(即ち走行路に対して横向きの面)を有する方形状の道路標識を検出対象地物Ftagとみなし、当該道路標識の傾きをライダ2が出力する計測点群にもとづき算出することで、ライダ2のピッチ角を推定する。 [Estimation of pitch angle]
Next, a description method for estimating the pitch angle L theta rider 2. The in-vehicle device 1 regards a rectangular road sign having a plane perpendicular to the width direction of the traveling road (that is, a plane transverse to the traveling road) as the detection target feature Ftag, and determines the inclination of the road sign to the rider 2. The pitch angle of the rider 2 is estimated by performing calculation based on the measurement point group output by.
次に、ライダ2のピッチ角Lθの推定方法について説明する。車載機1は、走行路の幅方向に対して垂直面(即ち走行路に対して横向きの面)を有する方形状の道路標識を検出対象地物Ftagとみなし、当該道路標識の傾きをライダ2が出力する計測点群にもとづき算出することで、ライダ2のピッチ角を推定する。 [Estimation of pitch angle]
Next, a description method for estimating the pitch angle L theta rider 2. The in-
図10は、ライダ2のピッチ角を推定する際の検出対象地物Ftagとなる道路標識24付近に車両が走行する場合の車両周辺の俯瞰図を示す。図10において、破線円20は、ライダ2により地物を検出可能なライダ検出範囲を示し、破線枠21は、検出対象地物Ftagである道路標識24に対して設定した検出ウィンドウを示す。
FIG. 10 is a bird's-eye view of the periphery of the vehicle when the vehicle travels near the road sign 24 that is the target feature Ftag when estimating the pitch angle of the rider 2. FIG. In FIG. 10, a dashed circle 20 indicates a lidar detection range in which a feature can be detected by the rider 2, and a dashed frame 21 indicates a detection window set for a road sign 24 which is a detection target feature Ftag.
この場合、車載機1は、地図DB10を参照することで、走行路の幅方向に対して垂直面を有する方形状の道路標識24が破線円20に示されるライダ検出範囲内に存在することを認識し、当該道路標識24に関する位置情報等に基づき破線枠21に示す検出ウィンドウを設定する。この場合、車載機1は、例えば、検出対象地物Ftagとなる道路標識の種別情報などを予め記憶しておき、当該種別情報が示す種別の道路標識がライダ検出範囲内に存在するか否かを地図DB10の地物データを参照して判定する。そして、車載機1は、ライダ2が出力する計測点群のうち、設定した検出ウィンドウ内に存在する計測点群を、対象計測点群Ptagとして抽出する。
In this case, the in-vehicle device 1 refers to the map DB 10 to determine that the rectangular road sign 24 having a vertical surface with respect to the width direction of the traveling path exists within the lidar detection range indicated by the broken circle 20. Recognition is performed, and a detection window indicated by a broken-line frame 21 is set based on position information and the like regarding the road sign 24. In this case, the in-vehicle device 1 previously stores, for example, type information of a road sign to be a detection target feature Ftag, and determines whether or not a road sign of the type indicated by the type information exists within the lidar detection range. Is determined with reference to the feature data in the map DB 10. Then, the on-vehicle device 1 extracts, as the target measurement point group Ptag, the measurement point group existing in the set detection window from the measurement point group output by the rider 2.
次に、車載機1は、抽出した対象計測点群Ptagに対して、ロール角の推定方法と同様の処理を実行することで、ライダ2のピッチ角を推定する。
Next, the on-vehicle device 1 estimates the pitch angle of the rider 2 by executing the same processing as the roll angle estimation method on the extracted target measurement point group Ptag.
具体的には、まず、車載機1は、対象計測点群Ptagの各計測点に対する受光強度を所定の閾値により判定することで、所定の閾値以下となる受光強度に対応する計測点を対象計測点群Ptagから除外する。そして、車載機1は、受光強度に基づく除外処理後の対象計測点群Ptagから、外縁を構成する外縁点群Poutを抽出する。次に、車載機1は、外縁点群Poutが構成する四角形の各辺の近似直線の傾き「θ1」~「θ4」を最小二乗法などの回帰分析手法により算出する。そして、車載機1は、傾きθ1~θ4に基づく平均化処理を行うことで、道路標識22の傾き「θ」を算出する。なお、車載機1は、式(6)~(8)と同様、四角形の各辺を構成する点群の数n1~n4と、垂直間隔IV及び水平間隔IHとの少なくとも一方を加味した重み付け平均により傾きθを算出してもよい。
Specifically, the in-vehicle device 1 first determines the light receiving intensity for each of the measurement points of the target measurement point group Ptag by using a predetermined threshold, and performs the target measurement on the measurement point corresponding to the light receiving intensity that is equal to or less than the predetermined threshold. Exclude from point cloud Ptag. Then, the vehicle-mounted device 1 extracts the outer edge point group Pout constituting the outer edge from the target measurement point group Ptag after the exclusion processing based on the received light intensity. Next, the in-vehicle device 1 calculates the slopes “θ 1 ” to “θ 4 ” of the approximate straight lines of each side of the rectangle formed by the outer edge point group Pout by a regression analysis method such as the least square method. Then, the in-vehicle device 1 calculates an inclination “θ” of the road sign 22 by performing an averaging process based on the inclinations θ 1 to θ 4 . Incidentally, the vehicle-mounted device 1 is similar to the equation (6) to (8), the number n 1 - n 4 points group constituting the sides of the rectangle, at least one of the vertical interval I V and horizontal spacing I H The inclination θ may be calculated based on the weighted average with consideration.
また、車載機1は、ジャイロセンサ3及び加速度センサ4の出力に基づき車体のピッチ角「θ0」を算出し、傾きθとの差分をとることで、ライダ2のピッチ角Lθを算出する。この場合、車載機1は、以下の式(11)に基づきライダ2のピッチ方向の傾きLθを算出する。
Further, the on-vehicle device 1 calculates the pitch angle “θ 0 ” of the vehicle body based on the outputs of the gyro sensor 3 and the acceleration sensor 4 and calculates the pitch angle L θ of the rider 2 by calculating the difference from the inclination θ. . In this case, the vehicle-mounted device 1 calculates the pitch direction tilt L theta rider 2 based on equation (11) below.
図11は、ピッチ角推定処理の手順を示すフローチャートである。
FIG. 11 is a flowchart showing the procedure of pitch angle estimation processing.
まず、車載機1は、地図DB10を参照し、車両周辺の検出対象地物Ftagに対して検出ウィンドウを設定する(ステップS201)。この場合、車載機1は、検出対象地物Ftagとして、走行路の幅方向に対して垂直面を有する方形状の道路標識を選定し、現在位置から所定距離以内に存在する上述の道路標識を地図DB10から特定する。そして、車載機1は、特定した道路標識に関する位置情報等を地図DB10から取得することで、検出ウィンドウを設定する。
First, the in-vehicle device 1 sets a detection window for the detection target feature Ftag around the vehicle with reference to the map DB 10 (Step S201). In this case, the on-vehicle device 1 selects, as the detection target feature Ftag, a rectangular road sign having a vertical plane with respect to the width direction of the traveling path, and determines the above-described road sign existing within a predetermined distance from the current position. It is specified from the map DB 10. Then, the on-vehicle device 1 sets a detection window by acquiring position information and the like regarding the specified road sign from the map DB 10.
次に、車載機1は、ステップS201で設定した検出ウィンドウ内のライダ2の計測点群を、検出対象地物Ftagの点群(即ち対象計測点群Ptag)として取得する(ステップS202)。そして、車載機1は、ステップS202で取得した検出対象地物Ftagの点群から、これらの点群の外縁を形成する外縁点群Poutを抽出する(ステップS203)。
Next, the vehicle-mounted device 1 acquires the measurement point group of the rider 2 in the detection window set in step S201 as a point group of the detection target feature Ftag (that is, the target measurement point group Ptag) (step S202). Then, the vehicle-mounted device 1 extracts, from the point group of the detection target feature Ftag acquired in step S202, an outer edge point group Pout that forms the outer edge of these point groups (step S203).
そして、車載機1は、ステップS203で抽出した外縁点群Poutの長手方向の傾きθを算出する(ステップS204)。その後、車載機1は、傾きθの平均化処理又は/及び車体のピッチ角θ0との差分処理により、ライダ2のピッチ方向の傾きLθを算出する(ステップS205)。上述の平均化処理では、車載機1は、ステップS201~S204の処理を複数の対象計測点群Ptagに対して実行することで複数の傾きθを算出し、これらの平均値を傾きLθとして算出する。
Then, the vehicle-mounted device 1 calculates the inclination θ in the longitudinal direction of the outer edge point group Pout extracted in Step S203 (Step S204). Thereafter, the on-vehicle device 1 calculates the inclination L θ of the rider 2 in the pitch direction by averaging the inclination θ and / or processing the difference from the pitch angle θ 0 of the vehicle body (step S205). In the averaging process described above, the vehicle-mounted unit 1, step S201 calculates a plurality of inclination theta by performing the process of ~ S204 for a plurality of target measurement point group PTAG, these average gradient as L theta calculate.
[ヨー角の推定]
次に、ライダ2のヨー角Lψの推定方法について説明する。車載機1は、白線などの区画線を検出対象地物Ftagとみなし、区画線の中心線の方向をライダ2が計測する計測点群に基づき算出することで、ライダ2のヨー角Lψを推定する。 [Estimation of yaw angle]
Next, a description method for estimating the yaw angle L [psi rider 2.Vehicle device 1 regards the demarcation line, such as a white line and the detection target feature FTAG, by the direction of the center line of the lane line rider 2 is calculated based on the measurement point group to be measured of the rider 2 yaw angle L [psi presume.
次に、ライダ2のヨー角Lψの推定方法について説明する。車載機1は、白線などの区画線を検出対象地物Ftagとみなし、区画線の中心線の方向をライダ2が計測する計測点群に基づき算出することで、ライダ2のヨー角Lψを推定する。 [Estimation of yaw angle]
Next, a description method for estimating the yaw angle L [psi rider 2.
図12は、ライダ2のヨー角を推定する際の検出対象地物Ftagとなる区画線に沿って車両が走行する場合の車両周辺の俯瞰図を示す。この例では、車両の左側に連続する区画線である連続線30が存在し、車両の右側に断続的に存在する区画線である破線31が存在している。また、矢印25は、ライダ2にヨー方向のずれが生じていない場合のヨー方向の基準方向を示し,ライダ2の座標軸「xL」をライダ2のヨー角LΨだけ回転させた向きであり、車両の進行方向に沿った座標軸「xb」と一致する。すなわち、ライダ2のヨー角LΨは、ずれが生じていなければ、車両の走行方向であるとみなすヨー角(「ヨー方向基準角」とも呼ぶ。)を示す。一方、矢印26は、ライダ2にヨー方向のずれがΔLΨ生じた場合のヨー方向基準角を示しており、ヨー方向基準角を修正しないと、車両の走行方向と一致しなくなる。ヨー方向基準角は、例えば予め記憶部12等に記憶されている。
FIG. 12 is a bird's-eye view of the periphery of the vehicle when the vehicle travels along a lane marking that is a target feature Ftag when estimating the yaw angle of the rider 2. In this example, there is a continuous line 30 which is a continuous lane marking on the left side of the vehicle, and a broken line 31 which is an intermittent lane marking on the right side of the vehicle. The arrow 25 indicates the reference direction of the yaw direction when the rider 2 does not occur yaw direction deviation, be a direction obtained by rotating the coordinate axes of the rider 2 "x L" only yaw angle L [psi rider 2 , And the coordinate axis “x b ” along the traveling direction of the vehicle. That is, the yaw angle L [psi rider 2, if no deviation occurs, indicating the yaw angle regarded as the running direction of the vehicle (also referred to as "yaw direction reference angle".). On the other hand, the arrow 26 indicates the yaw direction reference angle in the case where the deviation in the yaw direction of the rider 2 is ΔLΨ. Unless the yaw direction reference angle is corrected, the arrow 26 will not match the running direction of the vehicle. The yaw direction reference angle is stored in advance in, for example, the storage unit 12 or the like.
この場合、例えば、地図DB10内には、連続線30の離散的な座標位置及び破線31の離散的な座標位置を示す区画線情報が含まれている。そして、車載機1は、この区画線情報を参照し、車両の左前方、左後方、右前方、右後方の各方向において、車両から所定距離(例えば5m)離れた位置から最も近い座標位置を抽出し、抽出した座標位置を中心として矩形領域を、破線枠21A~21Dに示される検出ウィンドウとして設定する。
In this case, for example, the map DB 10 includes demarcation line information indicating the discrete coordinate position of the continuous line 30 and the discrete coordinate position of the broken line 31. The in-vehicle device 1 refers to the lane marking information and determines the nearest coordinate position from a position at a predetermined distance (for example, 5 m) from the vehicle in each of the left front, left rear, right front, and right rear directions of the vehicle. A rectangular area centered on the extracted coordinate position is set as a detection window indicated by broken-line frames 21A to 21D.
次に、車載機1は、ライダ2が計測する計測点群から、検出ウィンドウ内において、路面上かつ、所定の閾値以上の反射強度なる点群を、対象計測点群Ptagとして抽出する。そして、車載機1は、走査線ごとの対象計測点群Ptagの中心点を求め、検出ウィンドウごとにこれらの中心点を通る直線(矢印27A~27D参照)を、区画線の中心線として算出する。
Next, the in-vehicle device 1 extracts, from the measurement point group measured by the rider 2, a point group on the road surface and having a reflection intensity equal to or higher than a predetermined threshold within the detection window as a target measurement point group Ptag. Then, the vehicle-mounted device 1 obtains the center point of the target measurement point group Ptag for each scanning line, and calculates a straight line (see arrows 27A to 27D) passing through these center points for each detection window as the center line of the division line. .
図13(A)は、破線枠21Aの検出ウィンドウにおける中心点を丸印により示した図であり、図13(B)は、破線枠21Bの検出ウィンドウにおける中心点を丸印により示した図である。また、図13(A)、(B)では、ライダ2によるレーザ光の走査線28が明示されている。なお、区画線に入射するライダ2のレーザ光は下斜め方向に射出されているため、走査線は、車両に近づくほど間隔が短くなっている。
FIG. 13A is a diagram in which the center point of the dashed frame 21A in the detection window is indicated by a circle, and FIG. 13B is a diagram in which the center point of the dashed frame 21B in the detection window is indicated by a circle. is there. 13A and 13B, the scanning line 28 of the laser light by the rider 2 is clearly shown. In addition, since the laser beam of the rider 2 incident on the lane marking is emitted obliquely downward, the intervals between the scanning lines become shorter as approaching the vehicle.
図13(A)、(B)に示すように、車載機1は、各走査線28に対して中心点を算出する。中心点は、計測点が示す座標位置を走査線28ごとに平均した位置であってもよく、各走査線28上の左端及び右端の計測点の中間点であってもよく、各走査線28上の中間に存在する計測点であってもよい。そして、車載機1は、算出した中心点から、最小二乗法などの回帰分析手法により区画線の中心線(矢印27A、27B参照)を検出ウィンドウごとに算出する。
13) As shown in FIGS. 13A and 13B, the vehicle-mounted device 1 calculates a center point for each scanning line. The center point may be a position obtained by averaging the coordinate positions indicated by the measurement points for each scanning line 28, or may be an intermediate point between the leftmost and rightmost measurement points on each scanning line 28. It may be a measurement point existing in the upper middle. Then, the in-vehicle device 1 calculates the center line (see arrows 27A and 27B) of the division line for each detection window from the calculated center point by a regression analysis method such as the least square method.
そして、車載機1は、検出ウィンドウごとに算出した区画線の中心線とヨー方向基準角(図12の矢印26参照)とがなす角を、傾き「ψ1」~「ψ4」として算出する。図14(A)~(D)は、傾きψ1~ψ4をそれぞれ示した図である。図14(A)~(D)に示すように、傾きψ1~ψ4は、区画線の中心線を指し示す矢印27A~27Dと、ヨー方向基準角を指し示す矢印26とがなす角に相当する。
Then, the on-vehicle device 1 calculates an angle formed between the center line of the division line calculated for each detection window and the yaw direction reference angle (see the arrow 26 in FIG. 12) as inclinations “ψ 1 ” to “ψ 4 ”. . FIGS. 14A to 14D are diagrams showing the slopes ψ 1 to 4 4 , respectively. As shown in FIGS. 14A to 14D, the inclinations 1 1 to 4 4 correspond to the angles formed by arrows 27A to 27D indicating the center line of the dividing line and arrows 26 indicating the yaw direction reference angle. .
そして、車載機1は、以下の式(13)に示されるように、傾きψ1~ψ4を平均化することで、傾きψを算出する。
Then, the vehicle-mounted device 1 calculates the slope 車載 by averaging the slopes ψ 1 to ψ 4 as shown in the following equation (13).
次に、傾きψからライダ2のヨー角Lψを算出する方法について説明する。白線などの区画線は、道路に沿って描かれているため、車体のヨー角はほぼ一定(即ち走行路と平行)とみなすことができる。よって、車載機1は、ロール角やピッチ角の推定方法と異なり、傾きψを、ライダ2のヨー角のずれΔLψ(即ち「ΔLψ=ψ」)として算出する。ただし、通常の走行では、車線変更や、車線内での車体のふらつきが存在する。よって、車載機1は、多くの区画線に対して中心線の傾きを算出する処理を実行し、それを平均化するとよい。これにより、車載機1は、確からしいライダ2のヨー角のずれΔLψを好適に算出することができる。例えば、N個分の傾きψを取得した場合、ライダ2のヨー角Lψは、以下の式(15)により表される。
Next, a method for calculating the yaw angle L [psi rider 2 from the slope [psi. Since the lane markings such as the white line are drawn along the road, the yaw angle of the vehicle body can be regarded as substantially constant (that is, parallel to the traveling road). Therefore, the vehicle-mounted device 1 is different from the method of estimating the roll angle and the pitch angle, it calculates an inclination [psi, as the deviation [Delta] L [psi yaw angle rider 2 (i.e., "[Delta] L [psi = [psi"). However, in normal traveling, there are lane changes and the vehicle body fluctuates in the lane. Therefore, the in-vehicle device 1 may execute a process of calculating the inclination of the center line for many lane markings, and average the values. Thus, the vehicle-mounted device 1 can be suitably calculated deviation [Delta] L [psi of probable yaw angle of the rider 2. For example, when acquiring the inclination [psi of N content, the yaw angle L [psi rider 2 is expressed by the following equation (15).
図15は、ヨー角推定処理の手順を示すフローチャートである。
FIG. 15 is a flowchart showing the procedure of the yaw angle estimation process.
まず、車載機1は、地図DB10を参照し、区画線に対して1個以上の検出ウィンドウを設定する(ステップS301)。そして、車載機1は、ステップS301で設定した検出ウィンドウ内のライダ2の計測点群を、区画線の点群(即ち対象計測点群Ptag)として取得する(ステップS302)。
First, the in-vehicle device 1 refers to the map DB 10 and sets one or more detection windows for the lane markings (Step S301). Then, the vehicle-mounted device 1 acquires the measurement point group of the rider 2 within the detection window set in step S301 as a lane marking point group (ie, a target measurement point group Ptag) (step S302).
そして、車載機1は、設定した各検出ウィンドウ内の区画線の中心線を算出する(ステップS303)。この場合、車載機1は、ライダ2の走査線ごとに対象計測点群Ptagの中心点を求め、中心点から最小二乗法等に基づき検出ウィンドウごとに区画線の中心線を算出する。そして、車載機1は、予め記憶部12等に記憶したヨー角基準角と、区画線の中心線との角度ψを算出する(ステップS304)。この場合、車載機1は、ステップS301において2つ以上の検出ウィンドウを設定した場合には、検出ウィンドウごとにヨー角基準角と区画線の中心線との角度を求め、これらの角度を平均化又は中心点の数により重み付け平均化することで、上述の角度ψを算出する。
{Circle around (5)} The in-vehicle device 1 calculates the center line of the division line in each set detection window (step S303). In this case, the vehicle-mounted device 1 obtains the center point of the target measurement point group Ptag for each scanning line of the rider 2, and calculates the center line of the division line for each detection window from the center point based on the least square method or the like. Then, the vehicle-mounted device 1 calculates an angle の between the yaw angle reference angle stored in advance in the storage unit 12 and the center line of the lane marking (step S304). In this case, when two or more detection windows are set in step S301, the in-vehicle device 1 obtains the angle between the yaw angle reference angle and the center line of the division line for each detection window, and averages these angles. Alternatively, the above-described angle ψ is calculated by performing weighted averaging with the number of center points.
そして、車載機1は、異なる区画線を対象にステップS301~S304を実行することで算出した複数の角度ψを平均化し、式(15)に基づき、ライダ2のヨー角Lψを算出する(ステップS305)。
Then, the on-vehicle device 1 averages the plurality of angles ψ calculated by executing steps S301 to S304 on different lane markings, and calculates the yaw angle L の of the rider 2 based on Expression (15) ( Step S305).
ここで、以上説明したロール角、ピッチ角及びヨー角の推定値に基づく処理の具体例について説明する。図16は、ロール角、ピッチ角及びヨー角の推定値に基づく処理の具体例を示すフローチャートである。
Here, a specific example of the processing based on the estimated values of the roll angle, the pitch angle, and the yaw angle described above will be described. FIG. 16 is a flowchart illustrating a specific example of the processing based on the estimated values of the roll angle, the pitch angle, and the yaw angle.
まず、車載機1は、図9、図11、又は図15のフローチャートの処理を実行することで、ライダ2のロール角Lφ、ピッチ角Lθ、又はヨー角Lψのいずれかを算出したか否か判定する(ステップS401)。そして、車載機1は、ライダ2のロール角Lφ、ピッチ角Lθ、又はヨー角Lψのいずれかを算出した場合(ステップS401;Yes)、算出した角度がライダ設置情報ILに記録された角度から所定角度以上変化しているか否か判定する(ステップS402)。上述の閾値は、後述するステップS404でのライダ2の計測データの補正処理を行うことで引き続きライダ2の計測データを使用できるか否かを判定するための閾値であり、例えば予め実験等に基づき設定される。
First, the vehicle-mounted unit 1, 9, 11, or by performing the process of the flowchart of FIG. 15, the roll angle L phi rider 2, was calculated either pitch angle L theta, or yaw angle L [psi It is determined whether or not (step S401). Then, the vehicle-mounted unit 1, the roll angle L phi rider 2, when calculating the one of the pitch angle L theta, or yaw angle L [psi (step S401; Yes), the calculated angles are recorded to a rider installation information IL It is determined whether or not the angle has changed by a predetermined angle or more (step S402). The above-mentioned threshold value is a threshold value for determining whether or not the measurement data of the rider 2 can be continuously used by performing the correction processing of the measurement data of the rider 2 in step S404 described later. Is set.
そして、車載機1は、算出した角度がライダ設置情報ILに記録された角度から所定角度以上変化している場合(ステップS402;Yes)、対象のライダ2の出力データの使用(即ち障害物検知や自車位置推定等への利用)を中止し、対象のライダ2について再度のアライメント調整を行う必要がある旨の警告を情報出力部16により出力する(ステップS403)。これにより、事故等により著しく姿勢・位置のずれが生じたライダ2の計測データを用いることによる安全性低下等を確実に抑制する。
Then, when the calculated angle has changed by a predetermined angle or more from the angle recorded in the rider installation information IL (step S402; Yes), the in-vehicle device 1 uses the output data of the target rider 2 (that is, obstacle detection). And the information output unit 16 outputs a warning to the effect that it is necessary to perform the alignment adjustment again for the target rider 2 (step S403). As a result, a decrease in safety or the like caused by using the measurement data of the rider 2 in which the posture and the position are significantly shifted due to an accident or the like is reliably suppressed.
一方、車載機1は、算出した角度がライダ設置情報ILに記録された角度から所定角度以上変化していない場合(ステップS402;No)、ライダ設置情報ILに記録された角度からの算出した角度の変化量に基づき、ライダ2が出力する点群データの各計測値を補正する(ステップS404)。この場合、車載機1は、例えば、上述の変化量に対する計測値の補正量を示すマップ等を記憶しておき、当該マップ等を参照することで、上述の計測値を補正する。また、変化量の所定の割合の値を計測値の補正量として計測値を補正してもよい。
On the other hand, if the calculated angle has not changed by more than the predetermined angle from the angle recorded in the rider installation information IL (step S402; No), the in-vehicle device 1 calculates the angle calculated from the angle recorded in the rider installation information IL. The measured values of the point cloud data output by the rider 2 are corrected based on the amount of change (step S404). In this case, the in-vehicle device 1 stores, for example, a map indicating the correction amount of the measurement value with respect to the above-described change amount, and corrects the above-described measurement value by referring to the map or the like. Further, the measurement value may be corrected using a value of a predetermined ratio of the change amount as a correction amount of the measurement value.
以上説明したように、本実施例における車載機1は、車両の走行路の進行方向に対して垂直面又は走行路の幅方向に対して垂直面を有する道路標識を、車両に取り付けられたライダ2により計測した対象計測点群Ptagを取得する。そして、車載機1は、対象計測点群Ptagから算出される、ライダ2の設置情報に基づく座標系を基準とした道路標識の長手方向の傾きに基づいて、ライダ2の車両への取り付け姿勢を推定する。これにより、車載機1は、ロール方向及びピッチ方向におけるライダ2の車両への取り付け姿勢を的確に推定することができる。
As described above, the in-vehicle device 1 according to the present embodiment is configured such that a road sign having a plane perpendicular to the traveling direction of the vehicle travel path or a plane perpendicular to the width direction of the travel path is attached to the rider attached to the vehicle. 2. The target measurement point group Ptag measured by Step 2 is acquired. The in-vehicle device 1 determines the mounting posture of the rider 2 to the vehicle based on the inclination of the road sign in the longitudinal direction based on the coordinate system based on the installation information of the rider 2, which is calculated from the target measurement point group Ptag. presume. Thereby, the vehicle-mounted device 1 can accurately estimate the mounting attitude of the rider 2 to the vehicle in the roll direction and the pitch direction.
また、以上説明したように、本実施例における車載機1は、区画線を車両に取り付けられたライダ2により計測した計測点群を取得し、当該計測点群に基づき、車両の走行方向に沿った区画線の中心線を算出する。そして、車載機1は、ライダ2が基準とするヨー方向基準角に対する区画線の中心線の方向に基づき、ライダ2の車両への取り付け姿勢を推定する。これにより、車載機1は、ヨー方向におけるライダ2の車両の取り付け姿勢を好適に推定することができる。
Further, as described above, the vehicle-mounted device 1 in the present embodiment acquires the group of measurement points obtained by measuring the lane markings by the rider 2 attached to the vehicle, and, based on the group of measurement points, moves along the traveling direction of the vehicle. The center line of the demarcation line is calculated. Then, the on-vehicle device 1 estimates the mounting posture of the rider 2 to the vehicle based on the direction of the center line of the lane marking with respect to the yaw direction reference angle referenced by the rider 2. Thereby, the vehicle-mounted device 1 can appropriately estimate the mounting posture of the rider 2 in the yaw direction.
[変形例]
以下、実施例に好適な変形例について説明する。以下の変形例は、組み合わせて実施例に適用してもよい。 [Modification]
Hereinafter, a modified example suitable for the embodiment will be described. The following modifications may be combined and applied to the embodiment.
以下、実施例に好適な変形例について説明する。以下の変形例は、組み合わせて実施例に適用してもよい。 [Modification]
Hereinafter, a modified example suitable for the embodiment will be described. The following modifications may be combined and applied to the embodiment.
(変形例1)
図13及び図15のステップS303の説明では、車載機1は、検出ウィンドウ内における区画線の幅方向における中心点を算出することで区画線の中心線を算出した。これに代えて、車載機1は、検出ウィンドウ内における走査線の右端点又は左端点を抽出し、区画線の右端又は左端を通る直線を算出してもよい。これによっても、車載機1は、走行路と平行となる線を好適に特定して角度ψを算出することができる。 (Modification 1)
In the description of step S303 in FIG. 13 and FIG. 15, the vehicle-mounteddevice 1 calculates the center line of the lane marking by calculating the center point in the detection window in the width direction of the lane marking. Instead, the on-vehicle device 1 may extract the right or left end point of the scanning line in the detection window and calculate a straight line passing through the right or left end of the division line. In this way, the on-vehicle device 1 can appropriately identify the line parallel to the traveling path and calculate the angle ψ.
図13及び図15のステップS303の説明では、車載機1は、検出ウィンドウ内における区画線の幅方向における中心点を算出することで区画線の中心線を算出した。これに代えて、車載機1は、検出ウィンドウ内における走査線の右端点又は左端点を抽出し、区画線の右端又は左端を通る直線を算出してもよい。これによっても、車載機1は、走行路と平行となる線を好適に特定して角度ψを算出することができる。 (Modification 1)
In the description of step S303 in FIG. 13 and FIG. 15, the vehicle-mounted
(変形例2)
図12~図15で説明したヨー角の推定処理では、車載機1は区間線を検出対象地物Ftagとみなした。これに代えて、又はこれに加えて、車載機1は、縁石などを検出対象地物Ftagとみなしてもよい。このように、車載機1は、区画線に限らず、被計測面が走行路面と平行であって走行路面に沿って形成された任意の地物を検出対象地物Ftagとみなして図12~図15で説明したヨー角の推定処理を実行してもよい。 (Modification 2)
In the yaw angle estimation processing described with reference to FIGS. 12 to 15, the vehicle-mounteddevice 1 regards the section line as the detection target feature Ftag. Alternatively or additionally, the in-vehicle device 1 may regard the curbstone or the like as the detection target feature Ftag. In this manner, the in-vehicle device 1 regards an arbitrary feature whose surface to be measured is parallel to the traveled road surface and formed along the traveled road surface as the detection target feature Ftag, without being limited to the lane markings. The yaw angle estimation process described with reference to FIG. 15 may be executed.
図12~図15で説明したヨー角の推定処理では、車載機1は区間線を検出対象地物Ftagとみなした。これに代えて、又はこれに加えて、車載機1は、縁石などを検出対象地物Ftagとみなしてもよい。このように、車載機1は、区画線に限らず、被計測面が走行路面と平行であって走行路面に沿って形成された任意の地物を検出対象地物Ftagとみなして図12~図15で説明したヨー角の推定処理を実行してもよい。 (Modification 2)
In the yaw angle estimation processing described with reference to FIGS. 12 to 15, the vehicle-mounted
(変形例3)
図16のステップS404において、車載機1は、ライダ2が出力する点群データの各計測値を補正する代わりに、図9、図11、又は図15のフローチャートの処理により算出したロール角、ピッチ角、ヨー角に基づき、ライダ2が出力する点群データの各計測値を車両座標系に変換してもよい。 (Modification 3)
In step S404 in FIG. 16, instead of correcting each measurement value of the point cloud data output by therider 2, the on-vehicle device 1 uses the roll angle and the pitch calculated by the processing in the flowchart in FIG. 9, FIG. 11, or FIG. Each measured value of the point cloud data output by the rider 2 may be converted into a vehicle coordinate system based on the angle and the yaw angle.
図16のステップS404において、車載機1は、ライダ2が出力する点群データの各計測値を補正する代わりに、図9、図11、又は図15のフローチャートの処理により算出したロール角、ピッチ角、ヨー角に基づき、ライダ2が出力する点群データの各計測値を車両座標系に変換してもよい。 (Modification 3)
In step S404 in FIG. 16, instead of correcting each measurement value of the point cloud data output by the
この場合、車載機1は、算出したロール角Lφ、ピッチ角Lθ、ヨー角Lψを用いて、式(4)に基づき、ライダ2が出力する点群データの各計測値をライダ座標系から車体座標系に変換し、変換後のデータに基づいて、自車位置推定や自動運転制御などを実行してもよい。
In this case, the in-vehicle device 1 uses the calculated roll angle L φ , pitch angle L θ , and yaw angle L を to calculate each measurement value of the point cloud data output by the rider 2 based on the equation (4) in the rider coordinates. The system may be converted into a vehicle body coordinate system, and the own vehicle position estimation, automatic driving control, and the like may be executed based on the converted data.
他の例では、車載機1は、各ライダ2の姿勢を修正するためのアクチュエータなどの調整機構が各ライダ2に備わっている場合には、図16のステップS404の処理に代えて、ライダ設置情報ILに記録された角度からのずれ分だけライダ2の姿勢を修正するように調整機構を駆動させる制御を行ってもよい。
In another example, when each rider 2 has an adjustment mechanism such as an actuator for correcting the attitude of each rider 2, the on-vehicle device 1 replaces the processing in step S404 in FIG. Control for driving the adjustment mechanism may be performed such that the attitude of the rider 2 is corrected by the amount of deviation from the angle recorded in the information IL.
(変形例4)
図1に示す車両システムの構成は一例であり、本発明が適用可能な車両システムの構成は図1に示す構成に限定されない。例えば、車両システムは、車載機1を有する代わりに、車両の電子制御装置が図9、図11、図15、図16等に示す処理を実行してもよい。この場合、ライダ設置情報ILは、例えば車両内の記憶部に記憶され、車両の電子制御装置は、ライダ2などの各種センサの出力データを受信可能に構成される。 (Modification 4)
The configuration of the vehicle system shown in FIG. 1 is an example, and the configuration of the vehicle system to which the present invention can be applied is not limited to the configuration shown in FIG. For example, instead of having the vehicle-mounteddevice 1 in the vehicle system, the electronic control unit of the vehicle may execute the processes shown in FIGS. 9, 11, 15, 16, and the like. In this case, the rider installation information IL is stored in, for example, a storage unit in the vehicle, and the electronic control unit of the vehicle is configured to be able to receive output data of various sensors such as the rider 2.
図1に示す車両システムの構成は一例であり、本発明が適用可能な車両システムの構成は図1に示す構成に限定されない。例えば、車両システムは、車載機1を有する代わりに、車両の電子制御装置が図9、図11、図15、図16等に示す処理を実行してもよい。この場合、ライダ設置情報ILは、例えば車両内の記憶部に記憶され、車両の電子制御装置は、ライダ2などの各種センサの出力データを受信可能に構成される。 (Modification 4)
The configuration of the vehicle system shown in FIG. 1 is an example, and the configuration of the vehicle system to which the present invention can be applied is not limited to the configuration shown in FIG. For example, instead of having the vehicle-mounted
(変形例5)
車両システムは、ライダ2を複数台備えてもよい。この場合、車載機1は、それぞれのライダ2に対して図9、図11、及び図15のフローチャートの処理を実行することで、各ライダ2のロール角、ピッチ角、及びヨー角を推定する。 (Modification 5)
The vehicle system may include a plurality ofriders 2. In this case, the in-vehicle device 1 estimates the roll angle, the pitch angle, and the yaw angle of each rider 2 by executing the processing of the flowcharts of FIGS. 9, 11, and 15 for each rider 2. .
車両システムは、ライダ2を複数台備えてもよい。この場合、車載機1は、それぞれのライダ2に対して図9、図11、及び図15のフローチャートの処理を実行することで、各ライダ2のロール角、ピッチ角、及びヨー角を推定する。 (Modification 5)
The vehicle system may include a plurality of
1 車載機
2 ライダ
3 ジャイロセンサ
4 加速度センサ
5 GPS受信機
10 地図DB DESCRIPTION OFSYMBOLS 1 Onboard equipment 2 Rider 3 Gyro sensor 4 Acceleration sensor 5 GPS receiver 10 Map DB
2 ライダ
3 ジャイロセンサ
4 加速度センサ
5 GPS受信機
10 地図DB DESCRIPTION OF
Claims (10)
- 移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得手段と、
前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定手段と、
を有する姿勢推定装置。 Acquisition of a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path, and acquiring a measurement point group measured by a measuring device attached to the moving body. Means,
Estimating means for estimating the mounting posture of the measuring device on the moving body, based on the inclination of the vertical plane based on a coordinate system based on the installation information of the measuring device, calculated from the measurement point group,
A posture estimating device having: - 前記取得手段は、前記地物に関する地物情報に基づき前記地物の位置を認識することで、当該位置において前記計測装置が計測した計測点群を前記地物の計測点群として取得する請求項1に記載の姿勢推定装置。 The said acquisition means acquires the measurement point group which the measuring device measured at the said position as the measurement point group of the said feature by recognizing the position of the said feature based on the feature information regarding the said feature. The attitude estimation device according to claim 1.
- 前記推定手段は、
前記走行路の進行方向に対して垂直面を有する地物の計測点群から算出される前記傾きに基づいて、前記計測装置のロール方向の姿勢を推定し、
前記走行路の幅方向に対して垂直面を有する地物の計測点群から算出される前記傾きに基づいて、前記計測装置のピッチ方向の姿勢を推定する請求項1または2に記載の姿勢推定装置。 The estimating means includes:
Based on the inclination calculated from a measurement point group of a feature having a vertical plane with respect to the traveling direction of the traveling path, the attitude of the measurement device in the roll direction is estimated,
The attitude estimation according to claim 1, wherein the attitude of the measurement device in a pitch direction is estimated based on the inclination calculated from a group of measurement points of a feature having a plane perpendicular to the width direction of the travel path. apparatus. - 前記地物は、前記走行路に設けられた道路標識である請求項1~3のいずれか一項に記載の姿勢推定装置。 The attitude estimation device according to any one of claims 1 to 3, wherein the feature is a road sign provided on the travel path.
- 前記推定手段は、前記計測点群の縁を形成する外縁点群を抽出し、当該外縁点群に基づいて前記地物の傾きを算出する請求項1~4のいずれか一項に記載の姿勢推定装置。 The attitude according to any one of claims 1 to 4, wherein the estimating unit extracts an outer edge point group forming an edge of the measurement point group, and calculates the inclination of the feature based on the outer edge point group. Estimation device.
- 前記垂直面は矩形であり、
前記推定手段は、前記外側点群から前記矩形の各辺に対応する傾きを算出することで、前記地物の傾きを算出する請求項5に記載の姿勢推定装置。 The vertical plane is rectangular,
The posture estimating apparatus according to claim 5, wherein the estimating unit calculates the inclination of the feature by calculating an inclination corresponding to each side of the rectangle from the outside point group. - 前記各辺の各々を構成する点群の数及び間隔の少なくとも一方に基づき、前記各辺に対応する傾きの重み付けを行うことで、前記地物の傾きを算出する請求項6に記載の姿勢推定装置。 7. The pose estimation according to claim 6, wherein the slope of the feature is calculated by weighting the slope corresponding to each side based on at least one of the number and interval of the point groups constituting each of the sides. apparatus.
- 姿勢推定装置が実行する制御方法であって、
移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得工程と、
前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定工程と、
を有する制御方法。 A control method executed by the posture estimation device,
Acquisition of a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path, and acquiring a measurement point group measured by a measuring device attached to the moving body. Process and
An estimation step of estimating the mounting posture of the measurement device to the moving body, based on the inclination of the vertical plane based on a coordinate system based on the installation information of the measurement device, calculated from the measurement point group,
A control method having: - コンピュータが実行するプログラムであって、
移動体の走行路の進行方向に対して垂直面又は前記走行路の幅方向に対して垂直面を有する地物を、前記移動体に取り付けられた計測装置により計測した計測点群を取得する取得手段と、
前記計測点群から算出される、前記計測装置の設置情報に基づく座標系を基準とした前記垂直面の傾きに基づいて、前記計測装置の前記移動体への取り付け姿勢を推定する推定手段
として前記コンピュータを機能させるプログラム。 A program executed by a computer,
Acquisition of a feature having a plane perpendicular to the traveling direction of the traveling path of the moving body or a plane perpendicular to the width direction of the traveling path, and acquiring a measurement point group measured by a measuring device attached to the moving body. Means,
The estimating means for estimating the mounting posture of the measuring device on the moving body based on the inclination of the vertical plane based on a coordinate system based on the installation information of the measuring device, which is calculated from the measurement point group, A program that makes a computer function. - 請求項9に記載のプログラムを記憶した記憶媒体。 A storage medium storing the program according to claim 9.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018163723 | 2018-08-31 | ||
JP2018-163723 | 2018-08-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020045057A1 true WO2020045057A1 (en) | 2020-03-05 |
Family
ID=69644253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/031632 WO2020045057A1 (en) | 2018-08-31 | 2019-08-09 | Posture estimation device, control method, program, and storage medium |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020045057A1 (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007085937A (en) * | 2005-09-22 | 2007-04-05 | Toyota Motor Corp | Image ranging device and method therefor |
JP2009204532A (en) * | 2008-02-28 | 2009-09-10 | Aisin Seiki Co Ltd | Calibration device and calibration method of range image sensor |
JP2011191239A (en) * | 2010-03-16 | 2011-09-29 | Mazda Motor Corp | Mobile object position detecting device |
JP2012083157A (en) * | 2010-10-08 | 2012-04-26 | Mitsubishi Electric Corp | Outdoor feature detection system, program for the same, and record media of program for the same |
JP2013115540A (en) * | 2011-11-28 | 2013-06-10 | Clarion Co Ltd | On-vehicle camera system, and calibration method and program for same |
JP2016045150A (en) * | 2014-08-26 | 2016-04-04 | 株式会社トプコン | Point group position data processing device, point group position data processing system, point group position data processing method, and program |
WO2017159382A1 (en) * | 2016-03-16 | 2017-09-21 | ソニー株式会社 | Signal processing device and signal processing method |
US20170328992A1 (en) * | 2016-05-11 | 2017-11-16 | Samsung Electronics Co., Ltd. | Distance sensor, and calibration method performed by device and system including the distance sensor |
WO2019082700A1 (en) * | 2017-10-26 | 2019-05-02 | パイオニア株式会社 | Control device, control method, program, and storage medium |
-
2019
- 2019-08-09 WO PCT/JP2019/031632 patent/WO2020045057A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007085937A (en) * | 2005-09-22 | 2007-04-05 | Toyota Motor Corp | Image ranging device and method therefor |
JP2009204532A (en) * | 2008-02-28 | 2009-09-10 | Aisin Seiki Co Ltd | Calibration device and calibration method of range image sensor |
JP2011191239A (en) * | 2010-03-16 | 2011-09-29 | Mazda Motor Corp | Mobile object position detecting device |
JP2012083157A (en) * | 2010-10-08 | 2012-04-26 | Mitsubishi Electric Corp | Outdoor feature detection system, program for the same, and record media of program for the same |
JP2013115540A (en) * | 2011-11-28 | 2013-06-10 | Clarion Co Ltd | On-vehicle camera system, and calibration method and program for same |
JP2016045150A (en) * | 2014-08-26 | 2016-04-04 | 株式会社トプコン | Point group position data processing device, point group position data processing system, point group position data processing method, and program |
WO2017159382A1 (en) * | 2016-03-16 | 2017-09-21 | ソニー株式会社 | Signal processing device and signal processing method |
US20170328992A1 (en) * | 2016-05-11 | 2017-11-16 | Samsung Electronics Co., Ltd. | Distance sensor, and calibration method performed by device and system including the distance sensor |
WO2019082700A1 (en) * | 2017-10-26 | 2019-05-02 | パイオニア株式会社 | Control device, control method, program, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2020032986A (en) | Posture estimation device, control method, program and storage medium | |
WO2018181974A1 (en) | Determination device, determination method, and program | |
WO2011013586A1 (en) | Road shape recognition device | |
JP6806891B2 (en) | Information processing equipment, control methods, programs and storage media | |
WO2021112074A1 (en) | Information processing device, control method, program, and storage medium | |
JP7155284B2 (en) | Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium | |
JP6980010B2 (en) | Self-position estimator, control method, program and storage medium | |
JP2023075184A (en) | Output device, control method, program, and storage medium | |
US12085653B2 (en) | Position estimation device, estimation device, control method, program and storage media | |
JP2021181995A (en) | Self-position estimation device | |
JP2022176322A (en) | Self-position estimation device, control method, program, and storage medium | |
JP2013036856A (en) | Driving support apparatus | |
JP2023076673A (en) | Information processing device, control method, program and storage medium | |
WO2018212302A1 (en) | Self-position estimation device, control method, program, and storage medium | |
JP2020046411A (en) | Data structure, storage device, terminal device, server device, control method, program, and storage medium | |
WO2020045057A1 (en) | Posture estimation device, control method, program, and storage medium | |
JP2024150627A (en) | Attitude estimation device, control method, program, and storage medium | |
WO2018212290A1 (en) | Information processing device, control method, program and storage medium | |
US20240053440A1 (en) | Self-position estimation device, self-position estimation method, program, and recording medium | |
WO2019124279A1 (en) | Information processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19855981 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19855981 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |