WO2018179892A1 - Dispositif embarqué dans un véhicule, dispositif côté station et procédé d'étalonnage - Google Patents

Dispositif embarqué dans un véhicule, dispositif côté station et procédé d'étalonnage Download PDF

Info

Publication number
WO2018179892A1
WO2018179892A1 PCT/JP2018/004434 JP2018004434W WO2018179892A1 WO 2018179892 A1 WO2018179892 A1 WO 2018179892A1 JP 2018004434 W JP2018004434 W JP 2018004434W WO 2018179892 A1 WO2018179892 A1 WO 2018179892A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
section
original data
laser
vehicle
Prior art date
Application number
PCT/JP2018/004434
Other languages
English (en)
Japanese (ja)
Inventor
吉田 光伸
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2019508698A priority Critical patent/JP6808019B2/ja
Priority to TW107110045A priority patent/TW201836891A/zh
Publication of WO2018179892A1 publication Critical patent/WO2018179892A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/28Special adaptation for recording picture point data, e.g. for profiles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C7/00Tracing profiles
    • G01C7/02Tracing profiles of land surfaces
    • G01C7/04Tracing profiles of land surfaces involving a vehicle which moves along the profile to be traced
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • This invention relates to a mobile mapping system (hereinafter referred to as MMS).
  • MMS mobile mapping system
  • the present invention relates to an in-vehicle device that includes a device for estimating a self-position, a camera, and a laser scanner, and is mounted on a vehicle and measures the shape of a road and a road periphery.
  • a system for measuring the shape of a road and its surroundings by mounting a device for estimating the self-position, a camera, and a laser scanner on the measurement vehicle, such as MMS, and running the measurement vehicle is widely used.
  • MMS measurement vehicle the mounting position or mounting posture of the camera and the laser scanner changes due to aging or small collisions, resulting in degradation of measurement accuracy.
  • the MMS measuring vehicle performs a periodic inspection once a year, for example, and performs calibration for ensuring accuracy (for example, Patent Documents 1 and 2).
  • Calibration is a process of estimating the current mounting position and mounting posture of the camera and laser scanner.
  • a measurement vehicle performs a specified travel in a dedicated field having a survey point provided in a factory, measures a measurement target, and performs calibration using the measurement data.
  • the traveling of the measurement vehicle for estimating the mounting position and mounting posture of the camera and laser scanner is referred to as calibration traveling, and the estimation operation of the mounting position and mounting posture of the camera and laser scanner is referred to as calibration work.
  • the camera and laser scanner mounting positions and mounting postures obtained in the calibration operation are referred to as a camera calibration position and a camera calibration angle, a laser scanner calibration position and a laser scanner calibration angle, respectively.
  • ⁇ Calibration is performed once a year as described above to ensure accuracy as well as when the measurement vehicle is shipped. In that case, in order to absorb the error due to GPS (Global Positioning System) positioning, the same course in the dedicated field is measured 3 times in total, traveling a total of 6 times, and the average camera calibration angle and laser scanner calibration angle are calculated. ing.
  • GPS Global Positioning System
  • An object of the present invention is to provide means capable of efficiently performing calibration running.
  • the in-vehicle device of this invention is An in-vehicle device that includes a positioning receiver mounted on a vehicle, a laser scanner that collects original data that is a source of generation of three-dimensional point cloud data in a region through which the vehicle passes, and a storage device It is.
  • the in-vehicle device is For causing the laser scanner to collect the original data, and for calibrating a laser calibration parameter that is a setting value for the laser scanner that is necessary for generating the three-dimensional point cloud data and is a setting value to be calibrated
  • An original data collection unit for causing the laser scanner to collect the section original data, which is the original data, when the vehicle travels through a calibration section having a plurality of survey points measured on a public road as a section;
  • a positioning unit that generates position information using a positioning signal received by the positioning receiver from a positioning satellite;
  • a correlation unit that associates the position information of the collection period of the section original data with the section original data and stores the information in the storage device.
  • a public road on which a measurement vehicle frequently travels can be used as a calibration section. Therefore, it is possible to save the trouble of moving the measuring vehicle to a factory with a dedicated field, and the efficiency is improved. In addition, there is no need for special traveling in the factory, and manpower for traveling can be reduced. In addition, if the measurement vehicle is a public road that travels frequently, a plurality of times of travel is naturally performed, so that accuracy degradation or abnormality of a laser scanner or the like can be detected more quickly.
  • FIG. 3 is a diagram schematically illustrating a dedicated field in the factory according to the first embodiment.
  • the figure of Embodiment 1 The figure which shows the image of the comparative example of 3D point cloud data D (c) and reference
  • FIG. 3 is an external view showing the measurement vehicle 10 in the diagram of the first embodiment.
  • FIG. 3 is a diagram of the first embodiment and is a configuration diagram of the in-vehicle device 100.
  • FIG. 3 is a diagram of the first embodiment, and is a configuration diagram of a station-side device 200.
  • FIG. 3 is a diagram illustrating the flow of data to the calibration center 20 in the first embodiment.
  • FIG. 3 is a flowchart for explaining a calibration method in the first embodiment;
  • the figure of Embodiment 1 is a figure which shows typically the place set as a calibration area.
  • the figure of Embodiment 1 is a figure which shows typically a preferable place to set as a calibration area.
  • FIG. 3 is a flowchart of the operation of the in-vehicle device 100 in the first embodiment.
  • FIG. 3 is a flowchart of the operation of the station-side device 200 in the first embodiment.
  • the figure of Embodiment 1 is a figure which shows station
  • the figure of Embodiment 1 is a figure which shows the state which matched the station side quality information with the section original data.
  • FIG. The block diagram of the vehicle-mounted apparatus 100 of Embodiment 3.
  • FIG. The figure which supplements FIG.
  • FIG. The figure which supplements FIG.
  • Embodiment 1 FIG. Embodiment 1 will be described below.
  • One feature of the first embodiment is that a calibration section for calibration travel is set on a public road, and the measurement vehicle 10 travels on the calibration section of the public road, so that a dedicated field for calibration travel is in the factory. The problem at the time of being provided in is solved.
  • the calibration section is a section provided on the public road as a section for calibrating the laser calibration parameter, and is also a section provided on the public road as a section for calibrating the camera calibration parameter.
  • Calibration calibration calibrates the laser calibration parameters for each laser scanner attached to the measurement vehicle 10 and the camera calibration parameters for each camera attached to the measurement vehicle 10.
  • the configuration method of the laser calibration parameter and the camera calibration parameter conforms to the method described in Japanese Patent Application Laid-Open No. 2010-175423.
  • the laser calibration parameters are the laser scanner mounting position (X L , Y L , Z L ) and the laser scanner mounting posture ( ⁇ L , ⁇ L , ⁇ L ).
  • is the roll angle
  • is the pitch angle
  • is the yaw angle.
  • the camera calibration parameters are the camera mounting position (X C , Y C , Z C ) and the camera mounting posture ( ⁇ C , ⁇ C , ⁇ C ).
  • FIG. 1 is a diagram schematically showing a dedicated field in a factory.
  • FIG. 16 shows a diagram corresponding to FIG.
  • FIG. 16 is a view corresponding to FIG. 1 drawn based on FIG.
  • the measuring device mounted on the measuring vehicle 10 of the MMS uses a dedicated field to correct a measurement accuracy difference in data acquired by a sensor such as a laser scanner or a camera to be used.
  • the dedicated field has the survey point 11, but the reference three-dimensional point cloud data correctly adjusted by the survey point 11 is known.
  • 3D point cloud data D a collection of a plurality of points having 3D position information obtained from the measurement result of the laser scanner
  • reference 3D point cloud data D is referred to as reference 3D point cloud data D Called (0) .
  • Calibration is performed so that the three-dimensional point cloud data D (i) obtained from the measurement data measured by running the measurement vehicle 10 in the calibration field matches the reference three-dimensional point cloud data D (0) .
  • the reference three-dimensional point cloud data D (0) is generated from the original data d1 (0) , the position information d2 (0) , and the laser calibration parameter d3 (0) .
  • the following formula 1 shows this relationship.
  • F (d1 (0) , d2 (0) , d3 (0) ) D (0) (Formula 1)
  • d1 (0) Original data
  • d2 (0) Position information
  • d3 (0) Laser calibration parameter
  • F Processing for generating three-dimensional point cloud data D
  • the original data d1 (0) is mounted on the measurement vehicle 10. This is measurement data obtained by the laser scanner, and is data used to generate the reference three-dimensional point cloud data D (0) .
  • the position information d2 (0) is information associated with the original data and indicating the position where the original data is measured.
  • the laser calibration parameter d3 (0) is a set value set for the laser scanner.
  • a subscript “(0)” is attached to the original data and position information for generating the reference three-dimensional point cloud data D (0) .
  • the original data d1 (c) and the position information d2 (c) are acquired by the calibration running of the measurement vehicle 10. Since the laser calibration parameter d3 is a set value and not a measurement value, in this example, it is expressed as a laser calibration parameter d3 (i) . From the original data d1 (c) , the position information d2 (c), and the laser calibration parameter d3 (i) , from the original data d1 (c) measured at the time of the calibration run, the third order in the post-processing by the following equation 2 Original point cloud data D (c) is generated.
  • ⁇ Camera calibration parameter calibration operation In the case of calibration of the camera calibration parameter, the camera calibration parameter is calibrated so that the original data, which is a point cloud measured by the laser scanner, and the calibration image are overlapped and overlapped.
  • FIG. 2 is a diagram illustrating an image of a comparative example of the three-dimensional point group data D (c) and the reference three-dimensional point group data D (0) . 2 uses an image
  • FIG. 17 shows a diagram corresponding to FIG.
  • FIG. 17 is a view corresponding to FIG. 2 drawn on the basis of FIG.
  • FIG. 3 is an external view showing the measurement vehicle 10.
  • the measurement vehicle 10 is a vehicle equipped with a measurement unit 110 and an odometer 120.
  • the measurement unit 110 is installed on the top plate of the measurement vehicle 10, and the odometer 120 is installed on the axle of the measurement vehicle 10.
  • the measurement unit 110 includes six cameras 111A-F, four laser scanners 112A-D, three GPS receivers 113A-C (GPS: Global Positioning System), and one IMU 114 (Internal Measurement Unit). ).
  • the cameras 111A-F are devices that image the surroundings of the measurement vehicle 10.
  • the camera 111 ⁇ / b> A images the right front of the measurement vehicle 10, and the camera 111 ⁇ / b> B images the left front of the measurement vehicle 10.
  • the camera 111C images the right side of the measuring vehicle 10, and the camera 111D images the left side of the measuring vehicle 10.
  • the camera 111E images the right rear of the measuring vehicle 10, and the camera 111F images the left rear of the measuring vehicle 10.
  • the laser scanner 112A-D is a device that irradiates laser light and measures the distance and direction to the measurement point where the laser light is irradiated.
  • the laser scanner 112 ⁇ / b> A measures the lower front side of the measurement vehicle 10
  • the laser scanner 112 ⁇ / b> B measures the upper rear side of the measurement vehicle 10.
  • the laser scanner 112 ⁇ / b> C measures the upper front side of the measurement vehicle 10
  • the laser scanner 112 ⁇ / b> D measures the lower rear side of the measurement vehicle 10.
  • GPS receivers 113A to 113C are devices that receive positioning signals from positioning satellites and devices that perform positioning based on reception results.
  • the IMU 114 is a device (gyro sensor and acceleration sensor) that measures angular velocities in three axial directions (X, Y, Z).
  • the odometer 120 is a device that measures the traveling speed of the measurement vehicle 10.
  • FIG. 4 is a configuration diagram of the in-vehicle device 100.
  • the in-vehicle device 100 is mounted on the measurement vehicle 10.
  • the in-vehicle device 100 generates a GPS receiver 113A-C, which is a positioning receiver 9113, and the generation of the three-dimensional point cloud data D in the region (the road on which the measurement vehicle 10 travels and the vicinity of the road) that the measurement vehicle 10 passes.
  • a sampling device 9111 including a laser scanner that collects the original data d1 to be acquired with the movement of the measurement vehicle 10 and a storage device 9140 are provided.
  • the sampling device 9111 includes six cameras 111A-F, four generations of laser scanners 112A-D, an IMU 113, and an odometer 120.
  • the storage device 9140 includes a main storage device 141 and an auxiliary storage device 142. FIG. 4 will be described in detail.
  • the in-vehicle device 100 is a computer.
  • the in-vehicle device 100 includes a processor 130, a main storage device 141, an auxiliary storage device 142, an input device 150, and an input / output interface device 160 as hardware.
  • the processor 130 is connected to other hardware via a signal line and controls the other hardware.
  • the in-vehicle device 100 includes a 6th generation camera 111A-F, four laser scanners AD, an odometer 120, an IMU 114, and three GPS receivers 113A-C.
  • the processor 130 is an IC (Integrated Circuit) that performs arithmetic processing.
  • the processor 130 is a CPU (Central Processing Unit), a DSP (Digital Signal Processor), or a GPU (Graphics Processing Unit), as specific examples.
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • GPU Graphics Processing Unit
  • the main storage device 141 is a volatile storage device that can be read and written. Specific examples of the main storage device 141 include an SRAM (Static Random Access Memory) and a DRAM (Dynamic Random Access Memory).
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • the auxiliary storage device 142 is a readable / writable nonvolatile storage device.
  • the auxiliary storage device 142 stores a program and other data for realizing the function of the in-vehicle device 100.
  • the auxiliary storage device 142 is, as a specific example, a magnetic disk device (Hard Disk Drive).
  • the auxiliary storage device 142 may be a storage device that uses a portable storage medium such as an optical disc, a compact disc, a Blu-ray (registered trademark) disc, or a DVD (Digital Versatile Disk).
  • the input device 150 is a device that inputs data to the in-vehicle device 100.
  • the input device 150 is a touch panel, for example.
  • Laser calibration parameters d3 and camera calibration parameters, which are set values, can be input from the input device 150.
  • Data input from the input device 150 is stored in the auxiliary storage device 142.
  • the input / output interface device 160 is an interface device for the processor 130 to communicate with the sampling device 9111, the GPS receiver 113A-C, and the input device 150.
  • the in-vehicle device 100 includes a storage control unit 131, an original data collection unit 132, a positioning unit 133, and an association unit 134 as functional elements.
  • the functions of the storage control unit 131, the original data collection unit 132, the positioning unit 133, and the association unit 134 are realized by software.
  • the auxiliary storage device 142 stores programs for realizing the functions of the storage control unit 131, the original data collection unit 132, the positioning unit 133, and the association unit 134. This program is read and executed by the processor 130. Thereby, the functions of the storage control unit 131, the original data collection unit 132, the positioning unit 133, and the association unit 134 are realized.
  • the storage control unit 131 controls storage of data in the main storage device 141 and auxiliary storage device 142 or reading of data.
  • the functions of the storage control unit 131, the original data collection unit 132, the positioning unit 133, and the association unit 134 are realized by a program that is software. Programs that realize the functions of the storage control unit 131, the original data collection unit 132, the positioning unit 133, and the association unit 134 may be provided by being stored in a computer-readable recording medium or provided as a program product. Also good.
  • the in-vehicle device 100 may include a plurality of processors that replace the processor 130.
  • the plurality of processors share the functions of the storage control unit 131, the original data collection unit 132, the positioning unit 133, and the association unit 134.
  • Each processor is an IC that performs arithmetic processing in the same manner as the processor 130.
  • FIG. 5 is a configuration diagram of the station-side device 200 arranged in the calibration center 20.
  • FIG. 6 is a diagram showing a data flow to the calibration center 20.
  • the laser scanner measures the section original data d1 (k) in the calibration section
  • the GPS receiver measures the position information d2 (k) corresponding to the section original data d1 (k) .
  • the camera captures a calibration image in the calibration section. The calibration image is associated with the position information d2 (k) . It is assumed that the in-vehicle device 100 has a laser calibration parameter d3 (i) and a camera calibration parameter.
  • the calibration image is It is sent to the calibration center 20.
  • the calibration center 20 performs a calibration process for the laser scanner based on Equation 2 above.
  • the reference three-dimensional point cloud data D (0) is held by the station side device 200.
  • the camera is calibrated by the method described in the calibration of the dedicated field in the factory.
  • the station side device 200 is also a computer in the same manner as the in-vehicle device 100.
  • the station-side device 200 includes a processor 210, a station-side main storage device 220, a station-side auxiliary storage device 230, a station-side input / output interface device 240, and a station-side input device 250 as hardware.
  • the station-side device 200 includes a station-side point cloud data generation unit 211 and a station-side calibration unit 212 as functional elements.
  • the functions of the station-side point cloud data generation unit 211 and the station-side calibration unit 212 are realized by a program. Programs that realize the functions of the station-side point cloud data generation unit 211 and the station-side calibration unit 212 are stored in the station-side auxiliary storage device 230.
  • Programs that realize the functions of the station-side point cloud data generation unit 211 and the station-side calibration unit 212 may be provided by being stored in a computer-readable recording medium or may be provided as a program product.
  • the processor 130 illustrated in FIG. 4 and the processor 210 illustrated in FIG. 5 are also referred to as a processing circuit.
  • FIG. 7 is a flowchart illustrating the laser calibration parameter calibration method according to the first embodiment. A method for calibrating the laser calibration parameters will be described with reference to FIG. It is assumed that the station-side device 200 has a laser calibration parameter d3 (i) as the current laser calibration parameter. The camera calibration parameters are calibrated by the method described in the calibration of the dedicated field in the factory.
  • step S11 one or more workers set a calibration section, which is a section including a plurality of surveyed survey points, on a public road.
  • a calibration section is set based on the travel frequency indicating the frequency with which the measurement vehicle 10 travels on a public road. For example, on a main road with a high driving frequency as seen from the driving history, a place where the conditions are set is designated as a “calibration section”, surveying is performed to obtain a true value, and the reference three-dimensional point cloud data D from the true value Find (0) .
  • the true value means a value whose absolute position on the earth is known. The “location where the conditions are met” will be described later.
  • step S12 the laser scanner collects section original data d1 (k) , which is the original data d1, as the measurement vehicle 10 travels in the calibration section.
  • step S ⁇ b> 13 the in-vehicle device 100 generates position information using a positioning signal that the positioning receiver 9113 receives from a positioning satellite. Note that the collection of the section original data d1 (k) in step S12 and the generation of position information in step S13 are performed in parallel.
  • Measurement data measured by the measurement vehicle 10 traveling in the calibration section is sent to the calibration center 20, and the calibration apparatus 20 performs calibration of the laser calibration parameters.
  • the measurement data sent to the calibration center 20 is at least section original data d1 (k) and position information d2 (k) .
  • Calibration parameters laser in the case of how the station-side device 200's laser calibration parameter d3 (i) d3 (i) may not send.
  • the laser calibration parameter d3 (i) is also sent to the calibration center 20 unless otherwise specified. If the calibration process is performed from only one travel data, a GPS error occurs. Therefore, the station-side apparatus 200 performs calibration based on measurement data obtained by traveling a plurality of times. If there is a mechanism for sending measurement data to the calibration center 20, it is only necessary to determine whether or not the “calibration section” has been traveled from the travel route, and therefore it is not necessary to explicitly send data for the calibration section. .
  • the station side device 200 has the date of the last calibration of the laser calibration parameters, the accumulated data is stored after a certain period of time. It is possible to calibrate the laser calibration parameters.
  • the three-dimensional point cloud data D (c) obtained from the measurement result of the calibration travel is significantly different from the reference three-dimensional point cloud data D (0) , for example, when it is different from the GPS error range, the measurement vehicle 10 is abnormal data. It is determined that you have acquired. Furthermore, it is possible to determine whether the accuracy can be restored by calibrating the laser calibration parameters.
  • Step S14 the vehicle-mounted device 100 includes a position information of sampling period interval original data d1 (k) d2 (k), associating the segment source data d1 (k).
  • step S15 the in-vehicle device 100 or the station-side device 200 determines the section original data d1 (k) , the position information d2 (k) associated with the section original data d1 (k) , and the laser calibration parameter. Based on d3 (i) , the laser calibration parameter d3 (i) is calibrated.
  • FIG. 8 is a diagram schematically showing a place to be set as a calibration section.
  • Public roads such as major national roads or major expressways, through which many vehicles pass are designated as “calibration sections”. The following three points (a) to (c) are given as the designated conditions.
  • A) It is wide open and the positioning satellite 300 is easy to catch.
  • B) There are a plurality of non-moving targets such as road signs 51, utility poles, white lines 52, and piers.
  • C The vehicle must be able to travel not only in one direction but also in the opposite direction across a target such as a road sign 51.
  • the target feature of the measurement vehicle 10 or the white line 52 is surveyed in advance as a survey point, the true value of the survey point is obtained, and a point cloud is created from the true value.
  • Reference three-dimensional point cloud data D (0) is created in advance from the group.
  • the calibration section is provided on the public road based on the travel frequency indicating the frequency with which the measurement vehicle 10 travels on the public road. That is, as the calibration section, a section where the measurement vehicle 10 travels frequently on a specific public road is set as the calibration section. The travel frequency is determined from the travel history of the measurement vehicle 10.
  • the calibration section is registered in advance. In other words, what section of what public road is the calibration section is registered in advance.
  • the calibration section is registered in a management device that manages the in-vehicle device 100, for example.
  • the station-side device 200 is an example of a management device.
  • a plurality of survey points in the calibration section are also registered with the calibration section in association with the calibration section.
  • the measurement vehicle 10 starts measurement in the calibration section based on some opportunity, and holds measurement data from the laser scanner and the camera.
  • the trigger for starting measurement is as follows.
  • the storage device 9140 stores the position of the calibration section. Based on the comparison between the position of the calibration section and the position information generated by the positioning section 133, the original data collection unit 132 determines whether the current position information generated by the positioning section 133 is the position immediately before the calibration section. To do. When the original data collection unit 132 determines that the current position information generated by the positioning unit 133 is the position immediately before the calibration section, the determination result is the trigger for starting measurement.
  • the original data collection unit 132 causes the laser scanner to start collecting the section original data d1 (k) and performs calibration. Cause the camera to start capturing images. As a result, measurement can always be performed in the calibration section.
  • a roadside device 400 that broadcasts a signal notifying the start of the calibration section is installed at a position where the calibration section starts.
  • the passenger of the measurement vehicle 10 may operate a switch for starting measurement.
  • the laser calibration parameter and the camera calibration parameter are calibrated using measurement data obtained by a plurality of measurements.
  • the calibration parameter for laser and the calibration parameter for camera are calibrated using the measurement data stored in the periodic inspection period.
  • the calibration center 20 notifies the user of the measurement vehicle 10 of the change request by e-mail and asks the user to update the vehicle data.
  • the laser calibration parameter and the camera calibration parameter are data necessary for post-processing, and thus are not necessarily updated on the vehicle side.
  • FIG. 9 is a diagram schematically showing a preferable place for setting as a calibration section. 9 is preferable as the calibration section than FIG. The reason is that, in addition to (a) to (c) described above, the shape of the white line 52 is complicated and easy to evaluate. It is desirable to have multiple targets on both sides of the road.
  • FIG. 10 is a flowchart showing the operation of the in-vehicle device 100.
  • the original data collection unit 132 causes the laser scanner to collect the section original data d1 (k) , which is the original data d1, when the measurement vehicle 10 travels in the calibration section.
  • a calibration section is a section having a plurality of survey points.
  • the “plural survey points” are set values for the laser scanner necessary for generating the three-dimensional point cloud data D and are sections for calibrating the laser calibration parameter d3 (i) that is a set value to be calibrated. It is a plurality of surveying points provided on the public road and surveyed.
  • the calibration section is also a section for calibrating camera calibration parameters.
  • the original data collection unit 132 causes the camera to take a calibration image when the laser scanner collects the section original data.
  • the “calibration image” is an image used for calibration of a camera calibration parameter which is a setting value for the camera and is a setting value to be calibrated.
  • step S ⁇ b> 22 the positioning unit 133 generates position information using the positioning signal that the positioning receiver 9113 receives from the positioning satellite 300.
  • step S ⁇ b> 23 the associating unit 134 associates the position information d ⁇ b> 2 (k) of the collection period of the section original data d ⁇ b> 1 (k) with the section original data d ⁇ b> 1 (k) and stores them in the storage device 140. Moreover, the association unit 134, the positional information of the sampling period of the interval based on the data d1 (k) d2 (k) , in association with the image for calibration stored in the storage device 140.
  • step S23 After being stored in step S23, in the first embodiment, (A) Laser calibration parameter d3 (i) held by the in-vehicle device 100 in advance (B) Position information d2 (k) stored in step S23 (C) Section original data d1 (k) , (D) camera calibration parameters held by the in-vehicle device 100 in advance; (E) and a calibration image; Is sent to the calibration center 20.
  • the sending method may be any mode. It may be mailed or transmitted from the in-vehicle device 100 to the station side device of the calibration center 20 via a network.
  • FIG. 11 is a flowchart showing the operation of the station side device 200. With reference to FIG. 11, an operation in which the station-side apparatus 200 calibrates the calibration parameter for laser will be described.
  • step S31 the station-side point cloud data generation unit 211 acquires the section original data d1 (k) acquired from the in-vehicle device 100 and the position information d2 associated with the section original data d1 (k) by the association unit 134.
  • the laser calibration parameter after calibration is denoted as laser calibration parameter d3 (i + 1) .
  • step S33 the station-side calibration unit 212 generates station-side quality information by associating the accuracy of calibration with the laser calibration parameter d3 (i + 1) after calibration based on the calibration result.
  • FIG. 12 is a diagram showing station-side quality information.
  • FIG. 12 14 items including 1 to 3 are also described. Further, the station side calibration unit 212 associates station side quality information with the section original data.
  • FIG. 13 shows a state in which station-side quality information is associated with section original data.
  • the calibration parameters for laser after calibration and the laser scanner calibration accuracy constitute the station-side quality information.
  • Data 1.1, Data 1.2, etc. of the laser scanner data 1 are section original data.
  • the station-side calibration unit 212 calibrates the camera calibration parameter so that the original data, which is a point cloud measured by the laser scanner, and the calibration image are superimposed and overlapped. Is done.
  • the in-vehicle device 100 acquires and stores the laser calibration parameter d3 (i + 1) calibrated by the station-side device 200 from the station-side device 200.
  • the association unit 134 associates the original data d1 collected by the laser scanner with the laser calibration parameter d3 (i + 1) after calibration.
  • laser calibration parameters d3 (i + 1) , position information d2 (k + 1), and section source data d1 (k + 1) are sent to the calibration center 20. . Note that it is not essential for the in-vehicle device 100 to have the laser calibration parameter d3 (i + 1) .
  • the station-side apparatus 200 has a configuration in which the correspondence between the laser calibration parameter d3 (i + 1) , the position information d2 (k + 1), and the section source data d1 (k + 1) is known, only the station-side apparatus 200 can be used.
  • a configuration in which calibration parameters for laser are held may be used.
  • the post-processing apparatus has a configuration in which the correspondence between the laser calibration parameter d3 (i + 1) , the position information d2 (k + 1), and the section source data d1 (k + 1) is known, only the post-processing apparatus can perform the laser calibration. It is also possible to have a configuration that holds the operation parameters.
  • the in-vehicle device 100 acquires and stores the camera calibration parameters calibrated by the station side device 200 from the station side device 200.
  • the association unit 134 associates the image collected by the camera with the camera calibration parameter after calibration.
  • the calibration parameters for the laser and the calibration image that are currently held in the in-vehicle device 100 are sent to the calibration center 20. As with the laser calibration parameters, it is not essential for the in-vehicle device 100 to have the camera calibration parameters.
  • the following effects (1), (2), and (3) are mainly provided.
  • the measurement vehicle is not limited to a measurement-dedicated vehicle, and may be a vehicle in which the in-vehicle device 100 is mounted on a freight vehicle, a courier vehicle, a bus, or the like.
  • the measurement vehicle is not limited to a measurement-dedicated vehicle, and may be a vehicle in which the in-vehicle device 100 is mounted on a freight vehicle, a courier vehicle, a bus, or the like.
  • the measurement vehicle is a public road that travels well, a plurality of times of travel are naturally performed, so that accuracy degradation or abnormality of the laser scanner and the camera can be detected more quickly.
  • the following effects are further obtained.
  • a plurality of measurement vehicles travel in the “calibration section”
  • comparison between the measurement vehicles becomes possible, and it is possible to foresee a failure of the sampling device or an initial symptom of the failure due to a difference in data.
  • ⁇ Reception sensitivity is low
  • ⁇ Fix rate is decreasing
  • ⁇ The reflection brightness of the laser is different.
  • ⁇ The color of the camera is different. For example, it can be predicted not only by calibration but also by comparing a failure or initial symptoms of failure with data obtained from other vehicles.
  • Embodiment 2 FIG. The second embodiment will be described with reference to FIG.
  • the laser calibration parameter d3 (i) , the position information d2 (k), and the section source data d1 (k) are sent to the calibration center 20, and the laser calibration is performed at the calibration center 20.
  • Parameter d3 (i) was calibrated.
  • the camera calibration parameters the camera calibration parameters and the calibration image are sent to the calibration center 20, and the camera calibration parameters are calibrated at the calibration center 20.
  • the in-vehicle device 100 itself calibrates the laser calibration parameter d3 (i) and the camera calibration parameter.
  • FIG. 14 is a configuration diagram of the in-vehicle device 100 according to the second embodiment.
  • the in-vehicle device 100 includes a point cloud data generation unit 135 and a calibration unit 136 as compared to FIG. 4.
  • the functions of the point cloud data generation unit 135 and the calibration unit 136 are realized by a program in the same manner as the association unit 134 from the storage control unit 131 described in the first embodiment.
  • the auxiliary storage device 142 stores reference three-dimensional point cloud data D (0) of the calibration section.
  • the point cloud data generation unit 135 includes the section original data d1 (k) stored in step S23, the position information d2 (k) associated with the section original data d1 (k) by the association unit 134, and the laser Three-dimensional point cloud data D (i) is generated using the calibration parameter d3 (i) .
  • steps S21 to S23 are the same as those in FIG.
  • the calibration unit 136 compares the generated three-dimensional point cloud data D (i) with the reference three-dimensional point cloud data D (0), and sets the laser calibration parameter d3 (i) based on the comparison result. Calibrate.
  • the calibration unit 136 to calibrate the laser calibration parameter d3 (i), generates a laser calibration parameter d3 (i + 1) after calibration.
  • the calibration unit 136 generates quality information by associating the accuracy of calibration with the calibration parameter d3 (i + 1) for laser after calibration. This is the same as FIG. 12 and FIG.
  • the laser calibration parameter d3 (i), the position information d2 (k), and the section source data d1 ( k), calibration parameters, and calibration images need not be sent to the calibration center 20.
  • Embodiment 3 The third embodiment will be described with reference to FIG.
  • the in-vehicle device 100 itself calibrates the laser calibration parameter d3 (i) and the camera calibration parameter.
  • the in-vehicle device 100 generates a three-dimensional map using the laser calibration parameter d3 (i + 1) after calibration.
  • FIG. 15 is a configuration diagram of the in-vehicle device 100 according to the third embodiment.
  • the in-vehicle device 100 includes a map generation unit 137 with respect to FIG. 14.
  • the functions of the map generation unit 137 are realized by a program in the same manner as the storage control unit 131 to the association unit 134 described in the first embodiment.
  • the point cloud data generation unit 135 is sampled by the laser calibration parameter d3 (i + 1) after calibration, the position information d2 generated by the positioning unit 133, and the original data sampling unit 132. Based on the original data d1, three-dimensional point cloud data D (i + 1) is generated.
  • the generated three-dimensional point group data D (i + 1) is not for calibration, but is normally generated three-dimensional point group data.
  • the map generation unit 137 generates a three-dimensional map using the three-dimensional point cloud data D (i + 1) generated using the laser calibration parameter d3 (i + 1) .
  • a three-dimensional map for each region can be generated in the measurement vehicle 10 by running the plurality of measurement vehicles 10 in different regions. Furthermore, it is possible to generate a wide-range 3D map by transmitting a 3D map from each measuring vehicle 10 to the center station and integrating each 3D map at the center station.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un dispositif embarqué dans un véhicule (100), le dispositif étant pourvu d'une unité de collecte de données d'origine (132), d'une unité de positionnement (133) et d'une unité d'association (134). L'unité de collecte de données d'origine (132) amène un dispositif de balayage laser à collecter des données de section d'origine quand un véhicule de mesure (10) se déplace à travers une section d'étalonnage qui sert de section prévue sur une route publique pour étalonner des paramètres d'étalonnage de laser et comporte une pluralité de points surveillés. L'unité de positionnement (133) génère des informations de position à l'aide de signaux de positionnement reçus en provenance de satellites de positionnement par des récepteurs GPS (113A-C). L'unité d'association (134) associe et stocke, dans un dispositif de stockage (9140), les informations de position pour la période de collecte des données de section d'origine et des données de section d'origine.
PCT/JP2018/004434 2017-03-29 2018-02-08 Dispositif embarqué dans un véhicule, dispositif côté station et procédé d'étalonnage WO2018179892A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019508698A JP6808019B2 (ja) 2017-03-29 2018-02-08 車載装置、局側装置及びキャリブレーション方法
TW107110045A TW201836891A (zh) 2017-03-29 2018-03-23 車載裝置、局側裝置及校正方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-064936 2017-03-29
JP2017064936 2017-03-29

Publications (1)

Publication Number Publication Date
WO2018179892A1 true WO2018179892A1 (fr) 2018-10-04

Family

ID=63674660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/004434 WO2018179892A1 (fr) 2017-03-29 2018-02-08 Dispositif embarqué dans un véhicule, dispositif côté station et procédé d'étalonnage

Country Status (3)

Country Link
JP (1) JP6808019B2 (fr)
TW (1) TW201836891A (fr)
WO (1) WO2018179892A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020064011A (ja) * 2018-10-18 2020-04-23 日立建機株式会社 レーザスキャナのキャリブレーション方法、運搬機械
JP2020067402A (ja) * 2018-10-25 2020-04-30 株式会社デンソー センサ校正方法、及びセンサ校正装置
CN111348029A (zh) * 2020-03-16 2020-06-30 吉林大学 一种考虑工况的混合动力汽车标定参数最优值确定方法
IT201900018290A1 (it) * 2019-10-09 2021-04-09 Ets S R L Sistema di rilevamento per rilevare difetti lungo una struttura
JP7386091B2 (ja) 2020-02-14 2023-11-24 株式会社日立製作所 物体監視装置及び車両制御システム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016045150A (ja) * 2014-08-26 2016-04-04 株式会社トプコン 点群位置データ処理装置、点群位置データ処理システム、点群位置データ処理方法およびプログラム

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016045150A (ja) * 2014-08-26 2016-04-04 株式会社トプコン 点群位置データ処理装置、点群位置データ処理システム、点群位置データ処理方法およびプログラム

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DOUTERLOIGNE KOEN: "Automatic detection of a one dimensional ranging pole for robust external camera calibration in mobile mapping", ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, vol. 86, 17 October 2013 (2013-10-17) - December 2013 (2013-12-01), pages 111 - 123, XP055613144 *
ZHAO HUIJING: "An efficient extrinsic calibration of a multiple laser scanners and cameras' sensor system on a mobile platform", PROCEEDINGS OF THE 2007 IEEE INTELLIGENT VEHICLES SYMPOSIUM, 15 June 2007 (2007-06-15), pages 422 - 427, XP031126981 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020064011A (ja) * 2018-10-18 2020-04-23 日立建機株式会社 レーザスキャナのキャリブレーション方法、運搬機械
JP7138538B2 (ja) 2018-10-18 2022-09-16 日立建機株式会社 レーザスキャナのキャリブレーション方法、運搬機械
JP2020067402A (ja) * 2018-10-25 2020-04-30 株式会社デンソー センサ校正方法、及びセンサ校正装置
IT201900018290A1 (it) * 2019-10-09 2021-04-09 Ets S R L Sistema di rilevamento per rilevare difetti lungo una struttura
JP7386091B2 (ja) 2020-02-14 2023-11-24 株式会社日立製作所 物体監視装置及び車両制御システム
CN111348029A (zh) * 2020-03-16 2020-06-30 吉林大学 一种考虑工况的混合动力汽车标定参数最优值确定方法
CN111348029B (zh) * 2020-03-16 2021-04-06 吉林大学 一种考虑工况的混合动力汽车标定参数最优值确定方法

Also Published As

Publication number Publication date
TW201836891A (zh) 2018-10-16
JPWO2018179892A1 (ja) 2019-11-07
JP6808019B2 (ja) 2021-01-06

Similar Documents

Publication Publication Date Title
WO2018179892A1 (fr) Dispositif embarqué dans un véhicule, dispositif côté station et procédé d'étalonnage
US11113966B2 (en) Vehicular information systems and methods
US11860626B2 (en) Vehicle sensor verification and calibration
US10495762B2 (en) Non-line-of-sight (NLoS) satellite detection at a vehicle using a camera
US20220172487A1 (en) Method for detecting an operating capability of an environment sensor, control unit and vehicle
US20200218906A1 (en) Robust lane association by projecting 2-d image into 3-d world using map information
US20100165105A1 (en) Vehicle-installed image processing apparatus and eye point conversion information generation method
JPWO2018025341A1 (ja) 道路状態診断システム
JP2018180772A (ja) 物体検出装置
JP2016058912A (ja) 車載カメラ診断装置
EP3842735A1 (fr) Dispositif d'estimation de coordonnées de position, procédé d'estimation de coordonnées de position et programme
US20220390619A1 (en) Perception data based gnss multipath identification and correction
CN114670840A (zh) 死角推测装置、车辆行驶系统、死角推测方法
JPWO2018155149A1 (ja) センサ情報提供装置、センサ情報収集装置、センサ情報収集システム、センサ情報提供方法、センサ情報収集方法、およびコンピュータプログラム
US20210398425A1 (en) Vehicular information systems and methods
CN111103584A (zh) 用于获知车辆的环境中的物体的高度信息的设备和方法
US10163345B2 (en) Method and device for providing an event message indicative of an imminent event for a vehicle
US20220244407A1 (en) Method for Generating a Three-Dimensional Environment Model Using GNSS Measurements
CN111693043B (zh) 地图数据处理方法以及设备
CN116022198A (zh) 一种地铁隧道内的列车定位方法、装置、设备及存储介质
KR101689810B1 (ko) Obd 연결이 가능한 gps 시스템 및 이를 이용한 데이터 동기화 방법
JP2021070993A (ja) インフラ検査システム、及び、インフラ検査方法
US20210240196A1 (en) Positioning apparatus, recording medium, and positioning method
CN114739402B (zh) 一种融合定位方法、介质和电子设备
WO2024018726A1 (fr) Programme, procédé, système, carte routière et procédé de création de cartes routières

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777786

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019508698

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777786

Country of ref document: EP

Kind code of ref document: A1