WO2016157428A1 - Measurement device, measurement method, and program - Google Patents
Measurement device, measurement method, and program Download PDFInfo
- Publication number
- WO2016157428A1 WO2016157428A1 PCT/JP2015/060182 JP2015060182W WO2016157428A1 WO 2016157428 A1 WO2016157428 A1 WO 2016157428A1 JP 2015060182 W JP2015060182 W JP 2015060182W WO 2016157428 A1 WO2016157428 A1 WO 2016157428A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measurement data
- dimensional measurement
- dimensional
- moving body
- current time
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
Definitions
- the present invention relates to a technique for acquiring three-dimensional measurement data using a two-dimensional laser scanner mounted on a moving body.
- 3 3D laser scanners are still expensive and difficult to obtain, so it is difficult to mount them on ordinary vehicles.
- a relatively inexpensive and easily available 2D laser scanner is installed on the vehicle at an angle, and 2D point cloud data measured while traveling is based on the vehicle's position and orientation. There is a method of generating three-dimensional point cloud data by combining them.
- MMS Mobile Mapping System
- MMS Mobile Mapping System
- Patent Document 1 describes the technology related to MMS described above.
- all the measured data are accumulate
- Patent Document 1 When generating three-dimensional point cloud data in real time using a two-dimensional laser scanner, as shown in Patent Document 1, if all of the acquired point cloud data is accumulated, the amount of data becomes enormous. It is difficult to process in.
- An object of the present invention is to provide a measurement system that can generate three-dimensional point cloud data with high positional accuracy using a two-dimensional laser scanner.
- the invention according to claim 1 is a measuring device, a first acquisition unit that acquires the amount of movement of the moving body, and a two-dimensional front that is attached to the moving body and is based on the current position of the moving body A second conversion unit that acquires measurement data; a first conversion unit that converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body; A second conversion unit that converts the three-dimensional measurement data based on the time before the current time into converted three-dimensional measurement data based on the current time based on the movement amount acquired by the first acquisition unit 3D measurement data based on the current time generated by the first conversion unit and the converted 3D measurement data are combined, and the total 3D measurement corresponding to the current position and the current time of the mobile body A join to generate data Characterized in that it comprises a.
- a first acquisition unit that acquires a moving amount of a moving body
- a first acquisition unit that is attached to the moving body and acquires forward two-dimensional measurement data based on a current position of the moving body.
- a measurement method that is executed by a measurement device including two acquisition units, and converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body. Based on the first conversion step to be performed and the amount of movement acquired by the first acquisition step, the three-dimensional measurement data based on the time before a predetermined time from the current time is converted into the converted three-dimensional measurement based on the current time.
- a first acquisition unit that acquires a moving amount of a moving body
- a first acquisition unit that is attached to the moving body and acquires front two-dimensional measurement data based on a current position of the moving body.
- a computer program executed by a measuring device including two acquisition units and a computer, wherein the two-dimensional measurement data is converted into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body.
- the three-dimensional measurement data based on a time that is a predetermined time before the current time is converted into a three-dimensional conversion based on the current time.
- a second conversion unit for converting to measurement data, three-dimensional measurement data based on the current time generated by the first conversion unit, and the converted three-dimensional measurement data are combined, and the current position of the mobile body and the current At time Coupling unit to generate a comprehensive three-dimensional measurement data for response, and characterized by causing the computer to function as a.
- the measuring method by an Example is typically shown.
- the sensor utilized for the measurement system of an Example is shown. It is a figure explaining the coordinate system of a vehicle and a sensor. It is a block diagram which shows the structure of a measurement system. It is a figure explaining 2D point group data and 3D point group data. It is a figure which shows the position and attitude
- the measuring method by a modification is typically shown.
- the measuring device includes: a first acquisition unit that acquires the amount of movement of the moving body; and two-dimensional measurement data that is attached to the moving body and is based on the current position of the moving body.
- a first conversion unit that converts the two-dimensional measurement data into three-dimensional measurement data based on the attachment position and the attachment angle of the second acquisition unit with respect to the moving body, and the first acquisition unit
- a second conversion unit that converts the three-dimensional measurement data based on a time before a predetermined time based on a movement amount acquired by the acquisition unit into converted three-dimensional measurement data based on the current time;
- the above measuring device is mounted on a vehicle or other moving body, and the amount of movement is acquired by the first acquisition unit.
- two-dimensional measurement data ahead of the moving body is acquired by the second acquisition unit.
- a 1st conversion part converts 2D measurement data into 3D measurement data based on the attachment position and attachment angle with respect to the moving body of a 2nd acquisition part.
- three-dimensional measurement data is generated by the movement of the moving body.
- the second conversion unit converts the three-dimensional measurement data based on a time before a predetermined time from the current time into converted three-dimensional measurement data based on the current time.
- converted three-dimensional measurement data based on the current time is obtained from the three-dimensional measurement data generated at the past time.
- the measurement apparatus can generate comprehensive three-dimensional measurement data based on the current time from the two-dimensional measurement data.
- One aspect of the measurement apparatus includes a deletion unit that deletes three-dimensional measurement data corresponding to a position behind the current position of the moving body from the converted three-dimensional measurement data generated by the second conversion unit. .
- the amount of data stored in the measurement apparatus can be reduced.
- the first acquisition unit acquires the movement amount by SLAM based on outputs of a speed sensor, an angular velocity sensor, and an environment measurement sensor mounted on the moving body.
- the amount of movement of the moving body can be acquired with high accuracy, so that three-dimensional measurement data with high positional accuracy can be generated.
- the coupling unit outputs the comprehensive three-dimensional measurement data to an object recognition unit that recognizes an object in front of the moving body. Thereby, the object in front of the moving body can be recognized.
- a first acquisition unit that acquires a moving amount of a moving body, and a first acquisition unit that is attached to the moving body and acquires front two-dimensional measurement data based on the current position of the moving body.
- the three-dimensional measurement data based on a time before a predetermined time from the current time is converted into converted three-dimensional measurement data based on the current time.
- the second conversion step to be converted, the three-dimensional measurement data based on the current time generated by the first conversion step, and the converted three-dimensional measurement data are combined into the current position and the current time of the moving body.
- Corresponding total And a binding step of generating three-dimensional measurement data According to this measurement method, comprehensive three-dimensional measurement data based on the current time can be generated from the two-dimensional measurement data.
- a first acquisition unit that acquires a moving amount of a moving body, and two-dimensional measurement data that is attached to the moving body and that is forward based on the current position of the moving body is acquired.
- a program executed by a measurement device including a second acquisition unit and a computer that performs the two-dimensional measurement data on the two-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body.
- the three-dimensional measurement data Based on the movement amount acquired by the first conversion unit and the first acquisition unit, the three-dimensional measurement data based on a time that is a predetermined time before the current time is converted into a three-dimensional conversion based on the current time.
- a second conversion unit for converting to measurement data, three-dimensional measurement data based on the current time generated by the first conversion unit, and the converted three-dimensional measurement data are combined, and the current position of the mobile body and the current Times of Day Coupling unit to generate a corresponding overall three-dimensional measurement data, as to function the computer.
- This program can be stored and handled in a storage medium.
- FIG. 1 schematically shows a three-dimensional point cloud measurement method according to this embodiment.
- the two-dimensional laser scanner 5 is attached to the vehicle at an angle, and the two-dimensional point cloud data is synthesized by combining the two-dimensional point cloud data based on the position and orientation of the vehicle. Generate.
- the point cloud data behind the host vehicle is deleted, and only the point cloud data positioned in front of the host vehicle is retained, thereby generating local three-dimensional point group data based on the host vehicle position. Thereby, an increase in the amount of data can be suppressed, and accordingly, an increase in the amount of calculation in the three-dimensional point group processing such as object recognition can be suppressed.
- the relative movement amount of the vehicle is calculated with high accuracy by using SLAM (Simultaneous Localization and Mapping) technology for estimating the position and orientation of the vehicle.
- SLAM Simultaneous Localization and Mapping
- FIG. 2 shows a sensor used in the measurement system according to the embodiment.
- the measurement system is mounted on the vehicle 1 as a moving body.
- the vehicle 1 is equipped with a vehicle speed sensor 2, a gyro sensor 3, an environment measurement sensor 4, and a two-dimensional laser scanner 5 for measuring a three-dimensional point group.
- the vehicle speed sensor 2 detects the speed of the vehicle 1 by measuring a vehicle speed pulse composed of a pulse signal generated along with the rotation of the vehicle wheel.
- the gyro sensor 3 is an example of an angular velocity sensor, detects the angular velocity of the vehicle when the direction of the vehicle is changed, and outputs angular velocity data and relative azimuth data.
- the environmental measurement sensor 4 is used for self-position estimation, and a camera, a laser scanner, etc. can be used.
- a two-dimensional laser scanner separate from the two-dimensional laser scanner 5 is horizontally attached to the bumper portion of the vehicle 1 as the environmental measurement sensor 4.
- the two-dimensional laser scanner 5 is installed in the vehicle 1 at an angle rather than horizontal. In this embodiment, it is assumed that the two-dimensional laser scanner 5 is attached to the vehicle 1 at an elevation angle of 20 degrees with the front direction of the vehicle 1 as the front of the sensor.
- FIG. 3A shows a vehicle coordinate system.
- the front direction of the vehicle 1 is the X axis
- the left direction is the Y axis
- the vertical direction is the Z axis
- the origin is the center of gravity of the vehicle.
- the X axis corresponds to the roll axis
- the amount of rotation around the X axis is indicated by “ ⁇ ”.
- the Y axis corresponds to the pitch axis, and the amount of rotation around the Y axis is indicated by “ ⁇ ”.
- the Z axis corresponds to the yaw axis, and the amount of rotation about the Z axis is indicated by “ ⁇ ”.
- ⁇ indicates an angle in the traveling direction of the vehicle 1.
- FIG. 3B shows a coordinate system of the two-dimensional laser scanner 5 (hereinafter referred to as “sensor coordinate system”).
- the scan plane is the X S Y S plane by the XS axis and the Y S axis
- the sensor front direction is the XS axis
- the left direction is the Y S axis
- the direction perpendicular to the scan plane Is the Z S axis.
- FIG. 4 is a block diagram illustrating a functional configuration of the measurement system.
- the measurement system includes a SLAM unit 11, a three-dimensional conversion unit 12, a coordinate conversion unit 13, a point group deletion unit 14, and a point group combination unit 15 in addition to the sensors shown in FIG. 2.
- the two-dimensional laser scanner 5 outputs the measurement data z t at the time t on the two-dimensional plane defined by the mounting angle with respect to the vehicle 1 to the three-dimensional conversion unit 12.
- the three-dimensional conversion unit 12 converts the measurement data z t acquired from the two-dimensional laser scanner 5 into three-dimensional point group data q t and outputs it to the point group coupling unit 15.
- FIG. 5A shows an example of two-dimensional point group data output from the two-dimensional laser scanner 5.
- the two-dimensional point group data is given as a set of the distance r and the scanning angle ⁇ from the two-dimensional laser scanner 5 to each measurement point on the scan plane. Therefore, the three-dimensional conversion unit 12 converts the two-dimensional point group data (r k , ⁇ k ) acquired from the two-dimensional laser scanner 5 into the sensor coordinate system (X S Y S Z S coordinate system) according to the following equation (1). ) At position (x S, k , y S, k , z S, k ).
- z S, k is the height of the scanning surface of the two-dimensional laser scanner 5, and is zero here because the reference of the two-dimensional laser scanner 5 is the scanning surface.
- FIG. 5B shows an example of three-dimensional point cloud data.
- the three-dimensional point group data is given as a set of coordinates (x, y, z) of each measurement point in the vehicle coordinate system (XYZ coordinate system) based on the position of the center of gravity of the vehicle 1. Therefore, the three-dimensional conversion unit 12 is based on the two-dimensional laser scanner mounting position and mounting angle (x ls , y ls , z ls , ⁇ ls , ⁇ ls , ⁇ ls ) shown in FIG.
- the coordinate (x S, k , y S, k , z S, k ) position in the sensor coordinate system (X S Y S Z S coordinate system) obtained by (1) is expressed by the coordinate (x k , Y k , z k ).
- the three-dimensional conversion unit 12 generates the three-dimensional conversion data q t by converting all measurement points constituting the measurement data z t in this way.
- the vehicle speed sensor 2 detects the speed v t of the vehicle 1 at time t and outputs it to the SLAM unit 11.
- the gyro sensor 3 detects the angular velocity ⁇ t at time t and outputs it to the SLAM unit 11.
- the environmental measurement sensor 4 outputs measurement data z t slam to the SLAM unit 11.
- the SLAM unit 11 uses the speed v t output from the vehicle speed sensor 2, the angular velocity ⁇ t output from the gyro sensor 3, and the measurement data z t slam output from the environment measurement sensor 4 to drive the vehicle by SLAM. 1 position and orientation are estimated.
- the SLAM unit 11 assumes that the vehicle 1 travels on a two-dimensional plane as shown in FIG. 6, and the position and posture (x v , y) of the vehicle 1 in the external coordinate system (X W Y W coordinate system). v , ⁇ v ) is estimated.
- the coordinate conversion unit 13 performs coordinate conversion of the three-dimensional point group P t-1 generated at the previous time (t ⁇ 1) based on the movement amount ( ⁇ x v , ⁇ y v , ⁇ v ), and the current time Conversion to a point group P ′ t ⁇ 1 based on the position of the center of gravity of the vehicle at t .
- the point group P t ⁇ 1 before conversion is expressed as
- the coordinate conversion unit 13 applies to each point p i included in the point group P t ⁇ 1 .
- a point group P ′ t ⁇ 1 shown in FIG. N is the number of measurement points per scan.
- the point group deletion unit 14 sets a point group behind the center of gravity of the vehicle 1, that is, a point group whose X coordinate is negative as a point group A. 'The point group obtained by removing from t-1 P' point cloud A point group P and 't-1.
- the point group combining unit 15 combines the point group P ′′ t ⁇ 1 and the point group q t as described below, so that the three-dimensional point group is based on the center of gravity position of the vehicle at the current time t. Data P t is generated.
- Figure 7 is a diagram schematically showing a three-dimensional point group data P t generated by the measurement system.
- the position of the vehicle 1 at the previous time (t ⁇ 1) is denoted by O t ⁇ 1
- the current position at the current time t is denoted by O t .
- the vehicle 1 has moved by a movement amount ( ⁇ x w , ⁇ y w , ⁇ w ) from the previous time (t ⁇ 1) to the current time t.
- the three-dimensional point group P t-1 generated at the previous time (t-1) is a point group A belonging to the rear area, that is, a point behind the current position of the vehicle 1 in the X-axis direction.
- Group point group with negative X coordinate
- the point group behind this is deleted by the point group deletion unit 14.
- the measurement data acquired the current time t by the two-dimensional laser scanner 5 is three-dimensional point cloud q t is converted by the three-dimensional conversion unit 12 into three-dimensional data.
- the point group combining unit 15 and the three-dimensional point group q t acquired at the current time t and the three-dimensional point group P t ⁇ 1 (that is, P ′′ t ⁇ 1 ) after the rear point group is deleted.
- the latest three-dimensional point cloud data P t is generated by combining t .
- Recently three-dimensional point group data P t thus obtained shows a 3D point group in front of the vehicle 1 (X-coordinate is positive).
- Measurement system during the movement of the vehicle 1 repeatedly executes the above processing continues to update the three-dimensional point group data P t. Thereby, the measurement system can always hold the three-dimensional point cloud data P t ahead of the current position of the vehicle 1.
- the three-dimensional point group data P t obtained in this way is sent to the object recognition unit is used, such as the recognition of the label.
- point cloud processing such as object recognition can be performed in real time even with limited memory and calculation resources of a general vehicle.
- the movement amount of the vehicle can be acquired with high accuracy by using the SLAM technology, highly accurate three-dimensional point cloud data can be generated.
- the point group deletion unit 14 deletes the backward point group data, that is, the point group data whose X coordinate is negative, based on the position of the vehicle 1 at the current time t.
- the point cloud deletion unit 14 may also delete the point cloud data above the vehicle 1. Specifically, as shown in FIG. 8, a plane PL extending obliquely forward from the current position of the vehicle 1 is defined, and the point group control unit 14 is positioned behind the plane PL (X coordinate is in a negative direction).
- the point cloud data to be deleted may be deleted. Thereby, the data amount of the point cloud data held in the measurement system can be further reduced.
- the reference position O of the vehicle 1 is the center of gravity of the vehicle 1, but the application of the present invention is not limited to this.
- the reference position O of the vehicle 1 may be set as a driver position, a mounting position of a two-dimensional laser scanner, or the like.
- the above embodiment is based on the premise that the vehicle 1 is traveling on a flat surface (flat road), but when the posture detection sensor for detecting the posture (roll, pitch, etc.) is mounted on the vehicle 1. is a 3D point group data P t which measurement system outputs by correcting the output of the attitude detection sensor, it is possible to improve the accuracy of the 3D point group data generated.
- the present invention can be used for a measuring device mounted on a vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Provided is a measurement device which is installed in a moving body such as a vehicle and which by means of a SLAM unit acquires amounts of movement of the moving body. Additionally, two-dimensional measurement data of the front of the moving body is acquired by a two-dimensional laser scanner. A three-dimensional transformation unit transforms two-dimensional measurement data to three-dimensional measurement data, on the basis of the attachment location and attachment angle of the two-dimensional laser scanner relative to the moving body. A coordinate transformation unit transforms three-dimensional measurement data having as a benchmark a point in time earlier by a prescribed time duration than the current point in time, into transformed three-dimensional measurement data having the current point in time as the benchmark. The three-dimensional measurement data generated by the three-dimensional transformation unit and having the current point in time as the benchmark is combined with the transformed three-dimensional measurement data to generate synthesized three-dimensional measurement data that corresponds to the current location of the moving body and the current point in time.
Description
本発明は、移動体に搭載した2次元レーザスキャナを利用して3次元の計測データを取得する手法に関する。
The present invention relates to a technique for acquiring three-dimensional measurement data using a two-dimensional laser scanner mounted on a moving body.
自動車の運転に関する自動化技術が進んでいくと、車両が現在どの道路のどの車線を走行しているのか等を認識するために、数十センチ~数センチという高い精度で自己位置推定を行う必要が出てくる。近年研究開発が進められている自動運転車の多くは、事前に作成した高精度3次元地図と車両に搭載した3次元レーザスキャナで取得した点群データとのマッチングにより、そのような高精度自己位置推定を実現している。
As automobile driving automation technology advances, it is necessary to perform self-location estimation with high accuracy of several tens of centimeters to several centimeters in order to recognize which lane of which road the vehicle is currently driving. Come out. Many of self-driving cars that have been researched and developed in recent years have such high-accuracy self-matching by matching high-precision 3D maps created in advance with point cloud data acquired by a 3D laser scanner installed in the vehicle. Realizes position estimation.
高度運転支援システムや自動運転車の実現に向けて、従来の道路地図データのみでなく、3次元点群データや高精度な車線位置情報、高精度道路標識位置情報等の自動運転・運転支援に必要な各種データ群を含む、高度化した地図データの整備が進められている。
To realize advanced driving support systems and autonomous vehicles, not only conventional road map data, but also automatic driving and driving support such as 3D point cloud data, high-accuracy lane position information, and high-accuracy road sign position information Development of sophisticated map data including various necessary data groups is underway.
将来、一般車両においても3次元点群を計測し、点群データそのもの、もしくは点群データからの物体認識結果を高度化地図と照合することにより、高精度自己位置推定を行うことが期待される。
In the future, it is expected that high-precision self-localization will be performed by measuring 3D point clouds in general vehicles and collating the point cloud data itself or the object recognition result from the point cloud data with an advanced map. .
3次元レーザスキャナはまだ高価で入手困難なため、一般車両に搭載するのは難しい。3次元レーザスキャナの代替として、比較的安価で容易に入手が可能な2次元レーザスキャナを角度をつけて車両に設置し、走行しながら計測した2次元点群データを車両の位置及び姿勢に基づいて合成することにより3次元点群データを生成するという手法がある。
③ 3D laser scanners are still expensive and difficult to obtain, so it is difficult to mount them on ordinary vehicles. As an alternative to a 3D laser scanner, a relatively inexpensive and easily available 2D laser scanner is installed on the vehicle at an angle, and 2D point cloud data measured while traveling is based on the vehicle's position and orientation. There is a method of generating three-dimensional point cloud data by combining them.
2次元レーザスキャナによる3次元点群計測システムの例として、MMS(Mobile Mapping System)がある。MMSとは、車両にレーザスキャナやカメラ、GPS、IMUなどの機器を搭載し、走行しながら収集したデータを用いて、後処理により、高精度な3次元点群データを生成することができる移動式計測システムである。
As an example of a three-dimensional point cloud measurement system using a two-dimensional laser scanner, there is MMS (Mobile Mapping System). MMS is a vehicle that can be equipped with equipment such as laser scanner, camera, GPS, IMU, etc. on the vehicle and can generate highly accurate 3D point cloud data by post-processing using data collected while traveling. It is a type measurement system.
例えば、特許文献1は、上述したMMSに関する技術を記載している。当該特許文献1では、計測されたデータを全て蓄積し、後処理により3次元点群データを生成している。
For example, Patent Document 1 describes the technology related to MMS described above. In the said patent document 1, all the measured data are accumulate | stored and the three-dimensional point cloud data are produced | generated by post-processing.
2次元レーザスキャナを用いてリアルタイムに3次元点群データを生成する場合、特許文献1のように、取得した点群データを全て蓄積するとデータ量が膨大になり、一般車両の限られたリソースの中で処理するのは困難である。
When generating three-dimensional point cloud data in real time using a two-dimensional laser scanner, as shown in Patent Document 1, if all of the acquired point cloud data is accumulated, the amount of data becomes enormous. It is difficult to process in.
また、3次元点群をリアルタイムに生成するためには車両の位置及び姿勢もリアルタイムに推定する必要があるが、例えば、車両の位置推定に従来の自律航法を用いた場合、位置推定精度が悪いため、2次元点群データから高精度な3次元点群データを構築するのは難しい。
In addition, in order to generate a three-dimensional point cloud in real time, it is necessary to estimate the position and orientation of the vehicle in real time. For example, when conventional autonomous navigation is used for vehicle position estimation, the position estimation accuracy is poor. Therefore, it is difficult to construct highly accurate 3D point cloud data from 2D point cloud data.
本発明の解決しようとする課題としては、上記のものが一例として挙げられる。本発明は、2次元レーザスキャナを用いて、位置精度の高い3次元点群データを生成することが可能な計測システムを提供することを目的とする。
The above is one example of problems to be solved by the present invention. An object of the present invention is to provide a measurement system that can generate three-dimensional point cloud data with high positional accuracy using a two-dimensional laser scanner.
請求項1に記載の発明は、計測装置であって、移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換部と、前記第1取得部が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換部と、前記第1変換部が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合部と、を備えることを特徴とする。
The invention according to claim 1 is a measuring device, a first acquisition unit that acquires the amount of movement of the moving body, and a two-dimensional front that is attached to the moving body and is based on the current position of the moving body A second conversion unit that acquires measurement data; a first conversion unit that converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body; A second conversion unit that converts the three-dimensional measurement data based on the time before the current time into converted three-dimensional measurement data based on the current time based on the movement amount acquired by the first acquisition unit 3D measurement data based on the current time generated by the first conversion unit and the converted 3D measurement data are combined, and the total 3D measurement corresponding to the current position and the current time of the mobile body A join to generate data Characterized in that it comprises a.
請求項5に記載の発明は、移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、を備える計測装置により実行される計測方法であって、前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換工程と、前記第1取得工程が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換工程と、前記第1変換工程が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合工程と、を備えることを特徴とする。
According to a fifth aspect of the present invention, there is provided a first acquisition unit that acquires a moving amount of a moving body, and a first acquisition unit that is attached to the moving body and acquires forward two-dimensional measurement data based on a current position of the moving body. A measurement method that is executed by a measurement device including two acquisition units, and converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body. Based on the first conversion step to be performed and the amount of movement acquired by the first acquisition step, the three-dimensional measurement data based on the time before a predetermined time from the current time is converted into the converted three-dimensional measurement based on the current time. A second conversion step for converting to data, three-dimensional measurement data based on the current time generated by the first conversion step, and the converted three-dimensional measurement data, to combine the current position and the current position of the mobile object Corresponding to the time A bonding step of generating a comprehensive three-dimensional measurement data, characterized in that it comprises a.
請求項6に記載の発明は、移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、コンピュータと、を備える計測装置により実行されるプログラムであって、前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換部、前記第1取得部が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換部、前記第1変換部が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合部、として前記コンピュータを機能させることを特徴とする。
According to a sixth aspect of the present invention, there is provided a first acquisition unit that acquires a moving amount of a moving body, and a first acquisition unit that is attached to the moving body and acquires front two-dimensional measurement data based on a current position of the moving body. A computer program executed by a measuring device including two acquisition units and a computer, wherein the two-dimensional measurement data is converted into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body. Based on the movement amount acquired by the first conversion unit and the first acquisition unit, the three-dimensional measurement data based on a time that is a predetermined time before the current time is converted into a three-dimensional conversion based on the current time. A second conversion unit for converting to measurement data, three-dimensional measurement data based on the current time generated by the first conversion unit, and the converted three-dimensional measurement data are combined, and the current position of the mobile body and the current At time Coupling unit to generate a comprehensive three-dimensional measurement data for response, and characterized by causing the computer to function as a.
本発明の好適な実施形態では、計測装置は、 移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換部と、前記第1取得部が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換部と、前記第1変換部が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合部と、を備える。
In a preferred embodiment of the present invention, the measuring device includes: a first acquisition unit that acquires the amount of movement of the moving body; and two-dimensional measurement data that is attached to the moving body and is based on the current position of the moving body. A first conversion unit that converts the two-dimensional measurement data into three-dimensional measurement data based on the attachment position and the attachment angle of the second acquisition unit with respect to the moving body, and the first acquisition unit A second conversion unit that converts the three-dimensional measurement data based on a time before a predetermined time based on a movement amount acquired by the acquisition unit into converted three-dimensional measurement data based on the current time; By combining the three-dimensional measurement data based on the current time generated by the first conversion unit and the converted three-dimensional measurement data, total three-dimensional measurement data corresponding to the current position and the current time of the mobile object is obtained. Generated joints and Equipped with a.
上記の計測装置は、車両その他の移動体に搭載され、第1取得部により移動量が取得される。また、第2取得部により、移動体の前方の2次元計測データが取得される。第1変換部は、第2取得部の移動体に対する取付位置及び取付角度に基づいて、2次元計測データを3次元計測データに変換する。こうして、移動体の移動により3次元計測データが生成される。第2変換部は、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する。これにより、過去の時刻において生成された3次元計測データから、現在時刻を基準とした変換3次元計測データが得られる。そして、第1変換部が生成した現在時刻を基準とする3次元計測データと、変換3次元計測データとを結合して、移動体の現在位置及び現在時刻に対応する総合3次元計測データが生成される。このように計測装置は、2次元計測データから、現在時刻を基準とした総合3次元計測データを生成することができる。
The above measuring device is mounted on a vehicle or other moving body, and the amount of movement is acquired by the first acquisition unit. In addition, two-dimensional measurement data ahead of the moving body is acquired by the second acquisition unit. A 1st conversion part converts 2D measurement data into 3D measurement data based on the attachment position and attachment angle with respect to the moving body of a 2nd acquisition part. Thus, three-dimensional measurement data is generated by the movement of the moving body. The second conversion unit converts the three-dimensional measurement data based on a time before a predetermined time from the current time into converted three-dimensional measurement data based on the current time. As a result, converted three-dimensional measurement data based on the current time is obtained from the three-dimensional measurement data generated at the past time. Then, the three-dimensional measurement data based on the current time generated by the first conversion unit and the converted three-dimensional measurement data are combined to generate comprehensive three-dimensional measurement data corresponding to the current position and the current time of the mobile object. Is done. As described above, the measurement apparatus can generate comprehensive three-dimensional measurement data based on the current time from the two-dimensional measurement data.
上記の計測装置の一態様は、前記第2変換部が生成した前記変換3次元計測データから、前記移動体の現在位置よりも後方の位置に対応する3次元計測データを削除する削除部を備える。この態様では、移動体の現在位置より後方の3次元計測データを除去するので、計測装置内に保存しておくデータ量を削減することができる。
One aspect of the measurement apparatus includes a deletion unit that deletes three-dimensional measurement data corresponding to a position behind the current position of the moving body from the converted three-dimensional measurement data generated by the second conversion unit. . In this aspect, since the three-dimensional measurement data behind the current position of the moving body is removed, the amount of data stored in the measurement apparatus can be reduced.
上記の計測装置の他の一態様では、前記第1取得部は、前記移動体に搭載された速度センサ、角速度センサ、及び、環境計測センサの出力に基づいてSLAMにより前記移動量を取得する。SLAM技術を利用することにより、移動体の移動量を高精度に取得できるので、位置精度の高い3次元計測データを生成することが可能となる。
In another aspect of the measurement apparatus, the first acquisition unit acquires the movement amount by SLAM based on outputs of a speed sensor, an angular velocity sensor, and an environment measurement sensor mounted on the moving body. By using the SLAM technology, the amount of movement of the moving body can be acquired with high accuracy, so that three-dimensional measurement data with high positional accuracy can be generated.
上記の計測装置の他の一態様では、前記結合部は、前記移動体の前方にある物体を認識する物体認識部に前記総合3次元計測データを出力する。これにより、移動体の前方にある物体を認識することができる。
In another aspect of the measurement apparatus, the coupling unit outputs the comprehensive three-dimensional measurement data to an object recognition unit that recognizes an object in front of the moving body. Thereby, the object in front of the moving body can be recognized.
本発明の他の実施形態では、移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、を備える計測装置により実行される計測方法は、前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換工程と、前記第1取得工程が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換工程と、前記第1変換工程が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合工程と、を備える。この計測方法によれば、2次元計測データから、現在時刻を基準とした総合3次元計測データを生成することができる。
In another embodiment of the present invention, a first acquisition unit that acquires a moving amount of a moving body, and a first acquisition unit that is attached to the moving body and acquires front two-dimensional measurement data based on the current position of the moving body. A measurement method executed by a measurement device including a second acquisition unit, wherein the second acquisition unit converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body. Based on the amount of movement acquired by one conversion step and the first acquisition step, the three-dimensional measurement data based on a time before a predetermined time from the current time is converted into converted three-dimensional measurement data based on the current time. The second conversion step to be converted, the three-dimensional measurement data based on the current time generated by the first conversion step, and the converted three-dimensional measurement data are combined into the current position and the current time of the moving body. Corresponding total And a binding step of generating three-dimensional measurement data. According to this measurement method, comprehensive three-dimensional measurement data based on the current time can be generated from the two-dimensional measurement data.
本発明の他の好適な実施形態では、移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、コンピュータと、を備える計測装置により実行されるプログラムは、前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換部、前記第1取得部が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換部、前記第1変換部が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合部、として前記コンピュータを機能させる。このプログラムをコンピュータで実行することにより、2次元計測データから、現在時刻を基準とした総合3次元計測データを生成することができる。このプログラムは、記憶媒体に記憶して取り扱うことができる。
In another preferred embodiment of the present invention, a first acquisition unit that acquires a moving amount of a moving body, and two-dimensional measurement data that is attached to the moving body and that is forward based on the current position of the moving body is acquired. A program executed by a measurement device including a second acquisition unit and a computer that performs the two-dimensional measurement data on the two-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body. Based on the movement amount acquired by the first conversion unit and the first acquisition unit, the three-dimensional measurement data based on a time that is a predetermined time before the current time is converted into a three-dimensional conversion based on the current time. A second conversion unit for converting to measurement data, three-dimensional measurement data based on the current time generated by the first conversion unit, and the converted three-dimensional measurement data are combined, and the current position of the mobile body and the current Times of Day Coupling unit to generate a corresponding overall three-dimensional measurement data, as to function the computer. By executing this program on a computer, comprehensive three-dimensional measurement data based on the current time can be generated from the two-dimensional measurement data. This program can be stored and handled in a storage medium.
以下、図面を参照して本発明の好適な実施例について説明する。
Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
[概要]
本実施例は、車両に2次元レーザスキャナを搭載し、走行しながら計測することにより取得した2次元点群データから3次元点群データを生成する、移動式の3次元点群計測方法に関するものである。図1は、本実施例による3次元点群計測手法を模式的に示す。 [Overview]
This embodiment relates to a mobile three-dimensional point cloud measurement method in which a two-dimensional laser scanner is mounted on a vehicle and three-dimensional point cloud data is generated from two-dimensional point cloud data obtained by measurement while traveling. It is. FIG. 1 schematically shows a three-dimensional point cloud measurement method according to this embodiment.
本実施例は、車両に2次元レーザスキャナを搭載し、走行しながら計測することにより取得した2次元点群データから3次元点群データを生成する、移動式の3次元点群計測方法に関するものである。図1は、本実施例による3次元点群計測手法を模式的に示す。 [Overview]
This embodiment relates to a mobile three-dimensional point cloud measurement method in which a two-dimensional laser scanner is mounted on a vehicle and three-dimensional point cloud data is generated from two-dimensional point cloud data obtained by measurement while traveling. It is. FIG. 1 schematically shows a three-dimensional point cloud measurement method according to this embodiment.
具体的には、図1に示すように、2次元レーザスキャナ5を角度をつけて車両に取り付け、車両の位置・姿勢に基づいて2次元点群データを合成することで3次元点群データを生成する。但し、取得した点群データを全て蓄積するとデータ量が膨大になり、一般車両の限られたリソースでは処理が困難となる。そこで、自車より後方の点群データは削除し、自車より前方に位置する点群データのみを保持することで、自車位置を基準とする局所的な3次元点群データを生成する。これによりデータ量の増大を抑制することができ、それに応じて物体認識等の3次元点群処理における計算量の増大も抑制することが可能となる。
Specifically, as shown in FIG. 1, the two-dimensional laser scanner 5 is attached to the vehicle at an angle, and the two-dimensional point cloud data is synthesized by combining the two-dimensional point cloud data based on the position and orientation of the vehicle. Generate. However, if all of the acquired point cloud data is accumulated, the amount of data becomes enormous and processing becomes difficult with limited resources of ordinary vehicles. Therefore, the point cloud data behind the host vehicle is deleted, and only the point cloud data positioned in front of the host vehicle is retained, thereby generating local three-dimensional point group data based on the host vehicle position. Thereby, an increase in the amount of data can be suppressed, and accordingly, an increase in the amount of calculation in the three-dimensional point group processing such as object recognition can be suppressed.
また、車両の位置・姿勢の推定にSLAM(Simultaneous Localization and Mapping)技術を用いることで、車両の相対的な移動量を高精度に算出する。これにより高精度な3次元点群データを構築することができる。
Also, the relative movement amount of the vehicle is calculated with high accuracy by using SLAM (Simultaneous Localization and Mapping) technology for estimating the position and orientation of the vehicle. Thereby, highly accurate three-dimensional point cloud data can be constructed.
[計測システム]
図2は、実施例による計測システムで利用するセンサを示す。計測システムは、移動体としての車両1に搭載される。具体的に、車両1には、車速センサ2、ジャイロセンサ3、環境計測センサ4、及び、3次元点群計測用の2次元レーザスキャナ5が搭載される。 [Measurement system]
FIG. 2 shows a sensor used in the measurement system according to the embodiment. The measurement system is mounted on the vehicle 1 as a moving body. Specifically, the vehicle 1 is equipped with avehicle speed sensor 2, a gyro sensor 3, an environment measurement sensor 4, and a two-dimensional laser scanner 5 for measuring a three-dimensional point group.
図2は、実施例による計測システムで利用するセンサを示す。計測システムは、移動体としての車両1に搭載される。具体的に、車両1には、車速センサ2、ジャイロセンサ3、環境計測センサ4、及び、3次元点群計測用の2次元レーザスキャナ5が搭載される。 [Measurement system]
FIG. 2 shows a sensor used in the measurement system according to the embodiment. The measurement system is mounted on the vehicle 1 as a moving body. Specifically, the vehicle 1 is equipped with a
車速センサ2は、車両の車輪の回転に伴って発生されているパルス信号からなる車速パルスを計測することにより、車両1の速度を検出する。ジャイロセンサ3は、角速度センサの一例であり、車両の方向変換時における車両の角速度を検出し、角速度データ及び相対方位データを出力する。
The vehicle speed sensor 2 detects the speed of the vehicle 1 by measuring a vehicle speed pulse composed of a pulse signal generated along with the rotation of the vehicle wheel. The gyro sensor 3 is an example of an angular velocity sensor, detects the angular velocity of the vehicle when the direction of the vehicle is changed, and outputs angular velocity data and relative azimuth data.
環境計測センサ4は、自己位置推定のために使用され、カメラ、レーザスキャナなどが利用できる。本実施例では、環境計測センサ4として、2次元レーザスキャナ5とは別個の2次元レーザスキャナが車両1のバンパー部分に水平に取り付けられる。
The environmental measurement sensor 4 is used for self-position estimation, and a camera, a laser scanner, etc. can be used. In this embodiment, a two-dimensional laser scanner separate from the two-dimensional laser scanner 5 is horizontally attached to the bumper portion of the vehicle 1 as the environmental measurement sensor 4.
2次元レーザスキャナ5は、水平ではなく、角度をつけて車両1に設置される。本実施例では、2次元レーザスキャナ5は、車両1の前方向をセンサ正面とし、20度の仰角で車両1に取り付けられるものとする。
The two-dimensional laser scanner 5 is installed in the vehicle 1 at an angle rather than horizontal. In this embodiment, it is assumed that the two-dimensional laser scanner 5 is attached to the vehicle 1 at an elevation angle of 20 degrees with the front direction of the vehicle 1 as the front of the sensor.
次に、本実施例における座標系について説明する。図3(A)は車両座標系を示す。車両座標系は、車両1の前方向をX軸、左方向をY軸、垂直上方向をZ軸とし、原点は車両の重心とする。X軸はロール軸に相当し、X軸周りの回転量を「φ」で示す。Y軸はピッチ軸に相当し、Y軸周りの回転量を「θ」で示す。Z軸はヨー軸に相当し、Z軸周りの回転量を「ψ」で示す。ψは、車両1の進行方向の角度を示す。
Next, the coordinate system in this embodiment will be described. FIG. 3A shows a vehicle coordinate system. In the vehicle coordinate system, the front direction of the vehicle 1 is the X axis, the left direction is the Y axis, and the vertical direction is the Z axis, and the origin is the center of gravity of the vehicle. The X axis corresponds to the roll axis, and the amount of rotation around the X axis is indicated by “φ”. The Y axis corresponds to the pitch axis, and the amount of rotation around the Y axis is indicated by “θ”. The Z axis corresponds to the yaw axis, and the amount of rotation about the Z axis is indicated by “ψ”. ψ indicates an angle in the traveling direction of the vehicle 1.
図3(B)は、2次元レーザスキャナ5の座標系(以下、「センサ座標系」とする。)を示す。センサ座標系は、スキャン(走査)面をXS軸とYS軸によるXSYS平面とし、センサ正面方向をXS軸、左方向をYS軸、スキャン面に対してに垂直な方向をZS軸とする。
FIG. 3B shows a coordinate system of the two-dimensional laser scanner 5 (hereinafter referred to as “sensor coordinate system”). In the sensor coordinate system, the scan plane is the X S Y S plane by the XS axis and the Y S axis, the sensor front direction is the XS axis, the left direction is the Y S axis, and the direction perpendicular to the scan plane Is the Z S axis.
次に、計測システムの構成について説明する。図4は、計測システムの機能構成を示すブロック図である。図示のように、計測システムは、図2に示す各センサに加えて、SLAMユニット11、3次元変換部12、座標変換部13、点群削除部14、及び、点群結合部15を備える。
Next, the configuration of the measurement system will be described. FIG. 4 is a block diagram illustrating a functional configuration of the measurement system. As shown in the figure, the measurement system includes a SLAM unit 11, a three-dimensional conversion unit 12, a coordinate conversion unit 13, a point group deletion unit 14, and a point group combination unit 15 in addition to the sensors shown in FIG. 2.
2次元レーザスキャナ5は、車両1に対する取付角度により規定される2次元平面上の時刻tにおける計測データztを3次元変換部12に出力する。3次元変換部12は2次元レーザスキャナ5から取得した計測データztを3次元点群データqtに変換して点群結合部15へ出力する。
The two-dimensional laser scanner 5 outputs the measurement data z t at the time t on the two-dimensional plane defined by the mounting angle with respect to the vehicle 1 to the three-dimensional conversion unit 12. The three-dimensional conversion unit 12 converts the measurement data z t acquired from the two-dimensional laser scanner 5 into three-dimensional point group data q t and outputs it to the point group coupling unit 15.
図5(A)は、2次元レーザスキャナ5から出力される2次元点群データの例を示す。2次元点群データは、スキャン面における2次元レーザスキャナ5から各計測点までの距離rと走査角度θの集合で与えられる。よって、3次元変換部12は、以下の式(1)により、2次元レーザスキャナ5から取得した2次元点群データ(rk,θk)をセンサ座標系(XSYSZS座標系)における位置(xS,k,yS,k、zS,k)に変換する。
FIG. 5A shows an example of two-dimensional point group data output from the two-dimensional laser scanner 5. The two-dimensional point group data is given as a set of the distance r and the scanning angle θ from the two-dimensional laser scanner 5 to each measurement point on the scan plane. Therefore, the three-dimensional conversion unit 12 converts the two-dimensional point group data (r k , θ k ) acquired from the two-dimensional laser scanner 5 into the sensor coordinate system (X S Y S Z S coordinate system) according to the following equation (1). ) At position (x S, k , y S, k , z S, k ).
図5(B)は、3次元点群データの例を示す。3次元点群データは、車両1の重心位置を基準とした車両座標系(XYZ座標系)における各計測点の座標(x,y,z)の集合で与えられる。よって、3次元変換部12は、図5(B)に示す2次元レーザスキャナの取付位置及び取付角度(xls,yls,zls,φls,θls,ψls)に基づいて、式(1)により得られたセンサ座標系(XSYSZS座標系)における座標(xS,k,yS,k,zS,k)位置を、式(2)により座標(xk,yk,zk)に変換する。
FIG. 5B shows an example of three-dimensional point cloud data. The three-dimensional point group data is given as a set of coordinates (x, y, z) of each measurement point in the vehicle coordinate system (XYZ coordinate system) based on the position of the center of gravity of the vehicle 1. Therefore, the three-dimensional conversion unit 12 is based on the two-dimensional laser scanner mounting position and mounting angle (x ls , y ls , z ls , φ ls , θ ls , ψ ls ) shown in FIG. The coordinate (x S, k , y S, k , z S, k ) position in the sensor coordinate system (X S Y S Z S coordinate system) obtained by (1) is expressed by the coordinate (x k , Y k , z k ).
一方、車速センサ2は、時刻tにおける車両1の速度vtを検出し、SLAMユニット11へ出力する。ジャイロセンサ3は、時刻tにおける角速度ωtを検出し、SLAMユニット11へ出力する。環境計測センサ4は、計測データzt
slamをSLAMユニット11へ出力する。
On the other hand, the vehicle speed sensor 2 detects the speed v t of the vehicle 1 at time t and outputs it to the SLAM unit 11. The gyro sensor 3 detects the angular velocity ω t at time t and outputs it to the SLAM unit 11. The environmental measurement sensor 4 outputs measurement data z t slam to the SLAM unit 11.
SLAMユニット11は、車速センサ2から出力される速度vtと、ジャイロセンサ3から出力される角速度ωtと、環境計測センサ4から出力される計測データzt
slamとを用いて、SLAMにより車両1の位置及び姿勢を推定する。SLAMユニット11は、車両1は図6のように2次元平面上を走行するものと仮定して、外部の座標系(XWYW座標系)における車両1の位置及び姿勢(xv,yv,ψv)を推定する。
The SLAM unit 11 uses the speed v t output from the vehicle speed sensor 2, the angular velocity ω t output from the gyro sensor 3, and the measurement data z t slam output from the environment measurement sensor 4 to drive the vehicle by SLAM. 1 position and orientation are estimated. The SLAM unit 11 assumes that the vehicle 1 travels on a two-dimensional plane as shown in FIG. 6, and the position and posture (x v , y) of the vehicle 1 in the external coordinate system (X W Y W coordinate system). v , ψ v ) is estimated.
そして、前時刻(t-1)からの相対的な移動量(Δxv,Δyv,Δψv)を算出し、座標変換部13に出力する。
Then, the relative movement amounts (Δx v , Δy v , Δψ v ) from the previous time (t−1) are calculated and output to the coordinate conversion unit 13.
次に、座標変換部13は、前時刻(t-1)で生成した3次元点群Pt-1を、移動量(Δxv,Δyv,Δψv)に基づいて座標変換し、現在時刻tの車両の重心位置を基準とした点群P’t-1へ変換する。具体的には、変換前の点群Pt-1を
Next, the coordinate conversion unit 13 performs coordinate conversion of the three-dimensional point group P t-1 generated at the previous time (t−1) based on the movement amount (Δx v , Δy v , Δψ v ), and the current time Conversion to a point group P ′ t−1 based on the position of the center of gravity of the vehicle at t . Specifically, the point group P t−1 before conversion is expressed as
次に、点群削除部14は、車両1の重心位置よりも後方になった点群、即ち、X座標が負になった点群を点群Aとする。点群Aを点群P’t-1から取り除くことにより得られた点群をP’’t-1とする。
Next, the point group deletion unit 14 sets a point group behind the center of gravity of the vehicle 1, that is, a point group whose X coordinate is negative as a point group A. 'The point group obtained by removing from t-1 P' point cloud A point group P and 't-1.
前時刻(t-1)において生成された3次元点群Pt-1は、図示のように、後方エリアに属する点群A、即ち、X軸方向において車両1の現在位置よりも後方の点群(X座標が負の点群)を含むが、この後方の点群は点群削除部14により削除される。
As shown in the figure, the three-dimensional point group P t-1 generated at the previous time (t-1) is a point group A belonging to the rear area, that is, a point behind the current position of the vehicle 1 in the X-axis direction. Group (point group with negative X coordinate) is included, but the point group behind this is deleted by the point group deletion unit 14.
一方、2次元レーザスキャナ5によって現在時刻tに取得された計測データは、3次元変換部12により3次元データに変換されて3次元点群qtとなる。そして、点群結合部15は、後方の点群が削除された後の3次元点群Pt-1(即ち、P’’t-1)と、現在時刻tで取得した3次元点群qtとを結合し、最新の3次元点群データPtを生成する。こうして得られた最新の3次元点群データPtは、車両1の前方(X座標が正)の3次元点群を示している。
On the other hand, the measurement data acquired the current time t by the two-dimensional laser scanner 5 is three-dimensional point cloud q t is converted by the three-dimensional conversion unit 12 into three-dimensional data. Then, the point group combining unit 15 and the three-dimensional point group q t acquired at the current time t and the three-dimensional point group P t−1 (that is, P ″ t−1 ) after the rear point group is deleted. The latest three-dimensional point cloud data P t is generated by combining t . Recently three-dimensional point group data P t thus obtained shows a 3D point group in front of the vehicle 1 (X-coordinate is positive).
計測システムは、車両1の移動中に上記の処理を繰り返し実行して3次元点群データPtを更新し続ける。これにより、計測システムは、常に現在の車両1の位置より前方の3次元点群データPtを保持することができる。なお、こうして得られた3次元点群データPtは、例えば物体認識部に送られ、標識の認識などに利用される。
Measurement system, during the movement of the vehicle 1 repeatedly executes the above processing continues to update the three-dimensional point group data P t. Thereby, the measurement system can always hold the three-dimensional point cloud data P t ahead of the current position of the vehicle 1. The three-dimensional point group data P t obtained in this way, for example, is sent to the object recognition unit is used, such as the recognition of the label.
[効果]
以上のように、本実施例によれば、比較的安価で入手しやすい2次元レーザスキャナを用いることにより、低コストで3次元点群計測システムを構築することができる。 [effect]
As described above, according to this embodiment, it is possible to construct a three-dimensional point cloud measurement system at a low cost by using a two-dimensional laser scanner that is relatively inexpensive and easily available.
以上のように、本実施例によれば、比較的安価で入手しやすい2次元レーザスキャナを用いることにより、低コストで3次元点群計測システムを構築することができる。 [effect]
As described above, according to this embodiment, it is possible to construct a three-dimensional point cloud measurement system at a low cost by using a two-dimensional laser scanner that is relatively inexpensive and easily available.
また、過去に取得した点群データを全て蓄積せず、自車より後方のデータは排除し、自車より前方に位置するデータのみを用いて、自車位置を基準とした局所的な3次元点群データを生成する。よって、データ量の増大を抑制できるため、一般車両の限られたメモリ及び計算資源においても物体認識等の3次元点群処理をリアルタイムに行うことができる。
Also, it does not accumulate all point cloud data acquired in the past, excludes data behind the host vehicle, and uses only the data located in front of the host vehicle to generate a local three-dimensional Generate point cloud data. Therefore, since an increase in the amount of data can be suppressed, three-dimensional point cloud processing such as object recognition can be performed in real time even with limited memory and calculation resources of a general vehicle.
さらに、SLAM技術を用いることにより車両の移動量を高精度に取得することができるので、精度の高い3次元点群データを生成することができる。
Furthermore, since the movement amount of the vehicle can be acquired with high accuracy by using the SLAM technology, highly accurate three-dimensional point cloud data can be generated.
[変形例]
上記の実施例では、点群削除部14は、車両1の現在時刻tにおける位置を基準として後方の点群データ、即ち、X座標が負である点群データを削除している。しかし、実際には、生成した3次元点群データPtのその後の処理において、車両1の上方の点群データはあまり使用されることはない。そこで、点群削除部14は、車両1の上方の点群データも削除するようにしてもよい。具体的には、図8に示すように、車両1の現在位置から斜め前方に広がる平面PLを規定し、点群制御部14はこの平面PLよりも後方(X座標が負の方向)に位置する点群データを削除するようにしてもよい。これにより、計測システムにおいて保持する点群データのデータ量をさらに削減することができる。 [Modification]
In the above-described embodiment, the pointgroup deletion unit 14 deletes the backward point group data, that is, the point group data whose X coordinate is negative, based on the position of the vehicle 1 at the current time t. However, in practice, in the subsequent processing of the generated 3D point group data P t, the point cloud data above the vehicle 1 is not being used much. Therefore, the point cloud deletion unit 14 may also delete the point cloud data above the vehicle 1. Specifically, as shown in FIG. 8, a plane PL extending obliquely forward from the current position of the vehicle 1 is defined, and the point group control unit 14 is positioned behind the plane PL (X coordinate is in a negative direction). The point cloud data to be deleted may be deleted. Thereby, the data amount of the point cloud data held in the measurement system can be further reduced.
上記の実施例では、点群削除部14は、車両1の現在時刻tにおける位置を基準として後方の点群データ、即ち、X座標が負である点群データを削除している。しかし、実際には、生成した3次元点群データPtのその後の処理において、車両1の上方の点群データはあまり使用されることはない。そこで、点群削除部14は、車両1の上方の点群データも削除するようにしてもよい。具体的には、図8に示すように、車両1の現在位置から斜め前方に広がる平面PLを規定し、点群制御部14はこの平面PLよりも後方(X座標が負の方向)に位置する点群データを削除するようにしてもよい。これにより、計測システムにおいて保持する点群データのデータ量をさらに削減することができる。 [Modification]
In the above-described embodiment, the point
上記の実施例では、車両1の基準位置Oを車両1の重心としているが、本発明の適用はこれには限定されない。例えば、車両1の基準位置Oを運転手の位置、2次元レーザスキャナの取付位置などとしてもよい。
In the above embodiment, the reference position O of the vehicle 1 is the center of gravity of the vehicle 1, but the application of the present invention is not limited to this. For example, the reference position O of the vehicle 1 may be set as a driver position, a mounting position of a two-dimensional laser scanner, or the like.
上記の実施例は、車両1が平面(平坦な道路)を走行していることを前提としているが、車両1に姿勢(ロール、ピッチなど)を検出する姿勢検出センサが搭載されている場合には、計測システムが出力する3次元点群データPtを姿勢検出センサの出力によって補正することにより、生成される3次元点群データの精度を上げることができる。
The above embodiment is based on the premise that the vehicle 1 is traveling on a flat surface (flat road), but when the posture detection sensor for detecting the posture (roll, pitch, etc.) is mounted on the vehicle 1. is a 3D point group data P t which measurement system outputs by correcting the output of the attitude detection sensor, it is possible to improve the accuracy of the 3D point group data generated.
本発明は、車両に搭載する計測装置に利用することができる。
The present invention can be used for a measuring device mounted on a vehicle.
1 車両
2 車速センサ
3 ジャイロセンサ
4 環境計測センサ
5 2次元レーザスキャナ
11 SLAMユニット
12 3次元変換部
13 座標変換部
14 点群削除部
15 点群結合部 DESCRIPTION OF SYMBOLS 1Vehicle 2 Vehicle speed sensor 3 Gyro sensor 4 Environmental measurement sensor 5 Two-dimensional laser scanner 11 SLAM unit 12 Three-dimensional conversion part 13 Coordinate conversion part 14 Point group deletion part 15 Point group connection part
2 車速センサ
3 ジャイロセンサ
4 環境計測センサ
5 2次元レーザスキャナ
11 SLAMユニット
12 3次元変換部
13 座標変換部
14 点群削除部
15 点群結合部 DESCRIPTION OF SYMBOLS 1
Claims (7)
- 移動体の移動量を取得する第1取得部と、
前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、
前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換部と、
前記第1取得部が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換部と、
前記第1変換部が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合部と、
を備えることを特徴とする計測装置。 A first acquisition unit for acquiring a moving amount of the moving body;
A second acquisition unit that is attached to the mobile body and acquires forward two-dimensional measurement data based on a current position of the mobile body;
A first conversion unit that converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body;
A second conversion for converting the three-dimensional measurement data based on a time before a current time into a converted three-dimensional measurement data based on the current time based on the movement amount acquired by the first acquisition unit And
By combining the three-dimensional measurement data based on the current time generated by the first conversion unit and the converted three-dimensional measurement data, total three-dimensional measurement data corresponding to the current position and the current time of the mobile object is obtained. A coupling part to be generated;
A measuring device comprising: - 前記第2変換部が生成した前記変換3次元計測データから、前記移動体の現在位置よりも後方の位置に対応する3次元計測データを削除する削除部を備えることを特徴とする請求項1に記載の計測装置。 The apparatus according to claim 1, further comprising a deletion unit that deletes, from the converted three-dimensional measurement data generated by the second conversion unit, three-dimensional measurement data corresponding to a position behind the current position of the moving body. The measuring device described.
- 前記第1取得部は、前記移動体に搭載された速度センサ、角速度センサ、及び、環境計測センサの出力に基づいてSLAMにより前記移動量を取得することを特徴とする請求項1又は2に記載の計測装置。 The said 1st acquisition part acquires the said moving amount by SLAM based on the output of the speed sensor, angular velocity sensor, and environmental measurement sensor which are mounted in the said mobile body, The Claim 1 or 2 characterized by the above-mentioned. Measuring device.
- 前記結合部は、前記移動体の前方にある物体を認識する物体認識部に前記総合3次元計測データを出力することを特徴とする請求項1乃至3のいずれか一項に記載の計測装置。 4. The measuring apparatus according to claim 1, wherein the coupling unit outputs the comprehensive three-dimensional measurement data to an object recognition unit that recognizes an object in front of the moving body.
- 移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、を備える計測装置により実行される計測方法であって、
前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換工程と、
前記第1取得工程が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換工程と、
前記第1変換工程が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合工程と、
を備えることを特徴とする計測方法。 A measurement apparatus comprising: a first acquisition unit that acquires a movement amount of a moving body; and a second acquisition unit that is attached to the moving body and acquires front two-dimensional measurement data based on a current position of the moving body. A measurement method executed by
A first conversion step of converting the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body;
A second conversion that converts the three-dimensional measurement data based on a time before a current time into a converted three-dimensional measurement data based on the current time based on the movement amount acquired in the first acquisition step. Process,
By combining the three-dimensional measurement data based on the current time generated by the first conversion step and the converted three-dimensional measurement data, the total three-dimensional measurement data corresponding to the current position and the current time of the mobile body is obtained. A coupling step to be generated;
A measurement method comprising: - 移動体の移動量を取得する第1取得部と、前記移動体に取り付けられ、前記移動体の現在位置を基準とした前方の2次元計測データを取得する第2取得部と、コンピュータと、を備える計測装置により実行されるプログラムであって、
前記第2取得部の前記移動体に対する取付位置及び取付角度に基づいて、前記2次元計測データを3次元計測データに変換する第1変換部、
前記第1取得部が取得した移動量に基づいて、現在時刻より所定時間前の時刻を基準とした前記3次元計測データを、現在時刻を基準とした変換3次元計測データに変換する第2変換部、
前記第1変換部が生成した現在時刻を基準とする3次元計測データと、前記変換3次元計測データとを結合して、前記移動体の現在位置及び現在時刻に対応する総合3次元計測データを生成する結合部、
として前記コンピュータを機能させることを特徴とするプログラム。 A first acquisition unit that acquires a movement amount of the moving body; a second acquisition unit that is attached to the moving body and acquires front two-dimensional measurement data based on a current position of the moving body; and a computer. A program executed by a measuring device provided with
A first conversion unit that converts the two-dimensional measurement data into three-dimensional measurement data based on an attachment position and an attachment angle of the second acquisition unit with respect to the moving body;
A second conversion for converting the three-dimensional measurement data based on a time before a current time into a converted three-dimensional measurement data based on the current time based on the movement amount acquired by the first acquisition unit Part,
By combining the three-dimensional measurement data based on the current time generated by the first conversion unit and the converted three-dimensional measurement data, total three-dimensional measurement data corresponding to the current position and the current time of the mobile object is obtained. The joint to generate,
A program for causing the computer to function as: - 請求項7に記載のプログラムを記憶したことを特徴とする記憶媒体。 A storage medium storing the program according to claim 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/060182 WO2016157428A1 (en) | 2015-03-31 | 2015-03-31 | Measurement device, measurement method, and program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/060182 WO2016157428A1 (en) | 2015-03-31 | 2015-03-31 | Measurement device, measurement method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016157428A1 true WO2016157428A1 (en) | 2016-10-06 |
Family
ID=57004850
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/060182 WO2016157428A1 (en) | 2015-03-31 | 2015-03-31 | Measurement device, measurement method, and program |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016157428A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107179086A (en) * | 2017-05-24 | 2017-09-19 | 北京数字绿土科技有限公司 | A kind of drafting method based on laser radar, apparatus and system |
CN108534789A (en) * | 2017-12-27 | 2018-09-14 | 达闼科技(北京)有限公司 | A kind of multipath elements of a fix unified approach, electronic equipment and readable storage medium storing program for executing |
CN109115176A (en) * | 2018-09-05 | 2019-01-01 | 上海华测导航技术股份有限公司 | A kind of three-dimensional laser scanning system of movable type |
WO2020071416A1 (en) * | 2018-10-02 | 2020-04-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device |
WO2020071414A1 (en) * | 2018-10-02 | 2020-04-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
CN111367138A (en) * | 2020-04-14 | 2020-07-03 | 长春理工大学 | Novel laser scanning projection device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63186308A (en) * | 1987-01-28 | 1988-08-01 | Hitachi Ltd | Method and device for guiding mobile object |
JP2012242967A (en) * | 2011-05-17 | 2012-12-10 | Fujitsu Ltd | Map processing method, program and robot system |
WO2014132509A1 (en) * | 2013-02-27 | 2014-09-04 | シャープ株式会社 | Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method |
JP2014202601A (en) * | 2013-04-04 | 2014-10-27 | 株式会社Ihiエアロスペース | Plant position measurement instrument and method |
-
2015
- 2015-03-31 WO PCT/JP2015/060182 patent/WO2016157428A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63186308A (en) * | 1987-01-28 | 1988-08-01 | Hitachi Ltd | Method and device for guiding mobile object |
JP2012242967A (en) * | 2011-05-17 | 2012-12-10 | Fujitsu Ltd | Map processing method, program and robot system |
WO2014132509A1 (en) * | 2013-02-27 | 2014-09-04 | シャープ株式会社 | Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method |
JP2014202601A (en) * | 2013-04-04 | 2014-10-27 | 株式会社Ihiエアロスペース | Plant position measurement instrument and method |
Non-Patent Citations (1)
Title |
---|
TAKUMI NAKAMOTO ET AL.: "3-D Map Generation in a Dynamic Environment by a Mobile Robot Equipped with Laser Range Finders", IEICE TECHNICAL REPORT, vol. 106, no. 114, 29 June 2006 (2006-06-29), pages 25 - 30 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107179086A (en) * | 2017-05-24 | 2017-09-19 | 北京数字绿土科技有限公司 | A kind of drafting method based on laser radar, apparatus and system |
CN107179086B (en) * | 2017-05-24 | 2020-04-24 | 北京数字绿土科技有限公司 | Drawing method, device and system based on laser radar |
CN108534789A (en) * | 2017-12-27 | 2018-09-14 | 达闼科技(北京)有限公司 | A kind of multipath elements of a fix unified approach, electronic equipment and readable storage medium storing program for executing |
CN109115176A (en) * | 2018-09-05 | 2019-01-01 | 上海华测导航技术股份有限公司 | A kind of three-dimensional laser scanning system of movable type |
CN109115176B (en) * | 2018-09-05 | 2021-07-06 | 上海华测导航技术股份有限公司 | Movable three-dimensional laser scanning system |
WO2020071416A1 (en) * | 2018-10-02 | 2020-04-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device |
WO2020071414A1 (en) * | 2018-10-02 | 2020-04-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JPWO2020071414A1 (en) * | 2018-10-02 | 2021-09-02 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
JP7358376B2 (en) | 2018-10-02 | 2023-10-10 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
CN111367138A (en) * | 2020-04-14 | 2020-07-03 | 长春理工大学 | Novel laser scanning projection device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7073315B2 (en) | Vehicles, vehicle positioning systems, and vehicle positioning methods | |
EP3542182B1 (en) | Methods and systems for vehicle environment map generation and updating | |
WO2016157428A1 (en) | Measurement device, measurement method, and program | |
US9846043B2 (en) | Map creation apparatus, map creation method, and computer-readable recording medium | |
US11138465B2 (en) | Systems and methods for transforming coordinates between distorted and undistorted coordinate systems | |
CN110345937A (en) | Appearance localization method and system are determined in a kind of navigation based on two dimensional code | |
CN106441275A (en) | Method and device for updating planned path of robot | |
KR102056147B1 (en) | Registration method of distance data and 3D scan data for autonomous vehicle and method thereof | |
CN112189225A (en) | Lane line information detection apparatus, method, and computer-readable recording medium storing computer program programmed to execute the method | |
JP2016157197A (en) | Self-position estimation device, self-position estimation method, and program | |
US20240053475A1 (en) | Method, apparatus, and system for vibration measurement for sensor bracket and movable device | |
JP2022027593A (en) | Positioning method and device for movable equipment, and movable equipment | |
CN110458885B (en) | Positioning system and mobile terminal based on stroke perception and vision fusion | |
US11315269B2 (en) | System and method for generating a point cloud that includes surface normal information | |
JP2023164553A (en) | Position estimation device, estimation device, control method, program and storage medium | |
Li et al. | Pitch angle estimation using a Vehicle-Mounted monocular camera for range measurement | |
US11461944B2 (en) | Region clipping method and recording medium storing region clipping program | |
US11607999B2 (en) | Method and apparatus for invisible vehicle underbody view | |
JP7227849B2 (en) | Trajectory generator | |
US20240200953A1 (en) | Vision based cooperative vehicle localization system and method for gps-denied environments | |
US20220334259A1 (en) | Information processing apparatus, information processing method, and program | |
US11238292B2 (en) | Systems and methods for determining the direction of an object in an image | |
US12078490B2 (en) | 3D odometry in 6D space with roadmodel 2D manifold | |
KR20200145410A (en) | Apparatus and method for obtaining location information for camera of vehicle | |
Van Hamme et al. | Robust monocular visual odometry by uncertainty voting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15887581 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15887581 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |