WO2019093316A1 - Dispositif de positionnement de corps en mouvement et son procédé d'étalonnage - Google Patents

Dispositif de positionnement de corps en mouvement et son procédé d'étalonnage Download PDF

Info

Publication number
WO2019093316A1
WO2019093316A1 PCT/JP2018/041163 JP2018041163W WO2019093316A1 WO 2019093316 A1 WO2019093316 A1 WO 2019093316A1 JP 2018041163 W JP2018041163 W JP 2018041163W WO 2019093316 A1 WO2019093316 A1 WO 2019093316A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
marker
orientation
orientation information
marker image
Prior art date
Application number
PCT/JP2018/041163
Other languages
English (en)
Japanese (ja)
Inventor
恵佑 渡邊
一敏 佐藤
裕人 吉村
田口 信幸
義勝 五百竹
柿本 英司
Original Assignee
国立研究開発法人宇宙航空研究開発機構
株式会社デンソー
日立造船株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立研究開発法人宇宙航空研究開発機構, 株式会社デンソー, 日立造船株式会社 filed Critical 国立研究開発法人宇宙航空研究開発機構
Publication of WO2019093316A1 publication Critical patent/WO2019093316A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments

Definitions

  • the present invention relates to a mobile object positioning apparatus that can be applied to automatic travel of a vehicle and the like, and a method of calibrating the same.
  • Satellite positioning is widely used in moving objects including automobiles, agricultural machines, construction machines, etc. However, satellite positioning can not be performed as in tunnels, underpasses, cities with many high-rise buildings, etc.
  • the position is estimated by dead reckoning navigation using a sensor such as an IMU (Inertial Measurement Unit: inertial measurement device) using a gyro or the like, a vehicle speed meter, or a barometer.
  • IMU Inertial Measurement Unit: inertial measurement device
  • the attitude (azimuth angle) of the moving object is indirectly estimated from the moving path between two points Only. For this reason, although it is possible to correct the output of the dead reckoning navigation (the position and the azimuth angle of the moving body), the accuracy is insufficient, for example, to calibrate the sensor itself such as IMU, speedometer, barometer is there.
  • an object of the present invention is to provide a positioning device capable of maintaining the output of dead-reckoning navigation with high accuracy in a situation where positioning by satellite, such as in a tunnel, is not possible, and a calibration method of the positioning device.
  • a plurality of marks are arranged in a similar manner to the pitch between a plurality of lenses forming a dead reckoning unit that generates position and orientation information by dead reckoning, a microlens array, and the microlens array.
  • An imaging unit configured to capture a marker image having a variable moiré pattern unit configured to generate a moiré interference fringe pattern by a mark row and at least two reference points; and a marker image captured by the imaging unit.
  • a calibration unit that calibrates position and orientation information generated by the dead reckoning navigation unit.
  • the positioning device According to the positioning device according to the present invention, it is possible to maintain the output of dead reckoning navigation with high accuracy in a situation where positioning by satellite, such as in a tunnel, is not possible.
  • any method may be used as long as it is a GNSS (Global Navigation Satellite System) method, for example, a real time kinematic positioning (RTK: Realtime Kinematic) method, a single carrier.
  • RTK Realtime Kinematic
  • PPP Precise Point Positioning
  • PPP-AR PPP Ambiguity Resolution
  • a plurality of markers 40 shown in FIG. 2 are installed in advance in the tunnel at predetermined intervals, and the accurate position and attitude of the vehicle are measured using the markers 40 by the method described below. . Also, it is possible to perform calibration of a sensor such as an IMU, which has conventionally been performed in a stopped state, while moving.
  • FIG. 1 shows a schematic block diagram of a positioning device 10 according to the present embodiment.
  • the positioning device 10 includes a GNSS unit 11, an IMU unit 12, a marker positioning unit 13, a control unit 14, a storage unit 15, a calibration unit 16, and an input / output unit 17.
  • the GNSS unit 11 functioning as a satellite positioning unit includes a GNSS antenna (not shown), measures the position of the GNSS antenna by satellite positioning, and outputs position information.
  • the IMU unit 12 functioning as a dead reckoning navigation unit is a motion sensor (not shown) for detecting the 3-axis acceleration and attitude of the positioning device 10, a velocity sensor (not shown) for detecting the velocity of a moving object, and an altitude of the positioning device 10.
  • the IMU unit 12 detects the attitude of the moving object, moves from the GNSS unit 11 or the control unit 14 when the moving object enters a tunnel to which the signal from the positioning satellite can not reach and the positioning by the GNSS unit 11 can not be performed. It receives body position information and guesses the position of the mobile by dead reckoning.
  • the marker positioning unit 13 includes a position and orientation calculation unit 131, an angle calculation unit 132, and a position and orientation correction unit 133, and corrects the output of the position and orientation information by the IMU unit 12 using the marker 40 shown in FIG. And an image processing apparatus (not shown) for processing an image captured by a camera 134 as an imaging unit and an image captured by the camera 134 attached to the front of the vehicle for capturing an image of the marker 40.
  • the position and orientation calculation unit 131 calculates relative position and orientation information of the camera 134, that is, the moving object with respect to the marker 40, based on the image of the marker 40 captured by the camera 134.
  • the angle calculation unit 132 calculates a line-of-sight angle which is an angle formed by the camera 134, that is, the moving body with respect to the marker 40, based on the image of the marker 40 captured by the camera 134.
  • the position and orientation correction unit 133 corrects the relative position and orientation information of the moving body with respect to the marker 40 calculated by the position and orientation calculation unit 131 using the gaze angle calculated by the angle calculation unit 132.
  • the control unit 14 executes various calculations and controls in addition to calculating the position and orientation of the moving object based on the information from the GNSS unit 11, the IMU unit 12, and the marker positioning unit 13.
  • the storage unit 15 relates to identification information for specifying each marker from an image obtained by imaging the ID unit 41 of each marker 40 and the position and orientation of each marker 40 measured in advance in the earth coordinate system. Information is stored in association with each other.
  • the storage unit 15 is also used as a temporary storage unit in various operations.
  • the calibration unit 16 calibrates various sensors included in the IMU unit 12 based on the position and orientation information generated by the marker positioning unit 13.
  • the input / output unit 17 outputs the position and orientation information of the moving object calculated by the control unit 14 to the control unit (not shown) of the moving object and the display unit (not shown), while the control instruction to the control unit 14 Receive input.
  • FIG. 2 is a schematic plan view of the marker 40 used in the present embodiment.
  • the marker 40 uses, for example, the LentiMark described in "LentiMark: a visual marker for high-accuracy posture estimation using a lenticular lens" (The Institute of Electronics, Information and Communication Engineers D Vol. J 95-D No. 8 pp 1522-2529). be able to.
  • the marker 40 includes an ID portion 41 displaying the center marker identification information, an X-axis variable moiré pattern portion 42 provided along the right side around the ID portion 41, and a Y-axis provided along the upper side It comprises a variable moire pattern portion 43 and reference points 46a to 46d at four corners.
  • ID section 41 for example, unique ID information for identifying the marker 40 and accurate position and orientation information of the marker 40 are encoded and embedded in the figure. That is, the shape of the ID section 41 is different depending on each marker 40, the ID section 41 is imaged by the camera 134, the image is processed, and the marker positioning section 13 recognizes and installs the respective markers 40. It is possible to identify position coordinates and position / orientation information GP 41 that is its orientation. As tools that can be used for this identification, ARTool Kit, ARTag, CyberCode, ARTool Kit Plus, etc. are known. Also, a code called a two-dimensional barcode or QR code (registered trademark) can be used for this ID section 41.
  • the reference points 46a to 46d disposed at the four corners of the marker 40 are used to measure the distance from the moving object observing the marker 40 to the marker 40 and the relative position and orientation of the moving object with respect to the marker 40. Since the diameter of the reference point and the distance between the reference points are known in advance, the distance from the moving object to the marker 40 and the marker 40 can be determined by measuring the size and distance of the reference point in the image captured by the camera 134. It is possible to know the displacement angle with respect to the normal of, that is, the relative position and orientation information of the moving body.
  • the lenticular lens is on the black and white striped pattern, and the axial direction of each cylindrical lens constituting the lenticular lens is parallel to the striped pattern.
  • the X-axis variable moiré pattern portion 42 and the Y-axis variable moiré pattern portion 43 are viewed from the front by making the intervals of the cylindrical lenses and the intervals of the stripe pattern different from each other.
  • black stripes (called "black peaks") 42a and 43a appear respectively. The position of the black peak largely changes only by slightly rotating the X-axis variable moiré pattern portion 42 and the Y-axis variable moiré pattern portion 43 around an axis parallel to the cylindrical lens.
  • variable moire pattern portion can adjust the sensitivity of rotational angle detection by changing the distance between the cylindrical lenses and the distance between the stripes.
  • the X-axis variable moiré pattern portion 42 and the Y-axis variable moiré pattern portion 43 are described in detail with reference to FIGS. 3 to 8 of Japanese Patent No. 5842248 and the related specification.
  • the black peak 42a appearing in the X-axis variable moiré pattern portion 42 moves in the vertical direction. That is, the X-axis variable moiré pattern unit 42 can detect the line-of-sight angle of the X-axis of the moving body with reference to the normal line of the marker 40.
  • the black peak 43a appearing in the Y axis variable moiré pattern portion 43 moves in the left and right direction.
  • the Y-axis variable moiré pattern unit 43 can detect the line-of-sight angle of the Y-axis of the movable body based on the normal line of the marker 40.
  • the X-axis variable moiré pattern portion 42 and the Y-axis variable moiré pattern portion 43 are each at one place, but in addition to these, it is also possible to add an angle detection variable moiré pattern portion with low sensitivity. It is possible.
  • the angle of the line of sight relative to the normal to the marker 40 can be determined with high accuracy by determining the rotation angles about the horizontal and vertical axes.
  • the angle from the front of the line of sight when observing the X axis variable moiré pattern portion 42 and the Y axis variable moiré pattern portion 43 that is, the method of the X axis variable moiré pattern portion 42 and the Y axis variable moiré pattern portion 43
  • the relationship between the angle between the line and the line of sight and the positions of the black peaks 42 a and 43 a can be measured in advance and stored in the storage unit 15.
  • FIG. 3 is a flowchart showing a dead reckoning process of the positioning device 10 mounted on a mobile traveling in the tunnel.
  • the positioning by the GNSS unit 11 is stopped, and the latest position information GP 11 already measured by the GNSS unit 11 is handed over to the IMU unit 12 and the dead reckoning navigation It is started (step S11).
  • the camera 134 captures an image of the marker 40 (step S12).
  • the positional relationship between the mobile unit and the camera 134 attached to the mobile unit is measured in advance, and stored in the storage unit 15 as a correction value.
  • the marker positioning unit 13 identifies the unique ID of the marker 40 from the image information of the ID unit 41 of the captured marker 40, and in the earth coordinate system associated with the unique ID stored in advance from the storage unit 15.
  • the position and orientation information GP 15 is read out (step S13). Note that instead of reading out the position and orientation information GP 15 associated with the unique ID stored in advance in the storage unit 15, the position information providing server (not shown) is connected via a network (not shown), The position and orientation information GP 15 associated with the unique ID may be read out. Further, instead of the identification of the unique ID, the position and orientation information GP 41 of the marker 40 embedded in the image information may be acquired.
  • the position and orientation calculation unit 131 calculates the relative position and orientation information CP 131 of the moving object with respect to the marker 40 based on the four reference points 41a to 41d included in the image information of the marker 40 captured in step S12 (step S14). ).
  • the angle calculation unit 132 analyzes the X-axis variable moiré pattern unit 42 and the Y-axis variable moiré pattern unit 43 included in the image information of the marker 40, and calculates the positions of black peaks 42a and 43a appearing due to the moiré pattern.
  • a line of sight angle ⁇ X on the X axis and a line of sight angle ⁇ Y on the Y axis, which the straight line connecting the center of the marker 40 and the camera 135 forms with respect to the normal of the marker 40 (step S15).
  • the position and orientation correction unit 133 corrects the relative position and orientation information CP 131 of the moving object with respect to the marker 40 determined in step S14 using ⁇ X and ⁇ Y calculated in step S15 (step S16), and step S13.
  • the position and orientation information GP 133 in the earth coordinate system of the moving object is calculated based on the position and orientation information GP 15 or GP 41 of the marker 40 acquired in the above, and output from the input / output unit 17 (step S17). As described above, the operation from step S11 to step S17 is continued until the radio wave of the navigation satellite can be received and the positioning by the GNSS unit 11 becomes possible.
  • FIG. 4 is a flowchart showing a calibration process of the IMU unit 12 according to the present embodiment.
  • the calibration unit 16 measures an elapsed time T after the calibration based on the time T 0 at which the IMU unit 12 was finally calibrated (step S21). If the elapsed time T after calibration exceeds a predetermined threshold value T X (step S22; YES), the position and orientation information GP 133 corrected in step S17 in FIG. 3 and the imaging in step S12 in FIG.
  • step S23 conducted comparing the position and orientation information GP 12 calculated by the IMU unit 12 at the same time (step S23), the threshold delta X which is a difference delta 0 between the position and orientation information GP 12 and the position and orientation information GP 133 predetermined in the case of more (step S24; YES), so that the position and orientation information GP 12 of the IMU unit 12 coincides with the position and orientation information GP 133 corrected in step S17, the output value of a sensor or the like having the IMU 12 Calibrate (step S25).
  • step S22 if the elapsed time T is very small in step S22; and (Step S22 NO), if the error delta 0 of position and orientation information is smaller than a threshold delta X (step S24; NO), without calibration process End the process.
  • Markers 40 are installed at appropriate intervals in the tunnel, for example, at intervals of 100 meters, and if the marker positioning unit 13 performs the positioning operation each time the mobile passes each marker, the accuracy of the mobile is obtained. Do position and can know the orientation continuously, since the output GP 12 of the IMU unit 12 can be corrected each time, it can be greatly suppressed the accumulation of error in the output GP 12 of the IMU unit 12.
  • information GP 15 related to the position and orientation of each marker 40 measured in advance in the storage unit 15 of the positioning device 10 are stored in a server on a network (not shown), It is also possible to access the relevant server of the network via wireless communication and download these information GP 15 accordingly . Further, the information GP 15 and GP 133, it is also possible to share with the other plurality of vehicles traveling close via the wireless communication. The difference ⁇ 0 between the information GP 12 of the position and orientation of the vehicle determined by the IMU unit 12 and the information GP 133 of the position and orientation of the vehicle determined using the marker is shared among a plurality of vehicles via the network It is also possible.
  • the present invention can be applied to any place where the marker 40 can be installed even under such circumstances, and there are urban areas where lower buildings and high-rise buildings are lined up.
  • the present invention can be applied instead of satellite positioning.
  • the corrected position and orientation information GP 133 of the IMU unit 12 is supplied to the GNSS unit 11, It can be used for estimation of positioning parameters of the PPP scheme or the PPP-AR scheme by the GNSS unit 11. Since the supplied position and orientation information GP 133 is accurate, the time required for the initialization process of the PPP method or the PPP-AR method can be significantly reduced.
  • initialization is generally performed. It can be used for estimation of positioning parameters of the PPP method or the PPP-AR method which requires time for processing. And, since this data is correct, the time required for the initialization process of the PPP system or the PPP-AR system can be greatly reduced. As described above, if markers are used in the tunnel and positioning using the GNSS is resumed when exiting the tunnel, positioning can be performed seamlessly inside and outside the tunnel.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne : un dispositif de positionnement qui est apte à corriger les données de sortie d'une navigation à l'estime avec une précision élevée, lorsque le positionnement à l'aide de satellites n'est pas possible comme lorsque l'on se trouve à l'intérieur d'un tunnel; et un procédé d'étalonnage pour le dispositif de positionnement. Un réseau de microlentilles et une rangée de repères dans laquelle est disposée une pluralité de repères selon un pas similaire à celui séparant une pluralité de lentilles qui forment le réseau de microlentilles sont utilisés pour mesurer la position et l'orientation d'un corps en mouvement, à l'aide d'une image repère pourvue d'au moins deux points de référence, et d'une partie d'un motif de moiré variable formée de façon à générer un réseau de franges d'interférence de moiré.
PCT/JP2018/041163 2017-11-07 2018-11-06 Dispositif de positionnement de corps en mouvement et son procédé d'étalonnage WO2019093316A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017214560A JP2019086390A (ja) 2017-11-07 2017-11-07 移動体の測位装置及びその較正方法
JP2017-214560 2017-11-07

Publications (1)

Publication Number Publication Date
WO2019093316A1 true WO2019093316A1 (fr) 2019-05-16

Family

ID=66437794

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041163 WO2019093316A1 (fr) 2017-11-07 2018-11-06 Dispositif de positionnement de corps en mouvement et son procédé d'étalonnage

Country Status (2)

Country Link
JP (1) JP2019086390A (fr)
WO (1) WO2019093316A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7388390B2 (ja) * 2021-04-14 2023-11-29 トヨタ自動車株式会社 位置情報取得システム、位置情報取得方法
CN118382787A (zh) * 2021-12-24 2024-07-23 日立安斯泰莫株式会社 车载电子控制装置以及位置推定方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0385413A (ja) * 1989-08-30 1991-04-10 Okuma Mach Works Ltd 平均化回折モアレ位置検出器
JP2011129126A (ja) * 2009-12-17 2011-06-30 Deere & Co ランドマーク識別のための自動標識付け
JP2013517483A (ja) * 2010-01-18 2013-05-16 クアルコム,インコーポレイテッド オブジェクトを使用した慣性航法システムの整合および較正
JP2014508284A (ja) * 2011-01-11 2014-04-03 クゥアルコム・インコーポレイテッド 画像処理に基づくカメラベースの位置特定およびナビゲーション

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0385413A (ja) * 1989-08-30 1991-04-10 Okuma Mach Works Ltd 平均化回折モアレ位置検出器
JP2011129126A (ja) * 2009-12-17 2011-06-30 Deere & Co ランドマーク識別のための自動標識付け
JP2013517483A (ja) * 2010-01-18 2013-05-16 クアルコム,インコーポレイテッド オブジェクトを使用した慣性航法システムの整合および較正
JP2014508284A (ja) * 2011-01-11 2014-04-03 クゥアルコム・インコーポレイテッド 画像処理に基づくカメラベースの位置特定およびナビゲーション

Also Published As

Publication number Publication date
JP2019086390A (ja) 2019-06-06

Similar Documents

Publication Publication Date Title
US9134339B2 (en) Directed registration of three-dimensional scan measurements using a sensor unit
US10788830B2 (en) Systems and methods for determining a vehicle position
US7895761B2 (en) Measurement method and measuring device for use in measurement systems
US11227168B2 (en) Robust lane association by projecting 2-D image into 3-D world using map information
US6931322B2 (en) Method for correcting position error in navigation system
CA3066341A1 (fr) Procede et dispositif de correction de donnees de carte
KR101632225B1 (ko) 측지 측량 시스템 및 다수의 타겟 추적 기능을 갖는 방법
US20080319664A1 (en) Navigation aid
US8538671B2 (en) Apparatus and method for detecting position and orientation of mobile object
US10495456B2 (en) Method for calibrating a detection device, and detection device
CN108759834B (zh) 一种基于全局视觉的定位方法
US8199316B2 (en) Device and method for tracking the movement of a tool of a handling unit
CN113710988A (zh) 用于识别环境传感器的功能能力的方法、控制仪和车辆
US20110308309A1 (en) Method for wheel suspension measurement and a device for measuring the wheel suspension geometry of a vehicle
US20150192657A1 (en) Method for determining a position of a vehicle, and a vehicle
JP2007506109A (ja) 携帯型測定装置の空間位置の決定方法とシステム
CN108759815A (zh) 一种用于全局视觉定位方法中的信息融合组合导航方法
WO2019093316A1 (fr) Dispositif de positionnement de corps en mouvement et son procédé d'étalonnage
KR20200119092A (ko) 차량 및 차량의 위치 검출 방법
JP5122693B1 (ja) 車載測量システム
US20170046953A1 (en) Method and apparatus for determining direction of the beginning of vehicle movement
CN108955683A (zh) 基于全局视觉的定位方法
CN113063441A (zh) 里程计累计推算误差的数据源纠正方法及装置
KR20180137904A (ko) 차량 주행 정보의 보정 장치 및 방법
JP2021017073A (ja) 位置推定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18876458

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18876458

Country of ref document: EP

Kind code of ref document: A1