JP6925603B2 - Attitude control system for moving objects - Google Patents

Attitude control system for moving objects Download PDF

Info

Publication number
JP6925603B2
JP6925603B2 JP2016200639A JP2016200639A JP6925603B2 JP 6925603 B2 JP6925603 B2 JP 6925603B2 JP 2016200639 A JP2016200639 A JP 2016200639A JP 2016200639 A JP2016200639 A JP 2016200639A JP 6925603 B2 JP6925603 B2 JP 6925603B2
Authority
JP
Japan
Prior art keywords
attitude control
marker
control system
moving body
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2016200639A
Other languages
Japanese (ja)
Other versions
JP2018063512A (en
Inventor
賢哉 金田
賢哉 金田
領 此村
領 此村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hongo Aerospace Inc
Original Assignee
Hongo Aerospace Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hongo Aerospace Inc filed Critical Hongo Aerospace Inc
Priority to JP2016200639A priority Critical patent/JP6925603B2/en
Priority to PCT/JP2017/037041 priority patent/WO2018070486A1/en
Publication of JP2018063512A publication Critical patent/JP2018063512A/en
Application granted granted Critical
Publication of JP6925603B2 publication Critical patent/JP6925603B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Description

本発明は、移動体の姿勢制御システムに関する。更に具体的には、本発明は、移動体として3次元空間を移動可能な小型無人飛行機(ドローン)の自立姿勢制御システムに関する。 The present invention relates to an attitude control system for a moving body. More specifically, the present invention relates to an independent attitude control system for a small unmanned aerial vehicle (drone) capable of moving in three-dimensional space as a moving body.

近年、小型無人飛行機、典型的はドローンが、広い分野で利用されている。一般に、ドローンは、GPS衛星からの電波をドローン搭載のGPSセンサでとらえて位置データを算出し、更に、ドローン搭載のIMU(慣性計測装置)により飛行中の動き(速度、加速度等)や姿勢(傾き等)データを算出し、これら位置データ及び飛行データを利用者の操縦装置に送信している。利用者は、これらデータを参照しながら、ドローンに対して、移動命令及び姿勢制御命令等の飛行制御命令を送信して操縦制御している。 In recent years, small unmanned aerial vehicles, typically drones, have been used in a wide range of fields. In general, a drone captures radio waves from a GPS satellite with a GPS sensor mounted on the drone to calculate position data, and then uses an IMU (inertial measurement unit) mounted on the drone to perform movement (speed, acceleration, etc.) and posture (speed, acceleration, etc.) and posture ( (Inclination, etc.) data is calculated, and these position data and flight data are transmitted to the user's control device. The user controls the operation by transmitting flight control commands such as a movement command and an attitude control command to the drone while referring to these data.

ドローンの飛行制御に関し、従来、地上に反射マーカーを設置し、ドローンに赤外線高速カメラを搭載して、モーションキャプチャシステムを利用する方法が有る。赤外線高速カメラに代えて、単眼カメラ、立体カメラ、広範囲測定器等を利用したシステムも有る。 Regarding the flight control of the drone, there is a conventional method of installing a reflection marker on the ground, mounting an infrared high-speed camera on the drone, and using a motion capture system. There are also systems that use monocular cameras, stereoscopic cameras, wide-range measuring instruments, etc. instead of infrared high-speed cameras.

なお、本発明者は、小型無人飛行機の姿勢制御に関して、先行技術文献として、次の非特許文献が存在することを承知している。 The present inventor is aware that the following non-patent documents exist as prior art documents regarding attitude control of small unmanned aerial vehicles.

D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” in IEEE International Conference on Robotics and Automation, May 2011, pp.2520-2525D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” in IEEE International Conference on Robotics and Automation, May 2011, pp.2520-2525 M. Hehn and R. DAndrea, Quadcopter trajectory and control, in Proceedings of the IFAC World Congress, 2011M. Hehn and R. D Andrea, Quadcopter trajectory and control, in Proceedings of the IFAC World Congress, 2011 Michael, Nathan, et al. “The GRASP multiple micro-UAV testbed.” Robotics Automation Magazine, IEEE 17.3 (2010): 56-65Michael, Nathan, et al. “The GRASP multiple micro-UAV testbed.” Robotics Automation Magazine, IEEE 17.3 (2010): 56-65 Heng, Lionel; Meiser, L.; Tanskanen, P.; Fraundorfer, F.; Pollefeys, M., “Autonomous obstacle avoidance and maneuvering on a vision-guided MAV using on-board processing,” in Robotics and Automation (ICRA), 2011 IEEE International Conference on, vol., no., pp.2472-2477, 9-13 May 2011Heng, Lionel; Meiser, L .; Tanskanen, P .; Fraundorfer, F .; Pollefeys, M., “Autonomous obstacle avoidance and maneuvering on a vision-guided MAV using on-board processing,” in Robotics and Automation (ICRA) , 2011 IEEE International Conference on, vol., no., Pp.2472-2477, 9-13 May 2011 Stephan Weiss, Markus W. Achtelik, Simon Lynen, Margarita Chli, Roland Siegwart, “Real-time onboard visual-inertial state estimation and self-calibration on MAVs in unknown environments.” Robotics and Automation (ICRA), 2012 IEEE International Conference pp.957, May 2012Stephan Weiss, Markus W. Achtelik, Simon Lynen, Margarita Chli, Roland Siegwart, “Real-time onboard visual-inertial state estimation and self-calibration on MAVs in unknown environments.” Robotics and Automation (ICRA), 2012 IEEE International Conference pp .957, May 2012 Shaojie shen; Michael, N.; Kumar, V., “Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs,” in Robotics and Automation (ICRA), 2015 IEEE International Conference on, vol., no., pp.5303-5310, 26-30 May 2015Shaojie shen; Michael, N .; Kumar, V., “Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs,” in Robotics and Automation (ICRA), 2015 IEEE International Conference on, vol., No., pp.5303-5310, 26-30 May 2015 S. Grzonka, G. Grisetti, and W. Burgard, “Towards a navigation system for autonomous indoor flying,” in IEEE International Conference on Robotics and Automation, 2009S. Grzonka, G. Grisetti, and W. Burgard, “Towards a navigation system for autonomous indoor flying,” in IEEE International Conference on Robotics and Automation, 2009 G. Xu, Y. Zhang, S. Ji, Y. Cheng, Y. Tian, Research on computer vision-based for UAV autonomous landing on a ship, Pattern Recognition Letters, Vol. 30(6), 2009, pp.600-605G. Xu, Y. Zhang, S. Ji, Y. Cheng, Y. Tian, Research on computer vision-based for UAV autonomous landing on a ship, Pattern Recognition Letters, Vol. 30 (6), 2009, pp.600 -605 Lange, S.; Sunderhauf, N.; Protzel, P., “A vision based onboard approach for landing and position control of an autonomouse multirotor UAV in GPS-defined environments” in Advanced Robotics, 2009. ICAR 2009. International Conference on , vol., pp.1-6, 22-26 June 2009Lange, S .; Sunderhauf, N .; Protzel, P., “A vision based onboard approach for landing and position control of an autonomouse multirotor UAV in GPS-defined environments” in Advanced Robotics, 2009. ICAR 2009. International Conference on, vol., pp.1-6, 22-26 June 2009 Lugo, Jacobo Jimenez, and Andres Zell, “Framework for autonomous on-board navigation with the AR. Drone.” Journal of Intelligent and Robotic Systems 73.1-4(2014): 401-412Lugo, Jacobo Jimenez, and Andres Zell, “Framework for autonomous on-board navigation with the AR. Drone.” Journal of Intelligent and Robotic Systems 73.1-4 (2014): 401-412 Troiani, Chiara, Stefano Al Zanati, and Alessio Martineli. “A 3 points vision based approach for mav localization in GPS defined environments.” Mobile Robots (ECMR), 2013 European Conference on. IEEE, 2013.Troiani, Chiara, Stefano Al Zanati, and Alessio Martineli. “A 3 points vision based approach for mav localization in GPS defined environments.” Mobile Robots (ECMR), 2013 European Conference on. IEEE, 2013. Vogt, Sebastian, et al. “Single came tracking of marker clusters: Multiparameter cluster optimization and experimental verification” Proceedings of the 1st international Symposium on Mixed and Augmented Reality. IEEE Computer Society, 2002.Vogt, Sebastian, et al. “Single came tracking of marker clusters: Multiparameter cluster optimization and experimental verification” Proceedings of the 1st international Symposium on Mixed and Augmented Reality. IEEE Computer Society, 2002. Nikolic, Janosch, et al. “A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM.” Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 2014Nikolic, Janosch, et al. “A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM.” Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 2014 Zhou, Guyue, et al. “On-board inertia-assisted visual odometer on an embedded system.” Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 2044.Zhou, Guyue, et al. “On-board inertia-assisted visual odometer on an embedded system.” Robotics and Automation (ICRA), 2014 IEEE International Conference on. IEEE, 2044. Chen, Qian, Haiyuan Wu, and Toshikazu Wada. “Camera calibration with two arbitrary coplanar circles.” Computer Vision-ECVV 2004. Springer Berlin Heidelberg, 2004. 521-532Chen, Qian, Haiyuan Wu, and Toshikazu Wada. “Camera calibration with two arbitrary coplanar circles.” Computer Vision-ECVV 2004. Springer Berlin Heidelberg, 2004. 521-532 Elqursh, Ali, and Ahmed Elgammal. “Line-based relative pose estimation.” Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEEE, 2011Elqursh, Ali, and Ahmed Elgammal. “Line-based relative pose estimation.” Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEEE, 2011 Zhang, Zhengyou. “A flexible new technique foe camera calibration.” Pattern Analysis and Machine Intelligence, IEEE Transaction on 22.11 (2000): 1330-1334Zhang, Zhengyou. “A flexible new technique foe camera calibration.” Pattern Analysis and Machine Intelligence, IEEE Transaction on 22.11 (2000): 1330-1334 E. Rosten, R. Porter, and T. Drummond. “Faster and better: A machine learning approach to corner detection.” IEEE Trans. Pattern Analysis and Machine intelligence, 32:105119, 2010E. Rosten, R. Porter, and T. Drummond. “Faster and better: A machine learning approach to corner detection.” IEEE Trans. Pattern Analysis and Machine intelligence, 32: 105119, 2010 Harris, Chris, and Mike Stephens. “A combined corner and edge detector.” Alvey vision conference. Vol.15. 1988Harris, Chris, and Mike Stephens. “A combined corner and edge detector.” Alvey vision conference. Vol.15. 1988 Konomura, Ryo, and Koichi Hori. “Phoenix: Zynq 7000 based quadcopter robot. ReConFigurable Computing and FPGAs (ReConFig), 2014 International Conference on. IEEE, 2014.Konomura, Ryo, and Koichi Hori. “Phoenix: Zynq 7000 based quadcopter robot. ReConFigurable Computing and FPGAs (ReConFig), 2014 International Conference on. IEEE, 2014.

従来提案されている姿勢制御システムは、反射マーカーと赤外線高速カメラを利用したモーションキャプチャシステムのように構成が比較的複雑であり、姿勢制御の誤差が比較的大きい等の問題が有った。また、小型無人飛行機に搭載する姿勢制御のハードウェアには、厳しい重量制限が有り、軽量化が求められる。 The attitude control system proposed conventionally has a problem that the configuration is relatively complicated like a motion capture system using a reflection marker and an infrared high-speed camera, and an error of attitude control is relatively large. In addition, the attitude control hardware mounted on a small unmanned aerial vehicle has strict weight restrictions, and weight reduction is required.

そこで、本発明者は、従来の姿勢制御方法とは異なる新規且つ簡単な構成の小型無人飛行機の姿勢制御システムを提案する。 Therefore, the present inventor proposes an attitude control system for a small unmanned aerial vehicle having a new and simple configuration different from the conventional attitude control method.

具体的には、本発明は、新規且つ簡単な構成の移動体(例えば、小型無人飛行機)の自立姿勢制御システムを提供することを目的とする。 Specifically, an object of the present invention is to provide an independent attitude control system for a moving body (for example, a small unmanned aerial vehicle) having a novel and simple configuration.

更に、本発明は、移動体(例えば、小型無人飛行機)の自立姿勢制御システムに使用して好適な新規且つ簡単な構成のマーカーを提供することを目的とする。 Furthermore, it is an object of the present invention to provide a marker with a novel and simple configuration suitable for use in a self-sustaining attitude control system for a moving body (eg, a small unmanned aerial vehicle).

本発明に係る移動体の姿勢制御システムは、その発明の一面では、前記移動体に搭載された単眼カメラ及び姿勢制御装置と、所定の場所に設置された立体マーカーとを備え、前記姿勢制御装置は、前記単眼カメラが捉える前記立体マーカーの画像データに基づき前記移動体を自立姿勢制御している。 In one aspect of the invention, the attitude control system for a moving body according to the present invention includes a monocular camera and an attitude control device mounted on the moving body, and a three-dimensional marker installed at a predetermined location. Controls the independent attitude of the moving body based on the image data of the three-dimensional marker captured by the monocular camera.

更に、上記移動体の姿勢制御システムでは、前記移動体は、無人小型飛行機であってよい。 Further, in the attitude control system of the moving body, the moving body may be an unmanned small airplane.

更に、上記移動体の姿勢制御システムでは、前記立体マーカーは、少なくとも、1つの平面とその平面に含まれない他の平面又は点を有する形状から成るものでよい。 Further, in the movement control system of the moving body, the three-dimensional marker may have a shape having at least one plane and another plane or point not included in the plane.

更に、上記移動体の姿勢制御システムでは、前記姿勢制御装置は、前記単眼カメラが捉える前記立体マーカーの1つの平面と、その平面に含まれない他の平面又は点の画像データに関し、事前に判明している位置と、測定時の位置との差分に基づき座標化することにより三次元の姿勢制御を行っていてもよい。 Further, in the moving body posture control system, the posture control device is known in advance regarding image data of one plane of the three-dimensional marker captured by the monocular camera and other planes or points not included in the plane. Three-dimensional posture control may be performed by converting the coordinates based on the difference between the position being measured and the position at the time of measurement.

更に、上記移動体の姿勢制御システムでは、前記姿勢制御装置は、前記単眼カメラから立体マーカーの画像を取得し、取得画像をスカラー画像に変換し、該スカラー画像からカラー物体の輪郭を取得し、カラー物体の輪郭から立体マーカー及びサブ画素レベルで各頂点を検出してマーカー管理ユニットへ送り、平面の各頂点を使って最初の姿勢推定を行い、立体の各頂点を使ってバンドル調整を通じて対話型姿勢推定処理を行っていてもよい。 Further, in the moving body posture control system, the posture control device acquires an image of a three-dimensional marker from the monocular camera, converts the acquired image into a scalar image, and acquires the contour of a color object from the scalar image. Each vertex is detected from the contour of the color object at the 3D marker and subpixel level and sent to the marker management unit, the initial posture estimation is performed using each vertex of the plane, and interactive through bundle adjustment using each vertex of the 3D. The posture estimation process may be performed.

更に、上記移動体の姿勢制御システムでは、前記立体マーカーは、円錐、三角錐、四角錐、その他多角錐のいずれかの形状のマーカーでああってよい。 Further, in the motion control system of the moving body, the three-dimensional marker may be a marker having any shape of a cone, a triangular pyramid, a quadrangular pyramid, or another polygonal pyramid.

更に、上記移動体の姿勢制御システムでは、前記立体マーカーは、四角錐形状であって、表面の4つの三角面の少なくとも一面の色を変えてマーカーを非対称に着色していてもよい。 Further, in the moving body posture control system, the three-dimensional marker may have a quadrangular pyramid shape, and the marker may be asymmetrically colored by changing the color of at least one of the four triangular surfaces on the surface.

更に、上記移動体の姿勢制御システムでは、前記立体マーカーは、複数個の発光物で三次元形状を表すように配置されたマーカーであってよい。 Further, in the posture control system of the moving body, the three-dimensional marker may be a marker arranged so as to represent a three-dimensional shape with a plurality of luminescent materials.

更に、上記移動体の姿勢制御システムでは、前記移動体は、産業用ロボットであってよい。 Further, in the attitude control system of the moving body, the moving body may be an industrial robot.

更に、上記移動体の姿勢制御システムでは、前記移動体は、直線レール上を走行する物品移送用のトロッコでああってよい。 Further, in the attitude control system of the moving body, the moving body may be a truck for transferring articles traveling on a straight rail.

更に、本発明に係るマーカーは、その発明の一面では、三次元に移動可能な移動体の自立姿勢制御システムに使用される立体マーカーである。 Further, the marker according to the present invention is, in one aspect of the invention, a three-dimensional marker used in an independent posture control system for a moving body that can move three-dimensionally.

更に、上記立体マーカーでは、少なくとも、1つの平面とその平面に含まれない他の平面又は点を有する形状から成る立体マーカーである。 Further, the three-dimensional marker is a three-dimensional marker having a shape having at least one plane and another plane or point not included in the plane.

本発明によれば、新規且つ簡単な構成の移動体(例えば、小型無人飛行機)の自立姿勢制御システムを提供することが出来る。 According to the present invention, it is possible to provide an independent attitude control system for a moving body (for example, a small unmanned aerial vehicle) having a novel and simple configuration.

更に、本発明によれば、移動体(例えば、小型無人飛行機)の自立姿勢制御システムに使用して好適な新規且つ簡単な構成のマーカーを提供することが出来る。 Further, according to the present invention, it is possible to provide a marker having a novel and simple configuration suitable for use in an independent attitude control system of a moving body (for example, a small unmanned aerial vehicle).

図1Aは、本実施形態に係るドローンの自立姿勢制御システムのイメージを説明する図である。FIG. 1A is a diagram illustrating an image of an independent attitude control system for a drone according to the present embodiment. 図1Bは、ドローンの6-Dof(6自由度)姿勢制御を説明する図である。FIG. 1B is a diagram illustrating 6-Dof (6 degrees of freedom) attitude control of the drone. 図2Aは、図1に示すドローンの自立姿勢制御システムにおいて、ドローンに搭載される姿勢制御装置のブロック図である。FIG. 2A is a block diagram of an attitude control device mounted on the drone in the independent attitude control system of the drone shown in FIG. 図2Bは、図1に示すドローンの自立姿勢制御システムにおいて使用されるマーカーの一例を示す図である。FIG. 2B is a diagram showing an example of a marker used in the independent attitude control system of the drone shown in FIG. 図3は、本実施形態に係る6-Dof姿勢計測のフローを示す図である。FIG. 3 is a diagram showing a flow of 6-Dof posture measurement according to the present embodiment. 図4Aは、二次元形状マーカーと三次元形状マーカーを比較するため、二次元形状マーカーが小さな移動txをしたときのマーカー上の16ポイントを示す図である。FIG. 4A is a diagram showing 16 points on the marker when the 2D shape marker makes a small movement t x in order to compare the 2D shape marker and the 3D shape marker. 図4Bは、二次元形状マーカーと三次元形状マーカーを比較するため、三次元形状マーカーが小さな移動txをしたときのマーカー上の16ポイントを示す図である。FIG. 4B is a diagram showing 16 points on the marker when the 3D shape marker makes a small movement t x in order to compare the 2D shape marker and the 3D shape marker. 図4Cは、二次元形状マーカーと三次元形状マーカーを比較するため、二次元形状マーカーが小さな回転θyをしたときのマーカー上の16ポイントを示す図である。FIG. 4C is a diagram showing 16 points on the marker when the two-dimensional shape marker makes a small rotation θ y in order to compare the two-dimensional shape marker and the three-dimensional shape marker. 図4Dは、二次元形状マーカーと三次元形状マーカーを比較するため、三次元形状マーカーが小さな回転θyをしたときのマーカー上の16ポイントを示す図である。FIG. 4D is a diagram showing 16 points on the marker when the 3D shape marker makes a small rotation θ y in order to compare the 2D shape marker and the 3D shape marker.

以下、本発明に係る移動体の姿勢制御システムの実施形態に関し、移動体として3次元空間を移動可能な小型無人飛行機(ドローン)を例にとって、添付の図面に沿って説明する。なお、図面上、同じ参照符号は、同じ要素を示し、重複した説明を省略する。 Hereinafter, embodiments of the attitude control system for a moving body according to the present invention will be described with reference to the accompanying drawings, taking as an example a small unmanned aerial vehicle (drone) capable of moving in three-dimensional space as a moving body. In the drawings, the same reference numerals indicate the same elements, and duplicate description will be omitted.

[ドローンの姿勢制御のイメージ]
図1Aは、本実施形態に係るドローンの自立姿勢制御システムのイメージを説明する図である。本実施例の特徴は、ドローンの自立姿勢制御システムが、簡単な構成である点にある。即ち、ドローン2は、予め、地上に設置されたマーカー6を、ドローン2に搭載のカメラ20で撮影しながら、自立的に姿勢制御をしている。
[Image of drone attitude control]
FIG. 1A is a diagram illustrating an image of an independent attitude control system for a drone according to the present embodiment. The feature of this embodiment is that the independent attitude control system of the drone has a simple configuration. That is, the drone 2 autonomously controls the attitude while photographing the marker 6 installed on the ground in advance with the camera 20 mounted on the drone 2.

自立姿勢制御のために、図1Aにおいて、破線で示す要素のGPS衛星1からのデータ、利用者3の操縦機器4からのデータ等を必要としない、簡単な構成となっている。 For self-sustaining attitude control, FIG. 1A has a simple configuration that does not require data from the GPS satellite 1 of the element shown by the broken line, data from the control device 4 of the user 3, and the like.

図1Bは、ドローンの6-Dof(6自由度)姿勢制御を説明する図である。一般に、物体が三次元空間で取り得る動きの自由度を6-Dof(6自由度)といい、具体的には、物体が前又は後、上又は下、左又は右に移動できること(換言すれば、3次元の各直交座標軸に沿って移動できること)、また直交座標軸の各軸の周りを回転できることをいう。この回転に関しては、ドローン2のような航空機では、ピッチング、ヨーイング、ローリングが用いられる。即ち、ピッチングは、ドローンの左右を軸として(所謂、上下に)回転することをいう。ヨーイングは、ドローンの上下を軸として(所謂、水平面内で)回転することをいう。ローリングは、ドローンの前後の軸に対して回転(或いは傾斜)することをいう。 FIG. 1B is a diagram illustrating 6-Dof (6 degrees of freedom) attitude control of the drone. Generally, the degree of freedom of movement that an object can take in three-dimensional space is called 6-Dof (6 degrees of freedom), and specifically, the object can move forward or backward, up or down, left or right (in other words). For example, it means that it can move along each of the three-dimensional orthogonal coordinate axes), and that it can rotate around each axis of the orthogonal coordinate axes. For this rotation, aircraft such as Drone 2 use pitching, yawing, and rolling. That is, pitching means rotating around the left and right sides of the drone (so-called up and down). Yawing refers to rotation (in the so-called horizontal plane) about the top and bottom of the drone. Rolling refers to rotating (or tilting) with respect to the front and rear axes of the drone.

[自立姿勢制御システム]
(ハードウェア)
図2Aは、図1Aに示すドローンの自立姿勢制御システムにおいて、ドローン2に搭載される姿勢制御装置8のブロック図である。姿勢制御装置8は、基板10に実装された単眼カメラ20と、1個のCPU16と、メモリ18とを備えている。これらをFPGAで構成してもよい。
[Independent attitude control system]
(hardware)
FIG. 2A is a block diagram of an attitude control device 8 mounted on the drone 2 in the independent attitude control system of the drone shown in FIG. 1A. The attitude control device 8 includes a monocular camera 20 mounted on a substrate 10, a CPU 16, and a memory 18. These may be composed of FPGA.

三軸ジャイロセンサ14は、本実施形態の効果を検証するために搭載されたものであり、本実施形態にとって必須な要素ではない。しかし、所望により、三軸ジャイロセンサ14を搭載してもよい。 The triaxial gyro sensor 14 is mounted for verifying the effect of the present embodiment, and is not an essential element for the present embodiment. However, if desired, a triaxial gyro sensor 14 may be mounted.

従来、ドローン2の重力方向に対するローリング角度及びピッチング角度を計測するため、加速度計は不可欠であった。しかし、本実施形態の6-Dof姿勢計測では、姿勢制御装置8に加速度計を備えずに、ドローンの安定した飛行姿勢及び位置の飛行制御が可能となった。 Conventionally, an accelerometer has been indispensable for measuring the rolling angle and pitching angle of the drone 2 with respect to the direction of gravity. However, in the 6-Dof attitude measurement of the present embodiment, it is possible to control the flight of the drone in a stable flight attitude and position without providing the attitude control device 8 with an accelerometer.

単眼(一眼)カメラ16は、1個のレンズのカメラである。通常、距離測定、3D写真の撮影には視差を利用するため、2個のレンズを使用する双眼(二眼)カメラが必要である。本実施形態では、後述するマーカーと姿勢計測アルゴリズムを工夫することにより、単眼カメラでの姿勢制御が可能となった。 The monocular (single-lens) camera 16 is a one-lens camera. Usually, a binocular (twin-lens) camera using two lenses is required because parallax is used for distance measurement and 3D photography. In this embodiment, the posture can be controlled by the monocular camera by devising the marker and the posture measurement algorithm described later.

メモリ14は、姿勢制御のアルゴリズムを書き込む記憶領域と、CPU12の演算処理の作業領域とを有している。 The memory 14 has a storage area for writing the attitude control algorithm and a work area for arithmetic processing of the CPU 12.

なお、ドローン2には、本実施例の姿勢測定の効果を測定するに際し、基準データを得るためにモーションキャプチャ装置も搭載している。 The drone 2 is also equipped with a motion capture device in order to obtain reference data when measuring the effect of the posture measurement of this embodiment.

(マーカーの形状、設計及び設置場所)
本実施形態で使用するマーカー6は、「立体マーカー」である。本出願書類では、「立体マーカー」は、少なくとも、1つの平面とその平面に含まれない他の平面又は点を有する形状から成るものをいう。代表的には、三次元形状のマーカーである。
(Marker shape, design and installation location)
The marker 6 used in this embodiment is a "three-dimensional marker". In the present application documents, a "stereoscopic marker" means a shape having at least one plane and another plane or point not included in the plane. Typically, it is a three-dimensional marker.

好ましくは、図2Bに示すような四角錐マーカー6である。四角錐マーカー6は、平面6a,6b,6c,6dと、平面6a,6b,6eと、平面6b,6c,6eと、平面6c,6d,6eと、平面6d,6a,6eとの組み合わせである。見方を変えると、四角錐マーカー6は、平面6a,6b,6c,6dと、点6eと組み合わせた形状となっている。
しかし、四角錐に限定されない。円錐、多角錐でもよい。
Preferably, it is a quadrangular pyramid marker 6 as shown in FIG. 2B. The quadrangular pyramid marker 6 is a combination of planes 6a, 6b, 6c, 6d, planes 6a, 6b, 6e, planes 6b, 6c, 6e, planes 6c, 6d, 6e, and planes 6d, 6a, 6e. be. From a different point of view, the quadrangular pyramid marker 6 has a shape in which the planes 6a, 6b, 6c, 6d and the point 6e are combined.
However, it is not limited to a quadrangular pyramid. It may be a cone or a polygonal pyramid.

また、LED等の発光物で非同一平面を表現してもよい。 Further, a non-uniform plane may be represented by a light emitting material such as an LED.

マーカーの個数は、1個に限定されない。2個以上であってもよい。従って、同一平面でない平面上に、2個の平面マーカーを夫々設置し、2個の平面マーカーを全体として「立体マーカー」としてもよい。 The number of markers is not limited to one. There may be two or more. Therefore, two plane markers may be installed on a plane that is not the same plane, and the two plane markers may be used as a "stereoscopic marker" as a whole.

四角錐マーカーでは、表面の4つの三角面の少なくとも一面の色を変えてマーカーを非対称にすることにより、ヨー回転方向の向きを区別することが出来る。 With a quadrangular pyramid marker, the orientation in the yaw rotation direction can be distinguished by changing the color of at least one of the four triangular surfaces of the surface to make the marker asymmetric.

マーカーの設置場所は、地上、空中等何れであってもよい。リアルタイムで位置が明らかであれば、車上に時系列で移動してもよい。車上に設置して、ドローンが追従することも出来る。 The location of the marker may be on the ground, in the air, or the like. If the position is clear in real time, it may move on the vehicle in chronological order. It can be installed on the car and the drone can follow it.

マーカーの開発過程において、マーカーの形状に関し、平面形状(二次元形態)と立体形状(三次元形状)とを比較検討した。最終的に、マーカーの形状は、姿勢制御の推定誤差が少ない立体形状とし、好ましくは、四角錐とした。その理由は後で説明する。 In the process of developing the marker, the planar shape (two-dimensional shape) and the three-dimensional shape (three-dimensional shape) were compared and examined regarding the shape of the marker. Finally, the shape of the marker was a three-dimensional shape with little estimation error of attitude control, preferably a quadrangular pyramid. The reason will be explained later.

(6-Dofアルゴリズム)
この6-Dofアルゴリズムの内容は、例えば、四角錐のマーカーを特定する5つの頂点(底辺の正方形の4点と頂点の1点)に関して、事前に判明している位置と、測定時の位置との差分に基づき座標化することにある。ここで、二次元を写す単眼(一眼)カメラ16であっても、四角錐の正方形の4点内にある頂点の挙動により、三次元の姿勢制御が可能となる。
(6-Dof algorithm)
The contents of this 6-Dof algorithm are, for example, the positions known in advance and the positions at the time of measurement for the five vertices (4 points of the base square and 1 point of the vertices) that identify the marker of the quadrangular pyramid. It is to digitize based on the difference of. Here, even in the single-lens (single-lens) camera 16 that captures two dimensions, three-dimensional attitude control is possible by the behavior of the vertices within the four points of the square of the quadrangular pyramid.

最初に、本実施形態で使用された姿勢の記号及び方法を開示する。ベクトルui=[ui,vi]Tは、画像座標(0≦ui<H及び0≦vi≦V)のi番目の位置である。Xi∈R3は、ワールド座標である。T=[tx,ty,tz]T∈R3及びR∈SO(3)は、相対的なカメラの移動及び回転である。Tzを最初のカメラの方向と定義し、θ=[θxyz]をRのZ-X-Yオイラー角と定義する。次に、単一の画像シーンの再投影誤差のL2基準は、次のように定義出来る。 First, the posture symbols and methods used in this embodiment will be disclosed. The vector u i = [u i , v i ] T is the i-th position of the image coordinates (0 ≤ u i <H and 0 ≤ v i ≤ V). Xi ∈ R 3 is the world coordinate. T = [t x , t y , t z ] T ∈ R 3 and R ∈ SO (3) are relative camera movements and rotations. T z is defined as the direction of the first camera, and θ = [θ x , θ y , θ z ] is defined as the ZXY Euler angles of R. Next, the L 2 criterion for the reprojection error of a single image scene can be defined as follows.

Figure 0006925603
Figure 0006925603

Figure 0006925603
Figure 0006925603

Figure 0006925603
Figure 0006925603

Figure 0006925603
Figure 0006925603

マーカーの開発過程において、マーカーの形状に関し、平面形状(二次元形態)と立体形状(三次元形状)とを比較検討した。最終的にマーカーの形状は、姿勢制御の推定誤差が少ない「立体形状」とし、好ましくは、四角錐とした。その理由を簡単に説明する。 In the process of developing the marker, the planar shape (two-dimensional shape) and the three-dimensional shape (three-dimensional shape) were compared and examined regarding the shape of the marker. Finally, the shape of the marker was a "three-dimensional shape" with little estimation error of attitude control, preferably a quadrangular pyramid. The reason will be briefly explained.

カメラで撮影した同一平面マーカー(二次元形状マーカー)と非同一平面マーカー(三次元形状マーカー)の相違を説明するため、簡単な例を示す。図3A〜図3Dは、カメラを2つの方法、小さな移動tx(図3A及び図3B)と小さな回転θy(図3C及び図3D)で移動したときのマーカーの16ポイントを示している。二次元形状マーカーでは、並進運動と回転運動では僅かな相違しか見られない。しかし、三次元形状マーカーでは、他の同一平面の15ポイントと奥行きが異なる1つのポイント(◎)が有ることから、これら移動tx 及び回転θyの変化の相違は明瞭に分かる。 A simple example is shown to explain the difference between the coplanar marker (two-dimensional shape marker) and the non-coplanar marker (three-dimensional shape marker) taken by the camera. 3A-3D show 16 points of the marker when the camera is moved in two ways, with a small movement t x (FIGS. 3A and 3B) and a small rotation θ y (FIGS. 3C and 3D). With 2D shape markers, there is only a slight difference between translational and rotational movements. However, since the three-dimensional shape marker has one point (⊚) whose depth is different from that of other 15 points on the same plane, the difference in the changes in the movement t x and the rotation θ y can be clearly seen.

(6-Dof姿勢計測のフロー)
図3は、本実施形態に係る6-Dof姿勢計測のフローを示す図である。この手順は、予めメモリ14に書き込まれたアルゴリズムに従って、CPU12により処理される。
(6-Dof posture measurement flow)
FIG. 3 is a diagram showing a flow of 6-Dof posture measurement according to the present embodiment. This procedure is processed by the CPU 12 according to an algorithm written in the memory 14 in advance.

ステップS01で、単眼カメラ16からマーカー16を含む画像を取得する。 In step S01, an image including the marker 16 is acquired from the monocular camera 16.

ステップS02で、取得画像はスカラー画像に変換される。 In step S02, the acquired image is converted into a scalar image.

ステップS03で、スカラー画像からカラー物体の輪郭を得て、CPU12に送られる。 In step S03, the contour of the color object is obtained from the scalar image and sent to the CPU 12.

ステップS04で、CPU12により、カラー物体の輪郭からマーカー6及びサブ画素レベルで各頂点6a〜6eを検出する。 In step S04, the CPU 12 detects the vertices 6a to 6e at the marker 6 and the sub-pixel level from the contour of the color object.

ステップS05で、マーカー6及び各頂点6a〜6eをマーカー管理ユニットへ送られる。 In step S05, the marker 6 and the vertices 6a to 6e are sent to the marker management unit.

ステップS06で、マーカー6の平面の各頂点6a〜6dを使って、最初の姿勢推定を行う。 In step S06, the initial posture estimation is performed using the vertices 6a to 6d of the plane of the marker 6.

ステップS07で、マーカー6の立体の各頂点6a〜6eを使ってバンドル調整を通じて対話型姿勢推定を行う。 In step S07, interactive posture estimation is performed through bundle adjustment using the vertices 6a to 6e of the solid of the marker 6.

[検証実験:クワッドコプターによる飛行制御]
6-Dofアルゴリズムの内容では、四角錐の頂点である5点により姿勢計測を行うと説明したが、以下の実験ではこの5点を含む16点で検証している。検証実験では、同一平面形状として正方形を選択し、立体形状として図2Bに示す四角錐を選択した。
[Verification experiment: Flight control by quadcopter]
In the contents of the 6-Dof algorithm, it was explained that the attitude is measured at 5 points, which are the vertices of the quadrangular pyramid, but in the following experiment, it is verified at 16 points including these 5 points. In the verification experiment, a square was selected as the coplanar shape, and a quadrangular pyramid shown in FIG. 2B was selected as the three-dimensional shape.

2つのタイプのマーカーは、地面に置かれ、姿勢推定を行うため地面に対して固定されている。平面形状マーカーのサイズは19cm×19cm×0.3cmであり、立体形状マーカーのサイズは19cm×19cm×12cmであった。
検証のための6-DoF姿勢の正しいデータは、モーションキャプチャ装置を使用したデータを用いた。
Two types of markers are placed on the ground and fixed to the ground for attitude estimation. The size of the planar marker was 19 cm × 19 cm × 0.3 cm, and the size of the three-dimensional marker was 19 cm × 19 cm × 12 cm.
For the correct data of 6-DoF attitude for verification, the data using the motion capture device was used.

本発明者等は、単一のマーカーを使って小型無人飛行機を移動して6-DoF姿勢推定の結果を記録した。平面形状マーカーの場合、θxy,tx,ty推定に対して大きな白色雑音(ホワイトノイズ)が発生していた。しかし、これら白色雑音は、立体形状マーカーの場合には大幅に改善することが出来た。 We used a single marker to move a small unmanned aerial vehicle and recorded the results of 6-DoF attitude estimation. In the case of the planar marker, a large white noise (white noise) was generated with respect to the estimation of θ x , θ y , t x , t y. However, these white noises could be significantly improved in the case of the three-dimensional shape marker.

立体形状マーカーを2個の使った場合と1個使った場合との6-DoF姿勢推定の比較も行った。この実験では、2個のマーカーの位置関係(距離28cm)は、事前に姿勢制御装置に登録してある。2個の立体形状マーカーを使った場合、更に、白色雑音は減少し、精度よく推定できることが確認された。 A comparison of 6-DoF attitude estimation was also performed between the case where two 3D shape markers were used and the case where one stereoscopic marker was used. In this experiment, the positional relationship between the two markers (distance 28 cm) is registered in the attitude control device in advance. It was confirmed that when two three-dimensional markers were used, the white noise was further reduced and the estimation could be performed accurately.

[本実施形態の利点・効果]
(1) 三次元空間を自由に移動可能な移動体(小型無人飛行機)の単眼カメラと立体マーカーを使った姿勢推定制御により、移動体の自立飛行が可能となった。
(2) 三次元空間を自由に移動可能な移動体(小型無人飛行機)の単眼カメラを使った姿勢推定で、平面形状のマーカーより立体形状のマーカーが優位であること立証することが出来た。
[Advantages / Effects of the present embodiment]
(1) The attitude estimation control using a monocular camera and a three-dimensional marker of a moving body (small unmanned aerial vehicle) that can move freely in three-dimensional space has made it possible for the moving body to fly independently.
(2) Attitude estimation using a monocular camera of a moving object (small unmanned aerial vehicle) that can move freely in three-dimensional space proved that the three-dimensional marker is superior to the planar marker.

[変形例等]
(1)本発明の実施形態を、移動体として小型無人飛行機を例に挙げて説明した。しかし、本発明の姿勢制御は、これに限定されない。移動体として、小型無人飛行機は三次元空間(XYZ方向)を自由に移動する。
[Modification example, etc.]
(1) An embodiment of the present invention has been described by taking a small unmanned aerial vehicle as an example as a moving body. However, the attitude control of the present invention is not limited to this. As a moving body, a small unmanned aerial vehicle freely moves in three-dimensional space (XYZ direction).

二次元空間を移動する産業用ロボットに関しても、本発明の姿勢制御は応用できる。この場合、床等からの高さ方法(Z方向)の制御は、他の手段で正確に測定制御出来るので、Z方向の制御部分は省略され、二次元(XY方向)の制御となる。 The attitude control of the present invention can also be applied to an industrial robot that moves in a two-dimensional space. In this case, the control of the height method (Z direction) from the floor or the like can be accurately measured and controlled by other means, so that the control portion in the Z direction is omitted and the control is two-dimensional (XY direction).

更に、一次元空間を移動する、例えば、工場内の直線レール上を走行する物品移送用のトロッコに関しても、本発明の姿勢制御は応用できる。この場合、YZ方向の制御部分は省略され、一次元(X方向)の制御となる。 Further, the attitude control of the present invention can also be applied to a truck for moving an article moving in a one-dimensional space, for example, traveling on a straight rail in a factory. In this case, the control portion in the YZ direction is omitted, and the control is one-dimensional (X direction).

(2)マーカーをGPSと連動させることで、ロボットとマーカーの相対位置表現を、GPS座標系に変換することも出来る。 (2) By linking the marker with GPS, the relative position representation of the robot and the marker can be converted into the GPS coordinate system.

1:GPS衛星、 2:小型無人飛行機,ドローン、 3:ドローン利用者、 4:ドローン操縦機器、 6:マーカー,立体マーカー、 8:姿勢制御装置、 10:基板、 14:三軸ジャイロセンサ、 メモリ、 16:CPU、 20:単眼カメラ、一眼カメラ、

1: GPS satellite, 2: Small unmanned aerial vehicle, drone, 3: Drone user, 4: Drone maneuvering equipment, 6: Marker, 3D marker, 8: Attitude control device, 10: Board, 14: Triaxial gyro sensor, Memory , 16: CPU, 20: Monocular camera, Single-lens camera,

Claims (6)

3次元空間を移動可能な移動体の姿勢制御システムにおいて、
前記移動体に搭載されたカメラ及び姿勢制御装置と、
N角形(Nは3以上の整数値)の底面とN個の側面からなる錐体の形状を有し、当該錐体の当該底面と対向するように所定の場所に凸状に設置された立体マーカーとを備え、
前記姿勢制御装置は、前記移動体の飛行中に前記カメラにより逐次撮像される前記立体マーカーの領域を含む各画像データにおける前記立体マーカーの夫々の前記錐体の前記底面の領域に対する、前記立体マーカーの前記錐体の頂点の領域の挙動に基づいて、前記移動体姿勢制御をし
前記各画像データは、時間方向に異なる各時刻の夫々に前記カメラにより撮像された結果得られる2次元空間の画像のデータであり、
前記姿勢制御として、
所定時刻における、前記立体マーカーの夫々の前記錐体の前記頂点の領域の位置を、当該所定時刻における画像データと当該所定時刻よりも前の時刻の画像データとに基づいて演算をし、
前記各時刻の夫々における前記演算の結果得られる、前記立体マーカーの前記錐体の前記各時刻の夫々の前記頂点の領域の位置の推移に基づいて、前記移動体の姿勢制御をする、
姿勢制御システム。
In the attitude control system of a moving body that can move in three-dimensional space
A camera and an attitude control device mounted on the moving body,
A solid that has the shape of a cone consisting of the bottom surface of an N-sided polygon (N is an integer value of 3 or more) and N side surfaces, and is convexly installed at a predetermined location so as to face the bottom surface of the pyramid. With markers,
The attitude control device refers to the stereoscopic marker with respect to the region of the bottom surface of the cone of each of the stereoscopic markers in each image data including the region of the stereoscopic marker sequentially imaged by the camera during the flight of the moving object. on the basis of the behavior of the apex region of the cone of the attitude control of the movable body,
Each of the image data is data of an image in a two-dimensional space obtained as a result of being imaged by the camera at each time different in the time direction.
As the attitude control
The position of the apex region of each of the pyramids of the three-dimensional marker at a predetermined time is calculated based on the image data at the predetermined time and the image data at a time before the predetermined time.
The attitude of the moving body is controlled based on the transition of the position of the apex region of the pyramid of the three-dimensional marker at each time of the time, which is obtained as a result of the calculation at each time.
Attitude control system.
請求項1に記載の移動体の姿勢制御システムは、The attitude control system for a moving body according to claim 1 is
複数の前記立体マーカーを備え、Equipped with a plurality of the three-dimensional markers
前記姿勢制御装置は、前記複数の立体マーカーの領域を含む前記各画像データにおける前記複数の立体マーカーの夫々の前記錐体の前記底面の領域に対する、前記複数の立体マーカーの夫々の前記錐体の頂点の領域の挙動に基づいて、前記移動体の姿勢制御をし、The attitude control device relates to a region of the bottom surface of each of the pyramids of the plurality of steric markers in each image data including the regions of the plurality of steric markers. Based on the behavior of the apex region, the posture of the moving body is controlled.
前記姿勢制御として、 As the attitude control
所定時刻における、前記複数の立体マーカーの夫々の前記錐体の前記頂点の領域の位置を、当該所定時刻における画像データと当該所定時刻よりも前の時刻の画像データとに基づいて演算をし、The position of the apex region of each of the pyramids of the plurality of solid markers at a predetermined time is calculated based on the image data at the predetermined time and the image data at a time before the predetermined time.
前記移動体の姿勢制御をする、 Controls the posture of the moving body,
姿勢制御システム。Attitude control system.
請求項に記載の移動体の姿勢制御システムにおいて、
前記移動体は、無人小型飛行機である、姿勢制御システム。
In the attitude control system for a moving body according to claim 2.
The moving body is an attitude control system, which is an unmanned small airplane.
請求項1乃至3のいずれか一項に記載の移動体の姿勢制御システムにおいて、
前記姿勢制御装置は、前記カメラが捉える前記立体マーカーの1つの底面と、その平面に含まれない頂点の画像データに関し、事前に判明している位置と、測定時の位置との差分に基づき座標化することにより三次元の姿勢制御を行っている、姿勢制御システム。
In the attitude control system for a moving body according to any one of claims 1 to 3,
The attitude control device coordinates the bottom surface of one of the three-dimensional markers captured by the camera and the image data of the apex not included in the plane based on the difference between the position known in advance and the position at the time of measurement. Attitude control system that controls three-dimensional attitude by changing to.
請求項に記載の移動体の姿勢制御システムにおいて、
前記姿勢制御装置は、前記カメラから立体マーカーの画像を取得し、取得画像をスカラー画像に変換し、該スカラー画像からカラー物体の輪郭を取得し、カラー物体の輪郭から立体マーカー及びサブ画素レベルで各頂点を検出してマーカー管理ユニットへ送り、平面の各頂点を使って最初の姿勢推定を行い、立体の各頂点を使ってバンドル調整を通じて対話型姿勢推定処理を行っている、姿勢制御システム。
In the attitude control system for a moving body according to claim 4,
The attitude control device acquires an image of a three-dimensional marker from the camera, converts the acquired image into a scalar image, acquires the contour of a color object from the scalar image, and obtains the contour of the color object from the contour of the color object at the stereoscopic marker and sub-pixel level. A posture control system that detects each vertex and sends it to the marker management unit, performs the initial posture estimation using each vertex of the plane, and performs interactive posture estimation processing through bundle adjustment using each vertex of the solid.
請求項1乃至5のうちいずれか一項に記載の移動体の姿勢制御システムにおいて、
前記立体マーカーは、複数個の発光物で三次元形状を表すように配置されたマーカーである、姿勢制御システム。
In the attitude control system for a moving body according to any one of claims 1 to 5,
The three-dimensional marker is an attitude control system, which is a marker arranged so as to represent a three-dimensional shape with a plurality of luminescent materials.
JP2016200639A 2016-10-12 2016-10-12 Attitude control system for moving objects Active JP6925603B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2016200639A JP6925603B2 (en) 2016-10-12 2016-10-12 Attitude control system for moving objects
PCT/JP2017/037041 WO2018070486A1 (en) 2016-10-12 2017-10-12 Attitude control system for mobile body, and three-dimensional marker

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2016200639A JP6925603B2 (en) 2016-10-12 2016-10-12 Attitude control system for moving objects

Publications (2)

Publication Number Publication Date
JP2018063512A JP2018063512A (en) 2018-04-19
JP6925603B2 true JP6925603B2 (en) 2021-08-25

Family

ID=61906429

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016200639A Active JP6925603B2 (en) 2016-10-12 2016-10-12 Attitude control system for moving objects

Country Status (2)

Country Link
JP (1) JP6925603B2 (en)
WO (1) WO2018070486A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109739254B (en) * 2018-11-20 2021-11-09 国网浙江省电力有限公司信息通信分公司 Unmanned aerial vehicle adopting visual image positioning in power inspection and positioning method thereof
WO2021102797A1 (en) * 2019-11-28 2021-06-03 深圳市大疆创新科技有限公司 Gimbal control method, control device, and control system
WO2021181481A1 (en) * 2020-03-09 2021-09-16 三菱電機ビルテクノサービス株式会社 Water meter reading device
CN112558629A (en) * 2020-11-30 2021-03-26 广西电网有限责任公司电力科学研究院 System and method for realizing unmanned aerial vehicle inspection task
KR102585428B1 (en) * 2021-07-26 2023-10-11 주식회사 제이슨랩 An automatic landing system to guide the drone to land precisely at the landing site

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03170289A (en) * 1989-11-30 1991-07-23 Nec Corp Target grip
JP2000293696A (en) * 1999-04-07 2000-10-20 Matsushita Electric Ind Co Ltd Picture recognizing device
US7437226B2 (en) * 2003-08-20 2008-10-14 Samsung Electronics Co., Ltd. Method of constructing artificial mark for autonomous driving, apparatus and method of determining position of intelligent system using artificial mark and intelligent system employing the same
JP5085230B2 (en) * 2007-08-24 2012-11-28 パナソニック株式会社 Mobile system
JP2009217798A (en) * 2008-02-14 2009-09-24 Seiko Epson Corp Contour detection method, contour detection device, and contour detection program
JP2010249628A (en) * 2009-04-15 2010-11-04 Toyota Industries Corp Position detector for movable body and method for detecting position of movable body using camera
JP2012014262A (en) * 2010-06-29 2012-01-19 Yaskawa Electric Corp Vehicle moving system and landmark
JP6227993B2 (en) * 2013-12-12 2017-11-08 株式会社Ihi Robot remote control system and method

Also Published As

Publication number Publication date
JP2018063512A (en) 2018-04-19
WO2018070486A1 (en) 2018-04-19

Similar Documents

Publication Publication Date Title
JP6925603B2 (en) Attitude control system for moving objects
US10475209B2 (en) Camera calibration
Nguyen et al. Robust target-relative localization with ultra-wideband ranging and communication
Bähnemann et al. A decentralized multi-agent unmanned aerial system to search, pick up, and relocate objects
TWI827649B (en) Apparatuses, systems and methods for vslam scale estimation
Loianno et al. Cooperative localization and mapping of MAVs using RGB-D sensors
CN110077595B (en) Automatic landing and recovery system of unmanned autonomous aircraft under complex dynamic bump condition
Mohta et al. Vision-based control of a quadrotor for perching on lines
Paul et al. Landing of a multirotor aerial vehicle on an uneven surface using multiple on-board manipulators
Loianno et al. A swarm of flying smartphones
Yang et al. Visual SLAM for autonomous MAVs with dual cameras
Konomura et al. FPGA-based 6-DoF pose estimation with a monocular camera using non co-planer marker and application on micro quadcopter
JP6473188B2 (en) Method, apparatus and program for generating depth map
Miller et al. Optical Flow as a navigation means for UAV
Hinzmann et al. Robust map generation for fixed-wing UAVs with low-cost highly-oblique monocular cameras
Razinkova et al. Tracking a moving ground object using quadcopter UAV in a presence of noise
Baek et al. Position tracking system using single RGB-D Camera for evaluation of multi-rotor UAV control and self-localization
Shastry et al. Autonomous detection and tracking of a high-speed ground vehicle using a quadrotor UAV
Gomez-Balderas et al. Vision-based autonomous hovering for a miniature quad-rotor
Ajmera et al. Autonomous visual tracking and landing of a quadrotor on a moving platform
Azrad et al. Quadrotor uav indoor localization using embedded stereo camera
Qi et al. An Autonomous Pose Estimation Method of MAV Based on Monocular Camera and Visual Markers
Haecker et al. An experimental study of visual flight trajectory tracking and pose prediction for the automatic computer control of a miniature airship
TaoZhang UAV 3D mapping with RGB-D camera
Li et al. Off-board visual odometry and control of an ultralight quadrotor mav

Legal Events

Date Code Title Description
RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20170706

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20191007

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20200825

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20201026

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20201222

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20210222

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20210420

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20210629

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20210728

R150 Certificate of patent or registration of utility model

Ref document number: 6925603

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150