WO2019209169A1 - Système de positionnement précis - Google Patents

Système de positionnement précis Download PDF

Info

Publication number
WO2019209169A1
WO2019209169A1 PCT/SG2018/050205 SG2018050205W WO2019209169A1 WO 2019209169 A1 WO2019209169 A1 WO 2019209169A1 SG 2018050205 W SG2018050205 W SG 2018050205W WO 2019209169 A1 WO2019209169 A1 WO 2019209169A1
Authority
WO
WIPO (PCT)
Prior art keywords
landmark
sub
pose
camera
positioning system
Prior art date
Application number
PCT/SG2018/050205
Other languages
English (en)
Inventor
Qinghua Xia
Original Assignee
Unitech Mechatronics Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unitech Mechatronics Pte Ltd filed Critical Unitech Mechatronics Pte Ltd
Priority to PCT/SG2018/050205 priority Critical patent/WO2019209169A1/fr
Priority to CN201880092763.XA priority patent/CN112074706A/zh
Publication of WO2019209169A1 publication Critical patent/WO2019209169A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Definitions

  • the present disclosure describes an imaging panel, a camera, an inertia measurement unit, an altimeter, and an MCU that forms a precise positioning system, for obtaining its pose with respect to the passive or projected landmarks.
  • a precise positioning system is essential for navigation of a mobile robot such as unmanned aerial vehicle (UAV) or unmanned ground vehicle (UGV).
  • UAV unmanned aerial vehicle
  • UUV unmanned ground vehicle
  • positioning and orientation accuracy is crucial.
  • Positioning systems employing ultrasonic sensors, infrared sensors, laser range finder, wireless beacons, and vision exist in the market.
  • a problem associated with ultrasonic or infrared sensor based positioning systems is that, the systems can only provide position information, unable to provide orientation information. In order to navigate around, additional sensor is needed for a mobile robot to get orientation information.
  • a problem associated with laser range finder based positioning systems is that, the calculated position accuracy may drop under some dynamic environments. Sometimes, it is unable to obtain its own position at all under some scenarios.
  • vSLAM visual simultaneous localization and mapping approach
  • Wireless based positioning systems suffer from uncertainties of non-line-of-sight conditions and radio multi-path issue, which affect position accuracy.
  • patent WO 2004/015369 A2 discloses a tracking, autocalibration, and map-building system with artificial landmarks on the ceiling as one of the positioning methods.
  • Patent CN 102419178 A discloses a mobile robot positioning system and methods based on irradiating infrared landmarks on the ceiling.
  • Patent CN 102135429 A discloses an indoor positioning system based on passive landmarks on the ceiling.
  • Patent WO 2008/013355 A1 discloses a system and method for calculating location using a combination of odometer and irradiating infrared landmarks.
  • These systems employ cameras to capture either passive or irradiating artificial landmarks images.
  • theses landmarks do not contain sub-landmarks with different sizes and patterns to facilitate image recognition from different distances.
  • the effective image recognition distance of a passive or irradiating artificial landmark is shorter than that of a projected landmark.
  • a precise positioning system that can achieve less than a few millimetres position accuracy consists of an imaging panel, a camera, an inertia measurement unit, an altimeter, and an MCU.
  • either a passive or projected landmark consists of sub-landmarks combining big, medium, and small 2D codes, with groups of solid and hollow circles and squares around the 2D codes.
  • an imaging panel is positioned within focal range of the camera.
  • the central area of the imaging panel is made of light filtering material used to filter out the unwanted light spectrum, and the camera can see passive landmarks directly through the filter and obtain its pose with respect to the landmarks.
  • the rest area of the imaging panel is made of diffusion material used to capture the landmark image projected onto it.
  • the imaging panel, camera, and MCU can be mounted on a mobile robot to obtain its global pose information while navigating.
  • an altimeter on the mobile robot can be used to get its altitude information, while the inertial measurement unit, together with the mobile robot’s odometer, can be employed to estimate its location when no information can be obtained from either landmarks or projected landmarks.
  • either passive or projected landmarks can be put on a pallet to facilitate precise alignment between the pallet and a mobile forklift for manipulation purpose.
  • directional RFID tags can be put on a pallet or cabinet for a mobile forklift to know its rough pose, and then perform precise manipulation task with the help of either passive or projected landmarks on it.
  • landmarks or landmark projectors can be put on a jacket of person for a mobile robot to follow.
  • an UAV equipped with an imaging panel, a camera, an inertia measurement unit, light detection and ranging sensors, an infrared projector, an altimeter, and an MCU for navigation around a building.
  • building luminaries can be used as reference positions for UAV navigation.
  • Landmarks and landmark projectors can be mounted next to luminaries for UAV localization purpose.
  • a projector projects landmark downwards for a camera on the UAV to capture.
  • an altimeter on the UAV is used to get UAV’s altitude information, while the inertia measurement unit, together with light detection and ranging sensors, can be employed to estimate its pose when position and orientation information is not available from luminaries, landmarks or projected landmarks.
  • infrared projector onboard the UAV is used to project light toward a luminaire to trigger its motion sensor and judge working condition of the luminaire based on the brightness level variation.
  • FIG. 1 shows passive landmark, landmark projector, imaging panel and camera
  • FIG. 2 shows imaging panel and camera
  • FIG. 3 shows image processing procedure of the QR code image projected onto panel
  • FIG. 4 shows pose of projected QR code with respect to camera frame
  • FIG. 5 shows an scenario when landmark arrays are projected onto floor
  • FIG. 6 shows construction of an UAV mounted with positioning system
  • FIG. 7 shows UAV fleet for material delivery
  • FIG. 8 shows a mobile forklift for material handling
  • FIG. 9 shows a type of landmark construction
  • FIG. 10 shows a type of sub-landmark construction with big and small QR codes
  • FIG. 1 1 shows a type of sub-landmark construction with medium and small QR codes
  • FIG. 12 shows a type of sub-landmark construction with solid and hollow circles
  • FIG. 13 shows a type of sub-landmark construction with solid and hollow squares
  • FIG. 14 shows coordinate frames of components in the positioning system
  • FIG. 15 shows concept of QR code projection
  • FIG. 16 shows relationship between the ceiling and projected landmark in X direction
  • FIG. 17 shows relationship between the ceiling and projected landmark in Y direction
  • FIG. 18 shows construction of a pallet with landmarks
  • FIG. 19 illustrates pallet lifting using an autonomous mobile forklift
  • FIG. 20 shows a pallet with directional RFID tags
  • FIG. 21 shows a cabinet with directional RFID tags
  • FIG. 22 shows a robot following a person carrying a passive or projected landmark
  • FIG. 23 shows UAV navigating in a building with landmarks as reference
  • FIG. 24 shows UAV navigating with projected landmark
  • FIG. 25 shows UAV projecting infrared light to trigger motion sensor of a luminaire
  • projector 100 projects landmark onto imaging panel 102 that is located at the focal range of camera 101 .
  • Imaging panel 102 consists of optical filtering material at its central area, and diffusion material at the rest of the area.
  • Camera 101 can either capture the passive landmark image 103 through light filter 104, or the landmark image projected onto the diffusion area of imaging panel 102.
  • a landmark maybe in the form of 2D code, or other image patters that can be recognized and processed by an MCU.
  • one or more images of landmarks maybe projected onto panel 102, and camera 101 captures the images and transmits them to the MCU for processing.
  • FIG. 3 illustrates MCU’s image processing procedure of the projected QR code on imaging panel.
  • the captured image on the panel is converted into black and white image first, and then the three squares at the edges of the image are identified. Based on the three identified squares, local coordinate frame O q is assigned to the QR code, and the coordinates of the four corners D1 , D2, D3, D4 can be obtained.
  • the content of the QR code which is unique and represents its relative location with respect to landmark frame O m , can also be obtained.
  • the local pose of the image with respect to camera frame O c can be obtained. Once fixed, the global poses of all the landmarks are known. With the information, the camera’s global pose in XY plane can be obtained.
  • FIG. 5 shows the scenario when array of landmarks are projected onto floor for positioning.
  • imaging panel 102 and camera 101 can be mounted on an UAV.
  • UAV flies within array of the projected landmarks as shown in FIG. 5, one or more images of landmarks will be projected onto the panel.
  • global pose of the UAV can be obtained for navigation purpose.
  • An altimeter on the UAV can be used to get its height information with respect to ground.
  • the UAV’s global pose in 3-dimentional space can be determined.
  • the inertial measurement unit, together with the mobile robot’s odometer, maybe in the form of visual odometer, can be employed to estimate its pose during the period when no information is available from the landmarks.
  • UAV fleet can be deployed indoor for speedy point-to-point material delivery, as illustrated in FIG. 7.
  • the imaging panel, camera, and MCU can also be put on a UGV such as an autonomous forklift for localization, navigation, and precise manipulation.
  • a UGV such as an autonomous forklift for localization, navigation, and precise manipulation.
  • FIG. 9 shows a type of landmark construction with combination of big, medium, and small QR codes, plus solid and hollow squares and circles.
  • FIG. 10 shows a type of sub-landmark that forms part of a landmark shown in FIG. 9.
  • a big QR code is nested with a smaller QR code inside, surrounded with four small QR codes near the four corners of the big QR code. At horizontal and vertical direction, solid or hollow circles and squares are placed between the four small QR codes.
  • FIG. 1 1 shows a type of sub-landmark that forms part of a landmark shown in FIG. 9.
  • Four medium QR codes arranged in an array of two rows by two columns is also surrounded with four small QR codes near the four corners of the array. At horizontal and vertical direction, solid or hollow circles and squares are place between the four small QR codes.
  • FIG. 12 shows a type of sub-landmark with eight solid or hollow circles, with solid circle representing“1”, and hollow circle representing“0”.
  • the combination of solid and hollow circles represents the position of the sub-landmark in X direction of the landmark frame. For example, counting from left to right, seven hollow circles followed by one solid circle represent binary “00000001”, indicating its unique position.
  • FIG. 13 shows another type of sub-landmark with eight solid or hollow squares, with solid square representing“1”, and hollow square representing“0”.
  • the combination of solid and hollow squares represents the location of the sub-landmark in Y direction of the landmark frame. For example, counting from bottom to top, six hollow squares followed by two solid squares represent binary“0000001 1”, indicating its unique position.
  • FIG. 14 shows assignment of world, robot, camera, panel, and landmark coordinate frames O w , O r , O c , O p and O m . All the coordinate systems are right-handed systems by determining the direction of the z-axis by aiming the pointer finger of the right hand along the positive x-axis and curling the palm toward the positive y-axis.
  • the world frame O w is fixed at a location, the robot frame O r is attached to the mobile robot, the origin of the camera frame O c is placed at the centre of its focal lens and attached to the mobile robot, the origin of the panel frame O p is placed at the top centre of the imaging panel and also attached to the mobile robot, the landmark frame O m can be on the ceiling.
  • the pose of an object with respect to a reference frame 0 can be represented by a homogenous transformation matrix as
  • the upper left 3x3 submatrix of the 4x4 matrix T 0 represents the relative orientation of the object with respect to the reference frame 0
  • the upper right 3x1 vector represents the object’s position with respect to the same frame.
  • the homogenous transformation matrixes of landmark frame with respect to world frame, each sub-landmark pose with respect to landmark frame, robot frame with respect to world frame, camera frame with respect to robot frame, panel frame with respect to camera frame, and each sub-landmark pose with respect to panel frame are denoted as T w m , T m q , T w r , T r c , T c p , and T p q respectively.
  • the objective of the positioning system is to obtain the pose of a mobile robot with respect to the world coordinate frame T w r , denoted as
  • the camera is attached on a mobile robot, its pose with respect to the robot frame is known and represented as
  • FIG. 15 illustrates relationship between the QR code landmark on the ceiling and its
  • the projected QR code image is captured by the camera, and then processed by the MCU to get its four corners’ coordinates expressed in pixels.
  • the four corners are denoted as D1 , D2, D3, and D4.
  • D1 and D2’s positions in camera frame are expressed as [X C ,DI y c ,Di 0] r and 0] r , then their positions on the imaging panel with respect to the camera frame can be obtained as
  • FIG. 16 and FIG. 17 show the relationship between a ceiling QR code in landmark frame O m and its projected QR code in frame O pl . Based on P c,Di(Pjj m ) ’ the position of the ceiling QR code corner d1 expressed in frame O p , can be obtained as
  • the angle 0 (dl 2) of the ceiling QR code with respect to the X axis of frame O p can be obtained, thus the homogenous transformation matrix of the ceiling QR code with respect to frame O p , can be expressed as
  • the pose of the robot with respect to the world frame can be obtained as
  • a sub-landmark shown in FIG .12 with eight solid or hollow circles is projected onto the imaging panel, its pose T p q can also be obtained following similar derivation procedure described above, and the pose of the robot with respect to world frame T w r can be obtained. If only part of the sublandmark’s eight solid or hollow circles are projected onto the imaging panel, although exact pose of the sub-landmark T p q is unable to be obtained, its orientation can still be obtained, thus the orientation of the robot with respect to world frame can be obtained.
  • FIG. 18 shows construction of a pallet with either passive or projected landmarks put at the legs of a pallet.
  • the poses of the left, middle, and right landmarks on the pallet with respect to a mobile forklift frame can be obtained as T ' r ql , T r qm and T r qr .
  • the forklift can use the parameters to align with the pallet properly and perform precise pallet lifting task.
  • FIG. 20 shows a pallet mounted with directional RFID tags, with one RFID tag restricting the tag reading zone at horizontal plan only, and three RFID tags restricting the tag reading zones at left, middle, and right vertical plan only.
  • An autonomous forklift equipped with an RFID reader will know the pallet’s rough pose based on the readings from the RFID tags on the pallet. Combing either passive or projected landmarks on the pallet, the forklift will be able to identify the pallet’s rough pose first, and then perform precise pallet lifting task.
  • FIG. 21 shows a cabinet mounted with directional RFID tags, with one RFID tag restricting the tag reading zone at horizontal plan only, and three RFID tags restricting the tag reading zones at left, middle, and right vertical plan only.
  • An autonomous forklift equipped with an RFID reader will know the cabinet’s rough pose based on the readings from the RFID tags on the cabinet.
  • FIG. 22 shows a concept when a person wearing either passive or projected landmarks, a robot with positioning system calculates its pose with respect to the landmarks, and follow the person in front of it.
  • FIG. 23 illustrates a UAV navigating in a building with luminaries, passive landmarks, and landmark projectors as reference locations.
  • FIG. 24 shows a scenario when a UAV is obtaining its pose with respect to the projector’s landmark based on the projected landmark on the UAV’s imaging panel, and navigating in the building to carry out inspection and surveillance tasks.
  • FIG. 25 illustrates a scenario when a UAV carrying infrared projector is projecting infrared light to trigger the motion sensor of a luminaire.
  • the onboard camera can be used to detect whether a luminaire’s brightness level is changed after triggering. If the brightness level is adjusted up, which means the luminaire is working well, if not, there is a need to change the faulty luminaire.
  • the UAV can detect and record the working condition to facilitate building lighting maintenance.
  • the UAV can also carry out building surveillance works together with the onboard inertia measurement unit, light detection and ranging sensors, and an altimeter.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un système de positionnement précis qui peut atteindre une précision de position de moins de quelques millimètres composé d'une caméra, d'une MCU, d'un panneau d'imagerie dont la zone centrale est composée d'un matériau de filtrage optique, le reste de la zone étant composé d'un matériau de diffusion, d'une unité de mesure d'inertie et d'un altimètre. La caméra capture une image de repère passive ou projetée et obtient sa pose par rapport au cadre de repère. Si des repères passifs et projecteurs de repère sont fixés au niveau de certains emplacements, leurs poses globales sont connues. Avec des relations connues entre le système de positionnement et des cadres de caméra, des cadres de repère et des cadres mondiaux, la pose globale du système de positionnement par rapport au cadre mondial sera obtenue. Un altimètre peut être utilisé pour obtenir ses informations d'altitude. Avec les informations d'altitude et la pose globale dans le plan XY, la pose globale du système dans un espace tridimensionnel peut être déterminée.
PCT/SG2018/050205 2018-04-28 2018-04-28 Système de positionnement précis WO2019209169A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/SG2018/050205 WO2019209169A1 (fr) 2018-04-28 2018-04-28 Système de positionnement précis
CN201880092763.XA CN112074706A (zh) 2018-04-28 2018-04-28 精确定位系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SG2018/050205 WO2019209169A1 (fr) 2018-04-28 2018-04-28 Système de positionnement précis

Publications (1)

Publication Number Publication Date
WO2019209169A1 true WO2019209169A1 (fr) 2019-10-31

Family

ID=68294489

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2018/050205 WO2019209169A1 (fr) 2018-04-28 2018-04-28 Système de positionnement précis

Country Status (2)

Country Link
CN (1) CN112074706A (fr)
WO (1) WO2019209169A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112875578A (zh) * 2020-12-28 2021-06-01 深圳市易艾得尔智慧科技有限公司 一种无人叉车控制系统
JP2022007511A (ja) * 2020-06-26 2022-01-13 株式会社豊田自動織機 認識装置、認識方法、及びマーカ
JP7466813B1 (ja) 2023-04-07 2024-04-12 三菱電機株式会社 自動接続機構、自律走行車、および自動接続方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004015369A2 (fr) * 2002-08-09 2004-02-19 Intersense, Inc. Systeme de suivi, d'etalonnage automatique et d'elaboration de plan
JP2004328496A (ja) * 2003-04-25 2004-11-18 Toshiba Corp 画像処理方法
CN102135429A (zh) * 2010-12-29 2011-07-27 东南大学 一种基于视觉的机器人室内定位导航方法
CN102419178A (zh) * 2011-09-05 2012-04-18 中国科学院自动化研究所 基于红外路标的移动机器人定位系统和方法
US20150153639A1 (en) * 2012-03-02 2015-06-04 Mitsubishi Paper Mills Limited Transmission type screen
CN105184343A (zh) * 2015-08-06 2015-12-23 吴永 一种复合条码
US20160078335A1 (en) * 2014-09-15 2016-03-17 Ebay Inc. Combining a qr code and an image
CN107450540A (zh) * 2017-08-04 2017-12-08 山东大学 基于红外路标的室内移动机器人导航系统及方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7903048B2 (en) * 2004-06-18 2011-03-08 Pioneer Corporation Information display apparatus and navigation apparatus
CN201548685U (zh) * 2009-11-26 2010-08-11 山东大学 顶棚投影器辅助导航装置
CN104641315B (zh) * 2012-07-19 2017-06-30 优泰机电有限公司 3d触觉感应设备
CN104766309A (zh) * 2015-03-19 2015-07-08 江苏国典艺术品保真科技有限公司 一种平面特征点导航定位方法与装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004015369A2 (fr) * 2002-08-09 2004-02-19 Intersense, Inc. Systeme de suivi, d'etalonnage automatique et d'elaboration de plan
JP2004328496A (ja) * 2003-04-25 2004-11-18 Toshiba Corp 画像処理方法
CN102135429A (zh) * 2010-12-29 2011-07-27 东南大学 一种基于视觉的机器人室内定位导航方法
CN102419178A (zh) * 2011-09-05 2012-04-18 中国科学院自动化研究所 基于红外路标的移动机器人定位系统和方法
US20150153639A1 (en) * 2012-03-02 2015-06-04 Mitsubishi Paper Mills Limited Transmission type screen
US20160078335A1 (en) * 2014-09-15 2016-03-17 Ebay Inc. Combining a qr code and an image
CN105184343A (zh) * 2015-08-06 2015-12-23 吴永 一种复合条码
CN107450540A (zh) * 2017-08-04 2017-12-08 山东大学 基于红外路标的室内移动机器人导航系统及方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
COLIOS C. I. ET AL.: "A framework for visual landmark identification based on projective and point-permutation invariant vectors", ROBOTICS AND AUTONOMOUS SYSTEMS, vol. 35, no. 1, 30 April 2001 (2001-04-30), pages 37 - 51, XP004231364, [retrieved on 20180626], DOI: 10.1016/S0921-8890(00)00129-9 *
TIAN H.: "QR Code and Its Applications on Robot Self-localization", A THESIS IN PATTERN RECOGNITION AND INTELLIGENCE SYSTEM, 30 June 2014 (2014-06-30), pages 1 - 72 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022007511A (ja) * 2020-06-26 2022-01-13 株式会社豊田自動織機 認識装置、認識方法、及びマーカ
JP7351265B2 (ja) 2020-06-26 2023-09-27 株式会社豊田自動織機 認識装置及び認識方法
CN112875578A (zh) * 2020-12-28 2021-06-01 深圳市易艾得尔智慧科技有限公司 一种无人叉车控制系统
CN112875578B (zh) * 2020-12-28 2024-05-07 深圳鹏鲲智科技术有限公司 一种无人叉车控制系统
JP7466813B1 (ja) 2023-04-07 2024-04-12 三菱電機株式会社 自動接続機構、自律走行車、および自動接続方法

Also Published As

Publication number Publication date
CN112074706A (zh) 2020-12-11

Similar Documents

Publication Publication Date Title
US10930015B2 (en) Method and system for calibrating multiple cameras
CN108571971B (zh) 一种agv视觉定位系统及方法
US11448762B2 (en) Range finder for determining at least one geometric information
TWI827649B (zh) 用於vslam比例估計的設備、系統和方法
CN102419178B (zh) 基于红外路标的移动机器人定位系统和方法
CN107687855B (zh) 机器人定位方法、装置及机器人
US20140267703A1 (en) Method and Apparatus of Mapping Landmark Position and Orientation
CN110009682B (zh) 一种基于单目视觉的目标识别定位方法
US11614743B2 (en) System and method for navigating a sensor-equipped mobile platform through an environment to a destination
Khazetdinov et al. Embedded ArUco: a novel approach for high precision UAV landing
EP3113147B1 (fr) Dispositif de calcul de position propre et procédé de calcul de position propre
JP2014013146A5 (fr)
WO2019209169A1 (fr) Système de positionnement précis
CN114415736B (zh) 一种无人机多阶段视觉精准降落方法和装置
CN106370160A (zh) 一种机器人室内定位系统和方法
CN107436422A (zh) 一种基于红外线灯立体阵列的机器人定位方法
CN113390426A (zh) 定位方法、装置、自移动设备和存储介质
CN106403926B (zh) 一种定位方法和系统
JP2010078466A (ja) マーカ自動登録方法及びシステム
JP5874252B2 (ja) 対象物との相対位置計測方法と装置
CN100582653C (zh) 一种采用多束光确定位置姿态的系统和方法
Aminzadeh et al. Implementation and performance evaluation of optical flow navigation system under specific conditions for a flying robot
Mutka et al. A low cost vision based localization system using fiducial markers
Cucchiara et al. Efficient Stereo Vision for Obstacle Detection and AGV Navigation.
Karakaya et al. A hybrid indoor localization system based on infra-red imaging and odometry

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18916399

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18916399

Country of ref document: EP

Kind code of ref document: A1