WO2015083875A1 - Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule - Google Patents

Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule Download PDF

Info

Publication number
WO2015083875A1
WO2015083875A1 PCT/KR2013/012102 KR2013012102W WO2015083875A1 WO 2015083875 A1 WO2015083875 A1 WO 2015083875A1 KR 2013012102 W KR2013012102 W KR 2013012102W WO 2015083875 A1 WO2015083875 A1 WO 2015083875A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
estimating
reliability
position estimation
estimated
Prior art date
Application number
PCT/KR2013/012102
Other languages
English (en)
Korean (ko)
Inventor
김정호
최병호
황영배
배주한
Original Assignee
전자부품연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전자부품연구원 filed Critical 전자부품연구원
Publication of WO2015083875A1 publication Critical patent/WO2015083875A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present invention relates to camera position estimation, and more particularly, to a method for calculating the position of a mobile system such as a camera or a robot or a vehicle equipped with a camera.
  • the probabilistic filtering technique for camera position estimation is based on a particle filter that predicts a current camera pose from a motion model and newly updates a predicted position from data.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide a method for estimating camera position through particle generation and screening that can ensure the accuracy of the current position without seriously reducing the processing speed and It is to provide a mobile system applying this.
  • a camera position estimation method for achieving the above object, the first estimation step of estimating the position of the camera by a global position estimation technique; A second estimating step of estimating the position of the camera from the position estimated in the first estimating step by a sequential position estimating technique; Calculating a reliability of the position estimated in the second estimating step; And re-performing from the second estimating step if the reliability is greater than or equal to the reference in the calculating step.
  • the step of performing again from the first estimation step may further include.
  • the first estimating step may estimate the position of the camera by using corresponding points found by feature point matching between the previous image and the current image.
  • the second estimating step may further include generating motion particles of the camera; And selecting a predetermined number of moving particles among the moving particles.
  • the selecting step may include calculating weights of the moving particles; And selecting the predetermined number of moving particles with reference to the weights.
  • the weight may be a number in which the absolute value of the difference between the observation in the current image and the corresponding observation in the previous image is less than the reference.
  • the sum of the weights may be calculated as the reliability.
  • a mobile system a camera for generating an image by shooting; And estimating the position of the camera using a global position estimation technique, estimating the position of the camera using a sequential position estimation technique from the estimated position, and if the reliability of the estimated position is higher than a reference, the sequential position. And a processor that estimates the position of the camera again using an estimation technique.
  • the global position estimation technique and the sequential position estimation technique are adaptively used based on the reliability to ensure the accuracy of the current position without seriously reducing the position estimation speed. It becomes possible.
  • FIG. 1 is a flowchart provided for explaining a camera position estimation method according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating a camera motion particle generation result by the RANSAC algorithm
  • FIG. 3 is a view illustrating a camera position estimation result according to the method illustrated in FIG. 1, and
  • FIG. 4 is a block diagram of a robot system according to another embodiment of the present invention.
  • the camera position estimation method according to the present embodiment selectively performs a global position estimation technique and a sequential position tracking technique in order to prevent errors from accumulating in estimating the position of the camera from an image.
  • the accuracy of the camera position estimation is improved, which will be described in detail below.
  • a position of a camera is estimated by using a global position estimation technique (S110).
  • the global position estimation technique in step S110 is a technique of estimating the movement of the camera using the corresponding points found by the feature point matching between the previous image and the current image.
  • the global position estimation technique in step S110 may utilize other types of global position estimation techniques, such as based on color matching, in addition to feature point matching.
  • the position of the camera is estimated from the position estimated in step S110 by the image-based sequential position estimation technique (S120 to S140).
  • Camera motion particles are generated from the estimated position (S120).
  • Camera movement particles include camera rotation and translation.
  • the camera motion particle generation in step S120 may be performed by applying a sample acquisition method by a random SAmple consensus (RANSAC) algorithm.
  • RANSAC random SAmple consensus
  • FIG. 2 illustrates the previous image and the current image
  • the center of FIG. 2 illustrates the distribution of particles by three-dimensional movement of the camera
  • the right side of FIG. 2 illustrates the distribution of particles by three-dimensional rotation of the camera. It was.
  • a weight is calculated for the camera motion particles generated in step S120 (S130).
  • the weight for the camera moving particle can be calculated by the following equation (1).
  • m l is the Three-dimensional coordinates from the previous image corresponding to Is the i-th camera motion particle
  • ⁇ l is the degree of ambiguity in the image.
  • the weight may be regarded as a number in which the absolute value of the difference between the observation in the current image and the observation in the previous image corresponding thereto is less than the reference.
  • step S130 When the weights are calculated for all camera motion particles in step S130, only some of the camera motion particles generated in step S120 are selected based on the calculated weights (S140).
  • step S140 can be seen to function as a particle filter in the camera position estimation method according to the present embodiment. It should be noted that the number of camera motion particles selected in step S140 is not one but a plurality, and fixed to N.
  • step S150 the reliability of the N camera motion particles selected in step S140 is calculated (S150).
  • Reliability C in step S150 is calculated as the sum of the selected N camera motion particles, as shown in Equation 2 below.
  • step S150 When the reliability C calculated in step S150 is equal to or greater than the threshold Th (S160-Y), sequential position estimation is continued for the next image of the current image (S120 to S140).
  • the threshold Th may be set to a desired value, or may be set to be adaptively changed according to needs and circumstances.
  • step S120 Sequential position estimation is continued, and the camera motion particles are regenerated from the camera motion particles selected in step S130 (S120), the weights are recalculated for the generated camera motion particles (S130), and based on the calculated weights.
  • step S120 only some of the camera motion particles generated again are selected again (S140).
  • step S150 when the reliability (C) calculated in the step S150 is less than the threshold (Th) (S160-N), the next image of the current image is performed again from step S110. That is, after estimating the position of the camera again by global position estimation, sequential position estimation is performed. This can be seen as initializing the position estimate and performing a new one again.
  • FIG. 3 illustrates a camera position estimation result according to the method illustrated in FIG. 1.
  • the red line is the actual moving path of the camera
  • the blue line is the camera position estimation path.
  • the dark spots shown in FIG. 3 have low reliability, and thus are positions that are restarted from the global position estimation.
  • the robot system 200 includes a camera 210, a communication unit 220, a processor 230, a driver 240, and a storage unit 250.
  • the camera 210 is a means for generating an image through photographing, but may be implemented as a camera having six degrees of freedom, but there is no limitation on the type and location thereof.
  • the driver 240 is a means for performing the movement of the robot system 200 and other functions.
  • the processor 230 estimates the position of the camera position 210, that is, the position of the robot system 200, by executing the position estimation algorithm illustrated in FIG. 1 based on the image generated by the camera 210.
  • the communicator 220 is a means for wirelessly communicating with the outside, and may transmit an image captured by the camera 210 and location information estimated by the processor 230 as well as other information to the outside.
  • the storage unit 250 stores an image generated by the camera 210, position information estimated by the processor 230, and the like, and a position estimation algorithm illustrated in FIG. 2 is stored as a program.
  • the robot system 200 shown in FIG. 4 is just an example to which the position estimation method shown in FIG. 1 is applicable.
  • the position estimation method shown in FIG. 1 may be applied to other movable systems (eg, vehicles) in addition to the robot system 200 shown in FIG. 4.

Abstract

L'invention concerne un procédé et un système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule. Le procédé d'estimation d'emplacement de caméra selon un mode de réalisation de la présente invention estime un emplacement par utilisation de manière adaptative d'une technique d'estimation d'emplacement globale et d'une technique d'estimation d'emplacement séquentielle basé sur la fiabilité. Par conséquent, la précision d'un emplacement actuel peut être garantie sans réduire sensiblement la vitesse d'estimation. L'estimation d'emplacement globale est réalisée à nouveau lorsque la fiabilité de l'estimation d'emplacement séquentielle est réduite, et ainsi, un problème d'accumulation d'erreurs, qui peut se produire dans le processus de l'estimation d'emplacement séquentielle, peut être résolu.
PCT/KR2013/012102 2013-12-03 2013-12-24 Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule WO2015083875A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20130148932A KR101483549B1 (ko) 2013-12-03 2013-12-03 입자 생성 및 선별을 통한 카메라 위치 추정 방법 및 이동 시스템
KR10-2013-0148932 2013-12-03

Publications (1)

Publication Number Publication Date
WO2015083875A1 true WO2015083875A1 (fr) 2015-06-11

Family

ID=52590700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/012102 WO2015083875A1 (fr) 2013-12-03 2013-12-24 Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule

Country Status (2)

Country Link
KR (1) KR101483549B1 (fr)
WO (1) WO2015083875A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109323697A (zh) * 2018-11-13 2019-02-12 大连理工大学 一种针对室内机器人任意点启动时粒子快速收敛的方法
CN110998472A (zh) * 2017-08-03 2020-04-10 日本电产新宝株式会社 移动体以及计算机程序
CN112767476A (zh) * 2020-12-08 2021-05-07 中国科学院深圳先进技术研究院 一种快速定位系统、方法及应用

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167670A1 (en) * 2002-12-17 2004-08-26 Goncalves Luis Filipe Domingues Systems and methods for computing a relative pose for global localization in a visual simultaneous localization and mapping system
US20080065267A1 (en) * 2006-09-13 2008-03-13 Samsung Electronics Co., Ltd. Method, medium, and system estimating pose of mobile robots
US20100152945A1 (en) * 2008-12-17 2010-06-17 Samsung Electronics Co., Ltd. Apparatus and method of localization of mobile robot
WO2012093799A2 (fr) * 2011-01-04 2012-07-12 Chon Young-Ill Système de gestion de santé et de sécurité à large bande totale, équipement de communication l'utilisant, site spécial indépendant d'un réseau omniprésent et procédé correspondant
WO2012124852A1 (fr) * 2011-03-14 2012-09-20 (주)아이티엑스시큐리티 Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100966875B1 (ko) * 2006-09-26 2010-06-29 삼성전자주식회사 전방위 영상을 이용한 로봇의 위치 결정방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167670A1 (en) * 2002-12-17 2004-08-26 Goncalves Luis Filipe Domingues Systems and methods for computing a relative pose for global localization in a visual simultaneous localization and mapping system
US20080065267A1 (en) * 2006-09-13 2008-03-13 Samsung Electronics Co., Ltd. Method, medium, and system estimating pose of mobile robots
US20100152945A1 (en) * 2008-12-17 2010-06-17 Samsung Electronics Co., Ltd. Apparatus and method of localization of mobile robot
WO2012093799A2 (fr) * 2011-01-04 2012-07-12 Chon Young-Ill Système de gestion de santé et de sécurité à large bande totale, équipement de communication l'utilisant, site spécial indépendant d'un réseau omniprésent et procédé correspondant
WO2012124852A1 (fr) * 2011-03-14 2012-09-20 (주)아이티엑스시큐리티 Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110998472A (zh) * 2017-08-03 2020-04-10 日本电产新宝株式会社 移动体以及计算机程序
CN109323697A (zh) * 2018-11-13 2019-02-12 大连理工大学 一种针对室内机器人任意点启动时粒子快速收敛的方法
CN109323697B (zh) * 2018-11-13 2022-02-15 大连理工大学 一种针对室内机器人任意点启动时粒子快速收敛的方法
CN112767476A (zh) * 2020-12-08 2021-05-07 中国科学院深圳先进技术研究院 一种快速定位系统、方法及应用
CN112767476B (zh) * 2020-12-08 2024-04-26 中国科学院深圳先进技术研究院 一种快速定位系统、方法及应用

Also Published As

Publication number Publication date
KR101483549B1 (ko) 2015-01-16

Similar Documents

Publication Publication Date Title
CN107747941B (zh) 一种双目视觉定位方法、装置及系统
WO2017077925A1 (fr) Procédé et système d'estimation de pose tridimensionnelle de capteur
US9129397B2 (en) Human tracking method and apparatus using color histogram
WO2010005251A9 (fr) Procédé de surveillance d'objets multiples, dispositif et support de stockage
CN104778690A (zh) 一种基于摄像机网络的多目标定位方法
WO2022059955A1 (fr) Système de détermination de posture basé sur des nuages de points radar
WO2015083875A1 (fr) Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule
CN110243390B (zh) 位姿的确定方法、装置及里程计
CN105374049B (zh) 一种基于稀疏光流法的多角点跟踪方法及装置
WO2017099510A1 (fr) Procédé permettant de segmenter une scène statique sur la base d'informations statistiques d'image et procédé s'y rapportant
CN113012224B (zh) 定位初始化方法和相关装置、设备、存储介质
CN109118532A (zh) 视觉景深估计方法、装置、设备及存储介质
WO2020014864A1 (fr) Procédé et dispositif de détermination de pose, et support de stockage lisible par ordinateur
CN111273701A (zh) 一种云台视觉控制系统以及控制方法
CN111815679B (zh) 一种基于双目相机的空间目标特征点丢失期间轨迹预测方法
JP2014238409A (ja) 距離算出装置及び距離算出方法
CN112955712A (zh) 目标跟踪方法、设备及存储介质
CN106683113A (zh) 特征点跟踪方法和装置
WO2011162309A1 (fr) Dispositif, procédé et programme d'extraction de région d'objet
CN113936042B (zh) 一种目标跟踪方法、装置和计算机可读存储介质
CN112802112B (zh) 视觉定位方法、装置、服务器及存储介质
WO2018131729A1 (fr) Procédé et système de détection d'un objet mobile dans une image à l'aide d'une seule caméra
CN114740854A (zh) 一种机器人避障控制方法和装置
CN110705334A (zh) 目标追踪方法、装置、设备和介质
CN115115530A (zh) 一种图像去模糊的方法、装置、终端设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13898583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13898583

Country of ref document: EP

Kind code of ref document: A1