WO2012150150A1 - Procédé d'estimation assistée par ordinateur de la position d'un objet - Google Patents

Procédé d'estimation assistée par ordinateur de la position d'un objet Download PDF

Info

Publication number
WO2012150150A1
WO2012150150A1 PCT/EP2012/057458 EP2012057458W WO2012150150A1 WO 2012150150 A1 WO2012150150 A1 WO 2012150150A1 EP 2012057458 W EP2012057458 W EP 2012057458W WO 2012150150 A1 WO2012150150 A1 WO 2012150150A1
Authority
WO
WIPO (PCT)
Prior art keywords
pose
image
determined
estimated
features
Prior art date
Application number
PCT/EP2012/057458
Other languages
German (de)
English (en)
Inventor
Wendelin Feiten
Thilo Grundmann
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2012150150A1 publication Critical patent/WO2012150150A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the invention relates to a method for computer-assisted
  • a machine-accurate location of an object is required.
  • a corresponding robot accurately manipulates its environment, e.g. to edit or move certain objects.
  • Pose is the combination of position and orientation of the object to understand. As part of a three-dimensional localization, a pose includes a 3D position and 3D orientation and thus a total of 6 dimensions.
  • Known methods for estimating the position evaluate images taken with a corresponding camera system, in order thereby to identify the objects contained therein and to determine their position. While such procedures provide accurate results, however, is not set by the method by which the uncertainty determined according ⁇ object pose is associated. However, this information may be helpful in certain applications. For example, at too high uncertainty of the estimate, the estimated pose can ver ⁇ worfen are or are acquired more data to accurately position estimate.
  • the object of the invention is therefore to provide a method for computer-assisted estimation of an object which, in addition to an exact object pose, also specifies a measure for the uncertainty of the estimated object pose.
  • This object is achieved by the independent claims ge ⁇ triggers. Further developments of the invention are defined in the dependent claims.
  • the inventive method for computer-aided location estimate of an object using a model database, wel ⁇ surface includes a plurality of features of the object, wherein each feature is assigned to a local position on the object and a first covariance matrix for the local position.
  • the model database is considered as given and their generation is not part of the method according to the invention. It is known from the prior art to generate ent ⁇ speaking model databases for estimating the position of an object.
  • a step a) features of the object, which are extracted from one or several ⁇ reindeer, recorded with a camera system camera images of the object, as compared with characteristics of the object from the model database, whereby a hypothesis for a object pose is obtained in a reference coordinate system.
  • a method for carrying out step a) are described in the prior art. In particular, such a method can be found in reference [1]. The entire disclosure content of this document is incorporated by reference into the content of the present application.
  • a step b) of the process according to the invention will ba ⁇ sierend on a sensor model of the camera system and the hybrid pothese for the object pose an error function depending on the object pose minimized as a variable, whereby an estimated object pose is obtained.
  • the sensor model ⁇ be overwriting the association between a local position on the object (ie, a position in a fixed communication system of the object) and an image position in the corresponding camera image, depending on the object pose.
  • the sensor model defines a second covariance matrix for each image position in the respective camera image.
  • the Error function describes an error measure between about the sensor model determined image positions contained in the respective camera image features of the object and on the Kame ⁇ rasystem measured image positions of the features of the object.
  • a function ⁇ on which implies on the minimization of the error function is defined and describes the dependence of the estimated object pose from the measured image positions of the features of the object determining a Jacobian matrix for the implicit function using implicit differentiation and from this, using the first and second covariance matrices, the probability distribution of the estimated object pose is determined.
  • a probability distribution for the estimated pose can be derived from the minimization of the error function using the known method of implicit differentiation.
  • the inventive method thus provides not only a highly accurate estimate of the object pose as another Informa ⁇ tion a measure of the uncertainty of the estimated object pose in the form of a probability distribution.
  • the to be minimized Starbuckstalkti ⁇ on is a sum of distance measures, wherein a respective distance ⁇ measure the difference between an over the sensor model he ⁇ mediated image position of a respective, in the corresponding describes the feature of the object contained in the camera image and the image position of the feature measured via the camera system.
  • a third covariance matrix from the first and second covariance matrix is determined for each measured image position of a feature contained in the corresponding camera image.
  • just be ⁇ signed third covariance matrix is included in the error function.
  • the error function is a weighted sum of the distance measures, wherein a distance measure in such a way by the third covariance matrix of the gezzii- the respective distance measure gene measured image position dependent in that the distance measure is weighted less so, the larger the scattering described by the third Ko ⁇ variance matrix is ,
  • a covariance matrix for the probability distribution of the estimated object pose is determined from the third covariance matrices and the Jacobi matrix by means of matrix multiplication.
  • a block diagonal matrix is formed from all third covariance matrices and this matrix is multiplied by the Jacobian matrix and the transposed Jacobian matrix.
  • the probability distribution of the estimated object pose is described by a Gaussian distribution with the estimated object pose as mean value.
  • the matrix described above is used as the covariance matrix ⁇ , which was determined from the third and covariance matrices of the Jacobian matrix by matrix multiplication.
  • a sensor model is used, which is based on a hole camera model.
  • the Lochnmodell to describe optical images is well known in the prior art and will be described again in the detailed description.
  • This model is non-linear.
  • a linearization of the hole camera model is used to determine the third covariance matrix. The detailed description explains how such a linearization can be performed to determine the third covariance matrix.
  • SIFT scale-invariant feature transform
  • the method according to the invention are prepared from the determined in step c) the probability distribution of the estimated object pose with- means of sampling object poses removed and determines a His ⁇ diffractogram of extracted object poses, which satisfy a predetermined threshold criterion, whereby a new probability distribution is obtained.
  • the threshold criterion is dependent on the value of the error function for the removed object pose.
  • the threshold criterion is defined such that the extracted object pose satisfies the threshold criterion when the exponential function multiplies by -1
  • Value of the error function for the extracted object pose as Ex ⁇ components is greater than a predetermined threshold.
  • the appropriate choice of the predetermined threshold lies within the scope of expert action.
  • an estimated object pose is rejected if the probability distribution determined in step c) is rejected.
  • ment and / or the sampled new probability distribution have a dispersion which is greater than a predetermined threshold.
  • the appropriate choice of the predetermined threshold is within the scope of expert action.
  • the invention further relates to a device for computer-aided position estimation of an object, wherein in the device a model database is stored, which contains a plurality of features of Ob ⁇ jects, each feature a local position on the object and a associated first covariance matrix for the local Po ⁇ sition, the apparatus comprising a camera system for recording images from the camera of the object as well as a computing unit, said computing unit is such out ⁇ staltet that the erfindungsge ⁇ Permitted method with this computer unit or a or several variants of the method according to the invention are feasible.
  • the invention further relates to a robot comprising the device according to the invention, wherein the robot, in operation, performs its movements using the object poses estimated by the device.
  • the invention further relates to a computer program product with a program code stored on a machine-readable carrier for carrying out the method according to the invention or one or more variants of the method according to the invention, when the program runs on a computer.
  • FIG. 2 is a flow chart depicting determination of an object position hypothesis based on an embodiment of the invention
  • FIG. 3 shows a schematic illustration which illustrates the estimation of an object position based on the minimization of an error function on the basis of an embodiment of the method according to the invention
  • Fig. 4 is a schematic representation which with a
  • Embodiment of the invention reproduced probability distribution and another probability distribution, which is determined by sampling from the original probability distribution.
  • the object estimation can be used, in particular, in a corresponding arithmetic unit of a robot, in order to specify a pose that is as exact as possible for the objects, which are then to be processed or adopted by the robot as part of the execution of a task.
  • a probability distribution is also determined which takes into account the uncertainty of the estimation of the object pose.
  • the method in a robot can then be dispensed with using the object pose as part of the process performed by the robot action for example at a large scattering of the trokeitsvertei ⁇ lung or the inclusion of additional 3D camera images to determine a more accurate Estimate the object pose to be initiated.
  • the method according to the invention is based on a so-called.
  • ⁇ model-based object recognition method of one or more objects to be recognized is included in the corresponding in a database models, the object pose, that is their position and orientation in space, can be estimated.
  • SIFT features are out ⁇ long known from the prior art and are not explained in detail. These characteristics were determined for informative ⁇ strong local positions on the object and are stored in the database. Methods for determining SIFT features of objects are known and not the subject of the invention. Rather, the invention assumes that there is already a corresponding model database which contains a large number of features of at least one object whose position is to be estimated.
  • One way to determine SIFT features of an object is to place the object on a rotating
  • Disc is placed and recorded via a stereo camera system from a variety of different angles.
  • Known software can then be used to determine corresponding SIFT features in combination with associated local locations on the object (i.e., positions defined in a local coordinate system of the object).
  • FIG. 1 again illustrates the determination of the SIFT features on the basis of an object in the form of a juice bag.
  • the object 0 is in the left part of Fig. 1 as a point cloud again gege ⁇ ben. From this object x SIFT features of the object are determined for a plurality of local positions, which is indicated by the arrow P.
  • the object reproduced in the right-hand part of FIG. 1 indicates the positions at which SIFT features are determined on the object surface. It can be seen that there are more SIFT features in certain areas than in others.
  • an SIFT feature m is given by way of example, where ⁇ in the following, the SIFT features with the index k with
  • the SD position x is determined by the Bundler software known per se (see document [2]), and the corresponding covariance matrix is calculated using the covariance matrix of the minimum distance between the calculated 3D position of the Determined by the SIFT feature and the corresponding lines of sight.
  • the descriptor d is the average of all v descriptors that contribute to the corresponding 3D point of the feature.
  • the list of line-of-sight x v consists of normalized vectors from the 3-D positions of the corresponding features to the v camera poses taken into account in the determination of each feature and, together with the averaged scale, represents the range of visuals. directions from which a corresponding SIFT feature can be detected.
  • the model uses only SD positions with v> 5 lines of sight.
  • a database SIFT features of the object to be located then an estimate of the pose of the object is made via a pair of shots of a 3D stereo camera system, wherein the Ka ⁇ merasystem detects the corresponding object.
  • the object localization takes place in several steps, which are illustrated in FIG.
  • a first step Sl the first, the SIFT features and their image positions calculated z .. in the images of the two cameras j ⁇ [L, R] or ext rahiert, wherein the image positions of 2D locations in the form of pixel positions in the respective camera images are.
  • Triangulation determined and stored together with the corresponding positions z .. from the two images. Then, in step S3, it is determined which local descriptors d of the extracted features match well with descriptors from the database. The resulting SIFT features k from the database are then clustered to form sentences that are spatially and in terms of ent ⁇ speaking object types concentrated. The clustering takes place in step S4. From this, finally, in step S5, a hypothesis is determined for the object pose of the object recorded by the camera system.
  • the method for object localization just described is known per se from the prior art and is described in detail in the document [1].
  • Camera coordinates C. can be the projected image coordinates z ; .. be determined in the corresponding camera image j ⁇ [L, R] with a known standard hole camera model h j ( ⁇ , ⁇ ,) as follows / 0 0 0
  • f denotes the BE for the corresponding camera system ⁇ known focal length of the camera
  • T r is the per se known homogeneous transformation from the local 3-dimensional position on the object in the left and right images of the 3D camera system camera coordinates. This homogeneous transformation depends DA ⁇ when the object pose in a stationary Weltkoordina ⁇ tensystem from.
  • sentence z ( ⁇ .,., ⁇ ⁇ ) denotes the
  • J h denotes the Jacobi matrix (also called function matrix or derivative matrix) of the perspective projection according to equation (2) with respect to the local 3D positions x ; ⁇
  • d K (z K , iy) denotes the distance between an image position determined via the hole camera model according to equation (2) and the corresponding image position measured with the camera system.
  • the above error function f err is defined, which is a sum of the squares of the distances including the values of the (third) covariance matrix.
  • the minimization is done numerically by methods known per se, e.g. the gradient descent method, solved.
  • FIG. 3 illustrates again in schematic representation the just described estimation of an object pose ⁇ .
  • an object 0 which has an object pose ⁇ , which is specified with respect to a stationary world coordinate system, which is designated in Fig. 3 with WK.
  • FIG. 3 is indicated schematically the temperaturesys ⁇ tem C, are recorded with the images of the object 0 in order to estimate its pose.
  • a corresponding SIFT feature m at a local position of the object x on a corresponding measured 2D position z is obtained in the camera image B which is only partially again gege ⁇ ben.
  • a 2D image position can be calculated via the local position x.
  • This calculated position is designated z in FIG.
  • the distances d between these positions are now calculated for all SIFT features of the object in image B (see equation (6)) and from this by means of the above-defined error function from equation (7) the minimization problem indicated in FIG solved.
  • the minimization problem ⁇ the implicit function / (z).
  • This probability distribution is an essential to the invention result because hereby how big the spread is for an estimated Whether ⁇ jektposition also stated DIE
  • This object position is. Is the covariance matrix of the above probability distribution, the through is given, very large, for example, the estimated position ver ⁇ be discarded or further recordings are initiated by the camera system, to thereby obtain a better estimate of the object pose.
  • FIG. 4 shows, by way of example, a diagram DI which, in the left-hand column C1 for the six coordinates of an object pose, represents the probability distribution p determined in accordance with equation (13) in the context of an embodiment of the invention.
  • a sampling is carried out also with this probability distribution p, in which, by Stichprobenentnah- me based on the function p a new improved International ⁇ scheineriesverotti for the object pose is determined.
  • an object pose taken from the sampling is only included in a histogram according to a threshold criterion if the value e ⁇ ferr is greater than a suitably defined threshold value.
  • The, based on this histogram it ⁇ maintained probability distribution represents the new probability distribution.

Abstract

L'invention concerne un procédé d'estimation assistée par ordinateur de la position d'un objet, lequel utilise une banque de données de modèles comprenant les caractéristiques (m) de l'objet (0), une position locale (x) sur l'objet étant associée à chaque caractéristique (m). Une hypothèse pour une pose d'objet est déterminée sur la base d'images de caméra (B) de l'objet (0). Une meilleure estimation de la pose d'objet est obtenue sur la base d'un modèle de capteur du système de caméra et de l'hypothèse pour la pose d'objet par réduction au minimum d'une fonction d'erreur (ferr). Enfin, une distribution des probabilités (p) de la position d'objet estimée est déterminée au moyen d'une fonction (f) qui est implicitement définie par la réduction au minimum de la fonction d'erreur (ferr). On obtient une estimation précise d'une pose d'objet conjointement avec une mesure pour l'incertitude de cette estimation sous la forme d'une distribution des probabilités. Cette dernière peut être utilisée pour rejeter des positions estimées d'objet quand la dispersion de la distribution des probabilités est trop grande.
PCT/EP2012/057458 2011-05-05 2012-04-24 Procédé d'estimation assistée par ordinateur de la position d'un objet WO2012150150A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102011075335.4 2011-05-05
DE102011075335A DE102011075335A1 (de) 2011-05-05 2011-05-05 Verfahren zur rechnergestützten Lageschätzung eines Objekts

Publications (1)

Publication Number Publication Date
WO2012150150A1 true WO2012150150A1 (fr) 2012-11-08

Family

ID=46027929

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/057458 WO2012150150A1 (fr) 2011-05-05 2012-04-24 Procédé d'estimation assistée par ordinateur de la position d'un objet

Country Status (2)

Country Link
DE (1) DE102011075335A1 (fr)
WO (1) WO2012150150A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104517291A (zh) * 2014-12-15 2015-04-15 大连理工大学 基于目标同轴圆特征的位姿测量方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014005181A1 (de) 2014-04-03 2015-10-08 Astrium Gmbh Positions- und Lagebestimmung von Objekten

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
GRUNDMANN T ET AL: "A probabilistic measurement model for local interest point based 6 DOF pose estimation", INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2010 IEEE/RSJ INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 18 October 2010 (2010-10-18), pages 4572 - 4577, XP031920199, ISBN: 978-1-4244-6674-0, DOI: 10.1109/IROS.2010.5649799 *
N. SNAVELY; S.M. SEITZ; R. SZELISKI: "Photo tourism: exploring photo collections in 3d", ACM TRANS. GRAPH., vol. 25, no. 3, 2006, pages 835 - 846
T. GRUNDMANN; R. EIDENBERGER; M. SCHNEIDER; M. FIEGERT; G. WICHERT: "Robust high precision 6d pose determination in complex environments for robotic manipulation", ICRA 2010 WORKSHOP: BEST PRACTICE IN 3D PERCEPTION AND MODELLING FOR MOBILE MANIPULATION, 2010
YOUNGROCK YOON ET AL: "A New Kalman-Filter-Based Framework for Fast and Accurate Visual Tracking of Rigid Objects", IEEE TRANSACTIONS ON ROBOTICS, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 24, no. 5, 1 October 2008 (2008-10-01), pages 1238 - 1251, XP011332769, ISSN: 1552-3098, DOI: 10.1109/TRO.2008.2003281 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104517291A (zh) * 2014-12-15 2015-04-15 大连理工大学 基于目标同轴圆特征的位姿测量方法

Also Published As

Publication number Publication date
DE102011075335A1 (de) 2012-11-08

Similar Documents

Publication Publication Date Title
DE102014209137B4 (de) Verfahren und Vorrichtung zur Kalibrierung eines Kamerasystems eines Kraftfahrzeugs
EP2284795A2 (fr) Analyse quantitative, visualisation, et correction de mouvement dans des processus dynamiques
DE102014222617B4 (de) Fahrzeugerfassungsverfahren und Fahrzeugerfassungssytem
DE112016004534T5 (de) Nicht überwachtes Abgleichen in feinkörnigen Datensätzen zur Einzelansicht-Objektrekonstruktion
DE10326943A1 (de) Autonomes Fahrzeug sowie zugehörige Verfahren und Vorrichtung zur Bewegungsbestimmung, Bewegungssteuerung und Objekterkennung
EP2901414B1 (fr) Procédé et installation de traitement d'images pour déterminer des paramètres d'une caméra
EP2381207B1 (fr) Mesure de cible 3D et orientation de cible à partir de données IR
EP2562681A1 (fr) Procédé de suivi d'objet pour un système d'assistance du conducteur à caméra
WO2020078615A1 (fr) Procédé et dispositif de détermination d'une carte des alentours
DE102015220031A1 (de) Verfahren zur Konfidenzabschätzung für optisch-visuelle Posenbestimmung
DE102018211513A1 (de) Verfahren und Vorrichtung zum Bestimmen einer Kollisionswahrscheinlichkeit eines Fahrzeugs mit einem Objekt
WO2012150150A1 (fr) Procédé d'estimation assistée par ordinateur de la position d'un objet
DE10329250A1 (de) Verfahren und Vorrichtung zum Extrahieren dreidimensionaler räumlicher Daten eines Objektes unter Verwendung eines Elektronenmikroskops
WO2003025843A2 (fr) Classification d'objet et reconnaissance d'objectifs a base de modeles
DE102008057979B4 (de) Lerneinheit für ein Objekterkennungssystem und Objekterkennungssytem
DE102013018561A1 (de) Verfahren zur Datenerfassung und Datenverarbeitung für eine Fahrspurkennung eines Fahrzeugs
EP3174010A2 (fr) Procede de creation d'une representation en 3d et dispositif d'enregistrement d'images correspondant
EP0534996B1 (fr) Procede de segmentation d'objets en mouvement par adaptation de l'arriere-plan avec compensation du mouvement de la camera
WO2020104263A1 (fr) Procédé, appareil de commande, système et programme informatique permettant de déterminer des points d'objet caractéristiques d'un objet cible, ainsi que procédé de positionnement d'un véhicule à moteur de manière assistée par caméra
DE102004007049A1 (de) Verfahren zur Klassifizierung eines Objekts mit einer Stereokamera
DE102019209117A1 (de) Verfahren zur Lokalisierung eines Fahrzeugs
DE3903838A1 (de) Verfahren und einrichtung zum darstellen dreidimensionaler bilder
DE102007052762A1 (de) Verfahren zur raumzeitlichen Bestimmung einer Lage und einer Orientierung eines Objekts
DE102017201169A1 (de) Rechnergestütztes Bildverarbeitungsverfahren
DE102019209321A1 (de) Verfahren und Vorrichtung zur Auswertung von Bilddaten einer Fahrzeugmonokamera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12718943

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12718943

Country of ref document: EP

Kind code of ref document: A1