WO2020061725A1 - Procédé et système de détection et de suivi d'objets dans un espace de travail - Google Patents

Procédé et système de détection et de suivi d'objets dans un espace de travail Download PDF

Info

Publication number
WO2020061725A1
WO2020061725A1 PCT/CN2018/107190 CN2018107190W WO2020061725A1 WO 2020061725 A1 WO2020061725 A1 WO 2020061725A1 CN 2018107190 W CN2018107190 W CN 2018107190W WO 2020061725 A1 WO2020061725 A1 WO 2020061725A1
Authority
WO
WIPO (PCT)
Prior art keywords
workspace
new object
objects
existing
obtaining
Prior art date
Application number
PCT/CN2018/107190
Other languages
English (en)
Inventor
Xiaoyu Ge
Aditya ARDIYA
Jianfeng YANG
Original Assignee
Shenzhen Dorabot Robotics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dorabot Robotics Co., Ltd. filed Critical Shenzhen Dorabot Robotics Co., Ltd.
Priority to PCT/CN2018/107190 priority Critical patent/WO2020061725A1/fr
Publication of WO2020061725A1 publication Critical patent/WO2020061725A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Definitions

  • This invention relates to storage and organization of multiple physical objects in a storage space, and in particular to detecting and tracking of the objects as the objects are moved in or organized in the workspace.
  • a logistics robot is usually equipped with one or more grippers to pick up and move items within a logistics operation in a workspace such as a warehouse or a sorting center.
  • the logistics robot would need to handle a wide array of different parts in an infinite number of combinations, and for this reason it is important for the logistics robot to see, move, and react to its environment, for example the items in the workspace.
  • a controller of the logistics operation e.g. a central server or an administrator
  • the present invention in one aspect, is a method of detecting and tracking objects in a workspace.
  • the method contains the steps of obtaining first characteristics of existing objects in the workspace and a new object, before the new object is placed in the workspace; estimating second characteristics of all objects including the existing objects and the new object in the workspace after the new object is placed in the workspace; and associating each one of the first characteristics to a corresponding one of the second characteristics.
  • each one of the first characteristics corresponds to one of the existing objects and the new object
  • each one of the second characteristics also correspond to one of the existing objects and the new object.
  • each one of the first characteristics or each one of the second characteristics contains a pose of one said existing object or the new object.
  • the step of obtaining the first characteristics further includes observing spatial features of the workspace before the new object is placed therein.
  • the step of obtaining the first characteristics further includes obtaining the first characteristic of the new object based on a precomputed placement plan.
  • the step of estimating the second characteristics further contains the steps of observing spatial features of the workspace after the new object is placed therein, segmenting the spatial features observed, and generating the second characteristics.
  • the spatial features include point clouds of the workspace.
  • the segmenting step is based on a region growing method.
  • the step of obtaining the surfaces is based on RANdom SAmple Consensus (RANSAC) algorithm.
  • RANSAC RANdom SAmple Consensus
  • the associating step further contains determining an amount of effort required for a spatial change between the first characteristics and the second characteristics, optimizing the amount of effort; linking each one of the first characteristics to a corresponding one of the second characteristics based on results of Step.
  • the step of optimizing the amount of effort comprises minimizing the amount of effort.
  • the amount of effort contains a vector representing a displacement of one existing object or the new object, and a minimum translation vector thereof.
  • a system for detecting and tracking objects in a workspace contains a sensor adapted to detect spatial features in a workspace; and a controller.
  • the controller is adapted to perform method as described above.
  • the algorithm can detect poses and identities of items based on the prior observations, and solve data association of items based on the physical constraints imposed by prior observations. This is beneficial to the tracking the items through substantial spatial changes in the workspace.
  • the method provided by the invention makes realistic assumptions about the underlying domain and can work with raw point clouds, which allows it ready to be used as a perception module in most logistics loading systems.
  • Fig. 1 is a perspective view of a type of logistics loading system in conventional art.
  • Fig. 3 shows the workspace of a logistic system according to an embodiment of the invention, where a new object is to be placed in the workspace.
  • Fig. 4 shows the workspace in Fig. 3 after the new object has been placed in the workspace.
  • the controller may determine information about the environment from the sensors 106, 108 and in turn controls the robotic arm 102 to pick and move boxes efficiently.
  • the controller may be on-board with the robotic arm 102, or a separate module fixed in the environment and communicating with the robotic arm via wired or wireless connection.
  • the robotic arm 102 can be mounted on a movable platform to allow the robotic arm 102 to move to different locations in the workspace. Examples of the movable platform include a conveying belt, and a moving cart with wheels.
  • Fig. 2 The method in Fig. 2 is applicable to the exemplary workspace as shown in Figs. 3-4, which will be used to elaborate the principles of the method in Fig. 2.
  • Fig. 3 shows a number of existing boxes 20 stored in a workspace 22, and that a new box 24 is going to be placed into the workspace 22.
  • Both the existing boxes 20 and the new box 24 are physical objects that can be sensed by the controller 28 via the one or more sensors 26.
  • the method starts with Step 30 in Fig. 2, in which first characteristics of the existing boxes 20 in the workspace 22 and the new box 24 are obtained.
  • the first characteristics refer to the initial information of all existing boxes 20 in the workspace 22 before the new box 24 is placed in the workspace 22, as well as a precomputed information of the new box 24.
  • the first characteristics B before are defined as:
  • B existing is the initial information of all existing boxes 20 in the workspace 22 before the new box 24 is placed in the workspace 22.
  • B existing is obtained by the controller 28 in advance before placement of the new box 24, for example by using similar method as described herein at an earlier time.
  • the assembly B is represented by:
  • p is a box pose of a box (which could be either an existing box 20 or the new box 24) , and in particular a 6-tuple representing the location and orientation of the box.
  • id is a box identity which is a unique identifier from which detailed information (e.g., weight, load-bearing capacity) about a particular box can be obtained.
  • the pose of the new box 24 is given by a precomputed placement plan.
  • Step 36 a region growing method is used to segment the point clouds of the observation O after .
  • Step 38 surfaces are obtained from the segmentations resulted from Step 36 by using RANdom SAmple Consensus (RANSAC) algorithm.
  • Step 40 the surfaces are combined to form the second characteristics B after .
  • B after has a similar data structure as the first characteristics B before as defined in Eq. 2.
  • the pose of each box in B after is defined by the surfaces based on which the box is formed.
  • ⁇ h the size of the vertical component (defined by the direction of gravity) of c and ⁇ d the size of the non-vertical component.
  • ⁇ s j the size of the minimum translation vector for the box to move away from an intersection with another box.
  • S i 0.
  • the components P, D, S measure the amount of “efforts” the underlying physical system needs to make to achieve the observed spatial change. The efforts are analogous to the concept of energy in physics.
  • the method is able to link each one of the first characteristics to a corresponding one of the second characteristics.
  • each one of the existing boxes 20 as well as the new box 24 can be successfully tracked in the workspace 22.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un procédé de détection et de suivi d'objets dans un espace de travail. Le procédé consiste à obtenir des premières caractéristiques d'objets existants dans l'espace de travail et d'un nouvel objet, avant le placement du nouvel objet dans l'espace de travail; à estimer des deuxièmes caractéristiques de tous les objets comprenant les objets existants et le nouvel objet dans l'espace de travail après le placement du nouvel objet dans l'espace de travail; et à associer chacune des premières caractéristiques à une caractéristique correspondante parmi les deuxièmes caractéristiques. Le procédé permet de détecter des poses et des identités d'éléments en fonction des observations précédentes et de résoudre une association de données d'éléments en fonction des contraintes physiques imposées par les observations précédentes.
PCT/CN2018/107190 2018-09-25 2018-09-25 Procédé et système de détection et de suivi d'objets dans un espace de travail WO2020061725A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/107190 WO2020061725A1 (fr) 2018-09-25 2018-09-25 Procédé et système de détection et de suivi d'objets dans un espace de travail

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/107190 WO2020061725A1 (fr) 2018-09-25 2018-09-25 Procédé et système de détection et de suivi d'objets dans un espace de travail

Publications (1)

Publication Number Publication Date
WO2020061725A1 true WO2020061725A1 (fr) 2020-04-02

Family

ID=69949460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/107190 WO2020061725A1 (fr) 2018-09-25 2018-09-25 Procédé et système de détection et de suivi d'objets dans un espace de travail

Country Status (1)

Country Link
WO (1) WO2020061725A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102414696A (zh) * 2009-04-23 2012-04-11 皇家飞利浦电子股份有限公司 物体学习机器人和方法
WO2015194118A1 (fr) * 2014-06-16 2015-12-23 日本電気株式会社 Dispositif de gestion d'objet, procédé de gestion d'objet et support d'enregistrement stockant un programme de gestion d'objet
US20170220887A1 (en) * 2016-01-29 2017-08-03 Pointivo, Inc. Systems and methods for extracting information about objects from scene information
US20170355078A1 (en) * 2016-06-09 2017-12-14 Shmuel Ur Innovation Ltd. System, Method and Product for Utilizing Prediction Models of an Environment
CN108171748A (zh) * 2018-01-23 2018-06-15 哈工大机器人(合肥)国际创新研究院 一种面向机器人智能抓取应用的视觉识别与定位方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102414696A (zh) * 2009-04-23 2012-04-11 皇家飞利浦电子股份有限公司 物体学习机器人和方法
WO2015194118A1 (fr) * 2014-06-16 2015-12-23 日本電気株式会社 Dispositif de gestion d'objet, procédé de gestion d'objet et support d'enregistrement stockant un programme de gestion d'objet
US20170220887A1 (en) * 2016-01-29 2017-08-03 Pointivo, Inc. Systems and methods for extracting information about objects from scene information
US20170355078A1 (en) * 2016-06-09 2017-12-14 Shmuel Ur Innovation Ltd. System, Method and Product for Utilizing Prediction Models of an Environment
CN108171748A (zh) * 2018-01-23 2018-06-15 哈工大机器人(合肥)国际创新研究院 一种面向机器人智能抓取应用的视觉识别与定位方法

Similar Documents

Publication Publication Date Title
JP7145843B2 (ja) ロボットマニピュレータの訓練
CN111693047B (zh) 一种高动态场景下的微小型无人机视觉导航方法
US11836974B2 (en) Detecting boxes
US9393693B1 (en) Methods and systems for determining and modeling admissible gripper forces for robotic devices
US7966094B2 (en) Workpiece picking apparatus
JP5259286B2 (ja) 3次元物体認識システム及びそれを用いた棚卸システム
CN111602096A (zh) 具有排除区的多分辨率扫描匹配
DE102020104468A1 (de) Robotersystem mit objektidentifizierungs- und handhabungsmechanismus und verfahren zu seinem betrieb
US20010055063A1 (en) Position detection apparatus, position detection method and position detection program
JP2013217893A (ja) モデル生成装置、位置姿勢推定装置、情報処理装置、モデル生成方法、位置姿勢推定方法、情報処理方法
US10809739B2 (en) Method for controlling a robot and/or an autonomous driverless transport system
CN103582803A (zh) 共享与自动化工业车辆相关联的地图数据的方法和装置
US20090310117A1 (en) Method for Detecting Objects With a Pivotable Sensor Device
US10852740B2 (en) Determining the orientation of flat reflectors during robot mapping
CN106647738A (zh) 一种无人搬运车的对接路径确定方法及系统及无人搬运车
Hussein A review on vision-based control of flexible manipulators
US20220388175A1 (en) Object Association Using Machine Learning Models
WO2020137311A1 (fr) Dispositif de positionnement et objet mobile
Altuntaş et al. Comparison of 3-dimensional SLAM systems: RTAB-Map vs. Kintinuous
WO2020061725A1 (fr) Procédé et système de détection et de suivi d'objets dans un espace de travail
Singh et al. Optimized 3D laser point cloud reconstruction by gradient descent technique
Giordano et al. 3D structure identification from image moments
Milstein et al. A method for fast encoder‐free mapping in unstructured environments
Sujan et al. Visually guided cooperative robot actions based on information quality
Cheng et al. A software architecture for low-resource autonomous mobile manipulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18935120

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18935120

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18935120

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31.05.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 18935120

Country of ref document: EP

Kind code of ref document: A1