WO2015121729A1 - Procédé d'acquisition d'informations de distance, dispositif d'acquisition d'informations de distance, programme d'acquisition d'informations de distance et robot - Google Patents

Procédé d'acquisition d'informations de distance, dispositif d'acquisition d'informations de distance, programme d'acquisition d'informations de distance et robot Download PDF

Info

Publication number
WO2015121729A1
WO2015121729A1 PCT/IB2015/000123 IB2015000123W WO2015121729A1 WO 2015121729 A1 WO2015121729 A1 WO 2015121729A1 IB 2015000123 W IB2015000123 W IB 2015000123W WO 2015121729 A1 WO2015121729 A1 WO 2015121729A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
distance information
vector
target
information acquisition
Prior art date
Application number
PCT/IB2015/000123
Other languages
English (en)
Inventor
Kiyohiro SOGEN
Original Assignee
Toyota Jidosha Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Jidosha Kabushiki Kaisha filed Critical Toyota Jidosha Kabushiki Kaisha
Publication of WO2015121729A1 publication Critical patent/WO2015121729A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Definitions

  • the invention relates to a distance information acquisition method, a distance information acquisition device and a robot.
  • JP2002-342787A discloses a method in which three-dimensional information of a target is acquired by imaging the target with a multi-eye camera including individual cameras whose external parameters (a camera position and posture) and internal parameters (a focal length, a pixel pitch, etc.) are pre-calibrated, detecting corresponding points from a plurality of images thus obtained, acquiring distances to the corresponding points, in other words, distance information of the corresponding points, under the principle of triangulation, and performing a three-dimensional reconstruction.
  • external parameters a camera position and posture
  • internal parameters a focal length, a pixel pitch, etc.
  • distance to the target namely distance information of the target
  • the multi-eye camera it is sometimes the case that, when a target is captured and imaged with one of a plurality of cameras, other cameras may not image the target due to the existence of an obstacle or the like, making it impossible to acquire the distance information of the target. This may occur when the distance information of the target is acquired using a camera and a distance sensor.
  • the invention provides a distance information acquisition method, a distance information acquisition device and a robot which are capable of reliably acquiring the distance information of a target.
  • a distance information acquisition method is a method for measuring a distance to a distance information acquisition target with a distance sensor, including: detecting a vector which indicates the target, measuring the distance by changing a measurement region of the distance sensor when a distance information detected by the distance sensor does not exist at a point on the vector, detecting the distance information detected by the distance sensor at the point on the vector and acquiring a distance information of the target based on the detected distance information.
  • a distance information acquisition device includes: a vector detecting unit configured to detect a vector which indicates a distance information acquisition target, a distance detecting unit configured to detect the distance information detected by a distance sensor at a point on the vector and to acquire a distance information of the target based on the detected distance information, and a sensor driving unit configured to drive the distance sensor when the distance information detected by the distance sensor does not exist at the point on the vector such that the distance sensor measures a distance by changing a measurement region of the distance sensor.
  • a distance information acquisition program causes a computer to perform a process of: detecting a vector which indicates a distance information acquisition target, driving a distance sensor such that the distance sensor measures a distance by changing a measurement region of the distance sensor, when a distance information detected by the distance sensor does not exist at a point on the vector, detecting the distance information detected by the distance sensor at the point on the vector, and acquiring a distance information of the target based on the detected distance information.
  • the invention it is possible to provide a distance information acquisition method, a distance information acquisition device and a robot which are capable of reliably acquiring the distance information of a target.
  • FIG. 1A is a view showing a robot 11 provided therein with a distance information acquisition device according to a first embodiment and an operation environment thereof.
  • FIG. IB is a view showing an HMI 21 according to the first embodiment.
  • FIG. 2A is a view showing the positional relationship between a visual field 14 and a measurement region 15 when the robot 11 according to the first embodiment can acquire the distance information.
  • FIG. 2B is a view showing the positional relationship between the visual field 14 and the measurement region 15 when the robot 11 according to the first embodiment cannot acquire the distance information.
  • FIG. 2C is a view showing the positional relationship between the visual field 14 and the measurement region 15 when the robot 11 according to the first embodiment has changed the measurement region 15.
  • FIG. 3 is a block diagram showing a schematic configuration of the robot 11 provided with a distance information acquisition device 100 according to the first embodiment.
  • FIG. 4 is a flowchart showing a routine of a distance information acquisition method implemented by the distance information acquisition device 100 according to the first embodiment.
  • FIG. 5 is a view showing a indication vector 34 detected by a vector detecting unit 101 according to the first embodiment.
  • FIG. 6 is a view showing a method for detecting the distance information with a distance detecting unit 102 according to the first embodiment.
  • FIG. 7 is a view showing the position and direction of a distance sensor 13 decided by a sensor driving unit 103 according to the first embodiment.
  • a distance information acquisition device When it becomes difficult or impossible to acquire distance information of a distance sensor, the distance information acquisition device according to the first embodiment acquires the distance information by driving the distance sensor to change a measurement region of the distance sensor.
  • the distance information acquisition device acquires the distance information by driving the distance sensor to change a measurement region of the distance sensor.
  • FIG. 1A is a view showing a robot 11 according to the first embodiment and an operation environment of the robot 11.
  • FIG. IB is a view showing an HMI 21 which gives instructions to the robot 11 according to the first embodiment.
  • the robot 11 is provided therein with a distance information acquisition device (not shown) and is equipped in a head portion thereof with a camera 12 for displaying an HMI (Human Machine Interface) and a distance sensor 13.
  • the visual field 14 of the camera 12 and the measurement region 15 of the distance sensor 13 are oriented toward a remote controller 17 which is a target to be gripped by the robot 11.
  • a user gives instructions to the robot 11 by operating an HMI 21 as an operable display means.
  • the robot 11 performs an operation of gripping the remote controller 17 using an action unit 16 which is a moving mechanism or an arm/hand mechanism.
  • the distance information acquisition device executes a process relating to the distance information acquisition of the robot 11.
  • the HMI 21 displays an image of the remote controller 17 imaged by the camera 12. By tapping the displayed image of the remote controller 17, the user informs the robot 11 of the fact that the remote controller 17 is the target to be gripped.
  • the robot 11 grips the remote controller 17 by acquiring position information or three-dimensional information of the remote controller 17 based on the distance information of the distance sensor 13. At this time, the robot 11 may not acquire the distance information of the remote controller 17.
  • FIGS. 2A to 2C are views showing the positional relationship between the visual field 14 of the camera 12 and the measurement region 15 of the distance sensor 13 according to the first embodiment.
  • FIGS. 2A to 2C the situation where the remote controller 17 placed on a floor surface against a wall is being imaged by the camera 12 is observed in the horizontal direction (transverse direction) of the camera 12.
  • the distance sensor 13 acquires the distance information on the floor surface and a portion of the surface of the remote controller 17 existing within the measurement region 15 in a form of distance points 31.
  • the point of the displayed image of the HMI 21 tapped by the user is reflected as a user indication point 33 on a virtual image plane 32 of a perspective projection model.
  • perspective projection it is meant that object points in a space are mapped on a virtual image plane.
  • the virtual image plane refers to a virtual plane that exists at a position of focal length f in front of a view point O of a camera.
  • the technical terms such as “perspective projection”, “virtual image plane”, “view point” and “optical axis” to be described later are clearly described in Kanaya Kenichi, "Mathematical principle of camera calibration -How the best grid pattern was derived- Part 1 perspective transformation and focal length", Mathematical Sciences, Saiensu-sha Co., Ltd., October 1993, Vol. 31, No. 10, p.56-62.
  • FIG. 2A is a view showing the positional relationship between the visual field 14 and the measurement region 15 when the robot 11 according to the first embodiment can acquire the distance information.
  • the robot 11 can acquire the distance information.
  • FIG. 2B is a view showing the positional relationship between the visual field 14 and the measurement region 15 when the robot 11 according to the first embodiment cannot acquire the distance information. For some reason, the positional relationship between the visual field 14 and the measurement region 15 is changed from that shown in FIG. 2A. Thus, the distance point 31 does not exist on the indication vector 34 and the robot 11 cannot acquire the distance information.
  • FIG. 2C is a view showing the positional relationship between the visual field 14 and the measurement region 15 when the robot 11 according to the first embodiment has changed the measurement region 15. If a situation where the distance information cannot be acquired occurs as shown in FIG. 2B, the robot 11 changes the measurement region 15 by driving the distance sensor 13 and changing the position of the distance sensor 13. Then, the robot 11 scans the indication vector 34. If the distance point 31 appears on the indication vector 34, the robot 11 acquires the distance information of the distance point 31 as the distance information of the target indicated by the user. In this way, the distance information acquisition device according to the first embodiment can reliably acquire the distance information of the target.
  • FIG. 3 is a block diagram showing the schematic configuration of the robot 11 provided with the distance information acquisition device 100 according to the first embodiment.
  • the robot 11 includes a distance information acquisition device 100, a camera 12, a distance sensor 13, a three-dimensional information acquisition unit 110, an operation control unit 111, an action unit 16, and so forth.
  • the distance information acquisition device 100 includes a vector detecting unit 101, a distance detecting unit 102, a sensor driving unit 103, and so forth.
  • an HMI 21 is provided outside the robot 11, e.g., in the hand of a user.
  • the coordinate information of a tap position of the displayed image is inputted from the HMI 21 to the vector detecting unit 101.
  • the vector detecting unit 101 converts the inputted coordinate information to the coordinate information of the user indication point 33 of the virtual image plane 32. Further, the position information of the view point O is inputted from the camera 12 to the vector detecting unit 101.
  • the vector detecting unit 101 converts the inputted coordinate information to the coordinate information of the user indication point 33 of the virtual image plane 32. Further, the position information of the view point O is inputted from the camera 12 to the vector detecting unit 101.
  • the vector detecting unit 101 detects the vector passing through the view point O and the user indication point 33 as a indication vector having a starting point and a direction in a three-dimensional space and as a vector indicating a distance information acquisition target. Then, the vector detecting unit 101 outputs the detected vector to the distance detecting unit 102.
  • the indication vector information is inputted from the vector detecting unit 101 to the distance detecting unit 102. Further, the distance information is inputted from the distance sensor 13 to the distance detecting unit 102.
  • the distance detecting unit 102 determines whether the distance point 31 exists on the indication vector 34. If the distance point 31 exists on the indication vector 34, the distance detecting unit 102 acquires the distance information of the target based on the distance information of the distance point 31 and the position information of the distance sensor 13. Then, the distance detecting unit 102 outputs the distance information of the target to the three-dimensional information acquisition unit 110. If the distance point 31 does not exist on the indication vector 34, the distance detecting unit 102 outputs a notice of nonexistence of the distance point 31 and the vector information to the sensor driving unit 103.
  • the sensor driving unit 103 drives the distance sensor 13 based on the vector information concurrently inputted.
  • the sensor driving unit 103 causes the distance sensor 13 to undergo displacement such as rotation or movement, thereby changing the measurement region 15 of the distance sensor 13 such that the distance sensor 13 can scan at least the indication vector 34.
  • the sensor driving unit 103 makes use of a distance sensor driving device (not shown) mounted to the robot.
  • the three-dimensional information acquisition unit 110 performs a three-dimensional reconstruction by receiving the distance information of the target from the distance detecting unit 102 multiple times, and outputs the three-dimensional information of the target to the operation control unit 111.
  • the operation control unit 111 receives the three-dimensional information of the target from the three-dimensional information acquisition unit 110 and instructs the action unit 16 to perform an operation such as, e.g., taking, pressing or pulling, with respect to the target based on the three-dimensional information. At this time, the operation control unit 111 may receive and use other information, e.g., a target image recognition result as well.
  • the operation control unit 111 controls the overall operation of the robot 11.
  • the distance sensor 13 it is possible to use a stereo camera, a distance image sensor such as Kinect (registered trademark), a laser sensor such as a TOF (Time Of Flight), or the like. If the distance sensor 13 is a stereo camera, one of a plurality of cameras can be also used as a camera for HMI display.
  • the distance sensor driving device it is possible to use a movable platform fixed to a movable portion of the robot 11 or the environment. Further, the robot 11 is a mobile manipulator such as an HSR (Human Support Robot).
  • the respective components of the distance information acquisition device 100 can be realized by, for example, executing a program under the control of an arithmetic unit (not shown) of the distance information acquisition device 100 that is a computer. More specifically, the distance information acquisition device 100 can be realized by loading a program stored in a storage unit (not shown) onto a main storage unit (not shown) and executing the program under the control of the arithmetic unit. Further, the respective components can be realized not only by the software such as a program but also by the combination of hardware, firmware and software.
  • the aforementioned program is stored using different types of non-transitory computer readable media and can be supplied to a computer.
  • the non-transitory computer readable media include different types of tangible storage media. Examples of the non-transitory computer readable media include a magnetic recording medium (e.g., a flexible disk, a magnetic tape or a hard disk drive), a magneto-optical recording medium (e.g., a magneto-optical disk), a CD-ROM (Read Only Memory), a CD-R, a CD-R W, and a semiconductor memory (e.g., a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, or a RAM (Random Access Memory)).
  • a magnetic recording medium e.g., a flexible disk, a magnetic tape or a hard disk drive
  • a magneto-optical recording medium e.g., a magneto-optical disk
  • CD-ROM Read Only Memory
  • the program may be supplied to a computer by different types of transitory computer readable media.
  • Examples of the transitory computer readable media include an electrical signal, an optical signal and an electromagnetic wave.
  • the transitory computer readable media can supply the program to a computer via a wire communication path such as an electric wire, an optical fiber or the like, or a wireless communication path.
  • FIG. 4 is a flowchart showing a routine of a distance information acquisition method implemented by the distance information acquisition device 100 according to the first embodiment.
  • a user designates a target and instruct the robot on the designation (Step SIO).
  • the remote controller 17 as the target is imaged by the HMI display camera 12 mounted to the robot 11.
  • An image of the target is inputted to the HMI 21 through communication and is displayed on the HMI 21.
  • a user indicates the target by tapping the displayed image of the HMI 21.
  • the vector detecting unit 101 detects the indication vector 34 (Step 1)
  • FIG. 5 is a view showing the indication vector 34 detected by the vector detecting unit 101 according to the first embodiment.
  • the view point O of the HMI display camera 12 can be found as three-dimensional coordinates based on the design information of the robot 11 and the self-position information of the robot 11. Further, the relationship between the two-dimensional coordinates P (x, y) of the user indication point 33 on the virtual image plane of the perspective projection model and the three-dimensional coordinates P (X, Y, Z) of the target 17 is expressed by equations 1 and 2 where f represents the focal length and D represents the distance from the camera 12 to the target 17. In this case, the distance D is unknown. However, it is possible to find the slope V of the indication vector 34 from equation 3. This makes it possible to calculate the three-dimensional vector based on the position of the view point O and the slope V.
  • FIG. 6 is a view showing a method for detecting the distance information with the distance detecting unit 102 according to the first embodiment.
  • the distance detecting unit 102 sets a ray (line segment) using the view point O of the camera as a starting point in a three-dimensional space and sets the measured distance point group as a sphere group having a specified size. It is possible to determine whether the distance point exists on the indication vector 34 or not by using a ray cast to determine whether there exists any sphere colliding with the ray or not.
  • the ray cast is described in, e.g., "Distance Measurement using Ray", [Online], [January 28, 2014 Search], Internet http://wwwl0.atwiki.jp/bambooflow/pages/239.html.
  • the sensor driving unit 103 decides a position, a direction, and so forth of the distance sensor 13 in order to change the measurement region of the distance sensor 13 (Step S40).
  • FIG. 7 is a view showing the position and direction of the distance sensor 13 decided by the sensor driving unit 103 according to the first embodiment.
  • the sensor driving unit 103 samples the three-dimensional point existing on the indication vector 34 to select a gaze point 35 and sets a head angle of the robot 11 such that the optical axis of the distance sensor 13 faces toward the gaze point 35.
  • the distance sensor 13 measures the distance information based on the position and direction thus decided (Step S50). Then, the distance detecting unit 102 determines whether the distance point 31 exists on the indication vector 34 or not (Step S60). The determination method used at this time is the same as the ray cast determination method used in Step S30 and therefore will not be described. Then, if the distance point 31 exists on the indication vector 34 (if "YES" in Step S60), the distance detecting unit 102 acquires the three-dimensional coordinates of the target based on the distance information of the distance point 31 and the position information of the distance sensor 13, and notifies the acquired three-dimensional coordinates to the three-dimensional information acquisition unit 110 of the robot 11 (Step S70).
  • the operation control unit 111 controls the operation of the robot 11 based on the three-dimensional information and the like of the target outputted from the three-dimensional information acquisition unit 110;
  • the action unit 16 manipulates the target and applies a physical action to the target.
  • the distance detecting unit 102 Prior to changing the measurement region of the distance sensor 13, if the distance detecting unit 102 determines that the distance point 31 exists on the indication vector 34 (if "YES" in Step S30), the distance detecting unit 102 notifies the three-dimensional coordinates of the target based on the distance information to the three-dimensional information acquisition unit 110 of the robot ll (Step S70).
  • Step S60 After changing the measurement region of the distance sensor 13, if the distance detecting unit 102 determines that the distance point 31 does not exist on the indication vector 34 (if "NO" in Step S60), the flow returns to the sensor operation deciding process (Step S40) or the distance measurement process (Step S50), where the detection of the distance point 31 existing on the indication vector 34 is continuously performed.
  • the distance information acquisition method is a method for measuring a distance to a distance information acquisition target with a distance sensor.
  • the distance information acquisition method includes a step S20 of detecting a vector which designates the target, a step S50 of, if distance information detected by the distance sensor does not exist at a point on the vector, measuring a distance by changing a measurement region of the distance sensor, a step S60 of, if distance information detected by the distance sensor exists at a point on the vector, detecting the distance information, and a step S70 of acquiring distance information of the target based on the detected distance information.
  • the distance to the distance information acquisition target is a distance from the view point of the camera to the distance information acquisition target.
  • the distance information acquisition method further includes a step S10 of imaging the target with a camera and detecting the target existing on a virtual image plane of the camera. It is preferred that, in the step S20 of detecting a vector, a vector passing through the view point of the camera and the target on the virtual image plane is detected as the vector.
  • the invention is not limited to the first embodiment described above but may be appropriately modified without departing from the spirit of the invention.
  • the camera 12 for the display of the HMI 21 is mounted to the robot 11.
  • the camera may be mounted to the HMI 21 or may be disposed in the environment. Even in this case, it is possible to detect the indication vector 34 from the camera by specifying the view point O of the camera with an indoor GPS or other methods.
  • the vector detecting unit 101 detects the indication vector 34 based on the information of the camera 12 for the display of the HMI 21.
  • the vector detecting unit 101 may detect the indication vector based on a gesture such as user's finger-pointing recognized using a technique such as skeleton tracking.
  • skeleton tracking the information such as a shoulder position, an elbow position, or a hand position is obtained as three-dimensional coordinates. It is therefore possible to calculate the vector extending from the shoulder position toward the hand position as the indication vector.
  • the skeleton tracking is described in, e.g., "Kinect for Windows (registered trademark) SDK Introduction 4: Skeleton Tracking and 3D Display", [Online], [January 28, 2014 Search], Internet http://d.hatena.ne.jp/astrobot/20110721/ 1311240780.
  • the sensor driving unit 103 sets the head angle of the robot such that the optical axis of the distance sensor 13 faces toward the gaze point.
  • the distance sensor 13 can sense the gaze point at the center of an image.
  • the image center or the measurement region center of the distance sensor 13 may face toward the gaze point.
  • the gaze point may be positioned near the center of the measurement region.
  • the measurement region is changed such that the point on the vector is positioned near the center of the measurement region.
  • the distance information acquisition methods it is also preferred that, in the step S20 of detecting a vector, a vector extracted based on a user's gesture directly indicating the target is detected as the vector.

Abstract

L'invention concerne un procédé d'acquisition d'informations de distance qui permet de mesurer une distance à une cible d'acquisition d'informations de distance avec un capteur de distance. Le procédé comprend les étapes suivantes : S20, la détection d'un vecteur qui indique la cible ; 50, la mesure de la distance en modifiant une région de mesure du capteur de distance, lorsque des informations de distance détectées par le capteur de distance n'existent pas au niveau d'un point sur le vecteur ; 60, la détection des informations de distance détectées par le capteur de distance au niveau du point sur le vecteur, et 70, l'acquisition des informations de distance de la cible sur la base des informations de distance détectées.
PCT/IB2015/000123 2014-02-12 2015-02-09 Procédé d'acquisition d'informations de distance, dispositif d'acquisition d'informations de distance, programme d'acquisition d'informations de distance et robot WO2015121729A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-024089 2014-02-12
JP2014024089A JP2015152338A (ja) 2014-02-12 2014-02-12 距離情報取得方法、距離情報取得装置及びロボット

Publications (1)

Publication Number Publication Date
WO2015121729A1 true WO2015121729A1 (fr) 2015-08-20

Family

ID=52774285

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/000123 WO2015121729A1 (fr) 2014-02-12 2015-02-09 Procédé d'acquisition d'informations de distance, dispositif d'acquisition d'informations de distance, programme d'acquisition d'informations de distance et robot

Country Status (2)

Country Link
JP (1) JP2015152338A (fr)
WO (1) WO2015121729A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997014015A1 (fr) * 1995-10-12 1997-04-17 Metronor A/S Systeme de mesure point par point de coordonnees spatiales
JP2002342787A (ja) 2001-05-22 2002-11-29 Minolta Co Ltd 3次元モデルの生成方法および装置並びにコンピュータプログラム
WO2006024091A1 (fr) * 2004-08-30 2006-03-09 Commonwealth Scientific And Industrial Research Organisation Procede pour une imagerie tridimensionnelle automatisee
WO2008106999A1 (fr) * 2007-03-08 2008-09-12 Trimble Ab Procédés et instruments pour estimer le mouvement d'une cible

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4221330B2 (ja) * 2004-05-07 2009-02-12 日本電信電話株式会社 インタフェース方法、装置、およびプログラム
JP5198078B2 (ja) * 2008-01-24 2013-05-15 株式会社日立製作所 計測装置および計測方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997014015A1 (fr) * 1995-10-12 1997-04-17 Metronor A/S Systeme de mesure point par point de coordonnees spatiales
JP2002342787A (ja) 2001-05-22 2002-11-29 Minolta Co Ltd 3次元モデルの生成方法および装置並びにコンピュータプログラム
WO2006024091A1 (fr) * 2004-08-30 2006-03-09 Commonwealth Scientific And Industrial Research Organisation Procede pour une imagerie tridimensionnelle automatisee
WO2008106999A1 (fr) * 2007-03-08 2008-09-12 Trimble Ab Procédés et instruments pour estimer le mouvement d'une cible

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Kinect for Windows (registered trademark) SDK Introduction 4: Skeleton Tracking and 3D Display", INTERNET, 28 January 2014 (2014-01-28), Retrieved from the Internet <URL:http://.d.hatena.ne.ip/astrobot/20110721 1311240780>
KANAYA KENICHI: "Mathematical Sciences", vol. 31, October 1993, SAIENSU-SHA CO., LTD., article "Mathematical principle of camera calibration -How the best grid pattern was derived- Part 1 perspective transformation and focal length", pages: 56 - 62

Also Published As

Publication number Publication date
JP2015152338A (ja) 2015-08-24

Similar Documents

Publication Publication Date Title
JP5624394B2 (ja) 位置姿勢計測装置、その計測処理方法及びプログラム
JP4850984B2 (ja) 動作空間提示装置、動作空間提示方法およびプログラム
JP5538667B2 (ja) 位置姿勢計測装置及びその制御方法
JP6465789B2 (ja) デプスカメラの内部パラメータを算出するプログラム、装置及び方法
US20210112181A1 (en) Image processing device, image processing method, and recording medium
KR101973917B1 (ko) 3차원 계측 장치 및 그 계측 지원 처리 방법
US9154769B2 (en) Parallel online-offline reconstruction for three-dimensional space measurement
KR20120053275A (ko) 3차원 얼굴 위치 추정방법 및 장치
JP2012247364A (ja) ステレオカメラ装置、ステレオカメラシステム、プログラム
JP2017033429A (ja) 3次元物体検査装置
WO2015068470A1 (fr) Dispositif de mesure de forme 3d, procédé de mesure de forme 3d, et programme de mesure de forme 3d
US20170109932A1 (en) Content projection apparatus, content projection method, and computer readable storage medium
JPWO2018038152A1 (ja) ガス計測システム及びガス計測プログラム
US11729367B2 (en) Wide viewing angle stereo camera apparatus and depth image processing method using the same
WO2019059091A1 (fr) Système de gestion de poteau des services publics
JP2013130508A (ja) 3次元計測方法、3次元計測プログラム及びロボット装置
WO2015121729A1 (fr) Procédé d&#39;acquisition d&#39;informations de distance, dispositif d&#39;acquisition d&#39;informations de distance, programme d&#39;acquisition d&#39;informations de distance et robot
KR101716805B1 (ko) 로봇 제어 시각화 장치
WO2017057426A1 (fr) Dispositif de projection, dispositif de détermination de contenu, procédé de projection, et programme
CN111522299B (zh) 机械控制装置
KR101316387B1 (ko) 비전 센싱과 거리측정 센싱을 이용한 오브젝트 인식 방법
JP2022011821A (ja) 情報処理装置、情報処理方法、移動ロボット
JP6488946B2 (ja) 制御方法、プログラム及び制御装置
JP2020088688A (ja) 監視装置、監視システム、監視方法、監視プログラム
JP2019066196A (ja) 傾き測定装置及び傾き測定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15712407

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15712407

Country of ref document: EP

Kind code of ref document: A1