JP2016527998A5 - - Google Patents

Download PDF

Info

Publication number
JP2016527998A5
JP2016527998A5 JP2016535393A JP2016535393A JP2016527998A5 JP 2016527998 A5 JP2016527998 A5 JP 2016527998A5 JP 2016535393 A JP2016535393 A JP 2016535393A JP 2016535393 A JP2016535393 A JP 2016535393A JP 2016527998 A5 JP2016527998 A5 JP 2016527998A5
Authority
JP
Japan
Prior art keywords
orientation
identification element
visual identification
cameras
identifying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2016535393A
Other languages
Japanese (ja)
Other versions
JP2016527998A (en
JP6441346B2 (en
Filing date
Publication date
Priority claimed from EP13180985.7A external-priority patent/EP3001219B1/en
Application filed filed Critical
Publication of JP2016527998A publication Critical patent/JP2016527998A/en
Publication of JP2016527998A5 publication Critical patent/JP2016527998A5/ja
Application granted granted Critical
Publication of JP6441346B2 publication Critical patent/JP6441346B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Claims (15)

好ましくは医療装置である物体の姿勢を特定する方法であって、
a)少なくとも1つの視覚的識別要素と、前記物体の絶対方位及び/又は角速度を求めるための方位センサとを含む、好ましくは医療装置である物体を準備するステップと、
b)前記少なくとも1つの視覚的識別要素を視覚的に追跡するように適合され、各々が視野を有し、前記方法中に静止している少なくとも2つのカメラであって、カメラ座標系を定める少なくとも2つのカメラを準備するステップと、
c)前記物体が動いている間に、前記少なくとも2つのカメラの視野の画像データを取得するステップと、
d)前記物体が動いている間に、前記方位センサによって提供される方位データを取得するステップと、
e)前記方位センサを前記カメラ座標系に対して較正するステップと、
f)前記画像データ及び前記方位データを分析して、前記物体の移動中における該物体の姿勢を特定するステップと、
を含み、
f1)前記視覚的識別要素の少なくとも一部が、前記物体の前記方位を特定できるほど十分に両方のカメラによって確認できる場合、前記物体の方位は、前記方位データ及び/又は前記画像データに基づいて特定され、
f2)前記視覚的識別要素が、前記物体の方位を特定できるほどに十分に両方のカメラによって確認できない場合、前記物体の方位は、前記方位データのみに基づいて特定され、
f3)前記視覚的識別要素の少なくとも一部が、前記物体の前記位置を特定できるほど十分に両方のカメラによって確認できる場合、前記物体の位置は、前記画像データのみに基づいて特定され、
f4)前記視覚的識別要素の少なくとも一部が、該一部を識別して該一部の位置を特定できるほど十分に両方のカメラによって確認できる場合、前記物体の位置は、ステップf1)又はf2)によって特定された前記画像データと前記方位データとの組み合わせに基づいて特定される、
ことを特徴とする方法。
Preferably, a method for identifying the posture of an object that is a medical device,
a) providing an object, preferably a medical device, comprising at least one visual identification element and an orientation sensor for determining an absolute orientation and / or angular velocity of said object;
b) at least two cameras adapted to visually track the at least one visual identification element, each having a field of view and stationary during the method, at least defining a camera coordinate system Preparing two cameras,
c) obtaining image data of the field of view of the at least two cameras while the object is moving;
d) obtaining orientation data provided by the orientation sensor while the object is moving;
e) calibrating the orientation sensor with respect to the camera coordinate system;
f) analyzing the image data and the orientation data to identify a posture of the object during the movement of the object;
Including
f1) If at least part of the visual identification element can be confirmed by both cameras sufficiently to identify the orientation of the object, the orientation of the object is based on the orientation data and / or the image data Identified,
f2) If the visual identification element cannot be confirmed by both cameras enough to identify the orientation of the object, the orientation of the object is determined based only on the orientation data;
f3) if at least part of the visual identification element can be confirmed by both cameras sufficiently to identify the position of the object, the position of the object is determined based only on the image data;
f4) If at least part of the visual identification element can be confirmed by both cameras enough to identify the part and locate the part, the position of the object is determined by step f1) or f2 ) Is specified based on the combination of the image data and the orientation data specified by
A method characterized by that.
ステップf1)は、前記物体の方位を前記方位データのみに基づいて特定する精度、及び前記物体の方位を前記画像データのみに基づいて特定する精度を推定し、ステップf1)において、より高い精度をもたらす手順を使用するステップをさらに含む、
請求項1に記載の方法。
Step f1) estimates the accuracy with which the orientation of the object is specified based only on the orientation data and the accuracy with which the orientation of the object is specified based only on the image data. In step f1), higher accuracy is obtained. Further comprising using the resulting procedure,
The method of claim 1.
第1の時間間隔中に、前記視覚的識別要素の一部さえも、該一部を識別して該一部の位置を特定できるほど十分に両方のカメラによって確認できない場合、前記物体の位置は、補間に基づいて特定される、
請求項1又は2に記載の方法。
If even during the first time interval even part of the visual identification element cannot be confirmed by both cameras enough to identify the part and locate the part, the position of the object is Identified based on interpolation,
The method according to claim 1 or 2.
前記補間は、前記第1の時間間隔の直前に特定された前記物体の位置、及び/又は前記第1の時間間隔の直後に特定された前記物体の位置に基づく、
請求項3に記載の方法。
The interpolation is based on the position of the object identified immediately before the first time interval and / or the position of the object identified immediately after the first time interval.
The method of claim 3.
前記補間は、前記第1の時間間隔の直前に特定された前記物体の速度及び/又は加速度、及び/又は前記第1の時間間隔の直後に特定された前記物体の速度及び/又は加速度に基づく、
請求項3又は4に記載の方法。
The interpolation is based on the velocity and / or acceleration of the object identified immediately before the first time interval and / or the velocity and / or acceleration of the object identified immediately after the first time interval. ,
The method according to claim 3 or 4.
前記物体に対する前記視覚的識別要素の位置及び/又は方位を特定し、及び/又は前記物体に対する前記方位センサの方位を特定するステップをさらに含む、
請求項1から5のいずれかに記載の方法。
Identifying the position and / or orientation of the visual identification element relative to the object and / or identifying the orientation of the orientation sensor relative to the object.
The method according to claim 1.
前記視覚的識別要素は、互いに区別できて前記カメラによって識別できる複数の副要素及び/又は部分を含み、前記物体に対する前記視覚的識別要素の位置及び/又は方位を特定するステップは、前記副要素及び/又は前記部分の各々を識別し、前記物体に対する前記副要素及び/又は前記部分の各々の位置を特定するステップを含む、
請求項6に記載の方法。
The visual identification element includes a plurality of sub-elements and / or portions that can be distinguished from each other and identified by the camera, and the step of identifying the position and / or orientation of the visual identification element relative to the object comprises: And / or identifying each of the portions and determining the position of each of the sub-elements and / or the portions with respect to the object,
The method of claim 6.
前記視覚的識別要素は、3又は4以上の離散マーカ要素、2又は3以上のバーコード、1又は2以上の2Dバーコード、規則的パターン、不規則パターン、任意のパターン、幾何学的形状、前記物体の一部又は前記物体全体の2次元又は3次元表面、能動及び/又は受動マーカ、再帰反射マーカ、時間と共に所定の周期で又は非周期的に外観を変化させるように適合された能動マーカのうちの1つ又はこれらの組み合わせを含む、
請求項1から7のいずれかに記載の方法。
The visual identification element includes 3 or 4 or more discrete marker elements, 2 or 3 or more barcodes, 1 or 2 or more 2D barcodes, regular patterns, irregular patterns, arbitrary patterns, geometric shapes, 2D or 3D surface of part of the object or the entire object, active and / or passive markers, retroreflective markers, active markers adapted to change appearance over time or in a predetermined cycle Including one or a combination thereof,
The method according to claim 1.
前記方位センサは、レートジャイロ及び/又はコンパスを含む、
請求項1から8のいずれかに記載の方法。
The orientation sensor includes a rate gyro and / or a compass,
The method according to claim 1.
前記方位センサを前記カメラ座標系に対して較正するステップは、i)前記少なくとも2つのカメラの視野の画像データを第1の時刻に取得し、前記画像データに基づいて前記第1の時刻における前記物体の方位を特定するステップと、ii)前記方位センサによって提供される方位データを前記第1の時刻に取得し、前記方位データに基づいて前記第1の時刻における前記物体の方位を特定するステップと、iii)ステップi)及びii)に基づいて特定された前記物体の方位を互いに関連付けることにより、前記方位センサを前記カメラ座標系に対して較正するステップとを含む、
請求項1から9のいずれかに記載の方法。
The step of calibrating the azimuth sensor with respect to the camera coordinate system includes: i) obtaining image data of a field of view of the at least two cameras at a first time, and based on the image data, the step at the first time. Identifying the orientation of the object; ii) obtaining the orientation data provided by the orientation sensor at the first time and identifying the orientation of the object at the first time based on the orientation data And iii) calibrating the orientation sensor with respect to the camera coordinate system by associating the orientations of the objects identified based on steps i) and ii) with each other.
10. A method according to any one of claims 1-9.
前記方位センサを前記カメラ座標系に対して較正するステップは、前記物体が動いている間に複数の時点で行われ、所与の時点の前記姿勢は、該所与の時点に時間的に最も近い前記物体の較正に基づいて特定される、
請求項1から10のいずれかに記載の方法。
The step of calibrating the orientation sensor with respect to the camera coordinate system is performed at a plurality of time points while the object is moving, and the posture at a given time point is the most in time at the given time point. Identified based on calibration of the near object,
The method according to claim 1.
ステップe)に従って正常に行われた較正の指示、前記画像データから前記方位を特定する現在の精度、前記方位データから前記方位を特定する現在の精度、所定のレベルの精度を達成するために次の較正をいつ行う必要があるかについての指示のうちの1つ又はこれらの組み合わせを含むフィードバックがユーザに提供される、
請求項1から11のいずれかに記載の方法。
In order to achieve an indication of the calibration performed normally according to step e), the current accuracy of identifying the orientation from the image data, the current accuracy of identifying the orientation from the orientation data, a predetermined level of accuracy Feedback is provided to the user including one or a combination of instructions as to when calibration of
12. A method according to any one of claims 1 to 11.
前記物体は、ハンドヘルド型医療装置であり、好ましくは超音波プローブである、
請求項1から12のいずれかに記載の方法。
The object is a handheld medical device, preferably an ultrasound probe;
The method according to claim 1.
超音波撮像のための超音波装置であって、超音波プローブと、少なくとも2つのカメラと、プロセッサとを備え、前記超音波プローブは、少なくとも1つの視覚的識別要素と、前記超音波プローブの絶対方位及び/又は角速度を求めるための方位センサとを含み、前記プロセッサは、請求項1の方法ステップc)〜f)と、任意に請求項2〜7及び請求項9〜11の方法ステップとを実行するように適合される、
ことを特徴とする超音波装置。
An ultrasound device for ultrasound imaging, comprising an ultrasound probe, at least two cameras, and a processor, the ultrasound probe comprising at least one visual identification element and an absolute An azimuth sensor for determining azimuth and / or angular velocity, the processor comprising method steps c) to f) of claim 1 and optionally method steps of claims 2 to 7 and claims 9 to 11. Adapted to perform,
An ultrasonic device characterized by that.
前記視覚的識別要素は、3又は4以上の離散マーカ要素、2又は3以上のバーコード、1又は2以上の2Dバーコード、規則的パターン、不規則パターン、任意のパターン、幾何学的形状、前記物体の一部又は前記物体全体の2次元又は3次元表面、能動及び/又は受動マーカ、再帰反射マーカ、時間と共に所定の周期で又は非周期的に外観を変化させるように適合された能動マーカのうちの1つ又はこれらの組み合わせを含む、
請求項14に記載の超音波装置。
The visual identification element includes 3 or 4 or more discrete marker elements, 2 or 3 or more barcodes, 1 or 2 or more 2D barcodes, regular patterns, irregular patterns, arbitrary patterns, geometric shapes, 2D or 3D surface of part of the object or the entire object, active and / or passive markers, retroreflective markers, active markers adapted to change appearance over time or in a predetermined cycle Including one or a combination thereof,
The ultrasonic device according to claim 14 .
JP2016535393A 2013-08-20 2014-07-31 Optical tracking Active JP6441346B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP13180985.7 2013-08-20
EP13180985.7A EP3001219B1 (en) 2013-08-20 2013-08-20 Optical tracking
PCT/EP2014/066505 WO2015024755A1 (en) 2013-08-20 2014-07-31 Optical tracking

Publications (3)

Publication Number Publication Date
JP2016527998A JP2016527998A (en) 2016-09-15
JP2016527998A5 true JP2016527998A5 (en) 2017-08-31
JP6441346B2 JP6441346B2 (en) 2018-12-19

Family

ID=49035329

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016535393A Active JP6441346B2 (en) 2013-08-20 2014-07-31 Optical tracking

Country Status (8)

Country Link
US (1) US9613421B2 (en)
EP (1) EP3001219B1 (en)
JP (1) JP6441346B2 (en)
CN (1) CN105659107B (en)
AU (1) AU2014310841A1 (en)
CA (1) CA2921589C (en)
ES (1) ES2763912T3 (en)
WO (1) WO2015024755A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105361905B (en) * 2015-12-03 2019-02-01 汕头市超声仪器研究所有限公司 A kind of ultrasonic probe dynamic temperature control system based on pattern-recognition
GB2548341A (en) * 2016-03-10 2017-09-20 Moog Bv Movement tracking and simulation device and method
US20170273665A1 (en) * 2016-03-28 2017-09-28 Siemens Medical Solutions Usa, Inc. Pose Recovery of an Ultrasound Transducer
US10223798B2 (en) * 2016-05-27 2019-03-05 Intellijoint Surgical Inc. Systems and methods for tracker characterization and verification
KR101820682B1 (en) * 2016-08-09 2018-01-23 주식회사 고영테크놀러지 Marker for optical tracking, optical tracking system, and optical tracking method
WO2019064706A1 (en) * 2017-09-27 2019-04-04 富士フイルム株式会社 Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device
DE102019201526A1 (en) * 2019-02-06 2020-08-06 Ford Global Technologies, Llc Method and system for detecting and measuring the position of a component relative to a reference position and the displacement and rotation of a component moving relative to a reference system
KR102285007B1 (en) * 2019-06-21 2021-08-03 주식회사 데카사이트 Apparatus and method for providing ultrasound image using tracing position and pose of probe in ultrasound scanner
CN112568935B (en) * 2019-09-29 2024-06-25 中慧医学成像有限公司 Three-dimensional ultrasonic imaging method and system based on three-dimensional tracking camera

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2002222102A1 (en) 2000-11-28 2002-06-11 Roke Manor Research Limited. Optical tracking systems
US6491632B1 (en) * 2001-06-26 2002-12-10 Geoffrey L. Taylor Method and apparatus for photogrammetric orientation of ultrasound images
US7029477B2 (en) * 2002-12-20 2006-04-18 Zimmer Technology, Inc. Surgical instrument and positioning method
US6925339B2 (en) * 2003-02-04 2005-08-02 Zimmer Technology, Inc. Implant registration device for surgical navigation system
JP4914038B2 (en) * 2004-11-04 2012-04-11 キヤノン株式会社 Information processing method and apparatus
EP1866871A4 (en) * 2005-03-30 2012-01-04 Worcester Polytech Inst Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
JP4914039B2 (en) * 2005-07-27 2012-04-11 キヤノン株式会社 Information processing method and apparatus
JP4880264B2 (en) * 2005-08-24 2012-02-22 オリンパス株式会社 POSITION DETECTION DEVICE AND MEDICAL DEVICE POSITION DETECTION SYSTEM
US7677078B2 (en) * 2006-02-02 2010-03-16 Siemens Medical Solutions Usa, Inc. Line-based calibration of ultrasound transducer integrated with a pose sensor
CN101053518A (en) * 2006-04-12 2007-10-17 杨章民 Posture monitoring system
US7599789B2 (en) * 2006-05-24 2009-10-06 Raytheon Company Beacon-augmented pose estimation
CN101869484B (en) * 2009-04-24 2015-05-13 深圳迈瑞生物医疗电子股份有限公司 Medical diagnosis device having touch screen and control method thereof
FR2960082B1 (en) * 2010-05-17 2012-08-10 Commissariat Energie Atomique METHOD AND SYSTEM FOR MERGING DATA FROM IMAGE SENSORS AND MOTION OR POSITION SENSORS
CN102479386A (en) * 2010-11-24 2012-05-30 湘潭大学 Three-dimensional motion tracking method of upper half part of human body based on monocular video
US8448056B2 (en) * 2010-12-17 2013-05-21 Microsoft Corporation Validation analysis of human target
EP2656790A4 (en) * 2010-12-24 2017-07-05 Konica Minolta, Inc. Ultrasound image-generating apparatus and image-generating method
EP2716230A4 (en) * 2011-05-30 2014-10-29 Konica Minolta Inc Ultrasound image-generating apparatus and ultrasound image-generating method
JP5728372B2 (en) * 2011-11-30 2015-06-03 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
US9111351B2 (en) * 2011-12-15 2015-08-18 Sony Corporation Minimizing drift using depth camera images
WO2013134559A1 (en) * 2012-03-07 2013-09-12 Speir Technologies Inc. Methods and systems for tracking and guiding sensors and instruments
JP2014050589A (en) * 2012-09-07 2014-03-20 Furuno Electric Co Ltd Measuring apparatus
SG11201507613QA (en) * 2013-03-15 2015-10-29 Synaptive Medical Barbados Inc Intelligent positioning system and methods therefore
WO2014161574A1 (en) * 2013-04-03 2014-10-09 Brainlab Ag Method and device for determining the orientation of a co-ordinate system of an anatomical object in a global co-ordinate system
EP3089670A4 (en) * 2014-01-02 2017-10-11 Metritrack, Inc. System and method for tracking completeness of co-registered medical image data

Similar Documents

Publication Publication Date Title
JP2016527998A5 (en)
EP2825841B1 (en) Method, device and computer programme for extracting information about a staircase
JP2019067383A5 (en)
JP2016109630A5 (en)
JP2019500693A5 (en)
JP2017037554A5 (en)
JP2016525000A5 (en)
CN105659107B (en) For determining the method and ultrasonic equipment of the posture of object
JP2012021958A5 (en)
RU2013158008A (en) DETERMINING AND CALIBRATING THE NEEDLE LENGTH FOR THE NEEDLE GUIDING SYSTEM
RU2016149454A (en) SYSTEM AND METHOD FOR MEASURING DEFECTS IN FERROMAGNETIC MATERIALS
JP2018091656A5 (en)
EP2772815A3 (en) Mobile Robot and Method of Localization and Mapping of the Same
JP2012002761A5 (en) Position / orientation measuring apparatus, processing method thereof, program, robot system
JP2014046433A5 (en) Information processing system, apparatus, method, and program
JP2008116373A5 (en)
JP2015090298A5 (en)
JP2015040783A5 (en)
JP6746050B2 (en) Calibration device, calibration method, and calibration program
JP2014211404A (en) Motion capture method
JP2019501704A5 (en)
KR20190063967A (en) Method and apparatus for measuring position using stereo camera and 3D barcode
WO2018027451A1 (en) Flight positioning method and device
JP2015175831A5 (en)
RU2015133516A (en) SYSTEM AND METHOD FOR ASSESSING THE VOLUME OF MOVEMENTS OF A SUBJECT