JP2016527998A5 - - Google Patents
Download PDFInfo
- Publication number
- JP2016527998A5 JP2016527998A5 JP2016535393A JP2016535393A JP2016527998A5 JP 2016527998 A5 JP2016527998 A5 JP 2016527998A5 JP 2016535393 A JP2016535393 A JP 2016535393A JP 2016535393 A JP2016535393 A JP 2016535393A JP 2016527998 A5 JP2016527998 A5 JP 2016527998A5
- Authority
- JP
- Japan
- Prior art keywords
- orientation
- identification element
- visual identification
- cameras
- identifying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000000007 visual effect Effects 0.000 claims 13
- 238000002604 ultrasonography Methods 0.000 claims 5
- 239000000523 sample Substances 0.000 claims 3
- 230000001133 acceleration Effects 0.000 claims 2
- 230000001788 irregular Effects 0.000 claims 2
- 239000003550 marker Substances 0.000 claims 2
- 238000003384 imaging method Methods 0.000 claims 1
- 238000000034 method Methods 0.000 claims 1
Claims (15)
a)少なくとも1つの視覚的識別要素と、前記物体の絶対方位及び/又は角速度を求めるための方位センサとを含む、好ましくは医療装置である物体を準備するステップと、
b)前記少なくとも1つの視覚的識別要素を視覚的に追跡するように適合され、各々が視野を有し、前記方法中に静止している少なくとも2つのカメラであって、カメラ座標系を定める少なくとも2つのカメラを準備するステップと、
c)前記物体が動いている間に、前記少なくとも2つのカメラの視野の画像データを取得するステップと、
d)前記物体が動いている間に、前記方位センサによって提供される方位データを取得するステップと、
e)前記方位センサを前記カメラ座標系に対して較正するステップと、
f)前記画像データ及び前記方位データを分析して、前記物体の移動中における該物体の姿勢を特定するステップと、
を含み、
f1)前記視覚的識別要素の少なくとも一部が、前記物体の前記方位を特定できるほど十分に両方のカメラによって確認できる場合、前記物体の方位は、前記方位データ及び/又は前記画像データに基づいて特定され、
f2)前記視覚的識別要素が、前記物体の方位を特定できるほどに十分に両方のカメラによって確認できない場合、前記物体の方位は、前記方位データのみに基づいて特定され、
f3)前記視覚的識別要素の少なくとも一部が、前記物体の前記位置を特定できるほど十分に両方のカメラによって確認できる場合、前記物体の位置は、前記画像データのみに基づいて特定され、
f4)前記視覚的識別要素の少なくとも一部が、該一部を識別して該一部の位置を特定できるほど十分に両方のカメラによって確認できる場合、前記物体の位置は、ステップf1)又はf2)によって特定された前記画像データと前記方位データとの組み合わせに基づいて特定される、
ことを特徴とする方法。 Preferably, a method for identifying the posture of an object that is a medical device,
a) providing an object, preferably a medical device, comprising at least one visual identification element and an orientation sensor for determining an absolute orientation and / or angular velocity of said object;
b) at least two cameras adapted to visually track the at least one visual identification element, each having a field of view and stationary during the method, at least defining a camera coordinate system Preparing two cameras,
c) obtaining image data of the field of view of the at least two cameras while the object is moving;
d) obtaining orientation data provided by the orientation sensor while the object is moving;
e) calibrating the orientation sensor with respect to the camera coordinate system;
f) analyzing the image data and the orientation data to identify a posture of the object during the movement of the object;
Including
f1) If at least part of the visual identification element can be confirmed by both cameras sufficiently to identify the orientation of the object, the orientation of the object is based on the orientation data and / or the image data Identified,
f2) If the visual identification element cannot be confirmed by both cameras enough to identify the orientation of the object, the orientation of the object is determined based only on the orientation data;
f3) if at least part of the visual identification element can be confirmed by both cameras sufficiently to identify the position of the object, the position of the object is determined based only on the image data;
f4) If at least part of the visual identification element can be confirmed by both cameras enough to identify the part and locate the part, the position of the object is determined by step f1) or f2 ) Is specified based on the combination of the image data and the orientation data specified by
A method characterized by that.
請求項1に記載の方法。 Step f1) estimates the accuracy with which the orientation of the object is specified based only on the orientation data and the accuracy with which the orientation of the object is specified based only on the image data. In step f1), higher accuracy is obtained. Further comprising using the resulting procedure,
The method of claim 1.
請求項1又は2に記載の方法。 If even during the first time interval even part of the visual identification element cannot be confirmed by both cameras enough to identify the part and locate the part, the position of the object is Identified based on interpolation,
The method according to claim 1 or 2.
請求項3に記載の方法。 The interpolation is based on the position of the object identified immediately before the first time interval and / or the position of the object identified immediately after the first time interval.
The method of claim 3.
請求項3又は4に記載の方法。 The interpolation is based on the velocity and / or acceleration of the object identified immediately before the first time interval and / or the velocity and / or acceleration of the object identified immediately after the first time interval. ,
The method according to claim 3 or 4.
請求項1から5のいずれかに記載の方法。 Identifying the position and / or orientation of the visual identification element relative to the object and / or identifying the orientation of the orientation sensor relative to the object.
The method according to claim 1.
請求項6に記載の方法。 The visual identification element includes a plurality of sub-elements and / or portions that can be distinguished from each other and identified by the camera, and the step of identifying the position and / or orientation of the visual identification element relative to the object comprises: And / or identifying each of the portions and determining the position of each of the sub-elements and / or the portions with respect to the object,
The method of claim 6.
請求項1から7のいずれかに記載の方法。 The visual identification element includes 3 or 4 or more discrete marker elements, 2 or 3 or more barcodes, 1 or 2 or more 2D barcodes, regular patterns, irregular patterns, arbitrary patterns, geometric shapes, 2D or 3D surface of part of the object or the entire object, active and / or passive markers, retroreflective markers, active markers adapted to change appearance over time or in a predetermined cycle Including one or a combination thereof,
The method according to claim 1.
請求項1から8のいずれかに記載の方法。 The orientation sensor includes a rate gyro and / or a compass,
The method according to claim 1.
請求項1から9のいずれかに記載の方法。 The step of calibrating the azimuth sensor with respect to the camera coordinate system includes: i) obtaining image data of a field of view of the at least two cameras at a first time, and based on the image data, the step at the first time. Identifying the orientation of the object; ii) obtaining the orientation data provided by the orientation sensor at the first time and identifying the orientation of the object at the first time based on the orientation data And iii) calibrating the orientation sensor with respect to the camera coordinate system by associating the orientations of the objects identified based on steps i) and ii) with each other.
10. A method according to any one of claims 1-9.
請求項1から10のいずれかに記載の方法。 The step of calibrating the orientation sensor with respect to the camera coordinate system is performed at a plurality of time points while the object is moving, and the posture at a given time point is the most in time at the given time point. Identified based on calibration of the near object,
The method according to claim 1.
請求項1から11のいずれかに記載の方法。 In order to achieve an indication of the calibration performed normally according to step e), the current accuracy of identifying the orientation from the image data, the current accuracy of identifying the orientation from the orientation data, a predetermined level of accuracy Feedback is provided to the user including one or a combination of instructions as to when calibration of
12. A method according to any one of claims 1 to 11.
請求項1から12のいずれかに記載の方法。 The object is a handheld medical device, preferably an ultrasound probe;
The method according to claim 1.
ことを特徴とする超音波装置。 An ultrasound device for ultrasound imaging, comprising an ultrasound probe, at least two cameras, and a processor, the ultrasound probe comprising at least one visual identification element and an absolute An azimuth sensor for determining azimuth and / or angular velocity, the processor comprising method steps c) to f) of claim 1 and optionally method steps of claims 2 to 7 and claims 9 to 11. Adapted to perform,
An ultrasonic device characterized by that.
請求項14に記載の超音波装置。 The visual identification element includes 3 or 4 or more discrete marker elements, 2 or 3 or more barcodes, 1 or 2 or more 2D barcodes, regular patterns, irregular patterns, arbitrary patterns, geometric shapes, 2D or 3D surface of part of the object or the entire object, active and / or passive markers, retroreflective markers, active markers adapted to change appearance over time or in a predetermined cycle Including one or a combination thereof,
The ultrasonic device according to claim 14 .
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13180985.7 | 2013-08-20 | ||
EP13180985.7A EP3001219B1 (en) | 2013-08-20 | 2013-08-20 | Optical tracking |
PCT/EP2014/066505 WO2015024755A1 (en) | 2013-08-20 | 2014-07-31 | Optical tracking |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2016527998A JP2016527998A (en) | 2016-09-15 |
JP2016527998A5 true JP2016527998A5 (en) | 2017-08-31 |
JP6441346B2 JP6441346B2 (en) | 2018-12-19 |
Family
ID=49035329
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2016535393A Active JP6441346B2 (en) | 2013-08-20 | 2014-07-31 | Optical tracking |
Country Status (8)
Country | Link |
---|---|
US (1) | US9613421B2 (en) |
EP (1) | EP3001219B1 (en) |
JP (1) | JP6441346B2 (en) |
CN (1) | CN105659107B (en) |
AU (1) | AU2014310841A1 (en) |
CA (1) | CA2921589C (en) |
ES (1) | ES2763912T3 (en) |
WO (1) | WO2015024755A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105361905B (en) * | 2015-12-03 | 2019-02-01 | 汕头市超声仪器研究所有限公司 | A kind of ultrasonic probe dynamic temperature control system based on pattern-recognition |
GB2548341A (en) * | 2016-03-10 | 2017-09-20 | Moog Bv | Movement tracking and simulation device and method |
US20170273665A1 (en) * | 2016-03-28 | 2017-09-28 | Siemens Medical Solutions Usa, Inc. | Pose Recovery of an Ultrasound Transducer |
US10223798B2 (en) * | 2016-05-27 | 2019-03-05 | Intellijoint Surgical Inc. | Systems and methods for tracker characterization and verification |
KR101820682B1 (en) * | 2016-08-09 | 2018-01-23 | 주식회사 고영테크놀러지 | Marker for optical tracking, optical tracking system, and optical tracking method |
WO2019064706A1 (en) * | 2017-09-27 | 2019-04-04 | 富士フイルム株式会社 | Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device |
DE102019201526A1 (en) * | 2019-02-06 | 2020-08-06 | Ford Global Technologies, Llc | Method and system for detecting and measuring the position of a component relative to a reference position and the displacement and rotation of a component moving relative to a reference system |
KR102285007B1 (en) * | 2019-06-21 | 2021-08-03 | 주식회사 데카사이트 | Apparatus and method for providing ultrasound image using tracing position and pose of probe in ultrasound scanner |
CN112568935B (en) * | 2019-09-29 | 2024-06-25 | 中慧医学成像有限公司 | Three-dimensional ultrasonic imaging method and system based on three-dimensional tracking camera |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2002222102A1 (en) | 2000-11-28 | 2002-06-11 | Roke Manor Research Limited. | Optical tracking systems |
US6491632B1 (en) * | 2001-06-26 | 2002-12-10 | Geoffrey L. Taylor | Method and apparatus for photogrammetric orientation of ultrasound images |
US7029477B2 (en) * | 2002-12-20 | 2006-04-18 | Zimmer Technology, Inc. | Surgical instrument and positioning method |
US6925339B2 (en) * | 2003-02-04 | 2005-08-02 | Zimmer Technology, Inc. | Implant registration device for surgical navigation system |
JP4914038B2 (en) * | 2004-11-04 | 2012-04-11 | キヤノン株式会社 | Information processing method and apparatus |
EP1866871A4 (en) * | 2005-03-30 | 2012-01-04 | Worcester Polytech Inst | Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors |
JP4914039B2 (en) * | 2005-07-27 | 2012-04-11 | キヤノン株式会社 | Information processing method and apparatus |
JP4880264B2 (en) * | 2005-08-24 | 2012-02-22 | オリンパス株式会社 | POSITION DETECTION DEVICE AND MEDICAL DEVICE POSITION DETECTION SYSTEM |
US7677078B2 (en) * | 2006-02-02 | 2010-03-16 | Siemens Medical Solutions Usa, Inc. | Line-based calibration of ultrasound transducer integrated with a pose sensor |
CN101053518A (en) * | 2006-04-12 | 2007-10-17 | 杨章民 | Posture monitoring system |
US7599789B2 (en) * | 2006-05-24 | 2009-10-06 | Raytheon Company | Beacon-augmented pose estimation |
CN101869484B (en) * | 2009-04-24 | 2015-05-13 | 深圳迈瑞生物医疗电子股份有限公司 | Medical diagnosis device having touch screen and control method thereof |
FR2960082B1 (en) * | 2010-05-17 | 2012-08-10 | Commissariat Energie Atomique | METHOD AND SYSTEM FOR MERGING DATA FROM IMAGE SENSORS AND MOTION OR POSITION SENSORS |
CN102479386A (en) * | 2010-11-24 | 2012-05-30 | 湘潭大学 | Three-dimensional motion tracking method of upper half part of human body based on monocular video |
US8448056B2 (en) * | 2010-12-17 | 2013-05-21 | Microsoft Corporation | Validation analysis of human target |
EP2656790A4 (en) * | 2010-12-24 | 2017-07-05 | Konica Minolta, Inc. | Ultrasound image-generating apparatus and image-generating method |
EP2716230A4 (en) * | 2011-05-30 | 2014-10-29 | Konica Minolta Inc | Ultrasound image-generating apparatus and ultrasound image-generating method |
JP5728372B2 (en) * | 2011-11-30 | 2015-06-03 | キヤノン株式会社 | Information processing apparatus, information processing apparatus control method, and program |
US9111351B2 (en) * | 2011-12-15 | 2015-08-18 | Sony Corporation | Minimizing drift using depth camera images |
WO2013134559A1 (en) * | 2012-03-07 | 2013-09-12 | Speir Technologies Inc. | Methods and systems for tracking and guiding sensors and instruments |
JP2014050589A (en) * | 2012-09-07 | 2014-03-20 | Furuno Electric Co Ltd | Measuring apparatus |
SG11201507613QA (en) * | 2013-03-15 | 2015-10-29 | Synaptive Medical Barbados Inc | Intelligent positioning system and methods therefore |
WO2014161574A1 (en) * | 2013-04-03 | 2014-10-09 | Brainlab Ag | Method and device for determining the orientation of a co-ordinate system of an anatomical object in a global co-ordinate system |
EP3089670A4 (en) * | 2014-01-02 | 2017-10-11 | Metritrack, Inc. | System and method for tracking completeness of co-registered medical image data |
-
2013
- 2013-08-20 EP EP13180985.7A patent/EP3001219B1/en active Active
- 2013-08-20 ES ES13180985T patent/ES2763912T3/en active Active
-
2014
- 2014-07-31 AU AU2014310841A patent/AU2014310841A1/en not_active Abandoned
- 2014-07-31 CA CA2921589A patent/CA2921589C/en active Active
- 2014-07-31 JP JP2016535393A patent/JP6441346B2/en active Active
- 2014-07-31 CN CN201480057749.8A patent/CN105659107B/en active Active
- 2014-07-31 WO PCT/EP2014/066505 patent/WO2015024755A1/en active Application Filing
- 2014-08-19 US US14/463,212 patent/US9613421B2/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2016527998A5 (en) | ||
EP2825841B1 (en) | Method, device and computer programme for extracting information about a staircase | |
JP2019067383A5 (en) | ||
JP2016109630A5 (en) | ||
JP2019500693A5 (en) | ||
JP2017037554A5 (en) | ||
JP2016525000A5 (en) | ||
CN105659107B (en) | For determining the method and ultrasonic equipment of the posture of object | |
JP2012021958A5 (en) | ||
RU2013158008A (en) | DETERMINING AND CALIBRATING THE NEEDLE LENGTH FOR THE NEEDLE GUIDING SYSTEM | |
RU2016149454A (en) | SYSTEM AND METHOD FOR MEASURING DEFECTS IN FERROMAGNETIC MATERIALS | |
JP2018091656A5 (en) | ||
EP2772815A3 (en) | Mobile Robot and Method of Localization and Mapping of the Same | |
JP2012002761A5 (en) | Position / orientation measuring apparatus, processing method thereof, program, robot system | |
JP2014046433A5 (en) | Information processing system, apparatus, method, and program | |
JP2008116373A5 (en) | ||
JP2015090298A5 (en) | ||
JP2015040783A5 (en) | ||
JP6746050B2 (en) | Calibration device, calibration method, and calibration program | |
JP2014211404A (en) | Motion capture method | |
JP2019501704A5 (en) | ||
KR20190063967A (en) | Method and apparatus for measuring position using stereo camera and 3D barcode | |
WO2018027451A1 (en) | Flight positioning method and device | |
JP2015175831A5 (en) | ||
RU2015133516A (en) | SYSTEM AND METHOD FOR ASSESSING THE VOLUME OF MOVEMENTS OF A SUBJECT |