WO2016041855A1 - Ultrasound imaging apparatus - Google Patents

Ultrasound imaging apparatus Download PDF

Info

Publication number
WO2016041855A1
WO2016041855A1 PCT/EP2015/070806 EP2015070806W WO2016041855A1 WO 2016041855 A1 WO2016041855 A1 WO 2016041855A1 EP 2015070806 W EP2015070806 W EP 2015070806W WO 2016041855 A1 WO2016041855 A1 WO 2016041855A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
view
patient
virtual
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2015/070806
Other languages
English (en)
French (fr)
Inventor
Frank Michael WEBER
Thomas Heiko STEHLE
Irina Waechter-Stehle
Juergen Weese
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to JP2017514650A priority Critical patent/JP2017527401A/ja
Priority to US15/510,103 priority patent/US20170251988A1/en
Priority to EP15760481.0A priority patent/EP3193727A1/en
Publication of WO2016041855A1 publication Critical patent/WO2016041855A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0883Clinical applications for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means

Definitions

  • the present invention is based on the idea to acquire ultrasound data of a patient by means of an ultrasound acquisition unit and to transform the ultrasound data in the field of view as captured to ultrasound data in a virtual field of view corresponding to a position within the patient's body determined by the position determining unit.
  • the virtual field of a view has a virtual viewing direction as seen from the position within the patient's body determined by the position determining unit so that an internal view can be derived from the ultrasound data acquired by the ultrasound acquisition unit.
  • the position is a position of a catheter probe within the patient's body determined by the position determining unit.
  • the position determining unit is adapted to determine a position of a catheter probe within the patient's body as the position on the basis of which the virtual field of view is determined. This is a possibility to precisely determine a position of interest in the patient's body by means of a catheter, wherein the use of an expensive catheter ultrasound echo probe can be omitted.
  • the ultrasound data comprises a plurality of voxels each including an ultrasound measurement value
  • the transformation unit is adapted to transform the ultrasound measurement values of the voxels in the field of view to voxels of the virtual field of view.
  • the position within the patient's body can be determined in order to define the virtual field of view in order to simulate the acquisition of ultrasound data by means of a catheter ultrasound echo probe.
  • the position on the basis of which the virtual field of view is determined can be defined by determining a position of a real catheter probe within the patient's body, e.g. by means of a tracking unit or within the ultrasound image or an X-ray image, the position can be determined on the basis of the anatomical context in the patient's body by segmenting the ultrasound data and by determining organs within the patient's body on the basis of the segmentation data or by means of a combination of the catheter tracking and the anatomical context.
  • Fig. 1 shows a schematic illustration of an ultrasound imaging apparatus 10 according to one embodiment.
  • the ultrasound imaging apparatus 10 is applied to inspect a volume of an anatomical side, in particular an anatomical side of a patient 12.
  • the ultrasound imaging apparatus comprises an ultrasound acquisition unit 14 in particular an ultrasound probe 14 having at least one transducer array including a multitude of transducer elements for transmitting and receiving ultrasound waves.
  • the transducer elements are preferably arranged in a 2D array for providing 3D ultrasound image data.
  • the ultrasound acquisition unit 14 acquires ultrasound data in a field of view 16 within the patient's body and provides corresponding 3D ultrasound data.
  • the ultrasound imaging apparatus 10 comprises in general an image processing apparatus 18 for evaluating the ultrasound data received from the ultrasound acquisition unit 14 and for transforming the ultrasound data in the field of view 16 to a virtual field of view 20 as described in the following.
  • the image processing apparatus 18 further comprises a transformation unit 30 for transforming the ultrasound data in the field of view 16 to transformed ultrasound data in the virtual field of view.
  • the transformed ultrasound data is provided to a display unit 32 for displaying the transformed ultrasound data in the virtual field of view 20.
  • the transformation unit 30 receives the ultrasound data as a 3D array of voxels each including an ultrasound measurement value and transforms the voxels of the field of view 16 to voxels of the virtual field of view 20 in the virtual viewing direction 28 so that the transformed ultrasound data can be provided and displayed on a display unit 32 as if the transformed ultrasound data would have been acquired by an ultrasound probe located at the position 26 and directed in the virtual viewing direction 28.
  • the ultrasound imaging apparatus 10 may further comprise a segmentation unit 36 connected to the image evaluation unit 22 and to the position determining unit 24, wherein the segmentation unit 24 provides segmentation data on the basis of the ultrasound data and determines anatomical features within the field of view 16.
  • the position determining unit 24 can identify on the basis of the segmentation data different anatomical features and/or organs within the field of view 16 and determines the virtual field of view 20 on the basis of the segmentation data. This is a possibility to automatically define the virtual field of view 20 in the direction of a certain anatomical feature to be examined or which corresponds to a usual field of view of a catheter ultrasound probe during corresponding catheter examinations.
  • Fig. 3 shows an embodiment of the determination of the position 26 and the virtual viewing direction 28.
  • the segmentation unit 36 segments different organs in the ultrasound data 42 captured by the ultrasound acquisition unit 24 and provides segmentation data 44 of the different organs or anatomical features of the patient 12.
  • the position determining unit 24 determines the position 26 and the virtual viewing direction 28 on the basis of the segmentation data 44 and the correspondingly identified organs and/or anatomical features so that the organs or anatomical features of interest are within the virtual field of view 20 or the virtual cone and correspondingly displayed in the transformed ultrasound data on the display unit 32.
  • the organs and/or anatomical features of interest can be automatically displayed as if a catheter including an ultrasound echo probe would be located at the position 26 and directed correspondingly in the virtual viewing direction 28 to scan the respective organs and/or anatomical features.
  • Fig. 2 and 3 can be combined in one embodiment so that the position and the virtual viewing direction 28 is determined based on the identified position of the catheter probe 40 and on the basis of the segmentation data 44 provided by the segmentation unit 36.
  • the position 26 can be determined on the basis of the detected position of the catheter probe 40 and the virtual viewing direction 28 can be determined on the basis of the segmentation data 44 so that the relevant organs and/or anatomical features can be displayed automatically from the position of the catheter probe 40.
  • Fig. 4 shows ultrasound data in the field of view 16 and transformed ultrasound data in the virtual field of view 20 transformed by the transformation unit 30.
  • Fig. 4a shows the ultrasound data 42 captured by the ultrasound acquisition unit 14 in the field of view 16 including the position 26, the virtual viewing direction 28 and the virtual field of view 20.
  • the ultrasound data 42 is transformed to transformed ultrasound data 46 shown in Fig. 4b.
  • the transformed ultrasound data 46 is displayed in the virtual field of view 20 seen from the position 26 in the virtual viewing direction 28 as if the transformed ultrasound data 46 would have been captured from the position 26 within the patient's body 12.
  • FIG. 5 shows a schematic block diagram of an ultrasound imaging method for providing ultrasound images of the patient 12 generally denoted by 50.
  • the method 50 starts with acquiring 3D ultrasound data from the patient 12 by means of the ultrasound acquisition unit 14 as shown at a step 52.
  • the ultrasound data 42 may be formed as a transthoracic echocardiogram (TTE) or as a transesophageal echocardiogram (TEE) of the patient 12.
  • TTE transthoracic echocardiogram
  • TEE transesophageal echocardiogram
  • the ultrasound data 42 can be provided to the position determining unit 24 as shown at step 54 additionally or alternatively, the ultrasound data 42 can be provided to the segmentation unit 36 as shown at 56.
  • the X-ray unit 34 acquires X-ray data as shown at 58 and provides the X- ray data to the position determining unit 24 as shown at 54.
  • a user input is provided by means of the input device 38 as shown at 72 and the position determining unit 24 is adapted to determine the position 26 on the basis of the user input as shown at 74 and the virtual viewing direction on the basis of the user input as shown at 76.
  • the transformation unit 30 transforms the ultrasound data 42 in the field of view 16 to the transformed ultrasound data 46 in the virtual field of view 20 as shown at 78 and provides the transformed ultrasound data 46 to the display unit 32 for displaying the transformed ultrasound data 46 in the virtual field of view 20 as if the ultrasound data 46 would have been acquired from the position 26 within the patient's body 12.
  • the transformed ultrasound data 46 is provided to the display unit 32 for displaying the transformed ultrasound data as shown at 80.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Cardiology (AREA)
  • Quality & Reliability (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
PCT/EP2015/070806 2014-09-18 2015-09-11 Ultrasound imaging apparatus Ceased WO2016041855A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2017514650A JP2017527401A (ja) 2014-09-18 2015-09-11 超音波撮像装置
US15/510,103 US20170251988A1 (en) 2014-09-18 2015-09-11 Ultrasound imaging apparatus
EP15760481.0A EP3193727A1 (en) 2014-09-18 2015-09-11 Ultrasound imaging apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP14185262 2014-09-18
EP14185262.4 2014-09-18

Publications (1)

Publication Number Publication Date
WO2016041855A1 true WO2016041855A1 (en) 2016-03-24

Family

ID=51570323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/070806 Ceased WO2016041855A1 (en) 2014-09-18 2015-09-11 Ultrasound imaging apparatus

Country Status (4)

Country Link
US (1) US20170251988A1 (enExample)
EP (1) EP3193727A1 (enExample)
JP (1) JP2017527401A (enExample)
WO (1) WO2016041855A1 (enExample)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9452276B2 (en) 2011-10-14 2016-09-27 Intuitive Surgical Operations, Inc. Catheter with removable vision probe
US9387048B2 (en) 2011-10-14 2016-07-12 Intuitive Surgical Operations, Inc. Catheter sensor systems
US20130303944A1 (en) 2012-05-14 2013-11-14 Intuitive Surgical Operations, Inc. Off-axis electromagnetic sensor
US10238837B2 (en) 2011-10-14 2019-03-26 Intuitive Surgical Operations, Inc. Catheters with control modes for interchangeable probes
US20140148673A1 (en) 2012-11-28 2014-05-29 Hansen Medical, Inc. Method of anchoring pullwire directly articulatable region in catheter
CN111166274B (zh) 2013-10-24 2025-01-28 奥瑞斯健康公司 机器人辅助腔内外科手术系统及相关方法
EP2923669B1 (en) 2014-03-24 2017-06-28 Hansen Medical, Inc. Systems and devices for catheter driving instinctiveness
US9737371B2 (en) 2014-09-30 2017-08-22 Auris Surgical Robotics, Inc. Configurable robotic surgical system with virtual rail and flexible endoscope
US10314463B2 (en) 2014-10-24 2019-06-11 Auris Health, Inc. Automated endoscope calibration
US10143526B2 (en) * 2015-11-30 2018-12-04 Auris Health, Inc. Robot-assisted driving systems and methods
US9931025B1 (en) 2016-09-30 2018-04-03 Auris Surgical Robotics, Inc. Automated calibration of endoscopes with pull wires
US10244926B2 (en) 2016-12-28 2019-04-02 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
WO2018208994A1 (en) 2017-05-12 2018-11-15 Auris Health, Inc. Biopsy apparatus and system
US10299870B2 (en) 2017-06-28 2019-05-28 Auris Health, Inc. Instrument insertion compensation
US10426559B2 (en) 2017-06-30 2019-10-01 Auris Health, Inc. Systems and methods for medical instrument compression compensation
WO2019033098A2 (en) * 2017-08-11 2019-02-14 Elucid Bioimaging Inc. QUANTITATIVE MEDICAL IMAGING REPORT
US10145747B1 (en) 2017-10-10 2018-12-04 Auris Health, Inc. Detection of undesirable forces on a surgical robotic arm
CN110831536B (zh) 2017-12-06 2021-09-07 奥瑞斯健康公司 用于针对非命令器械滚转进行校正的系统和方法
AU2018384820B2 (en) 2017-12-14 2024-07-04 Auris Health, Inc. System and method for estimating instrument location
US10765303B2 (en) 2018-02-13 2020-09-08 Auris Health, Inc. System and method for driving medical instrument
KR20250130720A (ko) 2018-09-28 2025-09-02 아우리스 헬스, 인코포레이티드 의료 기구를 도킹시키기 위한 시스템 및 방법
WO2020069404A1 (en) 2018-09-28 2020-04-02 Auris Health, Inc. Robotic systems and methods for concomitant endoscopic and percutaneous medical procedures
EP3711673A1 (en) 2019-03-18 2020-09-23 Koninklijke Philips N.V. Methods and systems for adjusting the field of view of an ultrasound probe
WO2021137072A1 (en) 2019-12-31 2021-07-08 Auris Health, Inc. Anatomical feature identification and targeting
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
JP7497440B2 (ja) 2019-12-31 2024-06-10 オーリス ヘルス インコーポレイテッド 経皮的アクセスのための位置合わせインターフェース
US11737663B2 (en) 2020-03-30 2023-08-29 Auris Health, Inc. Target anatomical feature localization

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090268955A1 (en) * 2008-04-23 2009-10-29 Aditya Koolwal Systems, Methods and Devices for Correlating Reference Locations Using Image Data
US20130195313A1 (en) * 2010-03-19 2013-08-01 Koninklijke Philips Electronics N.V. Automatic positioning of imaging plane in ultrasonic imaging
US20130223702A1 (en) 2012-02-22 2013-08-29 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US20140187919A1 (en) * 2011-04-21 2014-07-03 Koninklijke Philips N.V. Mpr slice selection for visualization of catheter in three-dimensional ultrasound

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4709419B2 (ja) * 2001-04-24 2011-06-22 株式会社東芝 細径プローブ型超音波診断装置
JP4377646B2 (ja) * 2003-10-08 2009-12-02 株式会社東芝 画像診断装置、画像表示装置及び3次元画像表示方法
WO2006038188A2 (en) * 2004-10-07 2006-04-13 Koninklijke Philips Electronics N.V. Method and system for maintaining consistent anatomic views in displayed image data
US7713210B2 (en) * 2004-11-23 2010-05-11 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for localizing an ultrasound catheter
JP4653542B2 (ja) * 2005-04-06 2011-03-16 株式会社東芝 画像処理装置
JP2012506283A (ja) * 2008-10-22 2012-03-15 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 3次元超音波画像化
US8858436B2 (en) * 2008-11-12 2014-10-14 Sonosite, Inc. Systems and methods to identify interventional instruments
US8852103B2 (en) * 2011-10-17 2014-10-07 Butterfly Network, Inc. Transmissive imaging and related apparatus and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090268955A1 (en) * 2008-04-23 2009-10-29 Aditya Koolwal Systems, Methods and Devices for Correlating Reference Locations Using Image Data
US8270694B2 (en) 2008-04-23 2012-09-18 Aditya Koolwal Systems, methods and devices for correlating reference locations using image data
US20130195313A1 (en) * 2010-03-19 2013-08-01 Koninklijke Philips Electronics N.V. Automatic positioning of imaging plane in ultrasonic imaging
US20140187919A1 (en) * 2011-04-21 2014-07-03 Koninklijke Philips N.V. Mpr slice selection for visualization of catheter in three-dimensional ultrasound
US20130223702A1 (en) 2012-02-22 2013-08-29 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3193727A1 *

Also Published As

Publication number Publication date
JP2017527401A (ja) 2017-09-21
EP3193727A1 (en) 2017-07-26
US20170251988A1 (en) 2017-09-07

Similar Documents

Publication Publication Date Title
US20170251988A1 (en) Ultrasound imaging apparatus
US11100645B2 (en) Computer-aided diagnosis apparatus and computer-aided diagnosis method
JP6430498B2 (ja) 超音波剪断波エラストグラフィ測定のマッピングのためのシステムおよび方法
CN105407811B (zh) 用于超声图像的3d获取的方法和系统
EP3393366B1 (en) Ultrasound imaging apparatus and ultrasound imaging method for inspecting a volume of subject
CN101681504A (zh) 用于将实时超声图像与预先获取的医学图像进行融合的系统和方法
US11844654B2 (en) Mid-procedure view change for ultrasound diagnostics
JP5797364B1 (ja) 超音波観測装置、超音波観測装置の作動方法、及び超音波観測装置の作動プログラム
US10685451B2 (en) Method and apparatus for image registration
US20200305837A1 (en) System and method for guided ultrasound imaging
CN107106128B (zh) 用于分割解剖目标的超声成像装置和方法
KR102278893B1 (ko) 의료영상처리장치 및 이를 이용한 의료영상정합방법
JP7497424B2 (ja) 超音波ガイダンスダイナミックモード切り替え
US20180214129A1 (en) Medical imaging apparatus
CN107527379A (zh) 医用图像诊断装置及医用图像处理装置
US20200245970A1 (en) Prescriptive guidance for ultrasound diagnostics
CN103919571B (zh) 超声图像分割
US8724878B2 (en) Ultrasound image segmentation
US20120078101A1 (en) Ultrasound system for displaying slice of object and method thereof
CN107690312A (zh) 超声成像装置
KR102185724B1 (ko) 대상체의 측정에 사용될 캘리퍼 타입에 따라 위치가 교정된 포인트를 의료 영상에서 표시하기 위한 방법 및 장치
CN116113363A (zh) 将超声图像配准到解剖图的方法
Li et al. Real-time volumetric free-hand ultrasound imaging for large-sized organs: A study of imaging the whole spine
Maslebu et al. Using computer aided system to determine the maximum depth of visualization of B-Mode diagnostic ultrasound image
KR20160086126A (ko) 초음파 진단 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15760481

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15510103

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2017514650

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015760481

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015760481

Country of ref document: EP