WO2016064921A1 - Automatic detection of regions of interest in 3d space - Google Patents

Automatic detection of regions of interest in 3d space Download PDF

Info

Publication number
WO2016064921A1
WO2016064921A1 PCT/US2015/056522 US2015056522W WO2016064921A1 WO 2016064921 A1 WO2016064921 A1 WO 2016064921A1 US 2015056522 W US2015056522 W US 2015056522W WO 2016064921 A1 WO2016064921 A1 WO 2016064921A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
fiducial
processor
region
image
wire
Prior art date
Application number
PCT/US2015/056522
Other languages
French (fr)
Inventor
Yanhui GUO
Original Assignee
MedSight Tech Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/36Image preprocessing, i.e. processing the image information without deciding about the identity of the image
    • G06K9/46Extraction of features or characteristics of the image
    • G06K9/52Extraction of features or characteristics of the image by deriving mathematical or geometrical properties from the whole image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/32Aligning or centering of the image pick-up or image-field
    • G06K9/3216Aligning or centering of the image pick-up or image-field by locating a pattern
    • G06K2009/3225Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K2209/00Indexing scheme relating to methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K2209/05Recognition of patterns in medical or anatomical images
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30021Catheter; Guide wire
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30172Centreline of tubular or elongated structure
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

A medical imaging system including an imaging device configured to scan a region of interest to acquire an image including a plurality of pixels of the region of interest, a display output, and a processing unit. The processing unit is configured to receive the image including the plurality of pixels of the region of interest, enhance the image, segment fiducial candidate regions in a binary volume, determine a candidate centerline for each of the fiducial candidate regions in the binary volume, identify a true fiducial region from the fiducial candidate regions based on at least one of curvature, dimensions, and intensity present in the fiducial candidate regions, recover a fiducial component from the true fiducial region, and send an estimated fiducial volume of the recovered fiducial component to the display output.

Description

AUTOMATIC DETECTION OF REGIONS OF INTEREST IN 3D

SPACE

CROSS-REFERENCE TO RELATED PATENT APPLICATIONS

[0001] This application claims the benefit of U.S. Provisional Patent Application No. 62/066,231, titled "Automatic Detection of Regions of Interest in 3D Space," filed October 20, 2014, which is incorporated herein by reference in its entirety.

BACKGROUND

[0002] Intraoperative guidance has recently been a growing field in the surgical and medical field. Multiple technologies have been tested to support physicians (e.g., surgeons, interventional specialists, etc.) on the task of real-time localization of particular lesions. Some of these technologies include intraoperative mapping, the sentinel lymph node biopsy, and ongoing studies related to the use of intraoperative fluorescence and other targets for guidance.

[0003] Proper guidance during procedures is of prime importance for any type of intervention. For instance, the accurate diagnosis of a breast lesion relies on an appropriate biopsy sample, which relays on the accuracy of localization achieved. If a wrong tissue sample is obtained, pathological results will not show the correct pathology and the patient may be treated improperly. Due to this fact, many attempts to optimize surgical localization have been performed and currently, the wire localization technique is by far the most favored one for breast conservation surgeries in early stage lesions. This technique consists on introducing a wire through the skin, into the center of the lesion

(possible cancer) using ultrasound guidance. Afterwards, the patient goes to a surgical suite with the wire in place. During the surgical procedure, the surgeon makes a skin incision and tries to follow the wire into the deeper structures and remove the area around the tip of the wire. Due to the constant movement of the breast tissue forcing a constantly changing swirled position of the wire and the fact that the operator can only see the part of the wire that is outside the skin, about 30% of these procedures have incomplete resections. All these patients have to undergo a repeat surgery, as well as other potential treatments with a resultant higher cost of treatment and a worse overall prognosis. These unwanted results could be avoided if surgeons are provided with an improved technique for localization.

SUMMARY

[0004] One embodiment relates to a medical imaging system. The medical imaging system includes an imaging device configured to scan a region of interest to acquire an image including a plurality of pixels of the region of interest, a display output, and a processing unit. The processing unit is configured to receive the image including the plurality of pixels of the region of interest, enhance the image, segment fiducial candidate regions in a binary volume, determine a candidate centerline for each of the fiducial candidate regions in the binary volume, identify a true fiducial region from the fiducial candidate regions based on at least one of curvature, dimensions, and intensity present in the fiducial candidate regions, recover a fiducial component from the true fiducial region, and send an estimated fiducial volume of the recovered fiducial component to the display output.

[0005] Another embodiment relates to a method for use with a medical imaging system and automatic detection of objects. The method for use with a medical imaging system and automatic detection of objects includes loading an image including a plurality of pixels of a region of interest to a processor; enhancing the image with the processor; segmenting, by the processor, fiducial candidate regions in a binary volume; determining, by the processor, a candidate centerline for each of the fiducial candidate regions in the binary volume; identifying, by the processor, a true fiducial region from the fiducial candidate regions based on at least one of curvature, dimensions, and intensity present in the fiducial candidate regions; recovering, by the processor, a fiducial component from the true fiducial region; and displaying, by a display device, the image with an estimated fiducial volume of the recovered fiducial component.

[0006] Another embodiment relates to a method for use with a medical imaging system and automatic detection of a wire. The method for use with a medical imaging system and automatic detection of a wire includes loading an image including a plurality of pixels of a region of interest to a processor; segmenting, by the processor, wire candidate regions in a binary volume; determining, by the processor, a candidate centerline for each of the wire candidate regions in the binary volume; identifying, by the processor, a true wire region from the wire candidate regions; recovering, by the processor, the wire from the true wire region; and displaying, by a display device, the image with an estimated wire volume of the recovered wire. The processor provides a visualization of at least one of the wire component and the region of interest throughout a procedure in real-time.

[0007] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description. Other systems, methods, features and/or advantages will be or may become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features and/or advantages be included within this description and be protected by the accompanying claims.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] The components in the drawings are not necessarily to scale relative to each other. Like reference numerals designate corresponding parts throughout the several views.

[0009] FIG. 1 is a schematic block diagram of a computing device, according to an example embodiment.

[0010] FIG. 2A is a perspective view of a fiducial, according to an example

embodiment.

[0011] FIG. 2B is an illustration of the fiducial of FIG. 2A implanted within tissue near a region of interest, according to an example embodiment.

[0012] FIG. 2C is an illustration of scanning tissue to locate the fiducial and the region of interest, according to an example embodiment. [0013] FIG. 2D is an illustration of a three-dimension reconstruction of the scanned tissue with the fiducial, according to an example embodiment.

[0014] FIG. 3 is a flow diagram illustrating example operations for performing a fiducial detection algorithm, according to an example embodiment.

DETAILED DESCRIPTION

[0015] In the following detailed description, reference is made to the accompanying drawings, which form a part thereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.

[0016] Referring to Figures generally, a solution to achieve intraoperative guidance utilizing a software that allows ultrasound images to automatically recognize fiducials (e.g., wire, clips, internal/external markings, etc.) positioned at or around an area or region of interest is disclosed. Once the area is recognized, the image may be reconstructed in three-dimensions (3D) in a real-time fashion. Thereby, the user may have an optimal visualization of the area of interest throughout the procedure.

[0017] The image may provide a 2D or 3D visualization of an object, for example.

Example imaging modalities that provide 2D or 3D visualizations include, but are not limited to, ultrasound imaging, photoacoustic imaging, magnetic resonance imaging (MRI), computed tomography (CT) imaging, fluoroscopic imaging, x-ray imaging, fluorescence imaging and nuclear scan imaging. In the examples provided below, the image is a 3D ultrasound image, for example, acquired by a 3D ultrasound system. 3D ultrasound systems are portable, relatively inexpensive, and do not subject a patient to ionizing radiation, which provide advantages over CT scans (radiation exposure) and MRIs (relatively large system and time of reconstruction) for real-time image guidance. However, this disclosure contemplates using images acquired by any imaging modality that provides a 2D or 3D visualization, as mentioned above. Additionally, the object of interest may be a lesion region of interest such as a tumor, lesion, organ or other tissue of interest, for example. In the examples provided below, the image is an image of breast tissue of a subject and the object is a tumor. However, this disclosure contemplates using images of other tissue and objects other than tumors. The subject or patient described herein may be human or non-human mammals of any age.

[0018] When the logical operations described herein are implemented in software, the process may execute on any type of computing architecture or platform. For example, referring to FIG. 1, an example computing device upon which embodiments of the invention may be implemented is illustrated. In particular, at least one processing device described herein may be a computing device, such as computing device 10 shown in FIG. 1. The computing device 10 may include a bus or other communication mechanism for communicating information among various components of the computing device 10. In its most basic configuration, computing device 10 typically includes at least one processing unit 16 and a system memory 14. Depending on the exact configuration and type of computing device, the system memory 14 may be volatile (such as random access memory (RAM)), non-volatile (such as read-only memory (ROM), flash memory, etc.), or some combination of the two. This most basic configuration is illustrated in FIG. 1 by dashed line 12. The processing unit 16 may be a standard programmable processor that performs arithmetic and logic operations necessary for operation of the computing device 10.

[0019] The computing device 10 may have additional features/functionality. For example, the computing device 10 may include additional storage such as a removable storage 18 and a non-removable storage 20 including, but not limited to, magnetic or optical disks or tapes. The computing device 10 may also contain network connection(s) 26 that allow the device to communicate with other devices. The computing device 10 may also have input device(s) 24 such as a keyboard, mouse, touch screen, imaging devices (e.g., ultrasound, MRI, etc.), and the like. Output device(s) 22 such as a display, speakers, printer, etc. may also be included. The additional devices may be connected to the bus in order to facilitate communication of data among the components of the computing device 10.

[0020] The processing unit 16 may be configured to execute program code encoded in tangible, computer-readable media. Computer-readable media refers to any media that is capable of providing data that causes the computing device 10 (i.e., a machine) to operate in a particular fashion. Various computer-readable media may be utilized to provide instructions to the processing unit 16 for execution. Common forms of computer-readable media include, for example, magnetic media, optical media, physical media, memory chips or cartridges, a carrier wave, or any other medium from which a computer can read. Example computer-readable media may include, but is not limited to, volatile media, nonvolatile media and transmission media. Volatile and non- volatile media may be implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data and common forms are discussed in detail below. Transmission media may include coaxial cables, copper wires and/or fiber optic cables, as well as acoustic or light waves, such as those generated during radio-wave and infra-red data communication. Example tangible, computer- readable recording media include, but are not limited to, an integrated circuit (e.g., field- programmable gate array or application-specific IC), a hard disk, an optical disk, a magneto-optical disk, a floppy disk, a magnetic tape, a holographic storage medium, a solid-state device, RAM, ROM, electrically erasable program read-only memory

(EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices.

[0021] In an example implementation, the processing unit 16 may execute program code stored in the system memory 14. For example, the bus may carry data to the system memory 14, from which the processing unit 16 receives and executes instructions. The data received by the system memory 14 may optionally be stored on the removable storage 18 or the non-removable storage 20 before or after execution by the processing unit 16.

[0022] The computing device 10 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computing device 10 and includes both volatile and non- volatile media, removable and non-removable media. Computer storage media include volatile and non-volatile, and removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. The system memory 14, the removable storage 18, and the non- removable storage 20 are all examples of computer storage media. Computer storage media include, but are not limited to, RAM, ROM, electrically erasable program read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computing device 10. Any such computer storage media may be part of the computing device 10.

[0023] It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination thereof. Thus, the methods and apparatuses of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine -readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computing device, the machine becomes an apparatus for practicing the presently disclosed subject matter. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non- volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or utilize the processes described in connection with the presently disclosed subject matter, e.g., through the use of an application programming interface (API), reusable controls, or the like. Such programs may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language and it may be combined with hardware implementations.

[0024] As shown in FIG. 1, the system memory 14 includes various modules for completing the activities described herein. More particularly, the system memory 14 includes an imaging module 30, a fiducial module 32, a critical structure module 34, a display module 36, and a user interface module 38. The modules of the system memory

14 may be configured to facilitate automatic detection of fiducials (e.g., a wire, a clip, etc.) surgically implanted or disposed within tissue of a patient. While various modules with particular functionality are shown in FIG. 1, it should be understood that the computing device 10 and the system memory 14 may include any number of modules for completing the functions described herein. For example, the activities of multiple modules may be combined as a single module, additional modules with additional functionality may be included, etc. Further, it should be understood that the computing device 10 may further control other activity beyond the scope of the present disclosure.

[0025] According to the exemplary embodiment shown in FIGS. 2A-2C, a fiducial 50 may be disposed internally within tissue 42 (e.g., breast, abdomen, etc.) of a subject, shown as patient 60. As shown in FIGS. 2A-2C, the fiducial 50 includes a t-shaped wire. In other embodiments, the fiducial 50 includes a differently shaped wire, a clip, a needle, an internal marking, and/or an external marking. As shown in FIGS. 2B-2C, the fiducial 50 is disposed internally within the tissue 42 of the patient 60 (i.e., invisible to the eye). In other embodiments, the fiducial 50 at least partially protrudes out of the tissue 42 (i.e., is at least partially visible to the eye).

[0026] As shown in FIGS. 2B-2C, the fiducial 50 may be positioned at least partially within the tissue 42 of the patient 60 by a surgeon, doctor, radiologist, and/or another specialist such that at least a tip 52 of the fiducial 50 is positioned near and/or within a region of interest (ROI) 40 of the tissue 42. The fiducial 50 may also be positioned near or proximate critical structures 44 which may include internal structures such as veins, arteries, and/or the like. The surgeon, doctor, radiologist, and/or other specialist that placed the fiducial 50 within the tissue 42 may make notes within the computing device 10 regarding the placement of the fiducial 50 in relation to the critical structures 44 (e.g., to inform future users, surgeons, etc. of the critical structures 44, etc.).

[0027] The fiducial 50 may be subsequently detected using an imaging device (e.g., ultrasound, etc.) to locate the ROI 40 and/or for other monitoring purposes. The ROI 40 may include an organ, a vein, an artery, tissue, a lesion, a tumor, and/or another object of interest or concern within the tissue 42. According to the exemplary embodiment shown in Fig. 2C, the computing device 10 is communicably coupled to the input device 24 which may include the imaging device (e.g., ultrasound, etc.). The imaging device may be used to scan the tissue 42 of the patient 60 to acquire image data including a plurality of pixels regarding at least one of the tissue 42, the fiducial 50, the critical structure(s) 44, and the ROI 40.

[0028] Referring back to FIG. 1, the imaging module 30 may include or be operatively and communicably coupled to the input device 24, such as an imaging device (e.g., ultrasound, MRI, etc.). The imaging module 30 may be configured to receive and interpret the imaging data acquired by the imaging device. As described above, the imaging data may be indicative of the tissue 42, the fiducial 50, the critical structures 44, and/or the ROI 40. The imaging module 30 may be further structured to store the imaging data for future use by other modules (e.g., the fiducial module 32, the critical structure module 34, etc.). According to one embodiment, the imaging module 30 may include communication circuitry configured to facilitate the exchange of information, data, values, non-transient signals, etc. between and among the imaging module 30, the imaging device of the input device 24, and one or more other modules of the system memory 14. In this regard, the imaging module 30 may include communication circuitry including, but not limited to, wired and wireless communication protocol to facilitate reception of the imaging data.

[0029] The fiducial module 32 may be configured may be configured to store models

(e.g., three-dimensional models, three-dimensional representations, two-dimensional models, etc.) of various fiducials 50 that may be used with the imaging device to locate a

ROI 40. The models of the fiducials 50 may be pre-stored within the fiducial module 32 and/or uploaded on a need basis by a user. The models of the fiducials 50 provide dimensional and/or structural characteristics (e.g., height, width, depth, shape, etc.) of the fiducials 50. The fiducial module 32 may be further configured to automatically detect a fiducial 50 (e.g., with a corresponding model stored within the fiducial module 32, etc.) within the tissue 42 from the imaging data acquired by the imaging module 30. In one embodiment, the fiducial module 32 receives a command to search through the imaging data for a specific fiducial 50. For example, the user interface module 38 may receive a command from a user to identify a desired fiducial 50 within the tissue 42. Thus, the fiducial module 32 may be configured to analyze the imaging data to locate and identify the desired fiducial 50. In other embodiments, the fiducial module 32 is configured to detect and identify any fiducial 50 that has a corresponding model stored within the fiducial module 32 (e.g., in real-time, etc.). In some embodiments, the fiducial module 32 is capable of identifying multiple fiducials 50 simultaneously. The process through which the fiducial module 32 performs the automatic detection of the fiducial(s) 50 is described more fully herein in regard to method 100 of FIG. 3.

[0030] The critical structure module 34 may be configured to receive, identify, and/or store information regarding critical structures 44 near and/or proximate a fiducial 50. The information regarding the critical structures 44 may be inputted by a user via the input device 24 (e.g., a keyboard, etc.) and received by the user interface module 38. In some embodiments, the critical structure module 34 is configured to associate the critical structures 44 with the fiducials 50 such that once a fiducial 50 is detected, the critical structures 44 near and/or proximate the fiducial 50 are identified and/or characterized. The information may include instructions and/or positional characteristics for a current and/or a subsequent user (e.g., such as to avoid a certain area to reduce the risk of affecting the critical structure 44, a distance from the fiducial 50 that the critical structure 44 is located, etc.). In some embodiments, the critical structure module 34 is configured to identify and characterize the critical structures 44 in real-time by analyzing the imaging data (e.g., irrespective of the fiducials 50, etc.).

[0031] The display module 36 may include or be operative ly and communicably coupled to the output device 22, such as a display or monitor. The display module 36 may be configured to provide a two-dimensional and/or a three-dimensional representation

(e.g., on the output device 22, a monitor, etc.) of at least one of the tissue 42, the ROI 40, the fiducial 50, and/or the critical structures 44 in real-time. According to the exemplary embodiment shown in FIG. 2D, the display module 36 provides a three-dimensional representation 70 to a user of the computing device 10. According to an exemplary embodiment, the display module 36 is configured to display the surrounding tissue as transparent or at least partially transparent such that the depth and location of the fiducial

50 can be appreciated by the user. According to one embodiment, the display module 36 may include communication circuitry configured to facilitate the exchange of information, data, values, non-transient signals, etc. between and among the display module 36, the output device 22, and one or more other modules of the system memory 14. In this regard, the display module 36 may include communication circuitry including, but not limited to, wired and wireless communication protocol to facilitate displaying the 3D image and fiducial 50 (e.g., in real-time, etc.).

[0032] In addition to automatically detecting and providing a visual representation of the fiducial 50 to the user, the display module 36 may further be configured to provide other pertinent information to the user. For example, the display module 36 may provide information regarding the critical structures 44 associated with (e.g., positioned near as indicated by a prior user, etc.) the fiducial 50. The information may be used by the current user to preserve (e.g., not affect, etc.) the critical structure 44 during a surgical operation (e.g., prevent a surgeon from cutting a vein, etc.). In some embodiments, the display module 36 provides a visual aid for the current user regarding the critical structure 44. The visual aid may include a three-dimensional depiction of the critical structure 44, such as visual depiction 46. The display module 36 may alternatively provide a three- dimensional representation of the critical structure 44 in real time. The visual

representation of the critical structure 44 may provide the user (e.g., surgeon, etc.) with additional knowledge (e.g., depth perception, relative location, etc.) regarding the positon of the critical structure 44 relative to the fiducial 50 (e.g., providing increased accuracy, to preserve the critical structure 44, etc.). All of the information provided by the display module 36 may be displayed in real-time; increasing efficiency, accuracy, and the like during monitoring and/or a surgical procedure.

[0033] According to an exemplary embodiment, automatically detecting the fiducials 50 and providing a visual representation in real-time (e.g., during a surgical procedure, etc.) allows surgeons to be guided appropriately (e.g., to perform the operation or surgery, etc.).

The user interface module 38 may be further configured to allow a user to manipulate the three-dimensional representation 70. For example, a user may be able to turn, rotate, zoom, etc. the three-dimensional representation 70. Thus, a surgeon no longer has to solely try to follow the fiducial 50 into the deeper structures of the tissue 42 to locate and/or remove the ROI 40 around the tip 52 of the fiducial 50, as he/she will be at least partially guided by the visual display (e.g., the three-dimensional representation 70, etc.).

Further, the effects of the movement of the tissue 42 forcing a constantly changing swirled position of the fiducial 50 and the fact that the operator may only see a part of the fiducial

50 that is outside the skin may no longer negatively affect these procedures (e.g., reduces the number of incomplete resections, prevent repeat surgeries, increase efficiency, decrease costs, etc.).

[0034] As shown in FIG. 3, a method 100 for automatic detection of a fiducial is shown, according to an example embodiment. In one example embodiment, method 100 may be implemented with the computing device 10 of FIG. 1. Accordingly, method 100 may be described in regard to FIG. 1.

[0035] As a brief overview, method 100 describes a fiducial (e.g., wire, clip, internal marking, external marking, etc.) location detection approach based on a neutrosophic centerline finding algorithm in a 3D image (e.g., a breast ultrasound (BUS) image, etc.). At first, a Hessian filter is employed to enhance the fiducial-like structures in the image. Then an adaptive threshold value is selected automatically to obtain the region the fiducial candidate is located. The image is transferred into a binary volume using the adaptive threshold value, and regarded as the candidate fiducial volume. The volume may have a substantial amount of noise pixels and false positive regions, such as muscle, glands, lesions, and other fiducial-like tissues. The noise and false positive regions may make it difficult to find an accurate centerline for each object in the volume. Then, a distance map of the volume is obtained using a distance transform. A ridge point is defined using the values on the distance map, which can be regarded as the candidate pixels on each component's candidate centerline. A neutrosophic set is employed to define a cost function used in a path finding algorithm. The defined cost matrix is robust to noise and can find an accurate candidate centerline for each object in the volume. Finally, an optimization algorithm is utilized to find the path with a minimum cost function value as the true centerline of each component in the volume. Finally, the true fiducial component is identified according to its curvature value.

[0036] In one embodiment, the fiducial detection algorithm is performed on a 3D BUS image and the fiducial component is a wire. Thereby, method 100 may be described in regards to the fiducial detection algorithm being performed on a wire fiducial component within a 3D BUS image. At 102, a 3D digital imaging and communications in medicine (DICOM) file of a breast ultrasound (BUS) image is loaded to the computing device 10. The BUS image includes a plurality of pixels that define the BUS image provided by an ultrasound imaging device (e.g., an input device 24, ultrasound imaging modality, etc.). The BUS image may be of an area or region of interest (e.g., tissue, tumor, lesion, organ, fiducial, etc.). Prior to loading the image of the region of interest obtained by the ultrasound device, a fiducial or fiducial marker (e.g., wire, clip, external marking, etc.) is positioned at or around an area or region of interest by trained personnel (e.g., doctor, radiologist, etc.). Thereby, the BUS image may include the fiducial which is positioned for automatic localization purposes which is described more fully herein. In other embodiments, the modality in which the image is provided to the computing device 10 may be from a different imaging modality (e.g., not provided by ultrasound imaging, etc.). For example, the image may be provided by photoacoustic imaging, magnetic resonance imaging (MRI), computed tomography (CT) imaging, fluoroscopic imaging, x-ray imaging, fluorescence imaging and/or nuclear scan imaging. In other embodiments, the DICOM file may include an image of a different region of interest (ROI) and/or object (e.g., not breast tissue, breast tumor, etc.). This disclosure contemplates scanning other tissue and regions of interest other than tumors (e.g., organs, tissue, lesions, etc.) of humans (e.g., male, female, etc.) and non-human mammals of any age (e.g., adult, adolescent, etc.).

[0037] Following the loading of the BUS image, via computing device 10, a fully automatic (i.e., requires no human interaction, etc.) fiducial location algorithm is applied on the BUS image. At 104, the BUS volume (i.e., 3D BUS image, etc.), VBUS, is enhanced. First, a Hessian matrix is calculated on the plurality of pixels of the BUS image. The Hessian matrix quantifies a local geometry (e.g., curvature, orientations, etc.) between the plurality of pixels in the 3D image (i.e., BUS image, etc.). In order to enhance the regions of the BUS image with a vessel structure (e.g., a wire-like structure, etc.), an assessment of local orientations via eigenvalue analysis of the Hessian matrix is performed. Thereby, the eigenvalues and eigenvectors of the Hessian matrix are analyzed. A filtering process, for vessel enhancement, searches for geometrical structures which may be regarded as tubular (e.g., wire-like such as a wire fiducial marker, etc.). The idea of the eigenvalue analysis of the Hessian matrix is to extract the principal directions in which the local second order structures of the image may be decomposed. The eigenvectors point in the direction in which the second order image information takes extremal values, while the eigenvalues give these extremal values. In the case of a bright wire (e.g., an ideal tubular structure in a 3D image, fiducial, etc.) on a dark breast tissue or tumor background, and with the following ordering of eigenvalues lAJ < \λ2 \≤ \λ3\, the direction along the wire is given by v1 when |Λ]_ | « 0 and \λ1 \ « \λ2 | « \λ3 \. The direction along the wire is used to determine the length of the wire with respect to the width of the BUS image, which is described more fully herein.

[0038] When filtering the 3D BUS image, it is important to differentiate between the various structures within the image. The various structures may include at least one of a line-like structure, a plate-like structure, and a blob-like structure. A first ratio

Figure imgf000016_0001
is essential for distinguishing between plate-like structures and line-like structures. A second ratio
Figure imgf000016_0002
accounts for blob-like structures, but cannot distinguish between a line-like and plate-like structure, unlike the first ratio in Eqn. (1). The two geometric ratios (i.e., Eqns. (1) and (2)) are great-level invariant (i.e., they remain constant under intensity re-scalings). This ensures that only the geometric information of the 3D BUS image is captured. However, the 3D BUS image includes additional information (e.g., information other than geometric information, etc.). The foreground (e.g., a fiducial such a wire structure, a clip, an external marking, etc.) is brighter than the background (e.g., tissue, tumor, etc.) and occupies a substantially smaller volume compared to the background. Thereby, the following expression
Figure imgf000016_0003
differentiates between the foreground (e.g., wire, etc.) and the background (e.g., tissue, organ, tumor, lesion, etc.). A vesselness function may be used to calculate a multi-scale vesselness measure, i.e., each voxel in the output volume (e.g., BUS volume, 3D BUS image, etc.) indicates how similar the local structure is to a tube (e.g., the wire, a fiducial component, etc.). The formulation of the vesselness measure is as follows:

Figure imgf000017_0001
where parameters , β, and γ are weighing factors for determining the influence of A, B, and S. The vesselness function (i.e., enhancement function, etc.), as shown in Eqn. (4), consists of exponential functions. The exponential functions become advantageous when modifying the function to satisfy certain constraints imposed on the diffusion tensor to generate a scale representation. A wire-like response is calculated at multiple scales by computing the Hessian matrix with Gaussian derivatives at each of the multiple scales. The pixels with the highest response output are regarded as the wire-like structure pixels (e.g., wire candidate regions, etc.). Thereby, the BUS volume, VBUS, (e.g., 3D BUS image, etc.) is enhanced and the fiducial candidate regions (e.g., wire candidate regions, etc.) become the features of focus (e.g., feature value Vft, etc.).

[0039] At 106, an adaptive thresholding method is employed to segment the fiducial candidate regions from the enhanced image. An adaptive threshold value is selected adaptively and iteratively to optimize the adaptive threshold value. The optimized adaptive threshold value is determined based on the component analysis results (e.g., enhancement of the 3D BUS image, etc.) and prior knowledge (e.g., length, diameter, etc.)of the fiducial component (e.g., wire, etc.). With the optimized adaptive threshold value, the fiducial candidate regions are segmented in a binary volume, Vb . For example, a feature value is determined according to the Hessian matrix eigenvalues and pixel intensities as

Vft (p) = VH (p)In{V) (5) where Vft is the feature value on the BUS volume, VBUS, /n(p) is the intensity of the pixels in the volume, and p is a pixel in the volume. An initial adaptive threshold value is determined according to the feature values determined from Eqn. (5) as

Figure imgf000018_0001
where Th^ is the initial adaptive threshold value and N is the number of pixels in the volume. Using the initial adaptive threshold value, the 3D BUS image is transformed into a binary image

where (p) is a 3D binary volume determined from the initial adaptive threshold value determined in Eqn. (6). A 3D connected component analysis is used to obtain different disconnected object regions in the 3D binary volume. Each component's majority axis is calculated as the length of the fiducial component (e.g., the wire's length, etc.), and the average value of the majority axis (e.g., longitudinal axis, etc.) is used as an indicator to test if the fiducial component (e.g., wire, etc.) has been segmented entirely. To determine the length of the majority axis of each component, the following expression may be used:

/NDW = ^ i (8)

MW where Mc is the majority axis length of the ith component at the tth iteration, is the component number at the tth iteration, and the Ind^ is the average length of majority axis length for all 3D components at the tth iteration. According to the prior knowledge about the length of the wire (i.e., the fiducial component, etc.), the wire's length is larger than (e.g., at least, etc.) one-fourth of the width of the BUS image. Therefore, any component with a length of less than one-fourth the width of the BUS image does not meet a length threshold requirement and is not regarded as a fiducial candidate region going forward (i.e., is not segmented from the BUS image, etc.). For example, an object other than a wire (e.g., muscle, infection, etc.) may create the illusion of a wire-like structure. However, if the length of the object is less than the length threshold (i.e., one- fourth the width of the BUS image, etc.), the object is not segmented. In other

embodiments, the length threshold may be either greater than or less than one-fourth (e.g., one-eighth, one-half, etc.) the width of the BUS image. [0040] Initially, the adaptive threshold value is selected substantially high. When the adaptive threshold value is substantially high, the fiducial (e.g., wire, etc.) candidate region are not segmented entirely. In order to extract the fiducial component entirely, the adaptive threshold value is reduced and the component analysis is performed again with a new adaptive threshold value, as shown in Eqns. (9) and (10):

(9)

Figure imgf000019_0001
where Th^ is the adaptive threshold value at the tth iteration (e.g., second, third, tenth, etc.), Th^- is prior determined adaptive threshold value (i.e., the tth— 1 adaptive threshold value, etc.) (e.g., first, second, ninth, etc.), Δ is a reduction value applied to the prior adaptive threshold value (i.e., the adaptive threshold value at t— 1, etc.), and

is the 3D binary volume determined from the current adaptive threshold value. The reduction value may be a preset constant reduction value, a function (e.g., exponential, logarithmic, etc.), or adaptive to alter the rate of reduction for each iteration. This procedure is performed iteratively until a determination criterion is satisfied. The determination criterion is satisfied when the average value of the majority axis of all fiducial candidate regions is greater than one-fourth of the width of the BUS image (e.g., the length threshold, etc.). When satisfied, the threshold selection procedure is terminated and the optimized adaptive threshold value Th* is obtained from the last iteration (i.e., Th^). Finally, the thresholding method results in the segmentation of the components regarded as the fiducial candidate regions (e.g., wire candidate regions, etc.) in the binary volume.

[0041] At 108, the fully automatic fiducial location algorithm is used on the BUS image to determine a candidate centerline for each fiducial candidate region (e.g., the wire candidate regions, etc.) in the binary volume. The candidate centerline is a single pixel thick (e.g. a skeleton line, etc.) and tracks the middle (i.e., center) of a component (e.g., candidate region, etc.), preserving the topology (e.g., geometric properties, spatial relations, etc.) of the component. The fiducial candidate regions may include a true fiducial region (e.g., a true wire region, etc.) and a false positive fiducial region (e.g., a false positive wire region, etc.), such as muscle lines (e.g., the boundary between skin and subcutaneous tissue, etc.). In order to obtain the true fiducial region (e.g., the true wire region, etc.) accurately, a neutrosophic centerline algorithm is used. As a brief overview of the neutrosophic centerline algorithm, a distance map is calculated on the binary volume that includes the fiducial candidate regions. Ridge points, used for centerline candidate points, are detected and defined in the distance map. A cost function is defined using a neutrosophic set, and a cost function matrix is constructed for each candidate point. A path finding algorithm is employed to identify the optimum path with the least cost value between two points, thereby, identifying an optimized centerline of the fiducial candidate regions of the binary volume.

[0042] A distance map (DM) may be used as a representation of a binary 2D/3D image, and the value in the DM is that the DM measures each pixel in the image with a distance to the closest boundary (e.g., edge of candidate region, etc.). A Euclidean distance metric may be used to measure a pixel's distance, and a distance map of a volume, DMV, is obtained by applying a distance map transform on the volume as shown by

DMVb = DT(Vb) (1 1) where DT Vb) is a distance map transform function which evaluates the distance of each pixel of the binary volume, Vb, to the closest boundary and DMVb is the distance map of the binary volume. The distance map transform is a fundamental geometrical operator with great applicability in object shape analysis and computational geometry.

[0043] Following the creation of the DM of the binary volume, ridge points are detected and defined in the DM. A ridge point may represent a convex part on the binary volume, and be used as a centerline candidate point. A centerline candidate point is used to represent a point along the candidate centerline of a component (e.g., the fiducial candidate region, the wire candidate region, etc.). The ridge points are usually inside the volume of the component and have larger distance values than those of neighboring points. Essentially, the centerline is the furthest location from a boundary (e.g., edge, etc.) of the component. Therefore, since the ridge points lie on the centerline of the component, the distance values of the ridge points are the highest of all the points in the component. The ridge points of a component are defined by the following expression:

DT(p)≥ DT(q), Vq E Q (12) where p is a voxel, DT(p) is a distance transform value for p, Q is a neighboring set of points to p, q is a point within the set of Q, and DT(q) is a distance transform value for q. As long as Eqn. (12) holds true for an individual voxel (e.g., p, etc.) compared to all neighboring voxels (e.g., q, etc.), the individual voxel p is regarded as a ridge point along the candidate centerline of the component. By repeating this process for various locations along the length of the component, a ridge point set may be defined as

RPS = {p I RPS, DT(p)≥ DT(q), Vq E Q} (13) where RPS is a ridge point set which includes all ridge points along the candidate centerline of the component, defining the candidate centerline. In some embodiments, the ridge points may be affected by noise or small convex objects within the volume of the component. As a result, the candidate centerline as a whole may not be substantially smooth and have a variety of small burrs between the various ridge points.

[0044] A neutrosophic set is used to define a cost function. The cost function is configured to search for the optimum centerline path to substantially smoothen the candidate centerline defined by the RPS, and make the candidate centerline robust to noise. In the neutrosophic set, each element (e.g., pixel, voxel, etc.) is defined by one of three subset memberships: a truth degree, T(p), an indeterminacy degree, / (p), and a false degree, (p) . The truth degree may be expressed as

DT(p)

T p) = NDT(p) = (14) max(Dr(p)) where DT p) is a value on the distance transform map and NDT(p) is the normalized results. The indeterminacy degree may be expressed as

Figure imgf000022_0001
where GDT(p) is a gradient magnitude value on the distance transform map and

NGDT(p) is the normalized results. From Eqns. (14) - (15), the neutrosophic set, Ns(p), may be defined as

Ns(p) = (l - T p)) x / (p) (16)

The cost function value between two neighboring elements (e.g., pixels, voxels, ridge points, etc.) is defined by

C(p, q) = Ns(p) + Ns(q) (17) where p and q are two neighboring voxels, and C(p, q) is the cost function value between two neighboring voxels (e.g., p and q, etc.). The cost function of a connected path of voxels (e.g., ridge points, a RPS, a candidate centerline, etc.) may be represented by n

Figure imgf000022_0002
where P is a connected path of n points (e.g., P = {p1} p2, ·· · , pn}, etc.) and C(P) is a cost function of the path P. The definition of the cost function in Eqn. (18) substantially guarantees the centerline has the ability to avoid the noise objects in the volume of the component. Thereby, the centerline (i.e., ideal centerline, optimized centerline, etc.) for each component in the binary volume may be determined.

[0045] At 1 10, the centerline of the true fiducial region (e.g., a true wire region, etc.) is identified based on the curvature value of the 3D curve of the centerline. For example, after all candidate centerlines for the fiducial candidate regions are extracted, the true fiducial region (e.g., true wire region, etc.) may be defined using the curvature value of the path (i.e., centerline, etc.). For a parametrically defined space curve in three dimensions (e.g., x, y, and z) given by the Cartesian coordinates y(t) = (x(t), y (t), z(t)), the curvature is defined as (z"y' - y"z')2 + (χ"ζ' - z"x')2 + (y"x' - x"y')2 ,. n.

K = — ( iy)

( '2 + y'2 + z'2)3/2 where the prime and double prime denote a first order differentiation and a second differentiation with respect to the parameter t, respectively. Based on the prior knowledge of the fiducial component (e.g., the true wire region is substantially similar to a smooth line in the 3D volume, component length, component shape, etc.) and the curvature values of all the candidate centerlines, the fiducial component (e.g., wire, etc.) is identified. In some embodiments, the true fiducial region is identified additionally or alternatively based on the dimensions (e.g., relative to the fiducial 50, the fiducial component, etc.), the intensity (e.g., relative to the background, ROI 40, etc.), and/or other features present within the fiducial candidate regions.

[0046] At 1 12, the fiducial component (e.g., wire component, etc.) is recovered using a morphology operation (e.g. a dilation operation, etc.) on the centerline (e.g., skeleton line, etc.) of the true fiducial region to obtain the volume of fiducial component with the fiducial component as the foreground of the image (e.g., based on the fiducial models stored within the computing device 10, etc.). The basic effect of the dilation operator on the binary volume is to gradually enlarge the boundaries of regions of

foreground pixels (e.g., centerline of the fiducial, wire, etc.) until the component (e.g., wire, other fiducial, etc.) is recovered completely. For example, the wire candidate region with high line curvature values (i.e., true wire region, etc.) is merged using the dilation operation and the wire is recovered using the centerline and the size information of the wire (e.g., length, diameter, volume, etc.). The high line curvature values refer to curvature values of the components determined from Eqn. (19) which substantially represent the wire or other fiducial component. Therefore, the high line curvature values allow for only the wire to be recovered (e.g., only the true wire region is dilated, etc.) and brought to the foreground of the BUS image.

[0047] At 1 14, an estimated (e.g., substantially accurate, etc.) fiducial volume (e.g., wire volume Vw, etc.) is presented on the BUS image to a user (e.g., physicians, interventional specialists, surgeon, radiologist, doctor, etc.) on output device 22 (e.g., display, monitor, augmented reality head-mounted device, etc.) by the computing device 10 during a procedure. The image may be reconstructed in three-dimensions (3D) in a real-time fashion. Thereby, the user may have an optimal visualization of the region of interest throughout the procedure. The user is able to see the whole wire, not just the portion of the wire that is outside the skin, thereby reducing the probability of an incomplete resection of the region of interest (e.g., tumor, tissue, organ, etc.). Overall, less patients may have to undergo a repeat surgery, as well as other potential treatments with a substantial potential for a resultant worse overall prognosis and higher cost of treatment. This is substantially avoided by providing the surgeons with the improved technique for localization of the region of interest as described above.

[0048] In other embodiments, the fiducial marker may not be a wire positioned in or near the region of interest (e.g., tumor, organ, lesion, etc.). The fiducial may instead be a mark disposed inside or outside of the patient. For example, an ultrasound may be used to locate and recreate the internal structures (e.g., organs, tissue, tumor, etc.) of the region of interest as a 3D image. After the region of interest is located, one or more external marks may be disposed on the surface of the patient's skin, positioned to locate the region(s) of interest. The one or more external marks may then be linked to the 3D image to be used to localize the region of interest in a future scan or operation.

[0049] The present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine -readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine- readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine- executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.

[0050] Although the figures may show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.

[0051] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims

WHAT IS CLAIMED IS:
1. A medical imaging system, comprising:
an imaging device configured to scan a region of interest to acquire an image including a plurality of pixels of the region of interest;
a display output; and
a processing unit configured to:
receive the image including the plurality of pixels of the region of interest;
enhance the image;
segment fiducial candidate regions in a binary volume;
determine a candidate centerline for each of the fiducial candidate regions in the binary volume;
identify a true fiducial region from the fiducial candidate regions based on at least one of curvature, dimensions, and intensity present in the fiducial candidate regions;
recover a fiducial component from the true fiducial region; and send an estimated fiducial volume of the recovered fiducial component to the display output.
2. The medical imaging system of claim 1, wherein the fiducial component is at least one of a wire, a clip, an internal marking, and an external marking.
3. The medical imaging system of claim 1 , wherein the image provides at least one of a two-dimensional and a three-dimensional visualization of the region of interest.
4. The medical imaging system of claim 1 , wherein the image includes a three-dimensional ultrasound image.
5. The medical imaging system of claim 1, wherein the region of interest includes at least one of a lesion, a tumor, an organ, tissue, and the fiducial component.
6. The medical imaging system of claim 1 , wherein the imaging device uses an imaging modality including at least one of ultrasound imaging, photoacoustic imaging, magnetic resonance imaging, computed tomography imaging, fluoroscopic imaging, x-ray imaging, fluorescence imaging, and nuclear scan imaging.
7. The medical imaging system of claim 1, wherein the processing unit links the fiducial component to the region of interest and provides a visualization of at least one of the fiducial component and the region of interest throughout a procedure based on a fully automatic fiducial location algorithm.
8. A method for use with a medical imaging system and automatic detection of a fiducial component, comprising:
loading an image including a plurality of pixels of a region of interest to a processor;
enhancing the image with the processor;
segmenting, by the processor, fiducial candidate regions in a binary volume;
determining, by the processor, a candidate centerline for each of the fiducial candidate regions in the binary volume;
identifying, by the processor, a true fiducial region from the fiducial candidate regions based on at least one of curvature, dimensions, and intensity present in the fiducial candidate regions;
recovering, by the processor, the fiducial component from the true fiducial region; and
displaying, by a display device, the image with an estimated fiducial volume of the recovered fiducial component.
9. The method of claim 8, wherein the image provides at least one of a two- dimensional and a three-dimensional visualization of the region of interest.
10. The method of claim 8, wherein the image includes a three-dimensional ultrasound image.
11. The method of claim 8, wherein the region of interest includes at least one of a lesion, a tumor, an organ, tissue, and the fiducial component.
12. The method of claim 8, wherein the fiducial component is at least one of a wire, a clip, an internal marking, and an external marking.
13. The method of claim 8, wherein loading the image to the processor further comprises:
scanning the region of interest with an imaging device;
positioning the fiducial component to localize the region of interest; and linking, by the processor, the position of the fiducial component to the image.
14. The method of claim 8, wherein enhancing the image with the processor further comprises:
filtering, by the processor, various structures within the image to differentiate between at least one of a line-like structure, a plate-like structure, and a bloblike structure;
differentiating, by the processor, between a foreground and a background of the image; and
applying, by the processor, a vesselness function to indicate the similarity between the various structures and the fiducial component to determine the fiducial candidate regions.
15. The method of claim 8, wherein segmenting, by the processor, the fiducial candidate regions in the binary volume further comprises:
determining, by the processor, an initial adaptive threshold value;
transforming, by the processor, the image into a binary image represented by the binary volume;
determining, by the processor, a length of a majority axis of each fiducial candidate region; comparing, by the processor, the length of the majority axis of each fiducial candidate region to a length threshold;
disregarding, by the processor, any of the fiducial candidate regions which do not exceed the length threshold; and
determining, by the processor, if a determination criterion is satisfied;
wherein if the determination criterion is not satisfied, the initial adaptive threshold value is reduced, by the processor, and a new adaptive threshold value is determined.
16. The method of claim 15, wherein the determination criterion is satisfied when an average value of the length of the majority axis of all fiducial candidate regions is greater than the length threshold.
17. The method of claim 8, wherein determining, by the processor, the candidate centerline for each of the fiducial candidate regions in the binary volume further comprises:
determining, by the processor, a distance map of the binary volume;
detecting, by the processor, ridge points in the distance map; and defining, by the processor, a cost function based on a neutrosophic set; wherein within the neutrosophic set, each of the plurality of pixels is defined by at least one of a truth degree, an indeterminacy degree, and a false degree.
18. The method of claim 17, wherein the cost function identifies an optimum centerline path with a least cost value between the ridge points to smoothen the candidate centerline for each of the fiducial candidate regions in the binary volume.
19. A method for use with a medical imaging system and automatic detection of a wire, comprising:
loading an image including a plurality of pixels of a region of interest to a processor;
segmenting, by the processor, wire candidate regions in a binary volume; determining, by the processor, a candidate centerline for each of the wire candidate regions in the binary volume;
identifying, by the processor, a true wire region from the wire candidate regions;
recovering, by the processor, the wire from the true wire region; and displaying, by a display device, the image with an estimated wire volume of the recovered wire;
wherein the processor provides a visualization of the wire component throughout a procedure.
20. The method of claim 19, wherein the true wire region is identified based on at least one of curvature, dimensions, and intensity present in the wire candidate regions.
PCT/US2015/056522 2014-10-20 2015-10-20 Automatic detection of regions of interest in 3d space WO2016064921A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201462066231 true 2014-10-20 2014-10-20
US62/066,231 2014-10-20

Publications (1)

Publication Number Publication Date
WO2016064921A1 true true WO2016064921A1 (en) 2016-04-28

Family

ID=55761446

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/056522 WO2016064921A1 (en) 2014-10-20 2015-10-20 Automatic detection of regions of interest in 3d space

Country Status (1)

Country Link
WO (1) WO2016064921A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050171428A1 (en) * 2003-07-21 2005-08-04 Gabor Fichtinger Registration of ultrasound to fluoroscopy for real time optimization of radiation implant procedures
US20060018548A1 (en) * 2004-02-13 2006-01-26 Weijie Chen Method, system, and computer software product for automated identification of temporal patterns with high initial enhancement in dynamic magnetic resonance breast imaging
US20060177125A1 (en) * 2005-02-08 2006-08-10 Regents Of The University Of Michigan Computerized detection of breast cancer on digital tomosynthesis mammograms
US20070058865A1 (en) * 2005-06-24 2007-03-15 Kang Li System and methods for image segmentation in n-dimensional space
US20090322749A1 (en) * 2007-01-23 2009-12-31 Dtherapeutics, Llc Systems and methods for extracting a curve-skeleton from a volumetric image of a vessel
CN103942799A (en) * 2014-04-25 2014-07-23 哈尔滨医科大学 Breast ultrasounography image segmentation method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050171428A1 (en) * 2003-07-21 2005-08-04 Gabor Fichtinger Registration of ultrasound to fluoroscopy for real time optimization of radiation implant procedures
US20060018548A1 (en) * 2004-02-13 2006-01-26 Weijie Chen Method, system, and computer software product for automated identification of temporal patterns with high initial enhancement in dynamic magnetic resonance breast imaging
US20060177125A1 (en) * 2005-02-08 2006-08-10 Regents Of The University Of Michigan Computerized detection of breast cancer on digital tomosynthesis mammograms
US20070058865A1 (en) * 2005-06-24 2007-03-15 Kang Li System and methods for image segmentation in n-dimensional space
US20090322749A1 (en) * 2007-01-23 2009-12-31 Dtherapeutics, Llc Systems and methods for extracting a curve-skeleton from a volumetric image of a vessel
CN103942799A (en) * 2014-04-25 2014-07-23 哈尔滨医科大学 Breast ultrasounography image segmentation method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WBURGNER, J ET AL.: "TOWARD FLUOROSCOPIC SHAPE RECONSTRUCTION FOR CONTROL OF STEERABLE MEDICAL DEVICES'';", PROCEEDINGS OF THE ASME 2011 DYNAMIC SYSTEMS AND CONTROL CONFERENCE, DSCC2011-6029: PUBLICATION, 2 November 2011 (2011-11-02), pages 1 - 4, Retrieved from the Internet <URL:http://research.vuse.vanderbilt.edu/MEDLab/pub_files/BurgnerTowardDSCC11.pdf> [retrieved on 20151212] *

Similar Documents

Publication Publication Date Title
US20110178389A1 (en) Fused image moldalities guidance
US20050276455A1 (en) Systems and methods for segmenting an organ in a plurality of images
Rusko et al. Fully automatic liver segmentation for contrast-enhanced CT images
US20080101667A1 (en) Method and system for the presentation of blood vessel structures and identified pathologies
Correa et al. The occlusion spectrum for volume classification and visualization
US8620055B2 (en) Apparatus and method for registering two medical images
US20080021301A1 (en) Methods and Apparatus for Volume Computer Assisted Reading Management and Review
Massoptier et al. A new fully automatic and robust algorithm for fast segmentation of liver tissue and tumors from CT scans
US20090097728A1 (en) System and Method for Detecting Tagged Material Using Alpha Matting
US20070092864A1 (en) Treatment planning methods, devices and systems
US20120089008A1 (en) System and method for passive medical device navigation under real-time mri guidance
Shang et al. Vascular active contour for vessel tree segmentation
US20100135544A1 (en) Method of registering images, algorithm for carrying out the method of registering images, a program for registering images using the said algorithm and a method of treating biomedical images to reduce imaging artefacts caused by object movement
US20090326363A1 (en) Fused image modalities guidance
US20050111720A1 (en) Shape estimates and temporal registration of lesions and nodules
US20070217668A1 (en) Method and apparatus of segmenting an object in a data set and of determination of the volume of segmented object
US20110158491A1 (en) Method and system for lesion segmentation
Mansoor et al. A generic approach to pathological lung segmentation.
US20080171932A1 (en) Method and System for Lymph Node Detection Using Multiple MR Sequences
JP2003225231A (en) Method and system for detecting lung disease
US20130216110A1 (en) Method and System for Coronary Artery Centerline Extraction
US20070165927A1 (en) Automated methods for pre-selection of voxels and implementation of pharmacokinetic and parametric analysis for dynamic contrast enhanced MRI and CT
US20080101674A1 (en) Method and system for automatic analysis of blood vessel structures and pathologies
US20080170763A1 (en) Method and system for automatic analysis of blood vessel structures and pathologies in support of a triple rule-out procedure
Wei et al. Segmentation of lung lobes in high-resolution isotropic CT images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15853192

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15.09.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15853192

Country of ref document: EP

Kind code of ref document: A1