NL2011735C2 - Method and system for imaging a volume of interest in a human or animal body. - Google Patents

Method and system for imaging a volume of interest in a human or animal body. Download PDF

Info

Publication number
NL2011735C2
NL2011735C2 NL2011735A NL2011735A NL2011735C2 NL 2011735 C2 NL2011735 C2 NL 2011735C2 NL 2011735 A NL2011735 A NL 2011735A NL 2011735 A NL2011735 A NL 2011735A NL 2011735 C2 NL2011735 C2 NL 2011735C2
Authority
NL
Netherlands
Prior art keywords
voi
body surface
marker
location
data
Prior art date
Application number
NL2011735A
Other languages
Dutch (nl)
Inventor
Kenneth George Antonius Gilhuijs
Bruno Arsenali
Original Assignee
Umc Utrecht Holding Bv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Umc Utrecht Holding Bv filed Critical Umc Utrecht Holding Bv
Priority to NL2011735A priority Critical patent/NL2011735C2/en
Priority to EP14800161.3A priority patent/EP3065627A1/en
Priority to PCT/NL2014/050766 priority patent/WO2015069106A1/en
Application granted granted Critical
Publication of NL2011735C2 publication Critical patent/NL2011735C2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/397Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

In a method and system, visual information is provided about a location of a volume of interest, VOI, having predetermined dimensions under a body surface of a human or animal body. A specific point associated with the VOI is located by detecting radiation from a radioactive marker at the specific point. The body surface is located by optically scanning the body surface. Relative positions of the VOI and the body surface are calculated based on the located specific point, the predetermined dimensions, and the located body surface. A virtual image showing the relative positions of the VOI and the body surface is displayed on a display device. The VOI may be a spherical volume having the specific point at its center. A viewing direction relative to the VOI may be determined, whereby the virtual image may be displayed showing the VOI and the body surface from the viewing direction. Application in tumor surgery.

Description

Method and system for imaging a volume of interest in a human or animal body FIELD OF THE INVENTION
The invention relates to the field of medical imaging, and more specifically to a method and system for imaging a volume of interest in a human or animal body. The volume of interest has predetermined dimensions, and is located under a body surface.
BACKGROUND OF THE INVENTION
In the treatment of cancer, in particular breast cancer, different approaches have been developed to locate a lesion in the body in a preoperative stage. A well-known technique is to use magnetic resonance imaging, MRI, to acquire images of the different tissues of the body. With such images, possibly enhanced by application of contrast fluids in the body, a tumor location can be established. The image data can be used for planning an operation to remove the tumor from the body.
During such operation, a surgeon needs information on the location and extent of the tumor in order to remove the tumor and surrounding tissue such that a minimally invasive operation is performed and a minimum amount of surrounding tissue is removed.
As an example, R. Nakamura et al., Breast-conserving surgery using supine magnetic resonance imaging in breast cancer patients receiving neoadjuvant chemotherapy, The Breast 2008; 17: 245-251 discloses a breast-conserving surgery, BCS, procedure in which MRI scans are performed prior to neoadjuvant chemotherapy with the patient in the same position as adopted during surgery. In this procedure, the initial images are projected and reproduced on the skin of the breast just before the operation, and the projected images are then used to set a surgical excision line. When the BCS is performed, a cylindrical mammary resection vertical to the mammary surface about the lesion is usually performed because the three-dimensional reconstruction of the lesion during the operation is often difficult.
WO 2011/161197 A1 discloses a device and method for combined optical and nuclear imaging (also referred to as image acquisition) to determine a tracer concentration in a human or animal body. Three-dimensional, 3D, nuclear images showing a radionuclide distribution in the human or animal body are combined with a video image from an optical camera. A 3D imaging system may be provided for the image data acquired by the nuclear system. A virtual perspective image may be generated from the image data, having the same perspective as the image acquired by the optical camera. Nuclear and optical images can be superposed.
The above references provide visual information for a surgeon to perform surgery with a relatively low impact on the body, where the tumor to be removed can be located with some degree of accuracy. However, it is up to the surgeon to decide intraoperatively on the exact volume of tissue to be excised based on image information related to the tumor. Here, a need exists for the surgeon to have more exact guidance and feedback to perform the operation.
SUMMARY OF THE INVENTION
It would be desirable to provide an improved method and system for providing visual information about a location of a volume of interest under a body surface of a human or animal body.
To better address one or more of these concerns, in a first aspect of the invention a method is provided to provide visual information about a location of a volume of interest, VOI, having predetermined dimensions under a body surface of a human or animal body. The method comprises the steps of: locating a specific point associated with the VOI; locating the body surface; calculating relative positions of the VOI and the body surface based on the located specific point, the predetermined dimensions, and the located body surface; and displaying a virtual image showing the relative positions of the VOI and the body surface.
Providing visual information is the provision of information which can be detected by the human eye, e.g. by direct projection of a moving or still image on the retina, or by display of an image on or at a surface which can be seen with the eye, such as an active screen of a display device, or a passive surface having the image projected thereon.
Locating is the determination of a location, such as in terms of coordinates or a set of coordinates in a predetermined three-dimensional coordinate system. Therefore, a location is a spatial (three-dimensional) location. Different processes of locating may involve different types of coordinate systems and/or different scales or orientations of coordinate systems, where coordinates in one coordinate system can be converted into coordinates of a different coordinate system using known and pre-established conversion rules. Such conversion rules may be determined on a theoretical basis and/or on a practical basis, such as by calibration, e.g. when different locating devices are used that each are associated with their own coordinate system.
A VOI is an imaginary body volume, having predetermined dimensions, and which is of particular interest for a medical treatment to which the body is subjected. As an example, the VOI may define a body volume comprising a tumor and surrounding tissue such that when the VOI would be removed from the body, a low or minimum total amount of tissue, including the tumor, would be taken away by the surgeon. The VOI can have an arbitrary shape, e.g. generally congruent to a tumor or lesion volume to be removed. Other shapes of a VOI are possible, such as a cylindrical or spherical shape.
According to an embodiment of the present invention, relative positions of the VOI and the body surface can be established and can be shown in a virtual image. First, a specific point associated with the VOI, and the body surface are located. The specific point may be inside or outside the VOI, or may be at a boundary surface of the VOI. The association between the specific point and the VOI in turn defines a location of the VOI based on the location of the specific point, and using information about the predetermined dimensions of the VOI. In other words, once the specific point is located, also the VOI is located through a one-to-one relationship between the location of the specific point and the location of the VOI which has been predefined. Accordingly, when the specific point is located and the body surface is located, the relative positions of the specific point and the body surface can be established, and thereby the relative positions of the VOI and the body surface can be established. These positions can be expressed in terms of coordinates in a three-dimensional coordinate system to render a virtual image showing these relative positions.
A great benefit of the present invention is that the virtual image provides the visual information a surgeon would need to plan his or her actions when actually performing the surgery to remove the VOI from the body. The virtual image may be updated very frequently to keep the virtual image in conformity with the actual situation as much as possible, in particular when the body surface and the position of underlying tissue changes as a result of cutting into the surface. For this removal, a separation of tissue or cut should be made through the body surface or skin up to the boundary surface of the VOI, and further along the boundary surface of the VOI until the VOI is disconnected, or substantially disconnected, from the body. The virtual image may show a cross-sectional view, a surface-rendered view, volume-rendered views, beams-eye-view projection, etc., to visualize the VOI in spaced relationship to the body surface. Such views can be enhanced with color overlays to provide further information to a surgeon, such as to indicate an orientation relative to, or a distance to the body surface. The VOI may be a spherical volume, where the specific point may be at the center of the spherical volume.
It is noted that more than one specific point may be used and located, in particular when the VOI would not be spherical.
Hereby, the virtual image may give the surgeon accurate guidance on the required or allowed depth of a cut through the body surface up to the boundary surface of the VOI, in particular when the virtual image shows a scale of length, and/or a scaled distance between the boundary surface of the VOI and the body surface. Here, the term ‘scaled’ is to be taken to mean ‘provided with a dimensional scale’ such as to show said distance expressed in units of length, or to show units of length from which an actual distance can be easily determined.
In an embodiment, the step of locating the specific point comprises locating a marker, in particular an electromagnetic marker, or a radioactive marker (also referred to as a radioactive seed), under the body surface. The marker is or represents a material point in space, and has been provided in the body previously. As an example, the marker may have been provided in a tumor, in particular in a central portion thereof, whereby, for example, a spherical VOI can have been defined having the marker at its center, and comprising the tumor and surrounding tissue.
In case of a radioactive marker, such as an iodine-125 marker, the radiation (distribution) emitted by the marker as a point source can be detected at a distance from the marker. Thus, by properly evaluating the detected radiation, using one of more radiation detectors, a location of the radioactive marker in three dimensions may be calculated. Since the radiation distribution is known and expected to originate from a point source, calculations to determine the location of the marker can be performed relatively fast, with a low acquisition time, allowing for a high update frequency of the virtual image showing the location of the VOI based on the location of the marker as determined from the radiation. The distance between the marker and the radiation detector can be relatively large, for example at least 0.25 m, in particular at least 0.5 m.
In an embodiment, the step of locating the radioactive marker comprises: detecting a radiation distribution from the marker at a first position; detecting a radiation distribution from the marker at a second position different from the first position; and calculating a location of the marker from the radiation distributions detected at the first and second positions. Such an embodiment is particularly useful when the detection of the radiation distribution from the marker at each one of the first and second positions only leads to a two-dimensional determination of the marker location. When sensing the radiation distribution in two dimensions, a maximum radiation will be found at the marker location, with decreasing radiation at increasing distance from the marker. A calculation based on the detection results at the first and the second position then yields the three-dimensional location of the radioactive marker.
In an embodiment, the step of locating the body surface comprises optically scanning the body surface. Optical scanning may be performed by one or more stationary or moving cameras, based on physical sensing principles, devices and methods as known in the art. As an example, at least one time-of-flight camera, in particular an infrared radiation time-of-flight camera, may be used. The optical scanning produces a set of coordinates representing the body surface in a predetermined coordinate system. The coordinate system used for the optical scanning may be the same as, or different from, a coordinate system used for locating the specific point and the VOI. In case of different coordinate systems, a conversion according to predetermined rules can be made to have the locations of the specific point, the VOI and the body surface available in the same coordinate system.
In an embodiment, when the radiation distribution from the radioactive marker is detected at first and second positions, the step of locating the body surface comprises optically scanning the body surface from a third position different from the first and second positions. Accordingly, it is prevented that the steps of locating the radioactive marker and locating the body surface interfere with each other.
In an embodiment, the method further comprises: determining a viewing direction relative to the VOI; and displaying the virtual image showing the VOI and the body surface in the viewing direction. When the radiation distribution from the radioactive marker is detected at first and second positions, the method in particular may comprise: determining a viewing direction relative to the VOI; selecting the third position to coincide with the viewing direction; and displaying the virtual image showing the VOI and the body surface in the viewing direction. A viewing direction is determined by a line passing through a point of view and the VOI. By determining a particular viewing direction, such as an actual viewing direction of a surgeon performing a surgery, the virtual image may provide useful information on the relative locations of the VOI and the body surface, which useful information would not be available or not be easily recognizable in other viewing directions.
It is noted that in some embodiments the third position and the viewing direction need not coincide, i.e. the third position need not be on the line of sight determining the viewing direction. Optically scanning of the body surface may be from a different direction than the viewing direction, and the virtual image derived from the optical scanning may be processed to display the virtual image from the viewing direction.
In an embodiment, the step of determining the viewing direction comprises determining a location related to the eyes of a surgeon relative to the VOI, i.e. the viewing direction follows a line between a location related to the eyes of the surgeon, and a point of the VOI. Here, the location related to the eyes of a surgeon, which location may be represented by a location between the eyes of the surgeon, determines said point of view. In a practical embodiment, the location of the eyes of the surgeon may be determined by sensing a location or orientation of an object associated with the head of the surgeon. In such embodiment, the virtual image may be adapted automatically to the point of view of the surgeon as sensed.
In a second aspect of the invention, a system for providing visual information about a location of a volume of interest, VOI, having predetermined dimensions under a body surface of a human or animal body is provided. The system comprises: a point locating device configured to locate a specific point associated with the VOI, and to provide point location data; a scanning device configured to locate the body surface, and to provide body surface location data; a data storage device configured to store VOI data relating to the predetermined dimensions of the VOI and to the association between the specific point and the VOI; a processing unit configured to process the point location data, the body surface location data and the VOI data to calculate relative positions of the VOI and the body surface, and to provide image data representing a virtual image showing the relative positions of the VOI and the body surface; and a display device configured to display the virtual image based on the image data. The system of the invention is configured to perform the functions described above.
In an embodiment, the point location device comprises a marker locating device configured to detect a location of an electromagnetic marker.
In an embodiment, the point locating device comprises at least one radiation detector configured to detect a radiation distribution from a radioactive marker, and the processing unit is configured to calculate the point location data from the detected radiation distribution.
In an embodiment, the radiation detector comprises a gamma camera head to detect gamma radiation from the radioactive marker. The radiation detector may comprise different type of collimators, such as a collimator of a pinhole type or of a parallel hole type.
In a third aspect of the invention, a computer program is provided, comprising computer instructions which, when loaded in the processing unit of the system of the invention, cause the processing unit to process the point location data, the body surface location data, and the VOI data to calculate relative positions of the VOI and the body surface, and to provide image data representing a virtual image showing the relative positions of the VOI and the body surface.
In a fourth aspect of the invention, a computer program product is provided, comprising a data carrier having the computer instructions of the computer program according to the invention stored therein.
These and other aspects of the invention will be more readily appreciated as the same becomes better understood by reference to the following detailed description and considered in connection with the accompanying drawings in which like reference symbols designate like parts.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 depicts a block diagram of an embodiment of a system according to the present invention.
Figure 2 depicts a block diagram of another embodiment of a system according to the present invention.
Figure 3 depicts a flow diagram of an operation of an embodiment of the system according to the present invention.
Figure 4 schematically, and in perspective view, depicts components of embodiments of the system according to the present invention in an environment of a surgery room, also schematically showing a patient on an operation table.
DETAILED DESCRIPTION OF EMBODIMENTS
Figures 1 and 2 depict a block diagram of an embodiment of a system for providing visual information about a location of a volume of interest, VOI, having predetermined dimensions under a body surface of a human or animal body. The system comprises a point locating device 10, a scanning device 12, a processing unit 14, a data storage device 16 and a display device 18. The point locating device 10, scanning device 12, data storage device 16 and display device 18 are coupled to the processing unit 14 for an exchange of data between the processing unit 14 and the devices 10, 12, 16 and 18 coupled thereto. The data exchange may be by wired paths or wireless. A wireless data exchange may be conducted using a transmission of suitable electromagnetic signals, such as (e.g. infrared) light or radio waves.
The point locating device 10 is configured to locate a specific point in a predetermined three-dimensional coordinate system. In most embodiments, the specific point is in a human or animal body, below a body surface thereof, or on a body surface thereof. However, the specific point can also be outside the human or animal body under consideration. In most embodiments, the specific point has a marker, i.e. a marker is positioned at the specific point to allow the specific point to be located by the point locating device 10. The marker may be an electromagnetic marker, i.e. a marker locatable by use of electromagnetic radiation, planted on or in the human or animal body. The marker may also be a radioactive marker, such as a so-called radioactive seed, planted on or in the human or animal body.
In case of a radioactive marker, radiation emitted by the marker is detected by a radiation detector of the point locating device 10, and a detected intensity distribution of the radiation is converted into point location data being at least partly representative of the location of the specific point. The radiation detector may part of a gamma camera head known as such. A plurality of radiation detectors 11a, 11b (Figure 2) may be operated to allow for a determination of the location of the specific point in three dimensions. In the latter embodiment, a first radiation detector 11a is configured to detect a radiation distribution from the marker at a first position, and a second radiation detector 11 b is configured to detect a radiation distribution from the marker at a second position different from the first position.
The scanning device 12 is configured to locate the body surface of the human or animal body in a predetermined three-dimensional coordinate system. This coordinate system may be the same as the coordinate system referred to above in relation to the point locating device 10, or may be different therefrom. If the coordinate system of the scanning device 12 and the coordinate system of the point locating device 10 are different from each other, transformation rules should be available to transform coordinates of one coordinate system into coordinates of the other coordinate system, in order to be able to relate locating results of the point locating device 10 to locating results of the scanning device 12. The scanning device 12 is positionable relative to the human or animal body such that it can acquire the body surface in three dimensions, to provide body surface location data. The scanning device 12 may be a device operable through the use of radiation or sound, such as (e.g. infrared) light, radio waves or sound waves. For example, the scanning device 12 may comprise a time-of-flight camera or a laser scanner. In some embodiments, a plurality of scanning devices 12 at different positions are provided, e.g. to solve (partial) occlusion of viewing area if a camera viewing area is (partially) blocked by e.g. a part of the body of a surgeon.
In the data storage device 16, VOI data may be stored relating to predetermined dimensions of a VOI, and relating to an association between the specific point and the VOI. These data allow to define the three-dimensional coordinates of the VOI, once a location of the specific point has been established by the point locating device 10.
The point location data as provided by the point locating device 10, the body surface location data as provided by the scanning device 12, and the VOI data as provided by the data storage device 16, are processed in the processing unit 14 to calculate relative positions of the VOI and the body surface. Based on these calculations, the processing unit 14 provides image data representing a virtual image showing the relative positions of the VOI and the body surface.
The image data are provided to the display device 18, which is configured to display, based on the image data, a virtual image showing the relative positions of the VOI and the body surface.
As depicted in Figure 2, the system may further comprise a viewing direction sensing device 20 configured to sense a viewing direction relative to the VOI, and configured to provide viewing direction data. In an embodiment, the viewing direction sensing device 20 comprises a sensor configured to provide location data, in particular data relating to the location of the eyes of a surgeon. The location data may be processed in the processing unit 14 to determine a viewing direction relative to the VOI. The viewing direction may be calculated by the processing unit 14 by determining a direction of a line connecting a point in space determined by said location data with the three-dimensional coordinates of (a point of) the VOI. Then, the virtual image may be shown as seen in said viewing direction.
It is noted that the processing unit 14 in a practical embodiment may comprise one or more processors. Processors may be distributed in the system, where parts of the processing required for the system to produce the virtual image with the display device 18 may be performed in a distributed way in the point locating device 10, the scanning device 12, and the display device 18. Processors may be local or remote processors such a used in network or cloud computing.
It is further noted the data storage device 16 may be separate from the processing unit 14, or may be integrated therewith, or may be part of the point locating device 10.
Figure 3 depicts a flow diagram of an operation of an embodiment of the system described above with reference to Figures 1 and 2, and some pre-operative steps.
According to step 21, a specific point having a predetermined relationship with a body or body part of a specific human or animal patient is determined. As an example, in a preoperative stage, a tumor in a human or animal body is diagnosed, imaged and located. The tumor, having a known extension or known dimensions, then is assigned at least one specific point, such as a center of gravity of the tumor, or a point at a tumor boundary, or a point outside the tumor.
According to step 22, at the location of the specific point, known in three dimensions, a marker, for example an electromagnetic marker, or a radioactive marker or seed, is placed by known techniques.
According to step 23, the specific point is associated with a VOI, i.e. a VOI and a location thereof is associated with the location of the specific point. As an example, the VOI is a volume to be removed from the human or animal body, and dimensioned such that a tumor under consideration, including a specific amount of surrounding tissue, is completely contained in the VOI. As an example, the specific point, in particular if it is in the center of gravity of the tumor, may be taken as a center of a spherical VOI. In such a case, the location of the VOI is known when the location of the specific point is known, and a radius of the spherical VOI is known. In other situations, more complex associations between the specific point and the VOI may exist, for example when the specific point is not in a center of gravity of a tumor of interest and/or the VOI is not spherical but has another shape.
Steps 21,22 and 23 usually are taken in a pre-operative stage, before a surgery to remove a VOI from a human or animal body starts.
According to step 24, in an operative stage, during a surgery to remove a VOI from the human or animal body, the human or animal patient is placed on an operation table, and the specific point is located by locating the marker previously provided in or on the body of the human or animal patient in step 22 in three dimensions. As an example, when the marker is a radioactive marker, radiation emitted from the marker is detected to calculate the location of the marker. Locating the specific point is performed quasi real-time and quasi continuous, depending on the locating technology used. This implies that when deformations or movements of the human or animal body during surgery occur, for example when the human or animal patient moves or is moved, or when the body surface is cut during the operation, the location of the specific point may change and may be detected accordingly.
According to step 25, (a part of) the surface of the body of the human or animal patient is located in three dimensions. As an example, the body surface may be scanned to acquire locations of an area of interest of the body surface. The scanning of the body surface is performed quasi real-time and quasi continuous, depending on the scanning technology used. This implies that when deformations or movements of the body surface occur during surgery, for example when the human or animal patient moves or is moved, or when the body surface is cut during the operation, the location of the body surface or a part thereof may change and may be detected accordingly.
According to step 26, based on steps 23, 24 and 25, in particular based on data as generated during steps 23, 24 and 25, the relative positions of the VOI and the body surface are calculated, if necessary including a transformation of coordinates if one or more of the steps 23, 24 and 25, has been performed using a different coordinate system than the other step(s).
According to optional step 27, a particular viewing direction relative to the VOI may be determined. The viewing direction may be a viewing direction based on a real-time position of the eyes of a surgeon, or may be from a different direction selected in an appropriate way, for example by positioning a scanning device to be used in step 25 in a particular position relative to the VOI. When no particular viewing direction is determined or selected, step 27 may be omitted.
According to step 28, a virtual image showing the relative positions of the VOI and the body surface is displayed on a display device to provide the surgeon performing the operation with real-time information about the VOI inside the human or animal body. The surgeon may plan his operation, in particular his cutting movements, in an optimum way through analyzing the virtual image while the operation progresses.
Figure 4 depicts a perspective view of a human patient 40 on an operation table 42, with components of an embodiment of the system of the present invention positioned above the operation table. It is assumed that the patient 40 has a tumor in the right breast, and that this tumor has previously been marked by a marker 57 inserted in the patient’s body at a specific point, such as near or in the tumor, more specifically in the center of gravity of the tumor. For the following explanation, it will be assumed that the marker 57 is a radioactive marker, sometimes also referred to as a radioactive seed. It is further assumed that the components discussed by reference to Figure 4 are part of a system as explained above by reference to Figure 2.
A first radiation detector embodied as a first gamma camera head 44, and a second radiation detector embodied as a second gamma camera head 45 are mounted above the operation table 42 on support structures 46 and 47, respectively. Each of the gamma camera heads 44, 45 may comprise a parallel hole collimator, or a multi pinhole collimator. The centers of the substantially rectangular gamma camera heads 44, 45 are vertically above, and on a line parallel to, a longitudinal center line of the operation table 42. The gamma camera heads 44, 45 each are tilted to face a part of the body of the patient 40 where the tumor is situated, so as to enable locating radiation from the radioactive marker 57 from different directions. On gamma camera head 44, a first scanning camera 48 for scanning the body surface of the patient 40 from a first direction is mounted, whereas on gamma camera head 45 a second scanning camera 49 for scanning the body surface of the patient 40 from a second direction is mounted. The scanning cameras 48, 49 are facing the part of the body of the patient 40 where the tumor is situated so as to enable scanning this relevant body part. In other embodiments, the scanning cameras 48, 49 can be mounted independently from the gamma camera heads 44, 45, and at different positions depending on the circumstances of the surgery. In some embodiments, only one scanning camera is used. In the embodiment shown, the gamma camera heads 44, 45 and the scanning cameras 48, 49 are mounted not to interfere with each other such as to avoid them impeding each other’s functioning.
It is noted that, if a gamma camera head 44 or 45 comprises a dual pinhole collimator, the marker location can be established with one of the gamma camera heads 44, 45, by using different sets of detector elements of said one gamma camera head each viewing the marker 57 from different positions.
A display device 50 is mounted on a support structure 52 in the vicinity of the operation table 42, and is configured to display virtual images 54, 56 based on the location of the radioactive marker 57 in the patient 40 established by one or both of the gamma camera heads 44, 45, and further based on the location of the body surface of the patient 40 scanned by the scanning cameras 48, 49. The location of the radioactive marker 57 provides the location of a specific point associated with a VOI 55 in the body of the patient 40. In the shown embodiment, scanning camera 48 and gamma camera heads 44, 45 provide virtual image 56, whereas virtual image 54 is provided by scanning camera 49 and gamma camera heads 44, 45. Based on the location of the specific point, the VOI 55 can be shown in the virtual images 54, 56 at the exact location. A scale of length 53 can be added to the virtual images to accurately show a location of the VOI relative to the body surface. Thereby, a surgeon planning his/her actions for performing the surgery to remove the VOI 55 is aided in determining the exact cutting locations.
A surgeon (not shown) may have an object attached to his/her head comprising a viewing direction sensing device 60 configured to sense a viewing direction 61 relative to the VOI 55 in the body of the patient 40. The viewing direction sensing device 60 may a separate object coupled with the processing unit of the system of the invention, or may be connected to, for example, a pair of glasses 58, 59 worn by the surgeon. The system may be configured to display the virtual image showing the VOI 55 and the body surface from the viewing direction 61. So, optically scanning of the body surface by one or more scanning cameras may be from a different direction than the viewing direction, and the virtual image derived from the optical scanning by the one or more scanning cameras may be processed to display the virtual image from the viewing direction.
Furthermore, the pair of glasses 58, 59 or similar device worn by the surgeon near the eyes may comprise a display device 64 for displaying to a surgeon’s eye a virtual image showing the relative positions of the VOI 55 and the body surface, in a same or similar way like the virtual image 54 or 56.
Thus, a system for providing visual information about a location of a VOI 55 having predetermined dimensions under a body surface of a human or animal body comprises a point locating device comprising the gamma camera heads 44, 45, and configured to locate a specific point (the marker 57) associated with the VOI 55, and to provide point location data. The system further comprises a scanning device comprising the scanning cameras 48, 49, and configured to locate the body surface, and to provide body surface location data. The system also comprises a data storage device configured to store VOI data relating to the predetermined dimensions of the VOI 55 and to the association between the specific point and the VOI 55. The system further comprises a processing unit configured to process the point location data, the body surface location data and the VOI data to calculate relative positions of the VOI 55 and the body surface, and to provide image data representing a virtual image 54, 56 showing the relative positions of the VOI 55 and the body surface. The system also comprises a display device 50, 64 configured to display the virtual image 54, 56 based on the image data.
In the description of Figure 4 above, it has been assumed that the marker 57 is a radioactive marker of which a location in space can be determined with the aid of gamma camera heads 44, 45. In an alternative embodiment, the marker is an electromagnetic marker which can be located by an electromagnetic marker locating device 62 arranged in the vicinity of the patient 40. The electromagnetic marker may be an active marker, i.e. a marker provided with an energy source and actively transmitting location data to the marker locating device 62, e.g. when receiving a command from the marker locating device 62, or can be a passive marker, i.e. a marker without an energy source, only transmitting location data when receiving energy from the marker locating device 62.
As explained above, in a method and system, visual information is provided about a location of a volume of interest, VOI, having predetermined dimensions under a body surface of a human or animal body. A specific point associated with the VOI is located by detecting radiation from a radioactive marker at the specific point. The body surface is located by optically scanning the body surface. Relative positions of the VOI and the body surface are calculated based on the located specific point, the predetermined dimensions, and the located body surface. A virtual image showing the relative positions of the VOI and the body surface is displayed on a display device. The VOI may be a spherical volume having the specific point at its center. A viewing direction relative to the VOI may be determined, whereby the virtual image may be displayed showing the VOI and the body surface from the viewing direction.
Application of the method and system is in tumor surgery in a human or animal body, where a specific point is marked with a marker. Based on the size of the tumor and the location of the specific point, the specific point is associated with a VOI having predetermined dimensions, and representing a volume of tissue to be removed from the body. The VOI includes the tumor and surrounding tissue. During surgery, visual information is provided about a location of the VOI, by: locating a specific point associated with the VOI; locating the body surface; calculating relative positions of the VOI and the body surface based on the located specific point, the predetermined dimensions, and the located body surface; and displaying a virtual image showing the relative positions of the VOI and the body surface. The surgeon removes the VOI from the human or animal body based on the visual information about the location of the VOI.
As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which can be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting, but rather, to provide an understandable description of the invention.
The terms "a" or "an", as used herein, are defined as one or more than one. The term plurality, as used herein, is defined as two or more than two. The term another, as used herein, is defined as at least a second or more. The terms including and/or having, as used herein, are defined as comprising (i.e., open language, not excluding other elements or steps). Any reference signs in the claims should not be construed as limiting the scope of the claims or the invention.
The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The term coupled, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically.
A single processor or other processing unit may fulfil the functions of several items recited in the claims.
The terms program, software application, and the like as used herein, are defined as a sequence of instructions designed for execution on a computer system. A program, computer program, or software application may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Claims (34)

1. Werkwijze voor het verschaffen van visuele informatie over een locatie van een interessevolume, VOI, met vooraf bepaalde afmetingen onder een lichaamsoppervlak van een menselijk of dierlijk lichaam, waarbij de werkwijze omvat: het lokaliseren van een specifiek punt dat is geassocieerd met het VOI; het lokaliseren van het lichaamsoppervlak; het berekenen van relatieve posities van het VOI en het lichaamsoppervlak op basis van het gelokaliseerde specifieke punt, de vooraf bepaalde afmetingen, en het gelokaliseerde lichaamsoppervlak; en het weergeven van een virtueel beeld dat de relatieve posities van het VOI en het lichaamsoppervlak toont.A method for providing visual information about a location of an interest volume, VOI, with predetermined dimensions below a body surface of a human or animal body, the method comprising: locating a specific point associated with the VOI; locating the body surface; calculating relative positions of the VOI and body surface based on the localized specific point, the predetermined dimensions, and the localized body surface; and displaying a virtual image showing the relative positions of the VOI and the body surface. 2. Werkwijze volgens conclusie 1, waarbij de stap van het lokaliseren van het specifieke punt het lokaliseren van een merkorgaan onder het lichaamsoppervlak omvat.The method of claim 1, wherein the step of locating the specific point comprises locating a marker under the body surface. 3. Werkwijze volgens conclusie 2, waarbij het merkorgaan een elektromagnetisch merkorgaan is.The method of claim 2, wherein the marker is an electromagnetic marker. 4. Werkwijze volgens conclusie 2, waarbij het merkorgaan een radioactief merkorgaan is.The method of claim 2, wherein the marker is a radioactive marker. 5. Werkwijze volgens conclusie 4, waarbij de stap van het lokaliseren van het radioactieve merkorgaan omvat: het detecteren van een stralingsverdeling afkomstig van het merkorgaan; en het berekenen van een locatie van het merkorgaan aan de hand van de gedetecteerde stralingsverdeling.The method of claim 4, wherein the step of locating the radioactive marker comprises: detecting a radiation distribution from the marker; and calculating a location of the marker on the basis of the detected radiation distribution. 6. Werkwijze volgens een van de voorgaande conclusies, waarbij de stap van het lokaliseren van het lichaamsoppervlak het optisch aftasten van het lichaamsoppervlak omvat.The method of any one of the preceding claims, wherein the step of locating the body surface comprises optically scanning the body surface. 7. Werkwijze volgens conclusie 4, waarbij de stap van het lokaliseren van het radioactieve merkorgaan omvat: het detecteren van een stralingsverdeling afkomstig van het merkorgaan op een eerste positie; het detecteren van een stralingsverdeling afkomstig van het merkorgaan op een tweede positie die verschilt van de eerste positie; en het berekenen van een locatie van het merkorgaan aan de hand van de stralingsverdelingen die zijn gedetecteerd op de eerste en tweede posities.The method of claim 4, wherein the step of locating the radioactive marker comprises: detecting a radiation distribution from the marker at a first position; detecting a radiation distribution from the marker at a second position different from the first position; and calculating a location of the marker based on the radiation distributions detected at the first and second positions. 8. Werkwijze volgens conclusie 7, waarbij de stap van het lokaliseren van het lichaamsoppervlak het optisch aftasten van het lichaamsoppervlak vanuit een derde positie die verschilt van de eerste en tweede posities omvat.The method of claim 7, wherein the step of locating the body surface comprises optically scanning the body surface from a third position different from the first and second positions. 9. Werkwijze volgens een of meer van de voorgaande conclusies, verder omvattende: het weergeven, in genoemd virtueel beeld, van een lengteschaal, of een geschaalde afstand tussen een grensoppervlak van het VOI en het lichaamsoppervlak.The method of any one of the preceding claims, further comprising: displaying, in said virtual image, a length scale, or a scaled distance between a boundary surface of the VOI and the body surface. 10. Werkwijze volgens een of meer van de voorgaande conclusies, verder omvattende: het bepalen van een zichtrichting ten opzichte van het VOI; en het weergeven van het virtuele beeld dat het VOI en het lichaamsoppervlak in de zichtrichting toont.The method of any one of the preceding claims, further comprising: determining a viewing direction relative to the VOI; and displaying the virtual image showing the VOI and body surface in the viewing direction. 11. Werkwijze volgens conclusie 8, verder omvattende: het bepalen van een zichtrichting ten opzichte van het VOI; het selecteren van de derde positie om samen te vallen met de zichtrichting; en het weergeven van het virtuele beeld dat het VOI en het lichaamsoppervlak in de zichtrichting toont.The method of claim 8, further comprising: determining a viewing direction relative to the VOI; selecting the third position to coincide with the viewing direction; and displaying the virtual image showing the VOI and body surface in the viewing direction. 12. Werkwijze volgens conclusie 10 of 11, waarbij de stap van het bepalen van de zichtrichting omvat: het bepalen van een locatie die is gerelateerd aan de ogen van een chirurg ten opzichte van het VOI.The method of claim 10 or 11, wherein the step of determining the viewing direction comprises: determining a location related to a surgeon's eyes relative to the VOI. 13. Werkwijze volgens conclusie 12, waarbij de locatie van de ogen van de chirurg wordt bepaald door het meten van een locatie van een object dat is geassocieerd met het hoofd van de chirurg.The method of claim 12, wherein the location of the surgeon's eyes is determined by measuring a location of an object associated with the surgeon's head. 14. Werkwijze volgens een of meer van de voorgaande conclusies, waarbij het VOI een bolvormig volume is.The method of any one of the preceding claims, wherein the VOI is a spherical volume. 15. Werkwijze volgens conclusie 14, waarbij het specifieke punt zich in het centrum van het bolvormige volume bevindt.The method of claim 14, wherein the specific point is in the center of the spherical volume. 16. Systeem voor het verschaffen van visuele informatie over een locatie van een interessevolume, VOI, met vooraf bepaalde afmetingen onder een lichaamsoppervlak een menselijk of dierlijk lichaam, waarbij het systeem omvat: een puntlokaliseerinrichting die is geconfigureerd voor het lokaliseren van een specifiek punt dat met het VOI is geassocieerd, en voor het verschaffen van puntlocatiedata; een aftastinrichting die is geconfigureerd voor het lokaliseren van het lichaamsoppervlak, en voor het verschaffen van lichaamsoppervlaklocatiedata; een dataopslaginrichting die is geconfigureerd voor het opslaan van VOI-data die betrekking hebben op de vooraf bepaalde afmetingen van het VOI en op de associatie tussen het specifieke punt en het VOI; een verwerkingseenheid die is geconfigureerd voor het verwerken van de puntlocatiedata, de lichaamsoppervlaklocatiedata en de VOI-data voor het berekenen van relatieve posities van het VOI en het lichaamsoppervlak, en voor het verschaffen van beelddata die een virtueel beeld representeren dat de relatieve posities van het VOI en het lichaamsoppervlak toont; en een weergave-inrichting die is geconfigureerd voor het weergeven van het virtuele beeld op basis van de beelddata.A system for providing visual information about a location of an interest volume, VOI, with predetermined dimensions below a body surface, a human or animal body, the system comprising: a point locator configured to locate a specific point that has the VOI is associated, and to provide point location data; a sensing device configured to locate the body surface, and to provide body surface location data; a data storage device configured to store VOI data relating to the predetermined dimensions of the VOI and to the association between the specific point and the VOI; a processor configured to process the point location data, the body surface location data and the VOI data for calculating relative positions of the VOI and the body surface, and for providing image data representing a virtual image representing the relative positions of the VOI and shows the body surface; and a display device configured to display the virtual image based on the image data. 17. Systeem volgens conclusie 16, waarbij de puntlokaliseerinrichting is geconfigureerd voor het lokaliseren van een merkorgaan onder het lichaamsoppervlak.The system of claim 16, wherein the point locator is configured to locate a marker below the body surface. 18. Systeem volgens conclusie 17, waarbij het merkorgaan een elektromagnetisch merkorgaan is.The system of claim 17, wherein the marker is an electromagnetic marker. 19. Systeem volgens conclusie 18, waarbij de puntlokaliseerinrichting een merkorgaanlokaliseerinrichting omvat die is geconfigureerd voor het detecteren van een locatie van het elektromagnetische merkorgaan.The system of claim 18, wherein the point locator comprises a marker locator configured to detect a location of the electromagnetic marker. 20. Systeem volgens conclusie 17, waarbij het merkorgaan een radioactief merkorgaan is.The system of claim 17, wherein the marker is a radioactive marker. 21. Systeem volgens conclusie 20, waarbij de puntlokaliseerinrichting omvat: ten minste een stralingsdetector die is geconfigureerd voor het detecteren van een stralingsverdeling afkomstig van het merkorgaan; en waarbij de verwerkingseenheid is geconfigureerd voor het berekenen van de puntlocatiedata aan de hand van de gedetecteerde stralingsverdeling.The system of claim 20, wherein the spot locator comprises: at least one radiation detector configured to detect a radiation distribution from the marker; and wherein the processing unit is configured to calculate the point location data based on the detected radiation distribution. 22. Systeem volgens een of meer van de conclusies 16-21, waarbij de aftastinrichting is geconfigureerd voor het optisch aftasten van het lichaamsoppervlak.The system of any one of claims 16 to 21, wherein the sensing device is configured to optically scan the body surface. 23. Systeem volgens conclusie 20, waarbij de puntlokaliseerinrichting omvat: een eerste stralingsdetector die is geconfigureerd voor het detecteren van een verdeling van straling afkomstig van het merkorgaan op een eerste positie; een tweede stralingsdetector die is geconfigureerd voor het detecteren van een verdeling van straling afkomstig van het merkorgaan op een tweede positie die verschilt van de eerste positie; en waarbij de verwerkingseenheid is geconfigureerd voor het berekenen van de puntlocatiedata aan de hand van de stralingsverdelingen die zijn gedetecteerd door de eerste en tweede stralingsdetectoren.The system of claim 20, wherein the spot locator comprises: a first radiation detector configured to detect a distribution of radiation from the marker at a first position; a second radiation detector configured to detect a distribution of radiation from the marker at a second position different from the first position; and wherein the processing unit is configured to calculate the point location data based on the radiation distributions detected by the first and second radiation detectors. 24. Systeem volgens conclusie 23, waarbij de aftastinrichting is geconfigureerd voor het optisch aftasten van het lichaamsoppervlak vanuit een derde positie die verschilt van de eerste en tweede posities.The system of claim 23, wherein the sensing device is configured to optically scan the body surface from a third position different from the first and second positions. 25. Systeem volgens conclusie 21 of 23, waarbij de stralingsdetector een gammacamerakop omvat.The system of claim 21 or 23, wherein the radiation detector comprises a gamma camera head. 26. Systeem volgens conclusie 25, waarbij de stralingsdetector verder een multi-pengatcollimator, in het bijzonder een dubbelpengatcollimator omvat.The system of claim 25, wherein the radiation detector further comprises a multi-pin hole collimator, in particular a double-pin hole collimator. 27. Systeem volgens conclusie 23, waarbij de eerste en tweede stralingsdetectoren elk een gammacamerakop en een collimator met evenwijdige gaten omvatten.The system of claim 23, wherein the first and second radiation detectors each comprise a gamma camera head and a collimator with parallel holes. 28. Systeem volgens een of meer van de conclusies 16-27, waarbij de verwerkingseenheid verder is geconfigureerd voor het verschaffen van beelddata die in het virtuele beeld een lengteschaal, of een geschaalde afstand tussen een grensoppervlak van het VOI en het lichaamsoppervlak representeren.The system of any one of claims 16 to 27, wherein the processing unit is further configured to provide image data in the virtual image representing a length scale, or a scaled distance between a boundary surface of the VOI and the body surface. 29. Systeem volgens conclusie 22 of 24, waarbij de aftastinrichting een looptijdcamera, in het bijzonder een infraroodstralinglooptijdcamera omvat.29. System as claimed in claim 22 or 24, wherein the scanning device comprises a transit time camera, in particular an infrared radiation transit time camera. 30. Systeem volgens een of meer van de conclusies 16-29, verder omvattende: een zichtrichtingsmeetinrichting die is geconfigureerd voor het meten van een zichtrichting ten opzichte van het VOI, en het verschaffen van zichtrichtingsdata, waarbij de verwerkingseenheid verder is geconfigureerd voor het verschaffen van beelddata op basis van de zichtrichtingsdata voor het weergeven van het virtuele beeld dat het VOI en het lichaamsoppervlak in de zichtrichting toont.The system of any one of claims 16 to 29, further comprising: a view direction measuring device configured to measure a view direction relative to the VOI, and providing view direction data, the processing unit further configured to provide image data based on the view direction data for displaying the virtual image showing the VOI and the body surface in the view direction. 31. Systeem volgens conclusie 24, verder omvattende: een zichtrichtingsmeetinrichting die is geconfigureerd voor het meten van een zichtrichting ten opzichte van het VOI, en het verschaffen van zichtrichtingsdata; en het selecteren van een positie van de aftastinrichting om ervoor te zorgen dat de derde positie samenvalt met de zichtrichting; en waarbij de verwerkingseenheid verder is geconfigureerd voor het verschaffen van beelddata op basis van de zichtrichtingsdata voor het weergeven van het virtuele beeld dat het VOI en het lichaamsoppervlak in de zichtrichting toont.The system of claim 24, further comprising: a view direction measurement device configured to measure a view direction relative to the VOI, and to provide view direction data; and selecting a position of the scanning device to cause the third position to coincide with the viewing direction; and wherein the processing unit is further configured to provide image data based on the direction of view data for displaying the virtual image showing the VOI and the body surface in the view direction. 32. Systeem volgens conclusie 30 of 31, waarbij de zichtrichtingmeetinrichting is geconfigureerd voor het meten van een locatie van de ogen van een chirurg ten opzichte van het VOI.The system of claim 30 or 31, wherein the vision direction measuring device is configured to measure a location of a surgeon's eyes relative to the VOI. 33. Computerprogramma, omvattende computerinstructies welke, wanneer deze zijn geladen in de verwerkingseenheid van het systeem volgens conclusie 16, de verwerkingseenheid de puntlocatiedata, de lichaamsoppervlaklocatiedata en de VOI-data doen verwerken voor het berekenen van relatieve posities van het VOI en het lichaamsoppervlak, en voor het verschaffen van beelddata die een virtueel beeld representeren dat de relatieve posities van het VOI en het lichaamsoppervlak toont.A computer program comprising computer instructions which, when loaded into the processing unit of the system of claim 16, cause the processing unit to process the point location data, the body surface location data and the VOI data for calculating relative positions of the VOI and the body surface, and for providing image data representing a virtual image showing the relative positions of the VOI and the body surface. 34. Computerprogrammaproduct, omvattende een gegevensdrager met de computerinstructies van het computerprogramma volgens conclusie 33 daarin opgeslagen.A computer program product comprising a data carrier with the computer instructions of the computer program according to claim 33 stored therein.
NL2011735A 2013-11-05 2013-11-05 Method and system for imaging a volume of interest in a human or animal body. NL2011735C2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
NL2011735A NL2011735C2 (en) 2013-11-05 2013-11-05 Method and system for imaging a volume of interest in a human or animal body.
EP14800161.3A EP3065627A1 (en) 2013-11-05 2014-11-05 Method and system for imaging a volume of interest in a human or animal body
PCT/NL2014/050766 WO2015069106A1 (en) 2013-11-05 2014-11-05 Method and system for imaging a volume of interest in a human or animal body

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2011735A NL2011735C2 (en) 2013-11-05 2013-11-05 Method and system for imaging a volume of interest in a human or animal body.
NL2011735 2013-11-05

Publications (1)

Publication Number Publication Date
NL2011735C2 true NL2011735C2 (en) 2015-05-07

Family

ID=50001221

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2011735A NL2011735C2 (en) 2013-11-05 2013-11-05 Method and system for imaging a volume of interest in a human or animal body.

Country Status (3)

Country Link
EP (1) EP3065627A1 (en)
NL (1) NL2011735C2 (en)
WO (1) WO2015069106A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11523869B2 (en) 2015-11-20 2022-12-13 Stichting Het Nederlands Kanker Instituut—Antoni van Leeuwenhoek Ziekenhuis Method and system of providing visual information about a location and shape of a tumour under a body surface of a human or animal body
KR102315803B1 (en) * 2019-12-16 2021-10-21 쓰리디메디비젼 주식회사 Supporter for medical camera

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100016713A1 (en) * 2008-07-16 2010-01-21 Dilon Technologies, Inc. Fiducial marker and method for gamma guided stereotactic localization
US20110164728A1 (en) * 2008-09-03 2011-07-07 Hitachi, Ltd. Radiation imaging apparatus
US20130168570A1 (en) * 2010-06-23 2013-07-04 Technische Universitat Munchen Device and method for combined optical and nuclear image acquisition
WO2014041512A1 (en) * 2012-09-13 2014-03-20 Koninklijke Philips N.V. Breast surgery guidance based on breast mr images and radioactive markers

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100016713A1 (en) * 2008-07-16 2010-01-21 Dilon Technologies, Inc. Fiducial marker and method for gamma guided stereotactic localization
US20110164728A1 (en) * 2008-09-03 2011-07-07 Hitachi, Ltd. Radiation imaging apparatus
US20130168570A1 (en) * 2010-06-23 2013-07-04 Technische Universitat Munchen Device and method for combined optical and nuclear image acquisition
WO2014041512A1 (en) * 2012-09-13 2014-03-20 Koninklijke Philips N.V. Breast surgery guidance based on breast mr images and radioactive markers

Also Published As

Publication number Publication date
EP3065627A1 (en) 2016-09-14
WO2015069106A1 (en) 2015-05-14

Similar Documents

Publication Publication Date Title
US20220192611A1 (en) Medical device approaches
US20210212772A1 (en) System and methods for intraoperative guidance feedback
US20190321065A1 (en) Methods and systems for performing navigation-assisted medical procedures
CN105873517B (en) With automatic isocentric intervention x-ray system
US20180308247A1 (en) Tissue imaging system and method for tissue imaging
RU2464931C2 (en) Device for determining position of first object inside second object
EP2951779B1 (en) Three-dimensional image segmentation based on a two-dimensional image information
JP5837261B2 (en) Multi-camera device tracking
EP3079622B1 (en) Radiation-free registration of an optical shape sensing system to an imaging system
EP2760360B1 (en) Self-localizing medical device
EP2438880A1 (en) Image projection system for projecting image on the surface of an object
US9443161B2 (en) Methods and systems for performing segmentation and registration of images using neutrosophic similarity scores
US20210052329A1 (en) Monitoring of moving objects in an operation room
US10687900B2 (en) Method of image support for a person carrying out a minimally invasive procedure with an instrument in a procedure site of a patient, X-ray apparatus, computer program and electronically readable data carrier
US11596369B2 (en) Navigation system for vascular intervention and method for generating virtual x-ray image
CA2963865C (en) Phantom to determine positional and angular navigation system error
JP7463625B2 (en) Navigation Support
US20150301439A1 (en) Imaging Projection System
NL2011735C2 (en) Method and system for imaging a volume of interest in a human or animal body.
Shahin et al. Ultrasound-based tumor movement compensation during navigated laparoscopic liver interventions
Octorina Dewi et al. Position tracking systems for ultrasound imaging: a survey
Bekke et al. Optical surface scanning for respiratory motion monitoring in radiotherapy: a feasibility study
WO2024097249A1 (en) 3d spatial mapping in a 3d coordinate system of an ar headset using 2d images

Legal Events

Date Code Title Description
MM Lapsed because of non-payment of the annual fee

Effective date: 20161201