WO2005099581A1 - Ultrasound calibration and real-time quality assurance based on closed form formulation - Google Patents

Ultrasound calibration and real-time quality assurance based on closed form formulation Download PDF

Info

Publication number
WO2005099581A1
WO2005099581A1 PCT/US2005/013026 US2005013026W WO2005099581A1 WO 2005099581 A1 WO2005099581 A1 WO 2005099581A1 US 2005013026 W US2005013026 W US 2005013026W WO 2005099581 A1 WO2005099581 A1 WO 2005099581A1
Authority
WO
WIPO (PCT)
Prior art keywords
phantom
ultrasound
ultrasound probe
probe
orientation
Prior art date
Application number
PCT/US2005/013026
Other languages
French (fr)
Inventor
Emad Moussa Boctor
Gregory D. Hager
Gabor Fichtinger
Arnand Viswanathan
Original Assignee
Johns Hopkins University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Johns Hopkins University filed Critical Johns Hopkins University
Priority to EP05754852.1A priority Critical patent/EP1744676B1/en
Priority to US11/578,071 priority patent/US7867167B2/en
Publication of WO2005099581A1 publication Critical patent/WO2005099581A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/587Calibration phantoms
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/5205Means for monitoring or calibrating
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data

Definitions

  • the present invention involves the field of ultrasound imagery. More particularly, the present invention involves spatial calibration of ultrasound probes for intra-operative use. Discussion of the Related Art
  • the ultrasound system 100 includes a transmitter 105 having a transmitter reference frame 130; and an ultrasound probe 110 having a probe reference frame 135.
  • the ultrasound probe 110 transmits and receives energy in a scan plane 142, and projects a plurality of pixels 140 in a pixel reference frame 145.
  • a conventional ultrasound system 100 may also include tracking sensors 125 to monitor the position and orientation of the ultrasound probe 110.
  • the ultrasound system 100 is used to collect multiple 2D ultrasound images, which are assembled into a 3D image space 155 having a construction reference frame 150 (hereinafter "construction frame").
  • construction frame a construction reference frame 150
  • 2D ultrasound images acquired by the ultrasound system 100 must be registered or mapped in real-time into a 3D image space 155, which encompasses a target volume within the patient undergoing surgery.
  • ultrasound probes that acquire 3D images, these probes need to be spatially calibrated as well.
  • Registering pixels from pixel reference frame 145 to the 3D image space 155 requires a transformation matrix encompassing a series of constituent coordinate transformation matrices: e.g., from the pixel frame 145 to the ultrasound probe reference frame 135; from the ultrasound probe frame 135 to the transmitter reference frame 130; and from the transmitter reference frame 130 to the construction frame 150.
  • these transformation matrices the most difficult to determine is the transformation matrix from the pixel reference frame 145 to the ultrasound probe reference frame 135 (hereinafter the "probe calibration matrix").
  • probe calibration matrix the transformation matrix from the pixel reference frame 145 to the ultrasound probe reference frame 135
  • the ultrasound probe 1 10 is placed and oriented such that it acquires an image of a calibration target, or phantom, which has well defined spatial features.
  • image processing techniques such as segmentation
  • the well defined features of the phantom are identified and located in the acquired ultrasound image, and the position and orientation of the phantom is derived from the segmented image.
  • images are acquired with the ultrasound probe 110 placed in a single position and orientation. If the position and location of the phantom are known relative to the construction frame 155, the probe calibration matrix can be derived.
  • the orientation of the phantom may be determined relative to the orientation of the ultrasound probe, and the probe calibration matrix may be derived by correlating the segmented images of the phantom with the phantom's known spatial characteristics.
  • Image processing techniques such as segmentation are computationally intensive and may not be feasible to compute in real time, based on the number of images acquired. Typical segmentation is performed on several hundred images. The large number of images not only requires time to process, but it increases the likelihood of errors that may render the probe calibration matrix invalid.
  • a pixel 140 may be registered into the 3D image space 155 defines by the construction frame 150.
  • the transformation of a pixel 140 location from the pixel reference frame 145 to the construction frame 155 can be expressed as: C rp I rp K rp j r j x ⁇ 1 T ⁇ R i P r x ->
  • T P is the coordinate transformation matrix
  • T R is the coordinate transformation from the
  • T ⁇ is the coordinate
  • spatial calibration is in determining the probe calibration matrix T P .
  • the present invention is directed to ultrasound calibration and real-time quality assurance based on closed form formulation that substantially obviates one or more of the problems due to limitations and disadvantages of the related art. In general, the present invention achieves this by deriving a probe
  • An advantage of the present invention is to provide more reliable realtime ultrasound-based 3D imagery for use during medical procedures in that the ultrasound probe may be spatially calibrated intra-operatively. This helps mitigate post-calibration changes that may degrade the accuracy of 3D imagery without warning.
  • Another advantage of the present invention is to provide a more efficient and robust spatial calibration of an ultrasound probe. By spatially calibrating the ultrasound probe based on the relative differences between two or more images of the same phantom, the resulting calibration is less dependent on the precision to which the spatial characteristics of the phantom are known.
  • Another advantage of the present invention is to simplify the ultrasound probe calibration process.
  • the present invention identifies pixels corresponding to prominent feature points on a phantom, as opposed to segmenting an image in order to reconstruct an image of the phantom, which is more computationally intensive.
  • a method for spatially calibrating an ultrasound probe comprises placing the ultrasound probe in a first position and orientation relative to a phantom; measuring the first position and orientation of the ultrasound probe; acquiring a first ultrasound image of the phantom; determining a first transformation matrix corresponding to a phantom reference frame and a pixel reference frame, based on the first ultrasound image; repositioning the ultrasound probe in a second position and orientation relative to the phantom; measuring the second position and orientation of the ultrasound probe; acquiring a second ultrasound image of the phantom; determining a second transformation matrix corresponding to the phantom reference frame and the pixel reference frame, based on the second ultrasound image; and computing a probe calibration matrix based on the first position and orientation of the ultrasound probe, the first transformation matrix, the second position and orientation of the ultrasound probe, and the second transformation matrix.
  • a system for performing intra-operative calibration of an ultrasound probe comprises a position and angle encoder for measuring a position and angle of the ultrasound probe; and a data system having a computer readable medium encoded with a program for computing a probe calibration matrix according to a closed form formulation, and according to relative changes between the locations of prominent feature points in a first and a second ultrasound image, wherein the first ultrasound image corresponds to a first ultrasound probe position, and the second ultrasound image corresponds to a second ultrasound probe position.
  • FIG. 1 illustrates components of an ultrasound imaging system according to the related art
  • FIG. 2 illustrates an exemplary ultrasound imaging system according to the present invention
  • FIG. 3 illustrates an exemplary spatial calibration process according to the present invention
  • FIG. 4 illustrates an exemplary phantom according to the present invention
  • FIG. 5 illustrates an exemplary ultrasound imaging system, which includes at least one docking station
  • FIG. 6 illustrates an exemplary ultrasound imaging system that uses double-wedge phantoms
  • FIGs. 7A-7D illustrate the effects of misalignment and offset between an ultrasound probe and a double-wedge phantom, and their effects
  • FIGs. 8A-D illustrate ultrasound images, and how misalignment and offset between an ultrasound probe and a double-wedge phantom are apparent in the images
  • FIG. 9 illustrates an exemplary double-wedge phantom according to the present invention
  • FIG. 10 illustrates an exemplary process for performing bootstrap calibration of an ultrasound probe according to the present invention. DETAILED DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS [0034]
  • FIG. 10 illustrates an exemplary process for performing bootstrap calibration of an ultrasound probe according to the present invention.
  • the imaging system 200 includes an ultrasound transmitter 205 having a transmitter reference frame 230; an ultrasound probe 210 having a probe reference frame 235; position and angle encoders 216 for measuring the position and orientation of the probe reference frame 235 relative to the transmitter reference frame 230; an ultrasound processor 215 for providing power and signals to, and receiving signals from, the ultrasound transmitter 205 and the ultrasound probe 210; a data system 220 for sending commands to and receiving data from the ultrasound processor 215 and the position and angle encoders 216; and a user interface 225 connected to the data system 220.
  • the ultrasound probe 210 may transmit and receive energy in a scan plane 242, which includes a plurality of pixels 240 within the scan plane 242 and having a pixel reference frame 245.
  • the exemplary system 200 acquires ultrasound images, through use of the ultrasound probe 210, within a 3D image space 255 having a construction reference frame 250.
  • the exemplary system 200 may include one or more phantoms 260 and 265, which are located such that they can be imaged by the ultrasound probe 210, and wherein the phantoms 260 and 265 may be acoustically coupled to a target (not shown) to be imaged within the 3D image space 255.
  • FIG. 2 further illustrates a single ultrasound probe 210 in two separate positions, Position 1 and 2, in which the probe 210 may acquire images of the phantoms 260 and 265.
  • the probe 210 may acquire images of the phantoms 260 and 265.
  • there may be a single phantom, which may be imaged by the ultrasound probe 210 from multiple positions and orientations.
  • phantom 260 will be referred to in the case in which there is a single phantom.
  • matrix As used herein, the term "matrix,” as in the probe calibration matrix
  • T P may refer to any representation of a spatial relationship between coordinate
  • this embodiment of the present invention may employ a SONOLINETM Antares ultrasound scanner manufactured by Siemens Medical Solutions, USA, Inc., Ultrasound Division, Issaqua, WA with a Siemens VF 10-5 linear array probe held in a rigid attachment mounted on an adjustable arm.
  • SONOLINETM Antares ultrasound scanner manufactured by Siemens Medical Solutions, USA, Inc., Ultrasound Division, Issaqua, WA with a Siemens VF 10-5 linear array probe held in a rigid attachment mounted on an adjustable arm.
  • the position and angle encoders 216 include multiple optical markers attached to the ultrasound probe 210, which are tracked using, for example, an OPTOTRAKTM device, manufactured by Northern Digital, Inc.
  • the data system 220 may include one or more computers, which may be networked together either locally or over a network.
  • the data system 220 includes software (hereinafter "the software") for implementing processes according to the present invention.
  • the software may be stored and run on the data system 220, or may be stored and run in a distributed manner between the data system 220, the ultrasound processor 215, and the user interface 225.
  • FIG. 3 illustrates an exemplary process 300 for providing real-time spatial calibration according to the present invention, which may be implemented by the software.
  • Process 300 may be used in conjunction with system 200, illustrated in FIG. 2, in which a single phantom 260 is used.
  • the ultrasound probe 210 is placed in position 1 of N, wherein N may be at least three. Position 1 may be arbitrary or predetermined. Either way, the position should be such that the phantom 260 is within the scan plane 242 of ultrasound probe 240 wherein prominent feature points within the phantom 260 are readily identifiable in the acquired ultrasound image.
  • the software acquires position and angle data of ultrasound probe 210 from the position and angle encoders 216 and stores the corresponding data values in memory.
  • the software may acquire and store position and angle data of the ultrasound probe 210 exclusively while the ultrasound probe 210 is in position 1, or the software may continuously acquire and store position and angle data values throughout exemplary process 300.
  • the software may provide time tag information corresponding to the position and angle data such that the time tag data may be used to synchronize the position and angle data with the ultrasound data acquired from the ultrasound processor 215.
  • the ultrasound processor 215 acquires and processes ultrasound image data from the ultrasound probe 210 while the ultrasound probe is held in position 1.
  • the software receives ultrasound image data from the ultrasound processor 215 and stores the corresponding data values in memory.
  • the software may acquire ultrasound data continuously throughout exemplary process 300, along with time tag data, and may store the ultrasound and time tag data values so that the ultrasound data may be synchronized with similarly time tagged position and angle data acquired from the position and angle encoders 216. If the data system 220 continuously acquires and stores ultrasound data values throughout exemplary process 300, the data system may additionally acquire and store data from the user interface 225, along with corresponding time tag data, which may provide a flag indicating that ultrasound data values corresponding to a given time were acquired while the ultrasound probe was in position 1. [0045] In step 340, prominent feature points corresponding to the phantom 260 are identified from the ultrasound data acquired in step 330, as illustrated in FIG. 4.
  • FIG. 4 illustrates an exemplary phantom 260, along with its reference frame 410, and a scan plane 242 impinging on the phantom 260.
  • the phantom 260 may include a matrix of N-shaped wires stretched between two parallel plates.
  • the phantom 260 In order for the phantom 260 to be used intra- operatively, it should be acoustically coupled with a target volume, such as a patient undergoing surgery, such that the user may periodically position the ultrasound probe 210 in a given position 1 and position 2 during an operation.
  • the scan plane 242 When being imaged by the ultrasound probe 210, the scan plane 242 may intersect a plane defined by the phantom at points E, K, and Z, as illustrated in FIG. 4.
  • step 350 with the coordinates of the center point K determined, the software then computes a coordinate transformation between the ultrasound probe reference frame 245 and the phantom reference frame 410.
  • the transformation may be accomplished by, for example, Horn's quaternion rigid registration method, as described in B. Horn, Closed-form solution of absolute orientation using unit quaternions, Journal of the Optical Society of America A, Vol. 4, page 629, April 1987, which is incorporated herein by reference.
  • Other techniques may be used, such as those employed to transform a set of points between coordinate systems, as is done in the fields of photogrammetry.
  • step 360 the software determines if there are more positions at which to acquire ultrasound data. If so, steps 310-350 are repeated for a new position. The next position may be chosen arbitrarily, or determined prior to executing the exemplary process 300. The next position should be chosen such that the phantom 260 is located within the scan plane 242 of the ultrasound probe 210, and that prominent feature points on the phantom 260 will be visible in the ultrasound imagery acquired by the ultrasound probe 210, as illustrated in FIG. 4. In a particular embodiment of the present invention, steps 310-350 are iterated 3 times.
  • step 375 the software retrieves the stored data values for the following: the translation and rotation of the phantom 260 from the phantom reference frame 410 to the pixel reference frame 245 when the ultrasound probe 240 was in each position; and the position and angle of the ultrasound probe 240, as measured by the position and angle encoders 216, when the ultrasound probe was in each position [0050]
  • step 375 the software assembles this data into a closed form
  • A is the relative coordinate transformations between the locations of the respective pixels corresponding to the prominent feature points of the phantom
  • B is the relative coordinate transformation between ultrasound prove reference frame at position 1 and position 2, as measured by the position and angle n encoders
  • X is the probe calibration matrix T P .
  • equation may be expressed in software as the following:
  • R a i 2 and R a23 are the rotations of the pixel reference frame 245 from position 1 to position 2, and from position 2 to position 3, respectively;
  • Rb ⁇ and R 2 _? 2 are the respective rotations of the probe reference frame 235 from position 1 to position 2 and from position 2 to position 3, as measured by
  • is a vector of translational scale factors, wherein each scale factor
  • R ⁇ and D u are obtained by estimating the translation and rotation of the prominent feature points of the phantom 260 between position 1 and 2; and R x , t x ,
  • T P may be derived by extracting a unique solution from the null space
  • extracting the null space involves solving the closed form solution and selecting the vector corresponding to the lowest coefficient. [0052] If more than three positions are to be used, the left-most array in the
  • closed form solution may be concatenated to include the I 9 -R a ⁇ ® R b ⁇ 2 O 9 . 3 O 93
  • the calibration matrix may account for non-rigidity of the
  • the scale factor ⁇ may be a
  • the probe calibration matrix T P may be
  • the software may then store the constituent values of the probe
  • FIG. 5 illustrates still another embodiment of the present invention, where exemplary system 200 includes one or more docking stations 510 and 520.
  • the docking stations 510 and 520 are each in a substantially fixed position and orientation relative to the phantom 260, and each includes an acoustically coupled fixture for placing the ultrasound probe 210 in a precise position and angle relative to the construction frame 255.
  • the user may place the ultrasound probe 210 more precisely at each of the two positions, which may improve the precision and accuracy of the measured position and orientation of the probe reference frame 245.
  • FIG. 6 illustrates another embodiment of the present invention, in which imaging system 600 includes certain substantially similar components to exemplary system 200.
  • system 600 also includes three double-wedge phantoms 610 mounted on a base plate 620, which has a base plate reference frame 630; and a cross- wire structure 640, which is located such that it may be imaged by the ultrasound probe 210 simultaneously with any of the double-wedge phantoms 610.
  • the base plate 620 may have holes located and so that double-wedge phantoms 610 may be affixed to the base plate 620.
  • the double- wedge phantoms 610 may be precisely located so that their relative locations are precisely known.
  • the double- wedge phantoms 610 are rigidly mounted so that their relative locations are known to within lOO ⁇ m.
  • Exemplary system 600 may be used in conjunction with exemplary process 300.
  • the ultrasound probe 210 is positioned and oriented to acquire images of the double wedge phantom 610 at pose 1 in step 310.
  • pose refers to the position and orientation of a given double- wedge phantom 610.
  • Ultrasound images and probe position and angle data is then acquired in steps 320-350. Steps 310-350 may be iterated, whereby the position and orientation of the ultrasound probe 210 may be adjusted based on the translation and rotation determined in step 350.
  • step 340 the images of the double- wedge phantom 610 are identified in an ultrasound image.
  • FIGs. 7A-7D illustrate different scenarios in which an ultrasound beam 705 transmitted by the ultrasound probe 210 impinges on wedge features 710 and 720 of double-wedge phantom 610, and how the reflected energy from the transmitted beam 705 is distributed.
  • FIGs. 8A-8D illustrate how the wedges 710 and 720 may appear in a resulting ultrasound image 732.
  • any translational offset or angular misalignment in the transmitted beam 705 relative to the pose of the double- wedge phantom 610 is manifested in the ultrasound image 732.
  • FIGs. 7A and 8A correspond to a scenario in which the transmitted beam 705 is aligned with the pose of the double- wedge phantom 610 with no translational offset.
  • Line 721 refers to the "early echo,” or the first reflected energy of the transmitted beam 705 to impinge on either wedge 710 and 720.
  • Line 722 refers to the "late echo,” or the end of the reflected energy from the transmitted beam 705.
  • Elements 725a and 725b refer to the geometry of the reflected energy, in which the dimension L corresponds to the length of the reflected energy, which is a function of the beam width BW and the slope of the wedge 710 or 720.
  • FIG. 8A illustrates an exemplary ultrasound image 732 corresponding to FIG. 7A.
  • the acoustic energy reflected from wedge 710 results in a "cloud" image 730a; and the acoustic energy reflected from wedge 720 results in cloud 730b.
  • clouds 730a and 730b are referred to as clouds since the acoustic energy in transmitted beam 705 spatially and temporally spreads as a result of the divergence of the transmitted beam 705, the shape of the acoustic pulse transmitted by the ultrasound probe 210, and the angle of the wedge from which the energy is reflected. Since the transmitted beam 705 is aligned with the pose of the double- wedge phantom, clouds 730a and 730b have substantially the same height, which corresponds to dimension L, which is due to the fact that the transmitted beam 705 impinges on wedges 710 and 720 at substantially the same (and opposite) angle.
  • clouds 730a and 730b are located substantially “side by side” in ultrasound image 732, which is due to the fact that there is substantially no translational offset between the center of the transmitted beam 705 and the point at which wedges 710 and 720 cross.
  • FIG. 7B illustrates how acoustic energy may be reflected from wedges 710 and 720 when the transmitted beam 705 is angularly aligned with the pose of the double-wedge phantom 610, but in which the transmitted beam 705 has a translational offset relative to wedges 710 and 720.
  • clouds 730a and 730b have substantially the same height, but are offset from one another in a manner proportional to the translational offset of the transmitted beam 705.
  • FIG. 7C illustrates how acoustic energy may be reflected from wedges 710 and 720 when the transmitted beam is angularly misaligned (at angle ⁇ ) with the pose of the double-wedge phantom 610, but does not have any translational offset.
  • clouds 730a and 730b have different heights S and B, wherein the height differential is proportional to the misalignment angle according to the following relation: tan(30° - ⁇ ) _ S tan(30° + ⁇ ) B
  • FIG. 7D illustrates how acoustic energy may be reflected from wedges 710 and 720 with a transmitted beam 705 impinging on the double- wedge phantom 610 with both an angular misalignment with the pose of the double- wedge phantom 610 and an translational offset. As illustrated in FIG.
  • the resulting clouds 730a and 730b in the ultrasound image 732 have different heights S and B, the difference of which is proportional to the misalignment angle; and the clouds 730a and 730b are offset in a manner proportional to the translational offset of the transmitted beam 705.
  • the heights of the clouds 730a and 730b, and their offset may be determined automatically through image processing techniques known to the art. Alternatively, the heights of the clouds 730a and 730b and their offset may be determined by having the user place a cursor on the top and bottom of clouds 730a and 730b, and click a mouse.
  • the translation and rotation from the reference frame of the double- wedge phantom 610 to the pixel reference frame 245 may be determined.
  • the user may adjust the position and orientation of the ultrasound probe 210 to substantially eliminate the angular misalignment and translational offset of the ultrasound probe 210 relative to the double- wedge phantom 610.
  • the user may employ the ultrasound images of the wedges 710 and 720, like those illustrated in FIGs. 8A-8D, for feedback. If this is done, the translation and rotation between the reference frame of the double- wedge phantom 610 and the pixel reference frame 235 will be more precise.
  • FIG. 9 illustrates an exemplary double-wedge phantom 910 according to the present invention.
  • the double- wedge phantom 910 has multiple sets of wedges 710 and 720, each at different heights. Having multiple sets of wedges 710 and 720 at different heights substantially enable the divergence of the transmitted beam 705 to be characterized by determining the beam width BW at different heights, using the beam width equation described above.
  • Exemplary system 600 may be used in a "bootstrap calibration"
  • FIG. 10 illustrates an exemplary process 1000 for performing bootstrap calibration according to the present invention.
  • Exemplary process 1000 includes process 300, in which ultrasound image data and probe position and angle data are collected.
  • the ultrasound images include an image of one of the double- wedge phantoms 610 and the cross-wire structure 640.
  • the bootstrapping calibration technique works more effectively if the cross- wire structure 640 within the field of view of the ultrasound probe 210, but as far from the double-wedge phantom 610 as practicable.
  • the ultrasound probe 210 is placed such that it is sequentially centered
  • a probe calibration matrix T P is
  • Process 300 is implemented in such a way that a plurality of images may be acquired at each pose, and the mean and standard deviation corresponding to the resulting probe calibration matrices are computed.
  • step 1010 an inverse of the probe calibration matrix T P is
  • the reconstructed ultrasound image includes a reconstructed
  • step 1020 the reconstructed image of the cross-wire structure 640 is compared with an actual image of the cross- wire structure, and a standard deviation is computed between the two images.
  • the accuracy of the reconstructed image of the n cross-wire structure (and thus the accuracy of the probe calibration matrix T P ) is
  • step 1030 the out-of-plane motion parameters are perturbed
  • the purpose of perturbing the U k matrix is to substantially encompass the range of values for the elements of the U k matrix such that the optimal
  • the system 200 illustrated in FIG. 2, in conjunction with exemplary process 300 illustrated in FIG. 3, may be implemented without the use of a phantom 260.
  • image registration may be done by use of speckle correlation.
  • Speckle refers to a situation in which a target tissue contains a plurality of small acoustic scatterers that form patterns of constructive and destructive interference within the tissue.
  • the speckle pattern is generally stable, and may provide a pattern with sufficient spatial variability to substantially enable computing correlations between successive ultrasound images.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Disclosed is a system and method for intra-operatively spatially calibrating an ultrasound probe. The method includes determining the relative changes in ultrasound images of a phantom, or high-contrast feature points within a target volume, for three different ultrasound positions. Spatially calibrating the ultrasound probe includes measuring the change in position and orientation of the probe and computing a calibration matrix based on the measured changes in probe position and orientation and the estimated changes in position and orientation of the phantom.

Description

ULTRASOUND CALIBRATION AND REAL-TIME QUALITY ASSURANCE BASED ON CLOSED FORM FORMULATION
[0001] This application claims the benefit of United States Provisional Patent Application No. 60/562,460, filed on April 15, 2004, which is hereby incorporated by reference for all purposes as if fully set forth herein. [0002] The research and development effort associated with the subject matter of this patent application was supported by the National Science Foundation under grant no. ERC 9731478.
BACKGROUND OF THE INVENTION Field of the Invention [0003] The present invention involves the field of ultrasound imagery. More particularly, the present invention involves spatial calibration of ultrasound probes for intra-operative use. Discussion of the Related Art
[0004] Computer Integrated Surgery has revolutionized surgical procedures, whereby 3D imagery of a target volume is created to enable a surgeon to more precisely and accurately position surgical tools within a patient. To serve this purpose, the imaging system, or guidance modality, should provide 3D imagery in real time; it must not be excessively obstructive or burdensome in an operating environment; and it must provide 3D imagery with sufficient accuracy and precision to provide effective surgical planning and execution. [0005] Ultrasound has become a popular guidance modality for medical procedures, due to its real-time operation, safety, low cost, and convenience of use in an operating room environment. Although it is not a "true 3D" imaging modality, such as Magnetic Resonance Imaging (MRI) and Computer Tomography (CT), techniques have been developed to convert multiple ultrasound 2D images into a 3D image in order to provide image guidance for surgeons while exploiting the benefits and conveniences of ultrasound. [0006] Components of a conventional ultrasound system 100 are illustrated in FIG. 1. The ultrasound system 100 includes a transmitter 105 having a transmitter reference frame 130; and an ultrasound probe 110 having a probe reference frame 135. The ultrasound probe 110 transmits and receives energy in a scan plane 142, and projects a plurality of pixels 140 in a pixel reference frame 145. A conventional ultrasound system 100 may also include tracking sensors 125 to monitor the position and orientation of the ultrasound probe 110. The ultrasound system 100 is used to collect multiple 2D ultrasound images, which are assembled into a 3D image space 155 having a construction reference frame 150 (hereinafter "construction frame"). [0007] In order to provide image guidance during a surgical procedure, 2D ultrasound images acquired by the ultrasound system 100 must be registered or mapped in real-time into a 3D image space 155, which encompasses a target volume within the patient undergoing surgery. Although there are ultrasound probes that acquire 3D images, these probes need to be spatially calibrated as well. Registering pixels from pixel reference frame 145 to the 3D image space 155 requires a transformation matrix encompassing a series of constituent coordinate transformation matrices: e.g., from the pixel frame 145 to the ultrasound probe reference frame 135; from the ultrasound probe frame 135 to the transmitter reference frame 130; and from the transmitter reference frame 130 to the construction frame 150. Of these transformation matrices, the most difficult to determine is the transformation matrix from the pixel reference frame 145 to the ultrasound probe reference frame 135 (hereinafter the "probe calibration matrix"). [0008] According to the related art, spatial calibration is the act of determining each of the aforementioned transformation matrices, which is typically done before a medical procedure. In related art spatial calibration, the ultrasound probe 1 10 is placed and oriented such that it acquires an image of a calibration target, or phantom, which has well defined spatial features. Using image processing techniques such as segmentation, the well defined features of the phantom are identified and located in the acquired ultrasound image, and the position and orientation of the phantom is derived from the segmented image. In the related art approach, images are acquired with the ultrasound probe 110 placed in a single position and orientation. If the position and location of the phantom are known relative to the construction frame 155, the probe calibration matrix can be derived. By comparing the locations of the identified imaged features of the phantom with known locations and relative orientations of these features, the orientation of the phantom may be determined relative to the orientation of the ultrasound probe, and the probe calibration matrix may be derived by correlating the segmented images of the phantom with the phantom's known spatial characteristics. [0009] Image processing techniques such as segmentation are computationally intensive and may not be feasible to compute in real time, based on the number of images acquired. Typical segmentation is performed on several hundred images. The large number of images not only requires time to process, but it increases the likelihood of errors that may render the probe calibration matrix invalid. [0010] According to the related art, once the transformation matrices, including the probe calibration matrix, are known, a pixel 140 may be registered into the 3D image space 155 defines by the construction frame 150. The transformation of a pixel 140 location from the pixel reference frame 145 to the construction frame 155 can be expressed as: C rp I rp K rp jrj x~ 1T λ R i Prx ->
where Px is the location of pixel 140 in pixel reference frame 145; Cx is the location
of pixel 140 in construction frame 155; TP is the coordinate transformation matrix
from the pixel reference frame 145 to the ultrasound probe reference frame 135 (i.e.,
the probe calibration matrix); TR is the coordinate transformation from the
ultrasound probe reference frame 135 to the transmitter reference frame 130, which
may be measured using tracking sensors 125; and Tτ is the coordinate
transformation from the transmitter reference frame 130 to the construction frame 155, which may be measured. [0011] The accuracy and precision of registering ultrasound image pixels 140 into the construction frame 155 is limited by the accuracy and precision of each of the above transformation matrices. The weakest link in this chain is the accuracy and
precision of the probe calibration matrix TP . Accordingly, a primary challenge in
spatial calibration is in determining the probe calibration matrix TP .
[0012] There are errors intrinsic to the conventional spatial calibration process that limit its precision and accuracy, including the following: imprecision in fabrication of the phantom, subsequent mechanical distortions of the phantom, lack of precision in characterizing the features of the phantom, spatial co-registration or ambiguities, and limits to numerical solution optimizations. As such, the quality of the calibration is limited to the accuracy and precision to which the phantom is characterized. [0013] An additional disadvantage of the related art spatial calibration is that since it cannot be performed intra-operatively, partly because it cannot be performed in real time, it is vulnerable to subsequent changes that may render any or all of the calibration matrices invalid without warning. Such post-calibration changes may be brought on by mechanical alteration to the tracking sensors and changes in tissue temperature. The effect of post-calibration changes may include inaccurate 3D image, resulting in incorrect surgical instrument placement. [0014] Although the above discussion involves ultrasound, the same issues may be encountered for any imaging system for which 2D images are assembled into a 3D image space. Or more generally, the same issues may arise in which a 2D imaging system is spatially calibrated in order to register image products into another reference frame. SUMMARY OF THE INVENTION [0015] Accordingly, the present invention is directed to ultrasound calibration and real-time quality assurance based on closed form formulation that substantially obviates one or more of the problems due to limitations and disadvantages of the related art. In general, the present invention achieves this by deriving a probe
calibration matrix TP based on relative images of a phantom acquired from at least
three positions and orientations, as opposed to deriving a probe calibration matrix
TP from images of the phantom, from one position and orientation, that is correlated
with known characteristics of the phantom. [0016] An advantage of the present invention is to provide more reliable realtime ultrasound-based 3D imagery for use during medical procedures in that the ultrasound probe may be spatially calibrated intra-operatively. This helps mitigate post-calibration changes that may degrade the accuracy of 3D imagery without warning. [0017] Another advantage of the present invention is to provide a more efficient and robust spatial calibration of an ultrasound probe. By spatially calibrating the ultrasound probe based on the relative differences between two or more images of the same phantom, the resulting calibration is less dependent on the precision to which the spatial characteristics of the phantom are known. [0018] Another advantage of the present invention is to simplify the ultrasound probe calibration process. The present invention identifies pixels corresponding to prominent feature points on a phantom, as opposed to segmenting an image in order to reconstruct an image of the phantom, which is more computationally intensive. [0019] Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings. [0020] To achieve these and other advantages and in accordance with the purpose of the present invention, a method for spatially calibrating an ultrasound probe comprises placing the ultrasound probe in a first position and orientation relative to a phantom; measuring the first position and orientation of the ultrasound probe; acquiring a first ultrasound image of the phantom; determining a first transformation matrix corresponding to a phantom reference frame and a pixel reference frame, based on the first ultrasound image; repositioning the ultrasound probe in a second position and orientation relative to the phantom; measuring the second position and orientation of the ultrasound probe; acquiring a second ultrasound image of the phantom; determining a second transformation matrix corresponding to the phantom reference frame and the pixel reference frame, based on the second ultrasound image; and computing a probe calibration matrix based on the first position and orientation of the ultrasound probe, the first transformation matrix, the second position and orientation of the ultrasound probe, and the second transformation matrix. [0021] In another aspect of the present invention, a system for performing intra-operative calibration of an ultrasound probe comprises a position and angle encoder for measuring a position and angle of the ultrasound probe; and a data system having a computer readable medium encoded with a program for computing a probe calibration matrix according to a closed form formulation, and according to relative changes between the locations of prominent feature points in a first and a second ultrasound image, wherein the first ultrasound image corresponds to a first ultrasound probe position, and the second ultrasound image corresponds to a second ultrasound probe position. [0022] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. BRIEF DESCRIPTION OF THE DRAWINGS [0023] The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention. [0024] FIG. 1 illustrates components of an ultrasound imaging system according to the related art; [0025] FIG. 2 illustrates an exemplary ultrasound imaging system according to the present invention; [0026] FIG. 3 illustrates an exemplary spatial calibration process according to the present invention; [0027] FIG. 4 illustrates an exemplary phantom according to the present invention; [0028] FIG. 5 illustrates an exemplary ultrasound imaging system, which includes at least one docking station; [0029] FIG. 6 illustrates an exemplary ultrasound imaging system that uses double-wedge phantoms; [0030] FIGs. 7A-7D illustrate the effects of misalignment and offset between an ultrasound probe and a double-wedge phantom, and their effects; [0031] FIGs. 8A-D illustrate ultrasound images, and how misalignment and offset between an ultrasound probe and a double-wedge phantom are apparent in the images; [0032] FIG. 9 illustrates an exemplary double-wedge phantom according to the present invention; and [0033] FIG. 10 illustrates an exemplary process for performing bootstrap calibration of an ultrasound probe according to the present invention. DETAILED DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS [0034] FIG. 2 illustrates an exemplary ultrasound imaging system 200 according to the present invention. The imaging system 200 includes an ultrasound transmitter 205 having a transmitter reference frame 230; an ultrasound probe 210 having a probe reference frame 235; position and angle encoders 216 for measuring the position and orientation of the probe reference frame 235 relative to the transmitter reference frame 230; an ultrasound processor 215 for providing power and signals to, and receiving signals from, the ultrasound transmitter 205 and the ultrasound probe 210; a data system 220 for sending commands to and receiving data from the ultrasound processor 215 and the position and angle encoders 216; and a user interface 225 connected to the data system 220. The ultrasound probe 210 may transmit and receive energy in a scan plane 242, which includes a plurality of pixels 240 within the scan plane 242 and having a pixel reference frame 245. [0035] The exemplary system 200 acquires ultrasound images, through use of the ultrasound probe 210, within a 3D image space 255 having a construction reference frame 250. Further, the exemplary system 200 may include one or more phantoms 260 and 265, which are located such that they can be imaged by the ultrasound probe 210, and wherein the phantoms 260 and 265 may be acoustically coupled to a target (not shown) to be imaged within the 3D image space 255. By acoustically coupling, it is understood that continuity in the propagation medium is maintained such that sound waves pass through. [0036] FIG. 2 further illustrates a single ultrasound probe 210 in two separate positions, Position 1 and 2, in which the probe 210 may acquire images of the phantoms 260 and 265. Instead of two phantoms, there may be a single phantom, which may be imaged by the ultrasound probe 210 from multiple positions and orientations. For purposes herein, phantom 260 will be referred to in the case in which there is a single phantom. Although two positions are illustrated, at least three
positions are generally required for computing the probe calibration matrix TP
according to the present invention. [0037] As used herein, the term "matrix," as in the probe calibration matrix
TP , may refer to any representation of a spatial relationship between coordinate
frames, such as a quaternion. [0038] For the purposes of illustration, this embodiment of the present invention may employ a SONOLINE™ Antares ultrasound scanner manufactured by Siemens Medical Solutions, USA, Inc., Ultrasound Division, Issaqua, WA with a Siemens VF 10-5 linear array probe held in a rigid attachment mounted on an adjustable arm. However, it will be readily apparent to one skilled in the art that other commercially available ultrasound scanners may be used. [0039] In this exemplary embodiment of the present invention, the position and angle encoders 216 include multiple optical markers attached to the ultrasound probe 210, which are tracked using, for example, an OPTOTRAK™ device, manufactured by Northern Digital, Inc. It will be readily apparent to one skilled in the art that alternate devices and systems for providing real-time measurements of position and orientation of the ultrasound probe 210 may be used and are within the scope of the present invention. [0040] The data system 220 may include one or more computers, which may be networked together either locally or over a network. The data system 220 includes software (hereinafter "the software") for implementing processes according to the present invention. The software may be stored and run on the data system 220, or may be stored and run in a distributed manner between the data system 220, the ultrasound processor 215, and the user interface 225. [0041] FIG. 3 illustrates an exemplary process 300 for providing real-time spatial calibration according to the present invention, which may be implemented by the software. Process 300 may be used in conjunction with system 200, illustrated in FIG. 2, in which a single phantom 260 is used. [0042] In step 310, the ultrasound probe 210 is placed in position 1 of N, wherein N may be at least three. Position 1 may be arbitrary or predetermined. Either way, the position should be such that the phantom 260 is within the scan plane 242 of ultrasound probe 240 wherein prominent feature points within the phantom 260 are readily identifiable in the acquired ultrasound image. [0043] In step 320, the software acquires position and angle data of ultrasound probe 210 from the position and angle encoders 216 and stores the corresponding data values in memory. The software may acquire and store position and angle data of the ultrasound probe 210 exclusively while the ultrasound probe 210 is in position 1, or the software may continuously acquire and store position and angle data values throughout exemplary process 300. The software may provide time tag information corresponding to the position and angle data such that the time tag data may be used to synchronize the position and angle data with the ultrasound data acquired from the ultrasound processor 215. [0044] In step 330, the ultrasound processor 215 acquires and processes ultrasound image data from the ultrasound probe 210 while the ultrasound probe is held in position 1. The software then receives ultrasound image data from the ultrasound processor 215 and stores the corresponding data values in memory. The software may acquire ultrasound data continuously throughout exemplary process 300, along with time tag data, and may store the ultrasound and time tag data values so that the ultrasound data may be synchronized with similarly time tagged position and angle data acquired from the position and angle encoders 216. If the data system 220 continuously acquires and stores ultrasound data values throughout exemplary process 300, the data system may additionally acquire and store data from the user interface 225, along with corresponding time tag data, which may provide a flag indicating that ultrasound data values corresponding to a given time were acquired while the ultrasound probe was in position 1. [0045] In step 340, prominent feature points corresponding to the phantom 260 are identified from the ultrasound data acquired in step 330, as illustrated in FIG. 4. The prominent feature points may be selected by the user via the user interface 225 by, for example, selecting the point with a cursor and mouse-click. Alternatively, the software may automatically identify prominent feature points using image processing techniques that are known to the art. [0046] FIG. 4 illustrates an exemplary phantom 260, along with its reference frame 410, and a scan plane 242 impinging on the phantom 260. In a particular embodiment, the phantom 260 may include a matrix of N-shaped wires stretched between two parallel plates. In order for the phantom 260 to be used intra- operatively, it should be acoustically coupled with a target volume, such as a patient undergoing surgery, such that the user may periodically position the ultrasound probe 210 in a given position 1 and position 2 during an operation. When being imaged by the ultrasound probe 210, the scan plane 242 may intersect a plane defined by the phantom at points E, K, and Z, as illustrated in FIG. 4. The x and y coordinate of the center point K of the phantom 260 in the phantom reference frame 410 may be determined from the relations: Xk = Xb + (KE/EZ) (xc-xb), mάyk = yb + (KE/EZ)-(yc-yb), in which x* and yk are the coordinates of the center image point K of the phantom 260 in the phantom reference frame 410; Xb and yb are the coordinates of point B on the phantom 260 in the phantom reference frame 410; and xc andj^ are the coordinates of point C on the phantom 260 in the phantom reference frame 410. [0047] In step 350, with the coordinates of the center point K determined, the software then computes a coordinate transformation between the ultrasound probe reference frame 245 and the phantom reference frame 410. The transformation may be accomplished by, for example, Horn's quaternion rigid registration method, as described in B. Horn, Closed-form solution of absolute orientation using unit quaternions, Journal of the Optical Society of America A, Vol. 4, page 629, April 1987, which is incorporated herein by reference. Other techniques may be used, such as those employed to transform a set of points between coordinate systems, as is done in the fields of photogrammetry. The result of this transformation is a translation and rotation of the image of the phantom 260 from the phantom reference frame 410 to the pixel reference frame 245. [0048] In step 360, the software determines if there are more positions at which to acquire ultrasound data. If so, steps 310-350 are repeated for a new position. The next position may be chosen arbitrarily, or determined prior to executing the exemplary process 300. The next position should be chosen such that the phantom 260 is located within the scan plane 242 of the ultrasound probe 210, and that prominent feature points on the phantom 260 will be visible in the ultrasound imagery acquired by the ultrasound probe 210, as illustrated in FIG. 4. In a particular embodiment of the present invention, steps 310-350 are iterated 3 times. [0049] In step 375, the software retrieves the stored data values for the following: the translation and rotation of the phantom 260 from the phantom reference frame 410 to the pixel reference frame 245 when the ultrasound probe 240 was in each position; and the position and angle of the ultrasound probe 240, as measured by the position and angle encoders 216, when the ultrasound probe was in each position [0050] In step 375, the software assembles this data into a closed form
formulation for determining the probe calibration matrix TP according to the present
invention and then derives the probe calibration matrix TP from the closed form
formulation. The closed form formulation is based on the homogeneous matrix equation AX=XB, in which A is the relative coordinate transformations between the locations of the respective pixels corresponding to the prominent feature points of the phantom; B is the relative coordinate transformation between ultrasound prove reference frame at position 1 and position 2, as measured by the position and angle n encoders; and X is the probe calibration matrix TP . This homogeneous matrix
equation may be expressed in software as the following:
Figure imgf000016_0001
y
where / is an identity matrix; Rai2 and Ra23 are the rotations of the pixel reference frame 245 from position 1 to position 2, and from position 2 to position 3, respectively; Rbπ and R2_?2 are the respective rotations of the probe reference frame 235 from position 1 to position 2 and from position 2 to position 3, as measured by
the position and angle encoders 216; tb'l2 and ^23 are respectively the transverse of the translation vectors corresponding to the translation of the probe reference frame 235 from position 1 to position 2 and from position 2 to position 3, as measured (for example, in mm) by the position and angle encoders 216; 7 /2 and Dun are the translation vectors of the pixel reference frame 245 going from position 1 to position 2; tx is the translation vector component corresponding to the calibration matrix (to be solved); Rx is the rotational component corresponding to the calibration matrix (to be
solved); and λ is a vector of translational scale factors, wherein each scale factor
converts the translation from number of pixels to a distance, such as millimeters. Of these variables, Rβ and Du are obtained by estimating the translation and rotation of the prominent feature points of the phantom 260 between position 1 and 2; and Rx, tx,
and λ are the values to be solved using the above formulation. The <8> symbol refers
to the Kronecker product of two matrices; and the vecQ operator creates a column vector from a matrix as follows:
«ι ι «12 vec(A) = a2l a22
[0051] The rotation and translation corresponding to the probe calibration
matrix TP may be derived by extracting a unique solution from the null space
associated with the above formulation using the unity constraint to the first nine coefficients representing the rotation Rx . As is known in the art, extracting the null space involves solving the closed form solution and selecting the vector corresponding to the lowest coefficient. [0052] If more than three positions are to be used, the left-most array in the
closed form solution may be concatenated to include the I9 -Ra\ ® Rb\2 O9.3 O93
and I_ ® i2 ^3 -Rαi2 _ d 2 expressions for subsequent motions to additional
positions. Generally, the more motions used, the more precise the probe calibration
matrix RTP, at the expense of speed of computation.
[0053] An alternate approach is to solve the above formulation in two steps, wherein the rotation R* is extracted first, and then the translation tx and its associated
scale factor λ are subsequently extracted. By solving for and extracting the scale
factor vector λ, the calibration matrix may account for non-rigidity of the
transformation between the pixel reference frame 245 and the probe reference frame
235, as opposed to a rigid transformation, in which case the scale factor λ may be a
scalar. The rigid transformation case is described in the context of robotic hand-eye coordination by N. Andreff, R. Horaud, and B. Espiau, Robotic Hand-Eye Calibration Using Structure-from-Motion, The International Journal of Robotics Research, Vol. 20, No. 3, pp. 228-248, the contents of which are incorporated herein by reference.
[0054] With the rotation Rx, tx translation, and scale factor vector λ derived
from the null space of the above formulation, the probe calibration matrix TP may
be assembled according to the following relation:
Figure imgf000018_0001
Figure imgf000018_0002
where (ux, uy, uz) is the translation vector in number of pixels, and the scale factor λ
converts the number of pixels into distance, such as millimeters. [0055] The software may then store the constituent values of the probe
calibration matrix TP for use in subsequent pixel registration from the pixel
reference frame 245 into the 3D image space 255 defined by the reference frame. [0056] FIG. 5 illustrates still another embodiment of the present invention, where exemplary system 200 includes one or more docking stations 510 and 520. The docking stations 510 and 520 are each in a substantially fixed position and orientation relative to the phantom 260, and each includes an acoustically coupled fixture for placing the ultrasound probe 210 in a precise position and angle relative to the construction frame 255. For example, by having two docking stations 510 and 520, one at position 1 and another at position 2, the user may place the ultrasound probe 210 more precisely at each of the two positions, which may improve the precision and accuracy of the measured position and orientation of the probe reference frame 245. [0057] Multiple ultrasound images may be acquired per position, with each image being used to compute a separate probe calibration matrix. For example, if 3 positions are used, and 10 images are acquired per position, then it is possible to compute 10x9x8=720 probe calibration matrices. Similarly, if 6 images are taken per position, if 3 positions are used, then 6x5x4=120 probe calibration matrices may be generated. Computing the mean and standard deviation of any or all of these probe calibration matrices will provide an indication of the precision of the calibration. [0058] FIG. 6 illustrates another embodiment of the present invention, in which imaging system 600 includes certain substantially similar components to exemplary system 200. However, system 600 also includes three double-wedge phantoms 610 mounted on a base plate 620, which has a base plate reference frame 630; and a cross- wire structure 640, which is located such that it may be imaged by the ultrasound probe 210 simultaneously with any of the double-wedge phantoms 610. The base plate 620 may have holes located and so that double-wedge phantoms 610 may be affixed to the base plate 620. The double- wedge phantoms 610 may be precisely located so that their relative locations are precisely known. In a particular embodiment, the double- wedge phantoms 610 are rigidly mounted so that their relative locations are known to within lOOμm. The double-wedge phantoms 610 and the base plate 620 may be immersed in an acoustically coupling material, such as a gel or water. [0059] Exemplary system 600 may be used in conjunction with exemplary process 300. In using exemplary system 600, the ultrasound probe 210 is positioned and oriented to acquire images of the double wedge phantom 610 at pose 1 in step 310. As used herein, "pose" refers to the position and orientation of a given double- wedge phantom 610. Ultrasound images and probe position and angle data is then acquired in steps 320-350. Steps 310-350 may be iterated, whereby the position and orientation of the ultrasound probe 210 may be adjusted based on the translation and rotation determined in step 350. [0060] In step 340, the images of the double- wedge phantom 610 are identified in an ultrasound image. FIGs. 7A-7D illustrate different scenarios in which an ultrasound beam 705 transmitted by the ultrasound probe 210 impinges on wedge features 710 and 720 of double-wedge phantom 610, and how the reflected energy from the transmitted beam 705 is distributed. FIGs. 8A-8D illustrate how the wedges 710 and 720 may appear in a resulting ultrasound image 732. [0061] Given the shape of the double- wedge phantom 610, any translational offset or angular misalignment in the transmitted beam 705 relative to the pose of the double- wedge phantom 610 is manifested in the ultrasound image 732. By using the ultrasound image 732 as a form of feedback, the the position and orientation of the probe 210 may be adjusted to correct it for any misalignment and translational offset. [0062] FIGs. 7A and 8A correspond to a scenario in which the transmitted beam 705 is aligned with the pose of the double- wedge phantom 610 with no translational offset. Line 721 refers to the "early echo," or the first reflected energy of the transmitted beam 705 to impinge on either wedge 710 and 720. Line 722 refers to the "late echo," or the end of the reflected energy from the transmitted beam 705. Elements 725a and 725b refer to the geometry of the reflected energy, in which the dimension L corresponds to the length of the reflected energy, which is a function of the beam width BW and the slope of the wedge 710 or 720. [0063] FIG. 8A illustrates an exemplary ultrasound image 732 corresponding to FIG. 7A. In FIG. 8 A, the acoustic energy reflected from wedge 710 results in a "cloud" image 730a; and the acoustic energy reflected from wedge 720 results in cloud 730b. Features 730a and 730b are referred to as clouds since the acoustic energy in transmitted beam 705 spatially and temporally spreads as a result of the divergence of the transmitted beam 705, the shape of the acoustic pulse transmitted by the ultrasound probe 210, and the angle of the wedge from which the energy is reflected. Since the transmitted beam 705 is aligned with the pose of the double- wedge phantom, clouds 730a and 730b have substantially the same height, which corresponds to dimension L, which is due to the fact that the transmitted beam 705 impinges on wedges 710 and 720 at substantially the same (and opposite) angle. Further, clouds 730a and 730b are located substantially "side by side" in ultrasound image 732, which is due to the fact that there is substantially no translational offset between the center of the transmitted beam 705 and the point at which wedges 710 and 720 cross. [0064] The beam width BW of the transmitted beam may be computed from the height L of clouds 730a and 730b according to the relation BW = L»tan(30°). It will be readily apparent that angles other than 30° may be used, which may result in differing sensitivities to angular misalignment and translational offset. [0065] FIG. 7B illustrates how acoustic energy may be reflected from wedges 710 and 720 when the transmitted beam 705 is angularly aligned with the pose of the double-wedge phantom 610, but in which the transmitted beam 705 has a translational offset relative to wedges 710 and 720. In FIG. 8B, clouds 730a and 730b have substantially the same height, but are offset from one another in a manner proportional to the translational offset of the transmitted beam 705. [0066] FIG. 7C illustrates how acoustic energy may be reflected from wedges 710 and 720 when the transmitted beam is angularly misaligned (at angle α) with the pose of the double-wedge phantom 610, but does not have any translational offset. As illustrated in FIG. 8C, clouds 730a and 730b have different heights S and B, wherein the height differential is proportional to the misalignment angle according to the following relation: tan(30° - α) _ S tan(30° + α) B
where 30° is the magnitude of the angle of wedges 710 and 720. As mentioned earlier, angles other than 30° may be used, which may result in different sensitivities to angular misalignment and translational offset. [0067] FIG. 7D illustrates how acoustic energy may be reflected from wedges 710 and 720 with a transmitted beam 705 impinging on the double- wedge phantom 610 with both an angular misalignment with the pose of the double- wedge phantom 610 and an translational offset. As illustrated in FIG. 8D, the resulting clouds 730a and 730b in the ultrasound image 732 have different heights S and B, the difference of which is proportional to the misalignment angle; and the clouds 730a and 730b are offset in a manner proportional to the translational offset of the transmitted beam 705. [0068] In step 350, the heights of the clouds 730a and 730b, and their offset, may be determined automatically through image processing techniques known to the art. Alternatively, the heights of the clouds 730a and 730b and their offset may be determined by having the user place a cursor on the top and bottom of clouds 730a and 730b, and click a mouse. With the cloud size differential and cloud offset determined, the translation and rotation from the reference frame of the double- wedge phantom 610 to the pixel reference frame 245 may be determined. [0069] According to this exemplary embodiment of the present invention, the user may adjust the position and orientation of the ultrasound probe 210 to substantially eliminate the angular misalignment and translational offset of the ultrasound probe 210 relative to the double- wedge phantom 610. The user may employ the ultrasound images of the wedges 710 and 720, like those illustrated in FIGs. 8A-8D, for feedback. If this is done, the translation and rotation between the reference frame of the double- wedge phantom 610 and the pixel reference frame 235 will be more precise. [0070] In step 375, the software computes the closed form formulation as is done with exemplary system 200, except that the relative coordinate transformation A (from the aforementioned AX=XB homogeneous equation) may correspond to the following relation: A = UkWkM u
where Uk is the transformation matrix from the coordinate frame of the double-
wedge phantom 610 at pose k to the pixel coordinate frame 245; Uk+\ is the inverse
of the transformation matrix from the coordinate frame of the double- wedge phantom
610 at pose k+1 to the pixel coordinate frame 245; and Wk k+i is the transformation
matrix from the coordinate frame of the double-wedge phantom 610 at pose k to the
double-wedge phantom 610 at pose k+ 1. Of these, Wk k+X is known, since it depends
on the precision to which the base plate 620 was machined and characterized. With the closed form formulation assembled, the software extracts a unique solution from the null space in step 380. [0071] FIG. 9 illustrates an exemplary double-wedge phantom 910 according to the present invention. The double- wedge phantom 910 has multiple sets of wedges 710 and 720, each at different heights. Having multiple sets of wedges 710 and 720 at different heights substantially enable the divergence of the transmitted beam 705 to be characterized by determining the beam width BW at different heights, using the beam width equation described above. [0072] Exemplary system 600 may be used in a "bootstrap calibration"
procedure, in which the probe calibration matrix TP is iteratively refined, and its
accuracy and precision are improved. FIG. 10 illustrates an exemplary process 1000 for performing bootstrap calibration according to the present invention. [0073] Exemplary process 1000 includes process 300, in which ultrasound image data and probe position and angle data are collected. In this case, the ultrasound images include an image of one of the double- wedge phantoms 610 and the cross-wire structure 640. The bootstrapping calibration technique works more effectively if the cross- wire structure 640 within the field of view of the ultrasound probe 210, but as far from the double-wedge phantom 610 as practicable. Within process 300, the ultrasound probe 210 is placed such that it is sequentially centered
and aligned relative to pose 1 , pose 2, and pose 3. A probe calibration matrix TP is
computed according to process 300. Process 300 is implemented in such a way that a plurality of images may be acquired at each pose, and the mean and standard deviation corresponding to the resulting probe calibration matrices are computed.
[0074] In step 1010, an inverse of the probe calibration matrix TP is
computed, and the ultrasound image is reconstructed according to the inverse probe
calibration matrix TP . The reconstructed ultrasound image includes a reconstructed
image of the cross-wire structure 640. [0075] In step 1020, the reconstructed image of the cross- wire structure 640 is compared with an actual image of the cross- wire structure, and a standard deviation is computed between the two images. The accuracy of the reconstructed image of the n cross-wire structure (and thus the accuracy of the probe calibration matrix TP) is
assessed according to pre-determined accuracy requirements. If the probe calibration
matrix TP is deemed sufficiently accurate, the probe calibration matrix TP is
stored; if not, process 1000 proceeds to step 1030. [0076] In step 1030, the out-of-plane motion parameters are perturbed, and
input into process 300 as a new estimate for the Uk , the transformation matrix from
the coordinate frame of the double- wedge phantom 610 at pose k to the pixel
coordinate frame 245. The purpose of perturbing the Uk matrix is to substantially encompass the range of values for the elements of the Uk matrix such that the optimal
version of Uk will be selected.
[0077] In an additional embodiment of the present invention, the system 200 illustrated in FIG. 2, in conjunction with exemplary process 300 illustrated in FIG. 3, may be implemented without the use of a phantom 260. In this exemplary embodiment, image registration may be done by use of speckle correlation. Speckle refers to a situation in which a target tissue contains a plurality of small acoustic scatterers that form patterns of constructive and destructive interference within the tissue. The speckle pattern is generally stable, and may provide a pattern with sufficient spatial variability to substantially enable computing correlations between successive ultrasound images. [0078] It will be apparent to those skilled in the art that various modifications and variation can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims

WHAT IS CLAIMED IS:
1. A method for spatially calibrating an ultrasound probe, comprising: placing the ultrasound probe in a first position and orientation relative to a phantom; measuring the first position and orientation of the ultrasound probe; acquiring a first ultrasound image of the phantom; determining a first spatial relationship between a phantom reference frame and a pixel reference frame, based on the first ultrasound image; repositioning the ultrasound probe in a second position and orientation relative to the phantom; measuring the second position and orientation of the ultrasound probe; acquiring a second ultrasound image of the phantom; determining a second spatial relationship between the phantom reference frame and the pixel reference frame, based on the second ultrasound image; and computing a probe calibration matrix based on the first position and orientation of the ultrasound probe, the first spatial relationship, the second position and orientation of the ultrasound probe, and the second spatial relationship.
2. The method of claim 1 , further comprising acquiring position and orientation data from position and angle encoder.
3. The method of claim 1, wherein determining the first spatial relationship includes: identifying prominent phantom feature points in the first ultrasound image; identifying a center point of the phantom; and determining a coordinate transformation between the center point of the phantom and a reference frame corresponding to the ultrasound probe.
4. The method of claim 3, wherein determining a coordinate transformation between the center point of the phantom and a reference frame corresponding to the ultrasound probe includes: determining a translation between the center point of the phantom and a pixel location in the reference frame of the ultrasound probe; and determining a rotation between the center point of the phantom and a pixel location in the reference frame of the ultrasound probe.
5. The method of claim 4, wherein determining a rotation between the center point of the phantom and a pixel location in the reference frame of the ultrasound probe includes representing the rotation as a quaternion.
6. The method of claim 1 , wherein determining a first spatial relationship includes representing the first spatial relationship as a quaternion.
7. The method of claim 1 , wherein determining a first spatial relationship includes representing the first spatial relationship as a transformation matrix.
8. The method of claim 1, wherein computing a probe calibration matrix includes: assembling the first position and orientation of the ultrasound probe, the first spatial relationship, the second position and orientation of the ultrasound probe, and the second spatial relationship into a closed form formulation; and extracting a unique solution from a null space corresponding to the closed form formulation.
9. The method of claim 6, wherein extracting a unique solution includes using a unity constraint to a first nine coefficients representing a rotation component to the calibration matrix.
10. The method of claim 1 , wherein computing a probe calibration matrix includes: assembling the first position and orientation of the ultrasound probe, the first spatial relationship, the second position and orientation of the ultrasound probe, and the second spatial relationship into a closed form formulation; extracting a rotational component from a null space corresponding to the closed form formulation; extracting a translational component from the null space corresponding to the closed form formulation; and extracting a vector of scale factors from the null space corresponding to the closed form formulation.
11. The method of claim 1 , wherein measuring the first position and orientation of the ultrasound probe includes using optical markers that are disposed on the ultrasound probe.
12. The method of claim 1 , wherein placing the ultrasound probe in a first position relative to a phantom includes placing the ultrasound probe in contact with a docking station, the docking station having a position and orientation that is substantially fixed relative to the phantom.
13. A system for performing intra-operative calibration of an ultrasound probe, comprising: a position and angle encoder for measuring a position and angle of the ultrasound probe; and a data system having a computer readable medium encoded with a program for computing a probe calibration matrix according to a closed form formulation, and according to a relative change between a first location in a first ultrasound image and a second location in a second ultrasound image, wherein the first ultrasound image corresponds to a first ultrasound probe position, and the second ultrasound image corresponds to a second ultrasound probe position.
14. The system of claim 13, wherein the first location includes a prominent feature point.
15. The system of claim 13, wherein the first location includes a speckle pattern.
16. The system of claim 13, further comprising a phantom.
17. The system of claim 16, wherein the phantom includes a wire arranged substantially in an N-shape.
18. The system of claim 16, wherein the phantom includes two substantially parallel plates.
19. The system of claim 16, wherein the phantom includes two wedges.
20. The system of claim 13, further comprising a docking station, the docking station having a substantially fixed position and orientation.
21. The system of claim 13, wherein the position and angle encoder includes optical markers that are disposed on the ultrasound probe.
PCT/US2005/013026 2004-04-15 2005-04-15 Ultrasound calibration and real-time quality assurance based on closed form formulation WO2005099581A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP05754852.1A EP1744676B1 (en) 2004-04-15 2005-04-15 Ultrasound calibration and real-time quality assurance based on closed form formulation
US11/578,071 US7867167B2 (en) 2004-04-15 2005-04-15 Ultrasound calibration and real-time quality assurance based on closed form formulation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US56246004P 2004-04-15 2004-04-15
US60/562,460 2004-04-15

Publications (1)

Publication Number Publication Date
WO2005099581A1 true WO2005099581A1 (en) 2005-10-27

Family

ID=35149728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/013026 WO2005099581A1 (en) 2004-04-15 2005-04-15 Ultrasound calibration and real-time quality assurance based on closed form formulation

Country Status (3)

Country Link
US (1) US7867167B2 (en)
EP (1) EP1744676B1 (en)
WO (1) WO2005099581A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008155257A1 (en) 2007-06-15 2008-12-24 Sagem Defense Securite Method for adjusting transmission jitter in a reception terminal
US20090124906A1 (en) * 2007-10-19 2009-05-14 Calin Caluser Three dimensional mapping display system for diagnostic ultrasound machines and method
WO2009071503A1 (en) * 2007-12-05 2009-06-11 Universite De Bretagne Occidentale Method for calibrating ultrasound probes
WO2009143491A2 (en) * 2008-05-22 2009-11-26 The Trustees Of Dartmouth College System and method for calibration for image-guided surgery
CN104620128A (en) * 2012-08-10 2015-05-13 毛伊图像公司 Calibration of multiple aperture ultrasound probes
US10568535B2 (en) 2008-05-22 2020-02-25 The Trustees Of Dartmouth College Surgical navigation with stereovision and associated methods
CN113768535A (en) * 2021-08-23 2021-12-10 武汉库柏特科技有限公司 Method, system and device for self-calibration of ultrasonic profiling probe attitude for teleoperation
US11510600B2 (en) 2012-01-04 2022-11-29 The Trustees Of Dartmouth College Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007092054A2 (en) 2006-02-06 2007-08-16 Specht Donald F Method and apparatus to visualize the coronary arteries using ultrasound
WO2008051639A2 (en) 2006-10-25 2008-05-02 Maui Imaging, Inc. Method and apparatus to produce ultrasonic images using multiple apertures
US7832114B2 (en) * 2007-04-04 2010-11-16 Eigen, Llc Tracker holder assembly
JP2010531015A (en) * 2007-06-15 2010-09-16 ルイディア インコーポレイテッド Large flat panel display interactivity
US9282945B2 (en) * 2009-04-14 2016-03-15 Maui Imaging, Inc. Calibration of ultrasound probes
WO2010037436A1 (en) * 2008-09-30 2010-04-08 Mediri Gmbh 3d motion detection and correction by object tracking in ultrasound images
US8727986B2 (en) * 2009-02-27 2014-05-20 Wisconsin Alumni Research Foundation Method and apparatus for assessing risk of preterm delivery
JP5485373B2 (en) 2009-04-14 2014-05-07 マウイ イマギング,インコーポレーテッド Multiple aperture ultrasonic array alignment system
WO2011025943A2 (en) * 2009-08-28 2011-03-03 Dartmouth College System and method for providing patient registration without fiducials
EP3960075A1 (en) 2009-11-27 2022-03-02 Hologic, Inc. Systems and methods for tracking positions between imaging modalities and transforming a displayed three-dimensional image corresponding to a position and orientation of a probe
KR102322776B1 (en) 2010-02-18 2021-11-04 마우이 이미징, 인코포레이티드 Method of constructing an ultrasound image and multi-aperture ultrasound imaging system therefor
US9610063B2 (en) * 2010-03-26 2017-04-04 The Johns Hopkins University Methods and apparatus for ultrasound strain imaging
US9135707B2 (en) 2010-06-28 2015-09-15 Koninklijke Philips N.V. Real-time quality control of EM calibration
WO2012051305A2 (en) 2010-10-13 2012-04-19 Mau Imaging, Inc. Multiple aperture probe internal apparatus and cable assemblies
EP3563768A3 (en) 2010-10-13 2020-02-12 Maui Imaging, Inc. Concave ultrasound transducers and 3d arrays
US9098904B2 (en) 2010-11-15 2015-08-04 Dartmouth College System and method for registering ultrasound and magnetic resonance images
US8887551B2 (en) * 2011-09-06 2014-11-18 Trig Medical Ltd. Calibration of instrument relative to ultrasonic probe
JP5863345B2 (en) * 2011-09-08 2016-02-16 キヤノン株式会社 Subject information acquisition apparatus and subject information acquisition method
CN104105449B (en) 2011-12-01 2018-07-17 毛伊图像公司 Use the motion detection based on ping and porous doppler ultrasound
EP2797515A4 (en) 2011-12-29 2015-07-22 Maui Imaging Inc M-mode ultrasound imaging of arbitrary paths
CN104135937B (en) 2012-02-21 2017-03-29 毛伊图像公司 Material stiffness is determined using porous ultrasound
JP6399999B2 (en) 2012-03-26 2018-10-03 マウイ イマギング,インコーポレーテッド System and method for improving the quality of ultrasound images by applying weighting factors
EP2887879B1 (en) 2012-08-21 2021-05-26 Maui Imaging, Inc. Method of ultrasound imaging
EP2706372A1 (en) * 2012-09-10 2014-03-12 Esaote S.p.A. Method and apparatus for ultrasound image acquisition
US9131922B2 (en) 2013-01-29 2015-09-15 Eigen, Inc. Calibration for 3D reconstruction of medical images from a sequence of 2D images
US9510806B2 (en) 2013-03-13 2016-12-06 Maui Imaging, Inc. Alignment of ultrasound transducer arrays and multiple aperture probe assembly
WO2014150274A1 (en) 2013-03-15 2014-09-25 Hologic, Inc. System and method for reviewing and analyzing cytological specimens
US9883848B2 (en) 2013-09-13 2018-02-06 Maui Imaging, Inc. Ultrasound imaging using apparent point-source transmit transducer
US10197451B2 (en) * 2013-10-04 2019-02-05 Battelle Memorial Institute Contrast phantom for passive millimeter wave imaging systems
CA2948279A1 (en) 2014-02-05 2015-08-13 Verathon Inc. Ultrasonic data collection
JP6501796B2 (en) * 2014-05-14 2019-04-17 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Acquisition Orientation Dependent Features for Model-Based Segmentation of Ultrasound Images
KR102617888B1 (en) 2014-08-18 2023-12-22 마우이 이미징, 인코포레이티드 Network-based ultrasound imaging system
US10849650B2 (en) 2015-07-07 2020-12-01 Eigen Health Services, Llc Transperineal needle guidance
WO2017132517A1 (en) 2016-01-27 2017-08-03 Maui Imaging, Inc. Ultrasound imaging with sparse array probes
EP3755233B1 (en) * 2018-02-23 2021-06-09 Brainlab AG Image based ultrasound probe calibration
US11559279B2 (en) 2018-08-03 2023-01-24 Bfly Operations, Inc. Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
US20200037998A1 (en) 2018-08-03 2020-02-06 Butterfly Network, Inc. Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data
WO2021012122A1 (en) * 2019-07-19 2021-01-28 西门子(中国)有限公司 Robot hand-eye calibration method and apparatus, computing device, medium and product

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6390982B1 (en) * 1999-07-23 2002-05-21 Univ Florida Ultrasonic guidance of target structures for medical procedures
US20050085718A1 (en) * 2003-10-21 2005-04-21 Ramin Shahidi Systems and methods for intraoperative targetting

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4020019A1 (en) * 1990-06-20 1992-01-02 Cuba Acad Ciencias Optical sensor calibration system e.g. for robot arm positioning - processes centre of brightness and rotation angle values derived from stored signals
AU8876391A (en) * 1990-10-19 1992-05-20 St. Louis University Surgical probe locating system for head use
US5810008A (en) * 1996-12-03 1998-09-22 Isg Technologies Inc. Apparatus and method for visualizing ultrasonic images
GB9716994D0 (en) * 1997-08-11 1997-10-15 Univ Cambridge Tech Ultrasound machine calibration
US6192735B1 (en) * 1997-12-17 2001-02-27 Nihon Kohden Corporation Three-dimensional position calibrator
IL122839A0 (en) * 1997-12-31 1998-08-16 Ultra Guide Ltd Calibration method and apparatus for calibrating position sensors on scanning transducers
JP4794708B2 (en) * 1999-02-04 2011-10-19 オリンパス株式会社 3D position and orientation sensing device
US6338716B1 (en) * 1999-11-24 2002-01-15 Acuson Corporation Medical diagnostic ultrasonic transducer probe and imaging system for use with a position and orientation sensor
US20040218792A1 (en) * 2003-04-29 2004-11-04 Eastman Kodak Company Probe position measurement to facilitate image registration and image manipulation in a medical application
US7090639B2 (en) * 2003-05-29 2006-08-15 Biosense, Inc. Ultrasound catheter calibration system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6390982B1 (en) * 1999-07-23 2002-05-21 Univ Florida Ultrasonic guidance of target structures for medical procedures
US20050085718A1 (en) * 2003-10-21 2005-04-21 Ramin Shahidi Systems and methods for intraoperative targetting

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1744676A4 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008155257A1 (en) 2007-06-15 2008-12-24 Sagem Defense Securite Method for adjusting transmission jitter in a reception terminal
US20090124906A1 (en) * 2007-10-19 2009-05-14 Calin Caluser Three dimensional mapping display system for diagnostic ultrasound machines and method
US10512448B2 (en) 2007-10-19 2019-12-24 Metritrack, Inc. Three dimensional mapping display system for diagnostic ultrasound machines and method
US9439624B2 (en) * 2007-10-19 2016-09-13 Metritrack, Inc. Three dimensional mapping display system for diagnostic ultrasound machines and method
US20140163376A1 (en) * 2007-10-19 2014-06-12 Metritrack Llc Three dimensional mapping display system for diagnostic ultrasound machines and method
WO2009071503A1 (en) * 2007-12-05 2009-06-11 Universite De Bretagne Occidentale Method for calibrating ultrasound probes
FR2924810A1 (en) * 2007-12-05 2009-06-12 Univ Bretagne Occidentale Etab METHOD OF CALIBRATING ECHOGRAPHIC PROBES
US9052384B2 (en) 2008-05-22 2015-06-09 The Trustees Of Dartmouth College System and method for calibration for image-guided surgery
WO2009143491A3 (en) * 2008-05-22 2010-02-25 The Trustees Of Dartmouth College System and method for calibration for image-guided surgery
WO2009143491A2 (en) * 2008-05-22 2009-11-26 The Trustees Of Dartmouth College System and method for calibration for image-guided surgery
US10568535B2 (en) 2008-05-22 2020-02-25 The Trustees Of Dartmouth College Surgical navigation with stereovision and associated methods
US11129562B2 (en) 2008-05-22 2021-09-28 The Trustees Of Dartmouth College Surgical navigation with stereovision and associated methods
US11510600B2 (en) 2012-01-04 2022-11-29 The Trustees Of Dartmouth College Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance
US11857317B2 (en) 2012-01-04 2024-01-02 The Trustees Of Dartmouth College Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance
CN104620128A (en) * 2012-08-10 2015-05-13 毛伊图像公司 Calibration of multiple aperture ultrasound probes
CN104620128B (en) * 2012-08-10 2017-06-23 毛伊图像公司 The calibration of multiple aperture ultrasonic probe
CN113768535A (en) * 2021-08-23 2021-12-10 武汉库柏特科技有限公司 Method, system and device for self-calibration of ultrasonic profiling probe attitude for teleoperation

Also Published As

Publication number Publication date
EP1744676A4 (en) 2011-02-02
EP1744676B1 (en) 2015-07-22
US20080269604A1 (en) 2008-10-30
EP1744676A1 (en) 2007-01-24
US7867167B2 (en) 2011-01-11

Similar Documents

Publication Publication Date Title
US7867167B2 (en) Ultrasound calibration and real-time quality assurance based on closed form formulation
US9558583B2 (en) Systems and methods for tracking positions between imaging modalities and transforming a displayed three-dimensional image corresponding to a position and orientation of a probe
Prager et al. Rapid calibration for 3-D freehand ultrasound
US9138204B2 (en) Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US8831708B2 (en) Multi-modal medical imaging
US9173715B2 (en) Ultrasound CT registration for positioning
US9743912B2 (en) Automated intraoperative ultrasound calibration
US20090306509A1 (en) Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
Poon et al. Comparison of calibration methods for spatial tracking of a 3-D ultrasound probe
JP6974354B2 (en) Synchronized surface and internal tumor detection
US8811662B2 (en) Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
CN105392428A (en) System and method for mapping ultrasound shear wave elastography measurements
EP2706372A1 (en) Method and apparatus for ultrasound image acquisition
De Lorenzo et al. Accurate calibration method for 3D freehand ultrasound probe using virtual plane
WO1999045839A1 (en) Improvements in ultrasound techniques
Lee et al. Automated conformal ultrasound scanning for breast screening
Peterhans et al. A fully automatic calibration framework for navigated ultrasound imaging
Lange et al. Calibration of swept-volume 3-D ultrasound
Wein et al. Image-based method for in-vivo freehand ultrasound calibration
Aalamifar et al. Enabling technologies for robot assisted ultrasound tomography
Rousseau et al. Quantitative evaluation of three calibration methods for 3-D freehand ultrasound
Hsu Freehand three-dimensional ultrasound calibration
Hossack et al. Quantitative free-hand 3D ultrasound imaging based on a modified 1D transducer array
Zhou et al. A Novel Augmented Reality Ultrasound Framework Using an RGB-D Camera and a 3D-printed Marker
Bao et al. Tracked ultrasound for laparoscopic surgery

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 2005754852

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2005754852

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 11578071

Country of ref document: US