US6306091B1 - Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation - Google Patents
Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation Download PDFInfo
- Publication number
- US6306091B1 US6306091B1 US09/369,824 US36982499A US6306091B1 US 6306091 B1 US6306091 B1 US 6306091B1 US 36982499 A US36982499 A US 36982499A US 6306091 B1 US6306091 B1 US 6306091B1
- Authority
- US
- United States
- Prior art keywords
- matrix
- data
- transformation
- data sets
- ultrasound system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52034—Data rate converters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/06—Measuring blood flow
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/488—Diagnostic techniques involving Doppler signals
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/916—Ultrasound 3-D imaging
Definitions
- microfiche appendix is included of a computer program listing.
- the total number of microfiche is 1.
- the total number of frames is 26.
- This invention relates to medical diagnostic ultrasound systems and methods capable of constructing three-dimensional representations from disjoint two or three-dimensional image data sets of humans or animals. More particularly, this invention relates to ultrasound systems and methods which provide accurate three-dimensional reconstruction between any two ultrasonic data sets acquired by translating or rotating a transducer along one or more of the six degrees of freedom.
- Another approach to three-dimensional imaging is to collect multiple two-dimensional image data frames using a one-dimensional (“1D”) transducer array. These frames are subsequently assembled into the desired three-dimensional reconstruction using relative positional information. This approach is also used to scan multiple volumes with a 2D transducer. Multiple three-dimensional data sets are obtained and then subsequently assembled into a larger three-dimensional data set.
- 1D one-dimensional
- the preferred embodiments described below relate to an ultrasound system and method for accurately reconstructing three-dimensional image data sets from multiple two or three-dimensional image data sets. More particularly, the presently preferred embodiments relate to an improved ultrasound system and method utilizing affine transformations which adhere to a rigid body criterion.
- quatemions are used to perform the affine transformation.
- orthonormal matrices are used to perform the affine transformation. The resultant three-dimensional reconstructions are rendered more quickly, are more accurate and are less likely to be deformed or skewed.
- FIG. 1 depicts a perspective view of an ultrasound transducer for obtaining three-dimensional ultrasonic image data sets.
- FIG. 2 depicts two representative image planes obtained within the volume of FIG. 1 .
- FIG. 3 depicts a block diagram of an exemplary ultrasound imaging system according to one preferred embodiment.
- FIG. 4 depicts a block diagram of an exemplary three dimensional reconstruction computer.
- FIG. 5 shows a flow chart for estimating a three dimensional rigid body transformation according to a first preferred method.
- FIG. 6 shows a flow chart for estimating a three dimensional rigid body transformation according to a second preferred method.
- a sonographer uses an ultrasound transducer to scan a patient.
- the sonographer holds the ultrasound transducer against the patient's body in a location and at an orientation which scans the desired area.
- the sonographer moves the transducer to different places on the patient's body to acquire the desired images.
- Each image represents a two dimensional slice or three dimensional section through the patient's three dimensional body and organs.
- the word “image” is defined to include both two and three-dimensional ultrasonic image data sets.
- Each image includes a number of data points which make up the image and represent a particular slice/section of the three-dimensional body. Therefore the object projects out of the plane of the scan.
- the one dimensional transducer 100 is moved along the elevation direction of the subject 102 while acquiring a series of images. Alternatively, the transducer 100 can be held in place while the ultrasound beam is electronically swept through the subject 102 , or this can be combined with moving the transducer 100 .
- the transducer 100 can be of a conventional or specialized type.
- the local displacement of a collection of points from each image to the next is then estimated by any suitable technique, such as speckle and/or feature correlation, speckle decorrelation, time delays, spectral broadening, power or spectral peak.
- the points for which the local displacements are calculated are selected by using a predefined criteria such as points spaced equally along the range direction, points with highest image information content or points placed by the user. Once the local displacements are estimated, the rigid body transformation that best describes the local displacements is determined. From this calculation, the rigid body transformation for any point in the image plane can be calculated.
- the sonographer may use a two-dimensional transducer array to acquire a three-dimensional data set.
- a second three-dimensional data set is acquired from a different view by moving or sweeping the transducer in an arbitrary fashion.
- the displacement of points from the first data set to the second data set can then be estimated using the above techniques.
- the rigid body transformation between the two data sets can also be found using the techniques disclosed herein.
- FIG. 2 shows a pair of two dimensional images, 1 and 2 , in a series of images acquired by moving the transducer 100 as described above and shown in FIG. 1 .
- Points p i and q i can be substantially close to one another or substantially far apart, i.e. the image misalignment can range from very small to very large.
- These points were selected by using the predefined criteria that the points be spaced equally along the range direction. Alternatively, criteria such as points with highest image information content or points placed by the user can also be used.
- the transformation between the images 1 and 2 is found that minimizes the least squared error between points.
- dx i , dy i and dz i are the components of the displacements of each point.
- An affine transformation is a linear transformation which preserves collinearity among the data points in the data set; i.e. if two lines are parallel in image 1 , they are parallel in image 2 as well.
- a rigid body transformation is an affine transformation which preserves distances, lengths and angles and does not substantially deform objects.
- Rigid body transformations are a subset of affine transformations. All rigid body transformations are affine transformations but not all affine transformations are rigid body transformations. While the disclosed embodiments are directed to linear transformations, those skilled in the art will recognize that non-linear transformations can also be used, such as cubic spline or conformal.
- the above method does not enforce the rigid body criterion on the estimates and therefore, is not a rigid body transformation.
- Distances, lengths and angles, measured in a first image between two points, will be substantially identical to the distances, lengths and angles measured in the second image between the same two points. If the rigid body criterion is not enforced, the reconstructed images resulting from the above method may suffer from geometric distortion and may be deformed and/or skewed.
- the preferred embodiments described below relate to methods for estimating the three dimensional rigid body transformations between two ultrasonic data sets acquired by translating and/or rotating a transducer that may avoid deformation and/or skewing.
- the preferred embodiments enforce the rigid body criterion on the estimated local displacements. They employ transformation functions classified as being within the rotation group, special orthogonal group of order 3 transformations (“SO(3)”). See C. M. Brown, “Some Computational Properties of Rotation Representations,” Technical Report 303, University of Rochester, Department of Computer Science, Aug. 23, 1989, further describing rotation group SO(3) transformations.
- the disclosed embodiments estimate the rigid body transformation using unit quaternions (also known as Euler-Rodriquez) or orthonormal matrices (also known as Euler Angles), both of which are rotation group SO(3) transformations. It will be appreciated by one of ordinary skill in the art, that other transformations from this classification can also be used, such as Conical or Cayley-Klein transformations.
- the disclosed methods are applicable to one-dimensional (“1D”), two-dimensional (“2D”), three-dimensional (“3D”) and larger dimensional ultrasonic data sets acquired using single-element, 1D, 1.5D, 2D and larger dimensional transducer arrays.
- each data set can be acquired by a different type of transducer array.
- the acquired data can be B-mode, Tissue Harmonic or any other ultrasonic mode such as color Doppler velocity and energy or any combination thereof
- the acquired data can be motion-compensated or not.
- One form of motion compensation is discussed in detail in U.S. Pat. No. 5,910,114, “SYSTEM AND METHOD FOR CORRECTING THE GEOMETRY OF ULTRASONIC IMAGES ACQUIRED WITH A MOVING TRANSDUCER”.
- each image is formed from a data set representing multiple spatial locations.
- the disclosed embodiments find the rigid body transformation of these data points from one data set to another.
- the data points In each data set, the data points have a particular arrangement characteristic. This arrangement characteristic is given by each data point's coordinate location in relation to all of the other data points in the data set and can be expressed as distances, lengths and angles between the data points.
- this arrangement characteristic is preferably preserved in the resultant transformation. That is, as the data points are translated and rotated from one image to the next, the arrangement among the individual data points of the image, relative to the other data points, remains unchanged.
- the ultrasound system 300 includes a transmit beamformer 302 , a transducer 304 , a receive beamformer 306 , a filter block 308 , a signal processor 310 , a scan converter 312 , an image data storage 314 , a three-dimensional reconstruction computer 316 and a display 318 .
- the exemplary ultrasound system 300 is configurable to acquire information corresponding to a plurality of two-dimensional representations or image planes of a subject for three-dimensional reconstruction. Other systems, such as those for acquiring data with a two dimensional, 1.5 dimensional or single element transducer array, may be used.
- the ultrasound system 300 is configured to transmit, receive and process during a plurality of transmit events. Each transmit event corresponds to firing one or more ultrasound scan lines into the subject.
- the transmit beamformer 302 is of a construction known in the art, such as a digital or analog based beamformer capable of generating signals at different frequencies.
- the transmit beamformer 302 generates one or more excitation signals.
- Each excitation signal has an associated center frequency.
- the center frequency represents the frequency in a band of frequencies approximately corresponding to the center of the amplitude distribution.
- the center frequency of the excitation signals is within the 1 to 15 MHz range, such as 2 MHz, and accounts for the frequency response of the transducer 14 .
- the excitation signals preferably have non-zero bandwidth.
- Control signals are provided to the transmit beamformer 302 and the receive beamformer 306 .
- the transmit beamformer 302 is caused to fire one or more acoustic lines in each transmit event, and the receive beamformer 306 is caused to generate in-phase and quadrature (I and Q) information along one or more scan lines.
- I and Q in-phase and quadrature
- real value signals may be generated.
- a complete frame of information corresponding to a two-dimensional representation (a plurality of scan lines) is preferably acquired before information for the next frame is acquired.
- harmonic frequencies are frequencies associated with non-linear propagation or scattering of transmit signals.
- harmonic includes subharmonics and fractional harmonics as well as second, third, fourth, and other higher harmonics.
- Fundamental frequencies are frequencies corresponding to linear propagation and scattering of the transmit signals of the first harmonic.
- Non-linear propagation or scattering corresponds to shifting energy associated with a frequency or frequencies to another frequency or frequencies.
- the harmonic frequency band may overlap the fundamental frequency band.
- the filter block 308 passes information associated with a desired frequency band, such as the fundamental band using fundamental band filter 320 or a harmonic frequency band using the harmonic band filter 322 .
- the filter block 308 may be included as part of the receive beamformer 306 .
- the fundamental band filter 320 and the harmonic band filter 322 preferably comprise one filter that is programmable to pass different frequency bands, such as the fundamental, second or third harmonic bands.
- the filter block 308 demodulates the summed signals to baseband.
- the demodulation frequency is selected in response to the fundamental center frequency or another frequency, such as a second harmonic center frequency.
- the transmitted ultrasonic waveforms are transmitted at a 2 MHz center frequency.
- the summed signals are then demodulated by shifting by either the fundamental 2 MHz or the second harmonic 4 MHz center frequencies to baseband (the demodulation frequency). Other center frequencies may be used. Signals associated with frequencies other than near baseband are removed by low pass filtering.
- the filter block 308 provides band pass filtering.
- the signals are demodulated to an intermediate frequency (IF)(e.g. 2 MHz) or not demodulated and a band pass filter is used.
- IF intermediate frequency
- signals associated with frequencies other than a range of frequencies centered around the desired frequency or an intermediate frequency (IF) are filtered from the summed signals.
- the demodulated or filtered signal is passed to the signal processor 310 as the complex I and Q signal, but other types of signals, such as real value signals, may be passed.
- the ultrasound system 300 By selectively filtering which frequencies are received and processed, the ultrasound system 300 produces images with varying characteristics.
- tissue harmonic imaging no additional contrast agent is added to the target, and only the nonlinear characteristics of the tissue are relied on to create the ultrasonic image.
- Medical ultrasound imaging is typically conducted in a discrete imaging session for a given subject at a given time. For example, an imaging session can be limited to an ultrasound patient examination of a specific tissue of interest over a period of 1 ⁇ 4 to 1 hour, though other durations are possible. In this case, no contrast agent is introduced into the tissue at any time during the imaging session.
- Tissue harmonic images provide a particularly high spatial resolution and often possess improved contrast resolution characteristics. In particular, there is often less clutter in the near field. Additionally, because the transmit beam is generated using the fundamental frequency, the transmit beam profile is less distorted by a specific level of tissue-related phase aberration than a profile of a transmit beam formed using signals transmitted directly at the second harmonic.
- the harmonic imaging technique described above can be used for both tissue and contrast agent harmonic imaging.
- contrast agent harmonic imaging any one of a number of well known nonlinear ultrasound contrast agents, such as micro-spheres or the FS069 agent by Schering of Germany, are added to the target or subject in order to enhance the non-linear response of the tissue or fluid.
- the contrast agents radiate ultrasonic energy at harmonics of an insonifying energy at fundamental frequencies.
- the signal processor 310 comprises one or more processors for generating two-dimensional Doppler or B-mode information. For example, a B-mode image, a color Doppler velocity image (CDV), a color Doppler energy image (CDE), a Doppler Tissue image (DTI), a Color Doppler Variance image, or combinations thereof may be selected by a user.
- the signal processor 310 detects the appropriate information for the selected image.
- the signal processor 310 comprises a Doppler processor 324 and a B-mode processor 326 .
- Each of these processors is preferably a digital signal processor and operates as known in the art to detect information.
- the Doppler processor 324 estimates velocity, variance of velocity and energy from the I and Q signals.
- the B-mode processor 326 generates information representing the intensity of the echo signal associated with the I and Q signals.
- the information generated by the signal processor 310 is provided to the scan converter 312 .
- the scan converter 312 includes detection steps as known in the art and described in U.S. application Ser. No. 08/806,922 (Atty. Ref. No. 5050/189), assigned to the assignee of the present invention.
- the scan converter 312 is of a construction known in the art for arranging the output of the signal processor 310 into two-dimensional representations or frames of image data.
- the scan converter 312 outputs formatted video image data frames, using a format such as the DICOM Medical industry image standard format or a TIFF format.
- the plurality of two-dimensional representations are generated.
- Each of the representations corresponds to a receive center frequency, such as a second harmonic center frequency, a type of imaging, such as B-mode, and positional information.
- the harmonic based representations may have better resolution and less clutter than fundamental images. By suppressing the harmonic content of the excitation signal, the benefits of harmonic imaging of tissue may be increased.
- the plurality of two-dimensional representations of the subject are stored in the image data storage 314 .
- the 3D reconstruction computer 316 operates on the stored plurality of two-dimensional representations and assembles them into a three-dimensional representation.
- the 3D computer 316 may also input pre-scan converted acoustic data to convert to 3D data sets as well.
- the 3D reconstruction computer 316 receives the 2D representations in real time and assembles the 3D reconstruction. The completed 3D reconstruction is then displayed on the display 318 .
- the 3D reconstruction computer 316 includes a local memory 402 , a local motion detector 404 , a transform estimator 406 and a three-dimensional scan converter 408 .
- the local memory 402 receives and stores the 2D representations for 3D reconstruction. Alternatively, the local memory 402 receives and stores 3D representations.
- the local motion detector 404 operates on two successive images at a time. The local motion detector 404 first selects a series of points (e.g. 20) in the first image. These points could be on a rectangular grid and uniformly distributed substantially over the entire image. Alternatively, they could be points corresponding to features in the first image.
- the features could be selected using a number of criteria such as image intensity, maximum local image gradient, entropy or texture content.
- local motion is estimated using the speckle correlation technique as is known in the art.
- speckle correlation technique for each point for which the local motion is computed, a neighborhood of points in the first image is cross-correlated with a similar neighborhood of points in the second image.
- Other methods such as feature correlation, speckle decorrelation, time delays, spectral broadening, power and spectral peak may also be used for local motion detection.
- the transform estimator 406 receives the local motion estimates from the local motion detector 404 .
- the transform estimator 406 uses the local motion estimates for the selected set of points in the first image to compute the coordinates of the corresponding set of points in the second image.
- the transform estimator 406 directly receives the coordinates of the selected set of points in the first image and the corresponding set of points in the second image directly from the transducer, which contains a position sensing device such as a magnetic position sensor, optical position sensor, mechanical position sensor, an acoustic position sensor or other sensor or sensors as is known in the art.
- the transform estimator 406 then computes the rigid body transformation using unit quaternions or orthonormal matrices as disclosed by the methods described below. This rigid body transformation is preferably computed so as to minimize the error between points, such as by using a least squares algorithm (minimize the sum of the squared differences) or, alternatively, a weighted least squares algorithm.
- the 3D Scan Converter 408 receives the rigid body transformation data for each successive pair of images from the transform estimator 406 and the image data from the local memory 402 . It then generates a 3D grid data set corresponding to the scanned object and outputs this to the display 318 .
- the first preferred embodiment for computing the rigid body transformation represents data points and transformations as quatemions.
- the points for which the local displacements are calculated are selected by using a predefined criteria such as points spaced equally along the range direction, points with highest image information content or points placed by the user. Let us define p i and q, as quaternions:
- ⁇ i is the error between the predicted and actual i th point in image 2 and r * is the conjugate of r.
- ⁇ i is the standard deviation of error between the predicted and actual i th point in image 2 .
- ⁇ i may denote the confidence of the i th point being accurate. If the i th point is highly accurate, ⁇ i is either estimated or selected to be low. If the ith point is not very accurate, ⁇ i is large.
- Equation 4 reduces to:
- N 11 S xx +S yy +S zz (13)
- N 12 S yz ⁇ S zy (14)
- N 13 S zx ⁇ S xz (15)
- N 14 S xy ⁇ S yx (16)
- N 21 S yz ⁇ S zy (17)
- N 22 S xx ⁇ S yy ⁇ S zz (18)
- N 23 S xy +S yx (19)
- N 24 S zx +S xz (20)
- N 31 S zx ⁇ S xz (21)
- N 32 S xy +S yx (22)
- N 33 S xx +S yy ⁇ S zz (23)
- N 34 S yz +S zy (24)
- N 41 S xy ⁇ S yx (25)
- N 42 S zx +S xz (26)
- N 43 S yz +S zy (27)
- N 44 ⁇ S xx ⁇ S yy+S zz, (28)
- the rotation quaternion, r is given by the unit eigen vector corresponding to the most positive eigen value of the matrix N.
- a closed form solution for the rotation, r exists and is presented in B. K. P. Horn, “Closed-Form Solution of Absolute Orientation Using Unit Quaternions,” Journal of Optical Society of America A , vol. 4, pp. 629-42, April 1987.
- other solutions for the rotation, r known to those skilled in that art, can also be used.
- the computer program contained in the attached appendix (discussed in more detail below) utilizes this closed form solution.
- any point p o in the image 1 can be transformed into its corresponding point q o in image 2 as follows:
- Step 2-compute ⁇ circumflex over (q) ⁇ o r ⁇ circumflex over (p) ⁇ o r* (39)
- FIG. 5 there is shown a flow chart for an exemplary computer program to compute the rigid body transformation using quatemions.
- the program operates on two data sets representing two ultrasonic images.
- a set of points, p i is determined in the first data set (Block 502 ).
- a corresponding set of points, q i is determined in a second data set (Block 504 ).
- the points for which the local displacements are calculated are selected by using a predefined criteria such as points spaced equally along the range direction, points with highest image information content or points placed by the user.
- COG center of gravity
- a new set of data points, ⁇ circumflex over (p) ⁇ i is computed in a new coordinate system centered about this first COG (Block 508 ).
- a second COG is then computed for the second set of data point q i , (Block 510 ) and a new set of data points, ⁇ circumflex over (q) ⁇ i , is computed in a new coordinate system centered about this second COG (Block 512 ).
- a 4 ⁇ 4 matrix containing linear combinations of weighted moments, N is then formed using equations 12-37 above (Block 514 ).
- the source code of an exemplary computer program which performs the above algorithm is attached herein as a microfiche appendix.
- This program can be compiled using the gnu c++ compiler, published by The Free Software Foundation, located in Boston, Mass., the Visual C++ compiler, published by Microsoft Corporation, located in Redmond, Washington, or any other C++ compiler. It runs under Sun OS, published by Sun Microsystems, Inc., located in Mountain View, Calif., Linux, published by Red hat, located in Durham, N.C. and Windows NT, published by Microsoft Corporation located in Redmond, Wash.
- the main function “find_weighted_transformation” accepts 4 parameters: (1) a set of points in the first image, (2) a set of points in the second image, (3) a set of sigma's corresponding to weights and (4) the number of points. It gets these parameters from the module where the points are selected as described elsewhere in the specification.
- This function outputs a record structure called “transform” which contains the two center of gravity's and the rotation quaternion.
- the second preferred embodiment for computing the rigid body transformation utilizes orthonormal matrices to represent data points and transformations.
- An orthonormal matrix is defined as a matrix, that when multiplied by its own transpose, yields the identity matrix.
- the points for which the local displacements are calculated are selected by using a predefined criteria such as points spaced equally along the range direction, points with highest image information content or points placed by the user.
- the orthonormal matrix, R describes the rotation between the two coordinate spaces and vector, t, describes the translation.
- ⁇ i is, as before, the error between the predicted and actual i th point in image 2 .
- ⁇ i is the standard deviation of error between the predicted and actual i th point in image 2 .
- ⁇ i may denote the confidence of the i th point being accurate. If the i th point is highly accurate, ⁇ i is either estimated or selected to be low. If the i th point is not very accurate, ⁇ i is large.
- equation 43 reduces to:
- the rotation matrix, R is then found in one of two ways:
- the matrix M is first decomposed into its singular values. Utilizing matrix M from equation 51, let:
- any point p o in the image 1 can be transformed into its corresponding point q o in image 2 as follows:
- Step 2-compute ⁇ circumflex over (q) ⁇ o R ⁇ circumflex over (p) ⁇ o (58)
- FIG. 6 there is shown a flow chart for an exemplary computer program to compute the rigid body transformation using orthonormal matrices.
- the program operates on two data sets representing two ultrasonic images.
- a set of points, p i is determined in the first data set (Block 602 ).
- a corresponding set of points, q i is determined in a second data set (Block 604 ).
- the points for which the local displacements are calculated are selected by using a predefined criteria such as points spaced equally along the range direction, points with highest image information content or points placed by the user.
- COG center of gravity
- a new set of data points, ⁇ circumflex over (p) ⁇ i is computed in a new coordinate system centered about this first COG (Block 608 ).
- a second COG is then computed for the second set of data point q i , (Block 610 ) and a new set of data points, ⁇ circumflex over (q) ⁇ i , is computed in a new coordinate system centered about this second COG (Block 612 ).
- a 3 ⁇ 3 weighted moment matrix, M is then formed using equation 51 and corresponding equations from equations 29-37 above (Block 614 ).
- the eigen values and eigen vectors of M T M can be found (Block 616 B).
- the matrix (M T M) ⁇ 1 ⁇ 2 is then computed (Block 618 B).
- the estimation of rigid body transformation between two ultrasonic data sets can be applied to the situation where as the two data sets are acquired, the object may change shape due to transducer pressure, patient motion or other factors.
- a global rigid body transformation for the entire data set may not be applicable. Therefore, multiple rigid body transformations can be applied in respective small regions in the data set.
- a multitude of local rigid body transformations describing local regions of the data set are computed using the disclosed techniques on subsets of the larger ultrasonic data set. These sub-regions may be selected using image segmentation methods based on optical flow, texture, connectivity or border detection. Alternatively the user may select them manually as well.
- the unit quaternion, r, in equation 4 and the orthonormal matrix, R, in equation 43 may be replaced with a non-unit quaternion, ⁇ overscore (r) ⁇ , and a non-orthonormal matrix, ⁇ overscore (R) ⁇ . This is useful in order to model more complex object deformations that can occur between the two data sets.
- the transformation which maps the first image to the second can be an almost rigid body transformation wherein the objects in the images may change shape within pre-defined bounds or limits.
- some parts of the first image can be subject to a rigid body transformation while other parts are subject to a more general linear or non-linear transformation.
- the disclosed methods can be executed on a variety of hardware, including digital and analog processors and can be executed in real time or on stored data.
- the rigid body transformation computations on the ultrasound data can be performed remotely from the ultrasound system, such as on a remote 3D reconstruction computer.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims (65)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/369,824 US6306091B1 (en) | 1999-08-06 | 1999-08-06 | Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/369,824 US6306091B1 (en) | 1999-08-06 | 1999-08-06 | Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation |
Publications (1)
Publication Number | Publication Date |
---|---|
US6306091B1 true US6306091B1 (en) | 2001-10-23 |
Family
ID=23457086
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/369,824 Expired - Lifetime US6306091B1 (en) | 1999-08-06 | 1999-08-06 | Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation |
Country Status (1)
Country | Link |
---|---|
US (1) | US6306091B1 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020094134A1 (en) * | 2001-01-12 | 2002-07-18 | Nafis Christopher Allen | Method and system for placing three-dimensional models |
US6491632B1 (en) * | 2001-06-26 | 2002-12-10 | Geoffrey L. Taylor | Method and apparatus for photogrammetric orientation of ultrasound images |
US6503201B1 (en) * | 2001-10-03 | 2003-01-07 | Koninklijke Philips Electronics N.V. | Correction of extended field of view images for distortion due to scanhead motion |
US20030186650A1 (en) * | 2002-03-29 | 2003-10-02 | Jung-Tao Liu | Closed loop multiple antenna system |
FR2841342A1 (en) * | 2002-06-21 | 2003-12-26 | Thales Ultrasonics Sas | INPUT STRUCTURE FOR ULTRASOUND ECHOGRAPHY |
US6775405B1 (en) * | 2000-09-29 | 2004-08-10 | Koninklijke Philips Electronics, N.V. | Image registration system and method using cross-entropy optimization |
US20050033173A1 (en) * | 2003-08-05 | 2005-02-10 | Von Behren Patrick L. | Extended volume ultrasound data acquisition |
US20050043619A1 (en) * | 2003-08-20 | 2005-02-24 | Siemens Medical Solutions Usa, Inc. | Computing spatial derivatives for medical diagnostic imaging methods and systems |
US20050096538A1 (en) * | 2003-10-29 | 2005-05-05 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US20060183992A1 (en) * | 2003-06-06 | 2006-08-17 | Olympus Corporation | Ultrasonic endoscope device |
WO2007085048A1 (en) * | 2006-01-24 | 2007-08-02 | Desmond Fitzgerald & Associates Pty Ltd | An improved method of interpolation between a plurality of observed tensors |
US20070255137A1 (en) * | 2006-05-01 | 2007-11-01 | Siemens Medical Solutions Usa, Inc. | Extended volume ultrasound data display and measurement |
US20080021317A1 (en) * | 2006-07-24 | 2008-01-24 | Siemens Medical Solutions Usa, Inc. | Ultrasound medical imaging with robotic assistance for volume imaging |
CN100376215C (en) * | 2003-06-09 | 2008-03-26 | Ge医药系统环球科技公司 | Method of sector probe driving and ultrasound diagnostic apparatus |
CN101183460B (en) * | 2007-11-27 | 2010-10-13 | 西安电子科技大学 | Color picture background clutter quantizing method |
US20130116562A1 (en) * | 2011-11-09 | 2013-05-09 | Samsung Electronics Co., Ltd. | Method and apparatus for generating diagnostic image and medical image system |
US20150148652A1 (en) * | 2012-07-17 | 2015-05-28 | Canon Kabushiki Kaisha | Object information acquiring apparatus and control method thereof |
US20150276907A1 (en) * | 2014-03-31 | 2015-10-01 | Toshiba Medical Systems Corporation | PSEUDO-CONTINUOUS ASYMMETRIC SIGNAL TARGETING ALTERNATING RADIO FREQUENCY (pASTAR) FOR MAGNETIC RESONANCE ANGIOGRAPHY |
US20180116635A1 (en) * | 2015-03-31 | 2018-05-03 | Koninklijke Philips N.V. | Ultrasound imaging apparatus |
US10034657B2 (en) | 2013-07-26 | 2018-07-31 | Siemens Medical Solutions Usa, Inc. | Motion artifact suppression for three-dimensional parametric ultrasound imaging |
CN110327048A (en) * | 2019-03-11 | 2019-10-15 | 浙江工业大学 | A kind of human upper limb posture reconstruction system based on wearable inertial sensor |
US11364012B2 (en) * | 2017-05-31 | 2022-06-21 | Bk Medical Aps | 3-D imaging via free-hand scanning with a multiplane US transducer |
US11398082B2 (en) * | 2017-01-26 | 2022-07-26 | Mindesk S.r.l. | Affine transformations of 3D elements in a virtual environment using a 6DOF input device |
US20230086369A1 (en) * | 2020-06-10 | 2023-03-23 | Chison Medical Technologies Co., Ltd. | Ultrasound Imaging Device and System and Breast Ultrasound Apparatus |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5655535A (en) | 1996-03-29 | 1997-08-12 | Siemens Medical Systems, Inc. | 3-Dimensional compound ultrasound field of view |
WO1998025509A2 (en) | 1996-12-10 | 1998-06-18 | Medsim Ltd. | A method of mosaicing ultrasonic volumes for visual simulation |
US5776067A (en) * | 1996-01-19 | 1998-07-07 | Hitachi Medical Corporation | Method of displaying a biplane image in real time and an ultrasonic diagnosing apparatus for displaying the biplane in real time |
US5871013A (en) * | 1995-05-31 | 1999-02-16 | Elscint Ltd. | Registration of nuclear medicine images |
US5899861A (en) | 1995-03-31 | 1999-05-04 | Siemens Medical Systems, Inc. | 3-dimensional volume by aggregating ultrasound fields of view |
US5910114A (en) | 1998-09-30 | 1999-06-08 | Siemens Medical Systems, Inc. | System and method for correcting the geometry of ultrasonic images acquired with a moving transducer |
US6064904A (en) * | 1997-11-28 | 2000-05-16 | Picker International, Inc. | Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures |
US6106466A (en) * | 1997-04-24 | 2000-08-22 | University Of Washington | Automated delineation of heart contours from images using reconstruction-based modeling |
US6126600A (en) * | 1994-12-02 | 2000-10-03 | Oxaal; John T | Ultrasound image assisted administering of medication |
-
1999
- 1999-08-06 US US09/369,824 patent/US6306091B1/en not_active Expired - Lifetime
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6126600A (en) * | 1994-12-02 | 2000-10-03 | Oxaal; John T | Ultrasound image assisted administering of medication |
US5899861A (en) | 1995-03-31 | 1999-05-04 | Siemens Medical Systems, Inc. | 3-dimensional volume by aggregating ultrasound fields of view |
US5871013A (en) * | 1995-05-31 | 1999-02-16 | Elscint Ltd. | Registration of nuclear medicine images |
US5776067A (en) * | 1996-01-19 | 1998-07-07 | Hitachi Medical Corporation | Method of displaying a biplane image in real time and an ultrasonic diagnosing apparatus for displaying the biplane in real time |
US5655535A (en) | 1996-03-29 | 1997-08-12 | Siemens Medical Systems, Inc. | 3-Dimensional compound ultrasound field of view |
WO1998025509A2 (en) | 1996-12-10 | 1998-06-18 | Medsim Ltd. | A method of mosaicing ultrasonic volumes for visual simulation |
US6106466A (en) * | 1997-04-24 | 2000-08-22 | University Of Washington | Automated delineation of heart contours from images using reconstruction-based modeling |
US6064904A (en) * | 1997-11-28 | 2000-05-16 | Picker International, Inc. | Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures |
US5910114A (en) | 1998-09-30 | 1999-06-08 | Siemens Medical Systems, Inc. | System and method for correcting the geometry of ultrasonic images acquired with a moving transducer |
Non-Patent Citations (7)
Title |
---|
B. K. P. Horn, "Closed-Form Solution of Absolute Orientation Using Unit Quaternions," Journal of Optical Society of America A, vol. 4, pp. 629-642, Apr. 1987. |
B.K.P. Horn, H. M. Hilden, and S. Negahdaripour, "Closed-Form Solution of Absolute Orientation Using Orthonormal Matrices," Journal of Optical Society of America A, vol. 5, pp. 1127-1135, Jul. 1988. |
C. M. Brown, "Some Computational Properties of Rotation Representations," Technical Report 303, University of Rochester, Department of Computer Science, Aug. 23, 1989. |
J. A. Hossack, J. W. Sliwa, S. H. Maslak, E.A. Gardner, G.L. Holley, and D. J. Napolitano, "Multiple Ultrasound Image Registration System, Method and Transducer." |
J. D. Foley, A. van Dam, S.K. Feiner, and J. F. Hughes, eds., "Computer Graphics: Principles and Practive." New York: Addison-Wesley, 1990. |
J. M. Fitzpatrick, J.B. West, and C.R. Maurer, "Predicting Error in Rigid-Body Point-Based Registration," IEEE Transactions on Medical Imaging, vol. 15, pp. 694-702, Oct. 1998. |
K. S. Arun, T.S. Huang, and S. D. Blostein, "Least-Squares Fitting of Two 3D Point Sets," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-9, No. 5, pp. 698-700, 1987. |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6775405B1 (en) * | 2000-09-29 | 2004-08-10 | Koninklijke Philips Electronics, N.V. | Image registration system and method using cross-entropy optimization |
US20020094134A1 (en) * | 2001-01-12 | 2002-07-18 | Nafis Christopher Allen | Method and system for placing three-dimensional models |
US6491632B1 (en) * | 2001-06-26 | 2002-12-10 | Geoffrey L. Taylor | Method and apparatus for photogrammetric orientation of ultrasound images |
US6503201B1 (en) * | 2001-10-03 | 2003-01-07 | Koninklijke Philips Electronics N.V. | Correction of extended field of view images for distortion due to scanhead motion |
US20030186650A1 (en) * | 2002-03-29 | 2003-10-02 | Jung-Tao Liu | Closed loop multiple antenna system |
FR2841342A1 (en) * | 2002-06-21 | 2003-12-26 | Thales Ultrasonics Sas | INPUT STRUCTURE FOR ULTRASOUND ECHOGRAPHY |
WO2004001444A1 (en) * | 2002-06-21 | 2003-12-31 | Thales Ultrasonics Sas | Input arrangement for ultrasonic echography |
US20050256409A1 (en) * | 2002-06-21 | 2005-11-17 | Thales Ultrasonics Sas | Input arrangement for ultrasonic echography |
US20060183992A1 (en) * | 2003-06-06 | 2006-08-17 | Olympus Corporation | Ultrasonic endoscope device |
CN100376215C (en) * | 2003-06-09 | 2008-03-26 | Ge医药系统环球科技公司 | Method of sector probe driving and ultrasound diagnostic apparatus |
US20050033173A1 (en) * | 2003-08-05 | 2005-02-10 | Von Behren Patrick L. | Extended volume ultrasound data acquisition |
US7033320B2 (en) * | 2003-08-05 | 2006-04-25 | Siemens Medical Solutions Usa, Inc. | Extended volume ultrasound data acquisition |
US7037263B2 (en) | 2003-08-20 | 2006-05-02 | Siemens Medical Solutions Usa, Inc. | Computing spatial derivatives for medical diagnostic imaging methods and systems |
US20050043619A1 (en) * | 2003-08-20 | 2005-02-24 | Siemens Medical Solutions Usa, Inc. | Computing spatial derivatives for medical diagnostic imaging methods and systems |
US20050096538A1 (en) * | 2003-10-29 | 2005-05-05 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US20090030316A1 (en) * | 2003-10-29 | 2009-01-29 | Chomas James E | Image plane stabilization for medical imaging |
US7993272B2 (en) | 2003-10-29 | 2011-08-09 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US7998074B2 (en) | 2003-10-29 | 2011-08-16 | Siemens Medical Solutions Usa, Inc. | Image plane stabilization for medical imaging |
US8301379B2 (en) | 2006-01-24 | 2012-10-30 | Desmond Fitzgerald & Associates Pty Ltd | Method of interpolation between a plurality of observed tensors |
WO2007085048A1 (en) * | 2006-01-24 | 2007-08-02 | Desmond Fitzgerald & Associates Pty Ltd | An improved method of interpolation between a plurality of observed tensors |
JP2009524070A (en) * | 2006-01-24 | 2009-06-25 | デスモンド フィッツジェラルド アンド アソシエイツ ピーティーワイ エルティディー | Improved interpolation method between multiple observed tensors |
CN101375182B (en) * | 2006-01-24 | 2012-01-11 | 德斯蒙德-菲茨杰拉德联合有限公司 | An improved method of interpolation between a plurality of observed tensors |
AU2007209763B2 (en) * | 2006-01-24 | 2012-08-16 | Desmond Fitzgerald & Associates Pty Ltd | An improved method of interpolation between a plurality of observed tensors |
US20070255137A1 (en) * | 2006-05-01 | 2007-11-01 | Siemens Medical Solutions Usa, Inc. | Extended volume ultrasound data display and measurement |
US20080021317A1 (en) * | 2006-07-24 | 2008-01-24 | Siemens Medical Solutions Usa, Inc. | Ultrasound medical imaging with robotic assistance for volume imaging |
CN101183460B (en) * | 2007-11-27 | 2010-10-13 | 西安电子科技大学 | Color picture background clutter quantizing method |
US20130116562A1 (en) * | 2011-11-09 | 2013-05-09 | Samsung Electronics Co., Ltd. | Method and apparatus for generating diagnostic image and medical image system |
US20150148652A1 (en) * | 2012-07-17 | 2015-05-28 | Canon Kabushiki Kaisha | Object information acquiring apparatus and control method thereof |
US10413192B2 (en) * | 2012-07-17 | 2019-09-17 | Canon Kabushiki Kaisha | Object information acquiring apparatus and control method thereof |
US11426078B2 (en) | 2012-07-17 | 2022-08-30 | Canon Kabushiki Kaisha | Object information acquiring apparatus and control method thereof |
US10856851B2 (en) | 2013-07-26 | 2020-12-08 | Siemens Medical Solutions Usa, Inc. | Motion artifact suppression for three-dimensional parametric ultrasound imaging |
US10034657B2 (en) | 2013-07-26 | 2018-07-31 | Siemens Medical Solutions Usa, Inc. | Motion artifact suppression for three-dimensional parametric ultrasound imaging |
US20150276907A1 (en) * | 2014-03-31 | 2015-10-01 | Toshiba Medical Systems Corporation | PSEUDO-CONTINUOUS ASYMMETRIC SIGNAL TARGETING ALTERNATING RADIO FREQUENCY (pASTAR) FOR MAGNETIC RESONANCE ANGIOGRAPHY |
US9702954B2 (en) * | 2014-03-31 | 2017-07-11 | Toshiba Medical Systems Corporation | Pseudo-continuous asymmetric signal targeting alternating radio frequency (pASTAR) for magnetic resonance angiography |
US20180116635A1 (en) * | 2015-03-31 | 2018-05-03 | Koninklijke Philips N.V. | Ultrasound imaging apparatus |
US11006927B2 (en) * | 2015-03-31 | 2021-05-18 | Koninklijke Philips N.V. | Ultrasound imaging apparatus |
US11398082B2 (en) * | 2017-01-26 | 2022-07-26 | Mindesk S.r.l. | Affine transformations of 3D elements in a virtual environment using a 6DOF input device |
US11364012B2 (en) * | 2017-05-31 | 2022-06-21 | Bk Medical Aps | 3-D imaging via free-hand scanning with a multiplane US transducer |
CN110327048B (en) * | 2019-03-11 | 2022-07-15 | 浙江工业大学 | Human upper limb posture reconstruction system based on wearable inertial sensor |
CN110327048A (en) * | 2019-03-11 | 2019-10-15 | 浙江工业大学 | A kind of human upper limb posture reconstruction system based on wearable inertial sensor |
US20230086369A1 (en) * | 2020-06-10 | 2023-03-23 | Chison Medical Technologies Co., Ltd. | Ultrasound Imaging Device and System and Breast Ultrasound Apparatus |
US12089998B2 (en) * | 2020-06-10 | 2024-09-17 | Chison Medical Technologies Co., Ltd. | Ultrasound imaging device and system and breast ultrasound apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6306091B1 (en) | Diagnostic medical ultrasound systems and methods utilizing estimation of 3-dimensional rigid body transformation | |
US7033320B2 (en) | Extended volume ultrasound data acquisition | |
US6780152B2 (en) | Method and apparatus for ultrasound imaging of the heart | |
US20070255137A1 (en) | Extended volume ultrasound data display and measurement | |
US6988991B2 (en) | Three-dimensional ultrasound imaging method and apparatus using lateral distance correlation function | |
US6443894B1 (en) | Medical diagnostic ultrasound system and method for mapping surface data for three dimensional imaging | |
US6511426B1 (en) | Medical diagnostic ultrasound system and method for versatile processing | |
US6527717B1 (en) | Tissue motion analysis medical diagnostic ultrasound system and method | |
US6322505B1 (en) | Medical diagnostic ultrasound system and method for post processing | |
US6482161B1 (en) | Medical diagnostic ultrasound system and method for vessel structure analysis | |
EP0881506A2 (en) | Three dimensional M-mode ultrasonic diagnostic imaging system | |
US20110079082A1 (en) | Extended field of view ultrasonic imaging with a two dimensional array probe | |
US20110079083A1 (en) | Extended field of view ultrasonic imaging with guided efov scanning | |
Zhang et al. | Extension of Fourier-based techniques for ultrafast imaging in ultrasound with diverging waves | |
US20100004540A1 (en) | Dual path processing for optimal speckle tracking | |
CN104584074A (en) | Coupled segmentation in 3D conventional ultrasound and contrast-enhanced ultrasound images | |
US10679349B2 (en) | Method and system for estimating motion between images, particularly in ultrasound spatial compounding | |
WO2007046074A1 (en) | Ultrasonic imaging system and method | |
JP7346586B2 (en) | Method and system for acquiring synthetic 3D ultrasound images | |
US10856851B2 (en) | Motion artifact suppression for three-dimensional parametric ultrasound imaging | |
US6458082B1 (en) | System and method for the display of ultrasound data | |
Jørgensen et al. | Performance assessment of row–column transverse oscillation tensor velocity imaging using computational fluid dynamics simulation of carotid bifurcation flow | |
US20110054323A1 (en) | Ultrasound system and method for providing an ultrasound spatial compound image considering steering angle | |
US6306092B1 (en) | Method and apparatus for calibrating rotational offsets in ultrasound transducer scans | |
Bottenus et al. | Resolution and speckle reduction in cardiac imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ACUSON CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUMANAWEERA, THILAKA S.;HOSSACK, JOHN A.;REEL/FRAME:010158/0053;SIGNING DATES FROM 19990730 TO 19990802 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: SIEMENS MEDICAL SOLUTIONS USA, INC.,PENNSYLVANIA Free format text: CHANGE OF NAME;ASSIGNOR:SIEMENS MEDICAL SYSTEMS, INC.;REEL/FRAME:024563/0051 Effective date: 20010801 |
|
AS | Assignment |
Owner name: SIEMENS MEDICAL SOLUTIONS USA, INC., PENNSYLVANIA Free format text: RE-RECORD TO CORRECT CONVEYING PARTY NAME PREVIOUSLY RECORDED AT REEL 024563 FRAME 0051;ASSIGNORS:ACUSON CORPORATION;ACUSON LLC;ACUSON CORPORATION;SIGNING DATES FROM 20021218 TO 20050926;REEL/FRAME:024651/0673 |
|
FPAY | Fee payment |
Year of fee payment: 12 |