EP2945542A1 - Verfahren, vorrichtung und system zur vollständigen untersuchung von gewebeproben mit tragbaren bildgebungsvorrichtungen mit montierten kameras - Google Patents

Verfahren, vorrichtung und system zur vollständigen untersuchung von gewebeproben mit tragbaren bildgebungsvorrichtungen mit montierten kameras

Info

Publication number
EP2945542A1
EP2945542A1 EP14740355.4A EP14740355A EP2945542A1 EP 2945542 A1 EP2945542 A1 EP 2945542A1 EP 14740355 A EP14740355 A EP 14740355A EP 2945542 A1 EP2945542 A1 EP 2945542A1
Authority
EP
European Patent Office
Prior art keywords
image
scan
images
pixel
spacing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14740355.4A
Other languages
English (en)
French (fr)
Inventor
Philip E. Eggers
Scott P. Huntley
Eric A. Eggers
Bruce A. Robinson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tractus Corp
Original Assignee
Tractus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tractus Corp filed Critical Tractus Corp
Publication of EP2945542A1 publication Critical patent/EP2945542A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4405Device being mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode

Definitions

  • Embodiments described relate generally to medical imaging and methods and devices for ensuring adequate quality and coverage of scanned and recorded images. In another aspect, embodiments described relate to reducing review time of scanned and recorded images from an imaging session or procedure.
  • Radiology Because of the historical use of radiation-based imaging techniques to view internal structures of the human body.
  • the origin of radiology is traditionally credited to Wilhem Rontgen, a German Physicist who discovered X- radiation (electromagnetic radiation in the 0.01 to 10 nanometers and with an energy levels ranging from lOOeV to lOOKeV) in 1895 as a result of his research on cathode ray tubes.
  • Dr. Rontgen discovered that radiation emitted from the cathode ray tubes could pass through some forms of human tissue with varying degrees of absorption and that the X-radiation could expose photographic film.
  • Effectiveness is the ability for the device or method to image internal structures and present the image viewer sufficient information on the internal structure to make a medical decision. If a radiologist wishes to examine the knee joint of a patient presenting with complaints of pain, the effective imaging device or method will be able to distinguish the internal structures of the knee in a way that will allow the radiologist to determine the nature of the complaint. If it is a fractured bone, the image must display, in some fashion, both the bone and the fracture. If it is a torn meniscus, the image must display, in some fashion, the bone structure with the attached meniscus, and the tear in the meniscus.
  • Efficiency is a measure of the resources required to perform an effective procedure. If a device or method can replicate the effectiveness of an existing device or method and, because of an advance in materials, manufacturing method, or other factors lower the cost of the device, then the decreased cost in performing the same function, or increase in efficiency, is a useful feature of the advancement. If a device or method can replicate the effectiveness of an existing device or method and, because of an advance in the functional design can reduce the overall time required to perform the procedure, or if that advancement can shift the time requirements away from more highly trained and skilled personnel to less highly trained and skilled personnel, then the resource shifting is an increase in efficiency which is a useful feature of the advancement.
  • Embodiments described herein provide for devices and methods for recording manually-obtained medical images so that they may be reviewed at a later time.
  • the term "manual" is non-limiting and includes utilizing a device in which the image detection mechanism is designed to be used when held by the human hand.
  • Some embodiments are directed to solving the problem of recording scans that adequately capture information needed for a physician or other trained reviewer to properly screen or diagnose a patient.
  • some embodiments provide for devices and methods for alerting an ultrasound operator if the distance between scanned images exceeds a maximum distance. In such cases, the operator will be alerted to rescan to ensure completeness of the imaging.
  • FIG. 1 For embodiments described, some embodiments described reduce the review time expended by reducing the number of images for review or the amount of time allocated for each image in the review. In such cases, these devices and methods allow the more highly trained image reviewer to be uncoupled from the time-consuming aspects of image acquisition and focus on the tasks associated with image interpretation and allows the operators to benefit from the reduction in time consumed by more highly skilled personnel.
  • Embodiments described provide for devices and methods for recording and reviewing medical images for the purpose of diagnostic and screening image review. Applications of the described embodiments include use in screening and diagnosing many cancer types, such as cancer of the prostate, liver, pancreas, etc.
  • the discussion below may reference breast cancer detection for describing embodiments and aspects of the invention, it should be understood; however, that the device has utility in the early discovery of other types of cancers and that omitting those cancers from this discussion does not limit the scope of the current invention.
  • the described embodiments are applicable to medical imaging in general and are not limited to any specific application provided as an example herein.
  • breast cancer is the leading cause of death. While methods for detecting and treating breast cancer initially were crude and unsophisticated, advanced instrumentation and procedures are now available which provide more positive outcomes for patients.
  • the technology is an emission-reflection-detection technology rather than an emission-absorption-detection technology, as is the case of the mammogram, and since the sonic energy source transmits in multiple frequencies, each frequency interacting with the tissue differently, ultrasound is not as subject to shadowing phenomenon as is X-ray. Ultrasound is also one of the most prominent manual imaging technologies. That is, rather than the energy transmission and detection structures being mechanically fixed in place by other structure, the transmission and detection mechanisms are packaged in a single device which may be held in the human hand. The portability and small size of the device means that it can be used in locations, both geographic and anatomic, that are difficult for larger, more expensive imaging devices such as X-ray and MRI.
  • Medical imaging applications may be generally considered to fall in to one of three categories: (1) screening of asymptomatic patients, (2) diagnostic evaluation of symptomatic patients (i.e., those presenting symptoms discovered through the screening process, or outside of the screening process because they did not participate in a screening program or the screening program failed them), and (3) guidance for therapeutic procedures (i.e., those patients whose symptoms were confirmed, by the diagnostic testing process, to require some form of treatment).
  • diagnostic evaluation of symptomatic patients i.e., those presenting symptoms discovered through the screening process, or outside of the screening process because they did not participate in a screening program or the screening program failed them
  • therapeutic procedures i.e., those patients whose symptoms were confirmed, by the diagnostic testing process, to require some form of treatment.
  • the clinical needs for each of these applications differ significantly, as do the needs, applications, and methods of the imaging techniques used in the three procedures.
  • the physician is already concerned with, and desires to characterize, a particular structure which has been previously characterized as "abnormal".
  • the suspected abnormality is typically a result of a physical finding, such as the physical palpation of a lump in a particular location in the breast, a complaint of pain in a particular location in the breast, the appearance of some sort of deformity, such as skin thickening, skin distortion, abnormal nipple discharge, or the appearance of an abnormal structure on a screening imaging examination, such as a mammogram.
  • a screening imaging examination such as a mammogram.
  • the physician is not concerned with structures other than the identified region of interest.
  • the diagnostic examination is not only confined to the particular breast in which the abnormality was identified, but it is confined to the one particular quadrant of the particular breast in which the abnormality was found.
  • There may be abnormalities in the other seven quadrants (there are four quadrants per breast).
  • There may even be cancers in the other seven quadrants but it is not the purpose of the diagnostic examination, however, to find those possible, but previously not identified, lesions.
  • the purpose of the diagnostic examination is to characterize known lesions in known locations.
  • the screening examination differs from the diagnostic examination because (1) it is performed on an asymptomatic patient (that is, a patient who is considered healthy), so the physician expects all of the internal structures to be normal, and (2) it is performed on the entire structure, not just a localized area with a predetermined abnormality.
  • the physician expects normal tissue because the patient is asymptomatic, but he or she also expects normal tissue because the vast majority of patients have no abnormalities. In the case of breast cancer screening in the United States, only 3 to 5 patients per 1,000 screened have cancer. Only
  • the Mammographer will compress the breast tissue between two paddles to pull as much of the breast as possible away from the chest wall to bring that tissue within the field of the X-ray source and X-ray detector.
  • the X-ray source and X-ray detector are fixed in space and the patient tissue is immobilized within the field of exposure. The process requires significant patient manipulation and tissue distortion to pull the mammary tissue as far into the field of view of the X-ray radiation emitting and detecting imaging device as is possible.
  • the image is a collection of "shadows" of structures within the breast and the entirety of the three-dimensional structure of the breast is reduced to a single two-dimensional image.
  • the radiologist can tell with a single view whether the mammogram represents the entire breast.
  • mapping the location of various tissue structures.
  • the ability to map the images is critical because the device is not effective in practice if an abnormality is identified, but the physician does not know where it is within the patient's anatomy. Different portions of a three- dimensional object may be seen in different discreet images.
  • the relative position of the slice is only known if the relative position of the patient to the imaging device is known when that image is obtained. Mapping can be as simple as identifying which limb was imaged by the X-ray, to acute, three-dimensional location of small structures in the complex structure of the complete anatomy.
  • a lesion in the "upper-outer" quadrant is one that is located in the part of the breast which is nearest the shoulder and which presents lateral to the nipple ("outer") on the cranio- caudad view and above the nipple (“upper”) on the medial-lateral-oblique view.
  • Another family of imaging devices maps the cellular tissue by taking more than one image on sequential parallel planes as a robotic element translates the imaging apparatus over the portion of the patient's anatomy which is to be studied. Each image is a slice, or cross-section of the region of cellular tissue that is to be imaged.
  • Computed Tomographic X-ray (CT) and Magnetic Resonance Imaging (MRI) image multiple "slices" or cross sections of the anatomy. Each slice, or frame, is a discreet image which describes all of the structures contained within that cross section, but do not describe information contained in adjacent slices.
  • Computed Tomographic X-ray (CT) systems use a mechanism to move the X-ray source and detector over the entire body of the patient. Magnetic Resonance Imaging devices require the patient to lie, immobilized, in possibly in a prone position while he or she is literally moved, in totality, past the imaging structure. The rate of translation of that movement is controlled by a mechanical mechanism.
  • Both of these devices use a form of robotics to control the translation of the imaging device to the patient, or the translation of the patient to the imaging device, so that each image may be mapped.
  • the robotic control is designed to incorporate a real-time feedback mechanism to direct the path of the scanning and receiving mechanisms and direct the speed at which they scanning and receiving mechanisms translate.
  • the goal of this real-time control is to assure that there is complete coverage (the path follows the directed course) and that the images are evenly spaced (to assure appropriate resolution).
  • the primary purpose for controlling the speed is that most recording devices record at regular time intervals.
  • a constant recording interval e.g. frames/sec
  • a constant translation speed e.g. mm/sec
  • the location of the manual imaging device is not controlled by an external mechanical structure when that device obtains the image.
  • the device does not know where the imaging component is in space if the device does not know where the hand holding the device is in space. Therefore it does not know where the image is in space.
  • One way that this problem has been addressed is to retrofit manual devices with location sensors that will provide spatial information of the images. For example, a manual scan to obtain regularly spaced images which cover the desired area is used to substitute the human operator for the robotic controls and use information from the location sensors to direct the human being, dynamically and in real time while he or she is scanning, to adjust the position, angle, and speed of the probe as it translates over the patient.
  • the probe will translate over the skin at a constant speed and the images will be recorded at regular intervals.
  • One drawback of this approach is that there is no quality control to assure that the user responded to the prompts appropriately and that the images are actually being recorded at regular intervals. The situation is exacerbated if the program just assumes that the user made the adjustments and saves the images at the presumed locations and does not confirm actual spacing of the images.
  • Zooming in on the image does not change the resolution. If one expanded one quarter of the screen to fit the size of the entire screen, then the entire screen would only contain 171 by 120 pixels of information. The display would be still be 704 by 480 pixels, but the expanded image would not contain more information and the single pixels of a single color that were in the smaller image would be presented as four adjacent pixels, each of the same color. In effect the individual small pixels would be replaced by larger "pixels", but the resolution would not change by making that portion of the screen larger.
  • Modern high definition (HD) Television presents images in a 1920 by 1080 pixel format. When one adjusts for changes in aspect ratios (16:9 instead of 4:3), the modern television image can resolve structures which are 2.5 times smaller than the 20th Century 704 by 480 pixel broadcast models. The modern high definition television could distinguish, or resolve, that human hair.
  • the level of resolution can vary along dimensional axes.
  • a standard ultrasound system the iU22, Philips Healthcare, Andover, MA, USA
  • the system may be set to image variable depths of tissue.
  • the design of the system allows it to produce more than one pixel per element and the image is displayed on a video monitor in a format which is 600 pixels by 400 pixels, with each pixel representing a unique tissue structure in the space of the plane of the image.
  • an ultrasound image acquired from this system with a depth setting of 5cm, would have a resolution of 11.5pixel/mm in the horizontal, or X axis and 8.0pixel/mm in depth, or the Y axis.
  • Changing the depth setting to 4cm would change the Y pixel resolution of lO.Opixel/mm (the X pixel density would remain unchanged).
  • the translational resolution can differ greatly from the resolution presented in the planar presentation of each discrete image. Even if the resolution of the X-Y presentation of any one discrete image is sufficient to distinguish 1mm structures, it is possible for a 1mm structure to be missed entirely if the space, or "Z" vector, between the discrete images is greater than 1mm.
  • the intra-image resolution is sufficient.
  • the early CT devices had 8 discreet images. Although any single X-Y slice could resolve lesions as small as a millimeter, the inter-slice spacing made resolution of lesions smaller than 8.6mm unreliable. Modern 64-slice CT devices have a 0.5mm inter-slice spacing, making the ability to diagnose millimeter sized lesions possible.
  • the individual image slices are referred to as "discrete images” while the set of discrete images obtained in a single scan sequence are referred to as a "set of discrete images” or a “scan track”.
  • “scan” or “scan sequence” or “scan path” or “set of discrete images” are used in some embodiments to refer to a plurality of images recorded sequentially as the hand-held imaging probe is placed in contact with the patient and is moved from one location to another location on the patient.
  • mapping tissue images and determining resolution are essential when mapping tissue images and determining resolution. Since the discrete images are typically presented in a two-dimensional format, whether on paper or on a video screen, mapping of that format is typically presented in a means compatible with the X and Y axes of a Cartesian coordinate system. For example, previously described Philips ultrasound device displays the images on a video monitor in a format which is 600 pixels by 400 pixels. Thus, an ultrasound image acquired from this system (which has a probe width of 5.2cm), with a depth setting of
  • 5cm would be 0.087mm/pixel in the X axis and 0.125mm/pixel in the Y axis.
  • a second image in the sequence would also represent a tissue slice that is 5.2cm by
  • the corresponding pixels are the pixels which are at the same X-Y coordinate in both images.
  • the X-Y location of the first pixel of the first row of one image corresponds to the X-Y location of the first pixel of the first row of the second image; the X-Y location of the second pixel of the first row corresponds to the X-Y location of the second pixel of the first row, and so forth until the last X-Y location of the pixel of the last row of the first image, which corresponds to the X-Y location of the last pixel of the last row of the second image.
  • Hand-held imaging devices rely on a human operator to translate the imaging probe over the tissue to be examined and present resolution challenges that are very different from the robotic devices.
  • the X-Y resolution of a single image may be comparable to another method.
  • the pixel spacing in modern ultrasound systems is 0.125mm, approximately the same as a mammogram.
  • the primary challenges in the efficacy of a hand-held device are the ability to map individual images, the ability to resolve between the discrete images in the image set, and to determine whether the family of image sets represents complete coverage of the structure.
  • Coverage is a description of the extent of the field of imaging, not the quality of the imaging.
  • An X-ray of the kidney which images only half of the kidney may have finely detailed resolution, but it does not cover the entire kidney.
  • a blurry mammogram of the entire breast "covers" the entire breast, but may not do so with adequate resolution to be a useful examination.
  • the term "coverage” is not intended to be limited to any particular meaning.
  • the term broadly includes, at least, the distance, surface, volume, area, etc. that is imaged during a medical imaging session.
  • determining coverage of a scan would include evaluating whether there are any gaps in the relative positions of the images contained in (between) two or more scan track sets (e.g. scan-to-scan spacing or distance).
  • resolution describes at least the X-Y and x-y-z resolution of each individual image and the relative spacing of the discrete images within a single scan track (e.g., image-to-image spacing or distance).
  • Robotic devices have been used to previously achieve coverage because the desired field of view is predetermined and the systems are able to calculate the appropriate translational scan paths to encompass that field of view and they are programmed to translate the energy scanning and receiving elements along the predetermined paths.
  • manual imaging devices are operated based on the technical experience and subjective judgment of the human operator.
  • the quality, particularly coverage, of the scanned recorded images varies widely depending on the operator. For example, if the operator scans too quickly, the images in a scan sequence may be spaced too far apart to show a potential cancerous region. Similarly, if the operator spaces two scan sequences too far apart, then there may be areas between scan rows that have not been scanned for review.
  • some embodiments described provide methods, devices, and systems for recording images to ensure that recorded images during a manual scanning session have adequate coverage.
  • a "scan track,” in some embodiments, refers to any set of discrete images recorded by a medical imaging method, device, or system.
  • the set of discrete images can be obtained by any method or device.
  • the set of discrete images are obtained when an operator (1) places the probe on the patient, (2) begins recording images, (3) translates the probe across the surface of the skin, (4) stops recording the images.
  • a scan track is a set of sequential discrete images with unique relative spacing between individual discrete images.
  • the set of discrete images can encompass a volume which is as wide as the imaging probe design allows, as deep into the tissue as the imaging probe allows, and as long as may be accomplished by the act of recording the images while translating the probe across the skin.
  • mammography and the robotic devices depend on separating the imaging process in to two steps, (1) recording the image and (2) reviewing the image. With the hand-held devices the images can be presented in real-time, so the reviewer can dynamically review structures. When performing the procedure in real time, the skilled operator may believe that he or she is skilled in appropriately translating the probe to cover the breast entirely and to translate the probe with appropriate speed, and may believe that he or she does not need real-time feedback to achieve these goals.
  • the reviewer does not have the ability to confirm the location of the image nor does he or she have the ability to confirm the spacing between adjacent images, if appropriate.
  • the reviewer does not have the ability to determine the resolution in the "z" plane.
  • X and Y axes of a Cartesian coordinate system are used to define a two-dimensional array of ultrasound scanning derived images containing a multiplicity of pixels, where the term pixel refers to the basic unit of a video screen image and can be defined by its X and Y coordinate value in any predetermined reference frame defining the location of zero for both the X and Y coordinates.
  • These two-dimensional ultrasound images are generated by an ultrasound probe comprising a linear scanning array.
  • a modern high-end scanning array consists of 256 transmitting and receiving transducers packaged in an ultrasound probe, said linear array of transducers having a width of 38mm to 60mm.
  • Each individual pixel within the ultrasound-derived planar image is defined by a unique X and Y coordinate value.
  • the two-dimensional resolution, or two- dimensional density of the pixels within each ultrasound scan-derived two-dimensional image is constant and is a function of the ultrasound system hardware and remains the same for each adjacent image in the scan process. This resolution allows routine identification of tissue abnormalities (e.g., cancers) as small as 1mm to 5 mm.
  • the primary challenges in the three-dimensional reconstruction are the spacing between adjacent pixels in the third axis of the XYZ Cartesian coordinate system, viz., the Z-axis and the relative location of the families of sets of discrete images obtained during the scanning process.
  • the spacing along the Z-axis is dependent, in part, on the rate of change of the position and angle of the ultrasound probe between the creation of any two sequential and adjacent two-dimensional images.
  • the change in the spacing between two sequential two- dimensional images depends on five factors:
  • One factor is the rate at which the ultrasound system hardware and software are capable of processing the reflected ultrasound signals and constructing the two-dimensional images (i.e., number of completed two-dimensional ultrasound scans per second).
  • the second factor is the rate at which the displayed images can be recorded, for example by a digital frame-grabber card.
  • a digital frame-grabber card By way of example, if the ultrasound system displays 10 discrete images per second and a frame-grabber card can record 20 frames per second, then the recorded set of images will have 20 images but will, in reality, have only 10 discrete images with each image having a replicate.
  • the ultrasound system displays 40 frames per second and the frame grabber records 20 frames per second, the recorded set of images will have 20 discrete images, but will not have recorded an additional 20 discrete images.
  • a third factor is the rate at which the ultrasound probe is translated along the scanned path.
  • the faster the operator moves the ultrasound probe the greater the spacing will be in the Z direction and/or the slower the combined rate at which the ultrasound system hardware and software are capable of processing the reflected ultrasound signals and constructing the two-dimensional images and the image recording hardware can store the processed images (i.e., the lower the rate of completed two-dimensional ultrasound scans recorded and stored per second), the greater the spacing will be in the Z direction.
  • the operator moves the ultrasound probe more slowly the smaller the spacing will be in the Z direction.
  • the fourth factor is the relative orientation of the hand-held probe during the scanning process. Because the probe is not held rigid by a mechanical mechanism, the translational distance between adjacent frames is not a constant. For example, if the discrete images within an image set were perfectly parallel, then the Z spacing between corresponding pixels would be the same for each pair of corresponding pixels in two discrete images. If the probe were rotated along the lateral axis (pivoted, or pitch) then the Z spacing of the
  • corresponding pixels at the top of a pair of images would vary from the Z spacing of the corresponding pixels at the bottom of a pair of images. If the probe were rotated along its longitudinal axis (roll) then the Z spacing of corresponding pixels on the left side of the a pair of images would vary from the Z spacing of the corresponding pixels on the right side of the pair of images.
  • the fifth factor is associated with the rotation of the probe along its vertical axis (yaw).
  • the distance between two corresponding pixels in a pair of images differs if the two images are recorded when rotation on the vertical axis differs.
  • each scan track has its own set of discrete images, and since each discrete image has its own mapping location coordinates, it is possible to determine whether two separate scan tracks represent the exact same region of tissue, adjacent regions of tissue with some overlap, adjacent regions of tissue with no overlap, adjacent regions of tissue with some gap in between, or regions of tissue with no anatomic relation to each other.
  • the reconstruction of a plurality of scan tracks can describe a covered region if the scan tracks between any two adjacent scan tracks can be reconstructed to form a contiguous region of images with no gaps in coverage and if the extent of the reconstruction encompasses the entire tissue structure to be imaged.
  • Robotic approaches to ultrasound imaging require the use of expensive mechanical equipment that is also subject to regular service and calibration to assure that the machine driven ultrasound probe is in the assumed position and computed orientation as required to assure that a complete and systematic diagnostic ultrasonic scan of the target living tissue has been actually achieved.
  • An objective of the present invention is to enable and assure the completeness of an ultrasound diagnostic scan of the target tissue (e.g., human breast), in terms of area covered and resolution of the relative spacing of the images within that area covered, without the need for robotic mechanical systems for the support, translation and computed orientation control of an ultrasound probe.
  • Some embodiments enable the use of hand-held diagnostic ultrasound probe scanning methods while assuring that a complete scan of the targeted tissue is achieved.
  • the review time could be as short as 200 seconds (less than 4 minutes).
  • the concept of the cine presentation goes back more than a century, to Edison, but Freeland describes the use of the cine viewing technique for the review of ultrasound images in 1992 (5,152,290).
  • Mapping the images and calculating the resolution and coverage of the resultant sets of images allows the ability to divide the imaging and reviewing tasks and, thus, allows the time savings associated with performing the procedure in a manner where it is recorded by one individual and reviewed by another and still provide some level of confidence as to the aforementioned resolution and coverage.
  • the incremental improvement in patient care may not be warranted for the additional 1.5 minutes of physician time to review the track. If one considers that there may be as many as 16 such scan tracks for each breast, then the time differential could be 320 seconds (just over six minutes) vs. 3,200 seconds (just over one hour).
  • Some embodiments described provide for systems and methods for providing a speeded review time by varying the dwell time between successive discrete images and calculating that dwell time as a function of the distance between adjacent images.
  • the resultant presentation would be provided in distance covered per second (dcps) not frames per second.
  • the review time for those 19 images at lOfps (that is a dwell time of 0.1 sec/frame) would be 1.8sec. If individual dwell times were assigned unique values with criteria based on amount of tissue to be imaged per second and the spacing between discrete images, then the review time could be shortened considerably.
  • the review time would be 1.00 seconds.
  • Some embodiments also provide for a means of speeding the review time by displaying only those images which provide incremental information that the operator deems useful.
  • the extra images are redundant.
  • the system and method may choose to not display the redundant images.
  • the operator chooses an optimal image spacing of 1.0mm, then the system would only display those images recorded at 0.0mm, 0.9mm, 1.9mm, 2.8mm, 3.7mm. 4.7mm, 5.6mm, 6.6mm, 7.6mm, 8.5mm, 9.5mm and 10.0mm.
  • the images recorded at 0.7mm, 2.5mm, 3.7mm, 4.0mm, 5.1mm, 7.0mm, and 8.2mm would be culled. If the retained images were displayed at lOfps (a dwell time of O. lseconds/frame) then the image review time would be 1.1 seconds, not the 1.8 seconds that would be required if all of the images were reviewed.
  • Another system and method for reducing the review time required by the radiologist would be to cull images whose information is contained completely within another set of discrete images.
  • the operator is reviewing a scan of the breast which contains 12 sets of discrete images, each image originating at the nipple and extending radially to the base of the breast at each of the 12 clock positions, there will be images within some of those sets of discrete scans that image tissue structures that overlap or are partially or completely imaged by other images or groups of images.
  • the 5mm probe extends from 10 o'clock to 2 o'clock when the probe is performing the 12 o'clock scan is only 1cm from the nipple, and the probe extends from 1 o'clock to 5 o'clock when the probe performing the 3 o'clock scan is just 5mm from the nipple, then there is a substantial and possibly complete overlap between these two scans and the images recorded by the 1 o'clock scan at 5mm from the nipple and the 2 o'clock scan at 5mm from the nipple contain redundant information. If those images were removed from the review set then the result would be a time savings.
  • This system and method teaches a means of distinguishing which images contain information that is completely or partially contained in one or more images from other sets of discrete images in the scan and removing those images from the review set. Overlap of information in images could be anywhere from about 10% to about 100%. In some embodiments, images with information having 80%-100% overlap with other images are removed from the review image set.
  • the scan completeness auditing systems can include a position tracking system configured to track and record a position of a manual imaging probe.
  • the position tracking system can include a plurality of cameras adapted to couple to the manual imaging probe. The plurality of cameras can be configured to provide position data for the manual imaging probe.
  • the scan completeness auditing system can also include a receiver comprising a controller configured to electronically receive position data for the manual imaging probe from the position tracking system and to electronically receive and record a first scan sequence comprising a first set of scanned images representing cross-sections of the tissue from the manual imaging probe.
  • the controller can be further configured to compute an image-to-image spacing between successive images within the first scan sequence and to determine whether the computed image-to-image spacing exceeds a maximum limit.
  • the controller can also be adapted to provide an alert when the computed image-to-image spacing exceeds the maximum limit.
  • the manual imaging probe is an ultrasonic imaging probe and the imaging console is an ultrasound imaging console.
  • the position tracking system further includes a plurality of position sensors.
  • the plurality of position sensors are configured to reflect electromagnetic radiation and the plurality of cameras are configured to detect said reflected electromagnetic radiation to determine a relative position between the position sensors and the cameras.
  • each of the plurality of sensors are optically unique.
  • the position tracking system is configured to track the position of the manual imaging probe to an accuracy within 1 millimeter at a distance of up to 3 meters between the plurality of cameras and the plurality of sensors.
  • the cameras are configured to determine a position of the plurality of cameras relative to a position of the plurality of position sensors with the position of the manual imaging probe determined based on a spatial relationship between the plurality of cameras and the manual imaging probe.
  • the plurality of position sensors are configured to be stationary when screening the volume of tissue.
  • the plurality of cameras are optical cameras.
  • the plurality of position sensors are configured to reflect wavelengths of light between about 750 nm and about 390 nm.
  • the plurality of cameras are infrared cameras. In any of the embodiments described herein the plurality of position sensors are configured to reflect wavelengths of light between about 100,000 nm and about 750 nm.
  • the plurality of cameras are ultraviolet cameras. In any of the embodiments described herein the plurality of position sensors are configured to reflect wavelengths of light between about 390 nm and about 10 nm.
  • the receiver is configured to receive position data at time intervals of about 0.05 seconds. In any of the embodiments described herein the receiver is configured to receive position data at time intervals of about 0.01 seconds.
  • the controller applies an image position tracking algorithm to determine a relative resolution between the scanned images within the scan sequence.
  • the controller is configured to measure a scan-to-scan spacing between the first scan sequence and a second scan sequence, the second scan sequence comprising a second set of scanned images representing cross-sections of the tissue.
  • the controller is configured to measure the scan-to-scan spacing between the first and second scan sequence by calculating a distance between a first boundary of the first scan sequence and a second boundary of the second scan sequence.
  • the controller is configured to measure the scan-to-scan spacing between the first and second scan sequences by computing a pixel density for a unit volume within the screened volume of tissue and comparing the computed pixel density to a minimum pixel density value.
  • the controller can be configured to provide an alert to rescan the tissue if the computer pixel density is less than the minimum pixel density value.
  • the controller is configured to modify the first or second scan sequences for display by removing redundancy from at least one of the scan sequences.
  • the controller is configured to compute the image-to-image spacing between scanned images within a scan sequence by measuring a distance between a first pixel in a first scanned image and a second pixel in a second scanned image with the first and second scanned images being sequential images. In any of the embodiments described herein the controller is configured to determine whether the measured distance between the first and second pixels exceeds a maximum distance.
  • the controller is configured to compute the image-to-image spacing within the first scan sequence by measuring a maximum chord distance between a plurality of successive planar images in the first scan sequence.
  • the controller is configured to compute the image-to-image spacing within the first scan sequence by calculating a pixel density for a unit volume within the screened volume of tissue, and the controller adapted to compare the calculated pixel density with a minimum pixel density value.
  • the minimum pixel density value is between about 9,000 pixels/cm3 to about 180,000,000 pixels/cm3.
  • the controller is configured to only display images of a recorded scan sequence that satisfy a predetermined imaging spacing interval.
  • the controller is configured to change an image display rate of a recorded scan sequence to provide a substantially uniform spatial- temporal display of the recorded scan sequence.
  • the controller is configured to assign a dwell time to each image in a recorded scan sequence, wherein the dwell time for each image is based on a relative spacing for that image in the recorded scan sequence.
  • the receiver includes a cable configured to engage with a video output of the ultrasound imaging console.
  • methods for screening a tissue can include scanning the tissue with a manual ultrasonic imaging probe of an ultrasound imaging console along a first scanning path on the tissue, generating a first scan sequence comprising a first set of discrete digital images representing cross-sections of the scanned tissue along the first scanning path, electronically transmitting the first scan sequence to a controller, collecting position data for the manual ultrasonic imaging probe from a plurality of cameras engaged with the manual ultrasound imaging probe while scanning the tissue, electronically communicating the position data for the manual ultrasonic imaging probe to the controller, and assigning a display dwell time to each image based on a relative spacing for that image in the first scan sequence.
  • the methods further include determining the position data for the manual ultrasonic imaging probe based on a spatial relationship between the plurality of cameras and a plurality of sensors.
  • the plurality of sensors are stationary during the scanning step.
  • the plurality of cameras are optical cameras and the method further includes determining the position data for the manual ultrasonic imaging probe by reflecting wavelengths of light between about 750 nm and about 390 nm off of the plurality of sensors.
  • the plurality of cameras are infrared cameras and the methods further include determining the position data for the manual ultrasonic imaging probe by reflecting wavelengths between about 100,000 nm and about 750 nm off of the plurality of sensors.
  • the plurality of cameras are ultraviolet cameras and the method further includes determining the position data for the manual ultrasonic imaging probe by reflecting wavelengths between about 390 nm and about 10 nm off of the plurality of sensors.
  • the methods further include tracking the position data for the manual ultrasonic imaging probe with an accuracy within 1 millimeter at a distance of up to 3 meters between the plurality of cameras and the plurality of sensors.
  • the position data for the manual ultrasonic imaging probe is communicated to the controller at time intervals of about 0.05 seconds. In any of the embodiments described herein the position data for the manual ultrasonic imaging probe is communicated to the controller at time intervals of about 0.01 seconds.
  • the methods further include computing an image-to-image spacing between successive images in the first scan sequence based on the position data communicated to the controller, determining whether the image-to-image spacing exceeds a maximum limit, and generating an alert when the spacing exceeds a maximum limit.
  • the computing an image-to-image spacing step includes calculating a pixel density for a unit volume of the screened tissue; and the determining step comprises comparing the calculated pixel density to a minimum pixel density value. In any of the embodiments described herein the computing the image-to-image spacing step includes calculating a maximum chord distance between images in the first scan sequence.
  • the methods can further include generating a second scan sequence, the second scan sequence comprising a second set of discrete digital images along a second scanning path on the tissue, computing a scan-to-scan spacing between the first and second scan sequences, determining whether the computed scan-to-scan spacing exceeds a scan-to-scan spacing limit, and generating an alert when the scan-to-scan spacing exceeds the scan-to-scan spacing limit.
  • the methods can further include removing a redundant image from the first scan sequence or the second scan sequence.
  • the image-to-image spacing and the scan-to-scan spacing are calculated based on the position data communicated to the controller and orientation data derived from the communicated position data.
  • computing the image-to-image spacing step includes measuring a distance between a first pixel in a first image and a second pixel in a second image of the first scan sequence with the first image and the second image being sequential images.
  • the methods further include deriving orientation data for the manual ultrasonic imaging probe based on the position data
  • computing the image-to-image spacing within the first scan sequence includes calculating a maximum pixel distance between a first image and a second image of the first scan sequence with the first image having a first pixel matrix and the second image having a second pixel matrix and the first and second pixel matrices each having the same number of rows and columns, and determining the maximum pixel distance by measuring a pixel-to-pixel distance between at least two corresponding pixels with one of the at least two corresponding pixels in the first pixel matrix and the other of the at least two corresponding pixels in the second pixel matrix and the corresponding pixels having the same row and column locations in respective matrices.
  • determining the maximum pixel distance comprises computing the pixel-to-pixel distance between a corner pixel on the first pixel matrix and a corresponding corner pixel on the second pixel matrix.
  • the methods further include computing a plurality of corner-pixel-to-corner-pixel distances between corresponding corner pixels in the first and second images and the image-to-image spacing between the first and second images is a maximum absolute value computed for the plurality of corner-pixel-to-corner-pixel distances.
  • the first scan sequence includes a first planar image adjacent to a second planar image with the first and second planar images each having four corners and a matrix of pixels and the controller computing the image-to-image spacing by determining a plurality of pixel distance values between corresponding pixels for the adjacent images at each of the four corners and the controller selecting the greatest pixel distance value from the plurality of pixel distance values as the image-to-image spacing.
  • computing the scan-to-scan spacing comprises calculating a pixel density for a unit volume of the screened tissue.
  • the methods further include determining whether the calculated pixel density for the unit volume exceeds a minimum pixel density value.
  • each of the images in the first and second sets of discrete digital images comprises a matrix of pixels, each matrix having the same fixed number of rows and columns and each pixel in each matrix having a row and column location designed by rx, cx, x being the same or different for r and c
  • with computing the scan-to-scan spacing between the first and second scan sequences comprises calculating a plurality of pixel- to-pixel distances between a first pixel P(rx, cx) in a first image of the first scan sequence and a plurality of pixels in the second scan sequence, wherein the plurality of pixels in the second scan sequence have the same row location rx as the first pixel P.
  • the methods further include determining whether a minimum pixel-to-pixel distance value from the calculated plurality of pixel-to-pixel distances exceeds the scan-to-scan spacing limit.
  • the methods further include prior to scanning, attaching the plurality of cameras to the manual ultrasonic probe.
  • any of the embodiments described herein further include prior to scanning, deploying the plurality of sensors at known locations in a room such that the sensors are viewable by the plurality of cameras when scanning tissue.
  • the first scan sequence is transmitted from a video output of an ultrasound imaging console in communication with the ultrasonic imaging probe to the controller.
  • the methods further include prior to scanning, attaching a cable to the video output of the ultrasound imaging console to the controller, wherein the first scan sequence is electronically transmitted by the cable.
  • Some embodiments described provide for methods, apparatus and systems for determining the resolution or spacing of the image-to-image spacing of discrete images within sets of discrete images, or scan sequences, and determining the coverage of multiple sets of discrete images, or scan sequences, in a hand-held imaging scan of targeted human tissue such as the human breast.
  • the range of the image-to-image resolution within each scan sequence is about 0.01mm to 10.0mm.
  • the image-to-image resolution within each scan sequence is about 0.1mm to 0.4mm.
  • the image-to-image resolution within each scan sequence is about 0.5mm to 2.0mm.
  • the range of the image-to-image resolution within each scan sequence is a pixel density between 9,000 and 180,000,000 pixels/cm3. In other embodiments, the pixel density is between 22,500 and 18,000,000 pixels/cm3. In further embodiments, the pixel density is between 45,000 and 3,550,000 pixels/cm3.
  • the range of coverage, in terms of the overlap of the border of adjacent scan tracks is between about -50.0mm to +50.0mm (where a negative overlap value indicates a positive gap value, or spacing between the borders of adjacent scan tracks).
  • the overlap of the border of adjacent scan tracks is between about -25.0mm to +25.0mm (where a negative overlap value indicates a positive gap value, or spacing between the borders of adjacent scan tracks).
  • the overlap of the border of adjacent scan tracks is about -10.0mm to +10.0mm (where a negative overlap value indicates a positive gap value, or spacing between the borders of adjacent scan tracks).
  • Examples of hand-held imaging procedures include, but are not restricted to, ultrasound examinations. Objective determination that user-defined levels of coverage and resolution are achieved is critical, particularly when one clinical practitioner performs the recording function during the hand-held scan and another practitioner, who was not present at the recording procedure, reviews those pre-recorded images. Objective determination of coverage and image-to-image resolution or spacing that the subsequent review of the recorded images by a trained clinical specialist following the scanning procedure is critical to assure that the subsequent review does not result in a false negative assessment due to the fact that some regions of the targeted tissue volume were inadvertently omitted.
  • Such omissions can be caused by the inadvertent excessive spacing between successive hand-held scans that are intended to cover the tissue structure, excessive image-to-image spacing within a single hand-held scan that can result from variations in rate of translation of the hand-held imaging probe and/or the excessive rate of change of the orientation of a hand-held imaging probe during the scanning of a targeted tissue volume such as the human breast.
  • the tracking of the position and computed orientation of a hand-held imaging probe can be accomplished by affixing cameras on the body of the ultrasound probe at predetermined locations relative to the design geometry of the hand-held imaging probe imaging elements. Three or more cameras are affixed to the hand-held imaging probe to enable the computation of the position (viz., x, y, z coordinates) of the hand-held imaging probe imaging elements and the computation of the orientation of the longitudinal axis of the hand-held imaging probe body. Said orientation coincides with the axis of image, for example the planar ultrasound beam emitted into the tissue being interrogated.
  • the accurate and dynamic computation of the position of the hand-held imaging probe's imaging elements enables the determination of the actual spatial position and computed orientation of manually scanned, sequential pathways completed along the tissue surface.
  • the computed position and computed orientation of each manually scanned, sequential pathway, combined with information regarding the dimensional size of each recorded image, along the tissue surface enables the further computation of the physical spacing or distance between scan sequences.
  • This computation can be rapidly completed during the course of the manual scanning process or procedure and a visual and optional audible cue as well as an image is provided showing the paths of completed scan sequences to identify where re-scanning is required.
  • This intra-procedure computation of the distances between adjacent scan sequences determines whether complete coverage of the targeted tissue volume is achieved with the hand-held imaging probe. Accordingly, this intra- procedure computation of the distances between adjacent scan sequences assures that the completed scan sequences cover the targeted tissue structure by assuring that the individual scan sequences overlap, or are separated by an acceptable distance.
  • the accurate and dynamic computation of the position of the hand-held imaging probe's imaging elements enables the determination of the actual spatial position and computed orientation of each image within the sequential and manually scanned pathways completed along the tissue surface of the targeted defined volume of tissue.
  • the physical spacing between discrete images in scanned pathways can be determined by using the computed position and computed orientation of each manually scanned, sequential pathway with information regarding the dimensional size of each recorded image.
  • This computation can be rapidly completed during the course of the manual scanning process and a visual and optional audible cue as well as an image is provided showing the paths of completed scan sequences to identify where re-scanning is required.
  • This intra-procedure computation of the distances between adjacent scan sequences determines whether image-to- image resolution of the targeted tissue region is achieved with the hand-held imaging probe is achieved by identifying distances between completed discrete scan images that are inadvertently separated by an unacceptably large distance.
  • the accurate and dynamic computation of the orientation (based on the positions of the three or more sensors) of the hand-held imaging probe's longitudinal axis enables the computation of image-to-image resolution or spacing by enabling the computation of a chord length between the planar images at the maximum depth of tissue being scanned for any two successive time steps at which images are obtained and recorded during any manual scan sequence along the tissue surface.
  • the computed rate of change of orientation of the hand-held imaging probe (derived from position sensors affixed to the hand-held imaging device) during a manual scan sequence along the tissue surface enables the further computation of the physical spacing (i.e., chord length) between planar ultrasound scans between two successive time steps during a scan sequence.
  • This intra-procedure computation of the chord distances between handheld imaging planar scans acquired and recorded for any two consecutive time steps assures that a complete hand-held imaging scan of the targeted tissue region is achieved in terms of image-to- image resolution or spacing. This is accomplished through position change computations, thereby identifying any completed scan sequence in which the chord distances, at the maximum depth of interrogation, between adjacent discrete images are unacceptably large.
  • the accurate and dynamic computation of the orientation (based on the positions of the three or more sensors) of the hand-held imaging probe's lateral axis enables the computation of image-to-image resolution by enabling the computation of a chord length between the sides of two planar images, from the surface of the tissue to the maximum depth of tissue being scanned for any two successive time steps at which images are obtained and recorded during any manual scan sequence along the tissue surface.
  • the computed rate of change of orientation of the hand-held imaging probe (derived from position sensors affixed to the hand-held imaging device) during a manual scan sequence along the tissue surface enables the further computation of the physical spacing (i.e., chord length) between planar ultrasound scans between two successive time steps during a scan sequence.
  • This intra-procedure computation of the chord distances between hand-held imaging planar scans acquired and recorded for any two consecutive time steps assures that a complete hand-held imaging scan of the targeted tissue region is achieved in terms of image-to-image resolution.
  • An alternative method for assuring the completeness of any individual scan sequence involves computation of the pixel density in each unit volume within the swept volume of the scan sequence.
  • the swept volume of the scan sequence would be the volume defined by (a) the width of the ultrasound beam, which is defined by the length of the ultrasound transducer array (e.g., 5 cm), (b) the depth of recorded penetration of the ultrasound beam into the targeted living tissue (e.g., 5 cm) and (c) the total length traversed in the individual scan sequence (e.g., 15 cm).
  • This total volume 375 cubic cm in the present example is then subdivided into unit volumes (e.g., cubical volume of dimensions 1.0 cm x 1.0 cm x 1.0 cm).
  • the swept volume would be subdivided in to 375 unit volumes.
  • the number of ultrasound pixels within that unit volume would be the total number of pixels in the portion of each discrete ultrasound image which is defined as being within the three-dimensional boundaries of the unit volume.
  • the number of ultrasound scan pixels contained in each unit volume is computed and this number is compared to a
  • Minimum Pixel Density number If the computed pixel density within any unit volume (i.e., any of the 375 unit volumes in this example) within the swept volume is less than the Minimum Pixel Density, then the operator is alerted at the end of the scan sequence that scan sequence just completed is incomplete and that it must be repeated including a display of instructions to improve scanning method (e.g., reduce scanning speed and/or rate of change of orientation of hand-held ultrasound probe during the repeated scan sequence).
  • another embodiment also provides a receiving device to detect and digitally record and store a digitized set of numbers which indicate the position and computed orientation of the hand-held imaging probe as well as the time associated with said position and computed orientation at each time step (i.e., time-stamped position and computed orientation data).
  • a digital data storage device provides for the recording of hand-held imaging image data at multiple times per second, images which are also time stamped for purposes of subsequent review by an individual or software capable of expert analysis of handheld imaging images to detect the presence of suspicious lesions within the targeted tissue volume.
  • the complete set of consecutive hand-held imaging images can be reviewed by play back of the recorded images at regular time steps (e.g., 6 to 12 frames per second).
  • an imaging system for acquiring a sequence of two-dimensional images of a target volume represented by an array of pixels I (x,y,z) comprising [a] a hand-held imaging probe to scan said target volume along a path, which may be predetermined or may be determined dynamically as the operator performs the procedure, and generate a sequence of digitized two-dimensional images thereof representing cross-sections of said target volume on a plurality of planes spaced along said scanning path; said scanning path may any geometric path determined by the scanning personnel and is not required to be linear; [b] a data storage medium for storage of digital data associated with each pixel of each two dimensional image in a sequence of digitized two-dimensional images together with other related image data defining the location of said two-dimensional images in said memory and defining interpretation information relating to the relative position of pixels within said two-dimensional images and to the relative position of pixels in adjacent two-dimensional images within said target volume; and [c] software algorithm to determine if the relative position of pixels in
  • an imaging system for acquiring two or more sequences of two-dimensional images of a target volume represented by an array of pixels I (x,y,z) comprising [a] a hand-held imaging probe to scan said target volume along two or more scanning paths, which may be predetermined or may be determined dynamically as the operator performs the procedure, and generate two or more sequences of digitized two-dimensional images thereof representing cross-sections of said target volume on a plurality of planes spaced along said scanning path; said scanning paths may any geometric path determined by the scanning personnel and is not required to be linear; [b] a data storage medium for storage of digital data associated with said sequences of digitized two- dimensional images together with other related image data defining the location of said two- dimensional images in said data storage medium and spatial and temporal information relating to the relative position of pixels at the edge of said two-dimensional images and to the relative position of pixels in one or more adjacent two-dimensional images at the edge of the adjacent scan sequence; and [c] software
  • an imaging system for acquiring two or more sequences of two-dimensional images of a target volume represented by an array of pixels I (x,y,z): [a] a hand-held imaging probe to scan said target volume along two or more scanning paths, which may be predetermined or may be determined dynamically as the operator performs the procedure, and generate two or more sequences of digitized two-dimensional images thereof representing cross-sections of said target volume on a plurality of planes spaced along said scanning path; said scanning paths may any geometric path determined by the scanning personnel and is not required to be linear; [b] a data storage medium for storage of digital data associated with each pixel of said sequences of digitized two- dimensional images together with other related image data defining the location of said two- dimensional images in said data storage medium and constructing a three-dimensional array of said pixel locations; and [c] software algorithm to determine if the pixel density within a predetermined volume is greater than a predetermined limit.
  • Another embodiment of the present invention incorporates methods, apparatus, and system for optimizing image review time on the part of the physician.
  • the recorded images are reviewed as a series of still images, those images being presented for a fixed period of time (e.g. 0.1 sec each).
  • optimizing (that is, reducing) review time is an important aspect of any image review procedure, care must be taken that the review is thorough, but not excessive.
  • the images will be recorded with a hand-held probe, it is possible that the relative spacing of adjacent images will vary. Some images may be spaced so closely that they are, in effect, redundant, while others may be spaced so far apart that it is possible to miss important structures.
  • the prior part of this application describes methods for dealing with the latter scenario. Some embodiments described will optimize physician review time by one of two methods:
  • the system will choose an optimal image spacing parameter and a maximum allowable image spacing parameter.
  • the maximum spacing between relative images will be calculated and the images for which the relative spacing is closest to the optimal spacing parameter shall be saved, and intermediate images shall be culled. For example, if the operator varies his or her scan so that images are recorded at 0.0mm, 1.0mm, 1.5mm, 2.0mm, 2.8mm, 3.0mm, 3.2mm, 3.5mm, 3.7mm, 4.0mm, 4.3mm, 4.7mm, 5.0mm, 5.5mm, and 6.0mm, and the review time is 0.1 sec per image, the time to review these images is 1.5 seconds.
  • the tissue structure to be examined is the human torso. In other embodiments, the tissue structure to be examined is the human breast. In further embodiments, the tissue structure to be examined is the female human breast.
  • the plurality of cameras may be mounted on the imaging probe and the reflective position sensors are mounted at physical locations in the surrounding environment.
  • the position sensors may reflect electromagnetic radiation in the optical spectrum, or wavelengths between about 750nm and about 390nm.
  • the position sensor can be a register which reflects electromagnetic radiation in the infrared spectrum, or wavelengths between about 100,000nm and about 750nm, which may be detected by an infrared camera and locating system can include three or more infrared cameras which can record the relative position between the register and the camera.
  • the position sensor can be a register which reflects electromagnetic radiation in the ultraviolet spectrum, or wavelengths between about 390nm and about lOnm, which may be detected by an ultraviolet camera and locating system can mean three or more ultraviolet cameras which can record the relative position between the register and the camera.
  • the system comprises a storage device to store the discrete image data.
  • the system comprises a storage device to store the position sensor data corresponding to each discrete image.
  • Further embodiments include a viewer to display the discrete images, wherein the viewer can provide a sequential display of said discrete images.
  • the relative image resolution algorithm measures the three dimensional spacing between a pixel in one discrete image and a pixel at the same location of a second image recorded in a sequentially acquired image set.
  • an audible signal is issued in the event that the image resolution is not within a user-defined limit.
  • a visual signal is issued in the event that the image resolution is not within user-defined limits.
  • the visual signal identifies discrete image sequence wherein that the image resolution is not within user-defined limits.
  • the image resolution algorithm creates a set of discrete image subsets by superimposing a three-dimensional volumetric boundary on adjacent images, determining which images have discrete image subsets which are described within that boundary, segregating the portions of each image subset which is described within that boundary, and calculating the pixels within the described subset of image portions.
  • an image coverage algorithm measures the three-dimensional spatial distance the three dimensional locations of the edge boundaries of one set of sequentially- recorded images with a second set of sequentially-recorded images.
  • a method for screening a defined volume of tissue with an image scanning device comprising the following steps: scanning tissue within defined volume using a manual imaging probe; detecting the position of the imaging probe using three or more position sensors coupled with the imaging probe; receiving a set of discrete images from the image scanning device; receiving position data from locating system comprising three or more position sensors for each image in said set of discrete images; application of position tracking algorithm to determine the resolution of that set of discrete images of tissue within said defined volume; and application of position tracking algorithm to determine the relative coverage of that set of discrete images of tissue, relative to another set of discrete images of tissue within that said defined volume.
  • the manual image scanning device is an ultrasound scanning device and the imaging probe is an ultrasound probe.
  • a viewer is used to display discrete images, providing a, sequential display of said discrete images.
  • Some embodiments include one or more microprocessors to calculate the image resolution by calculating the three dimensional spacing between a pixel in one discrete image and a pixel at the same location of a second image recorded in a sequentially acquired image set.
  • Some embodiments provide for using one or more microprocessors to create a set of discrete image subsets by superimposing a three-dimensional volumetric boundary on adjacent images, determining which images have discrete image subsets which are described within that boundary, segregating the portions of each image subset which is described within that boundary, and calculating the pixels within the described subset of image portions.
  • a locating system issues one or more audible signals in the event that the image resolution is not within user-defined limits to alert operator to obtain additional discrete images.
  • the locating system issues one or more visual signals in the event that the image resolution is not within user-defined limits to alert operator to obtain additional discrete images.
  • the visual signal identifies discrete image sequence wherein that the image resolution is not within user-defined limits to direct operator to location within defined volume requiring one or more additional discrete images.
  • one or more microprocessors measure the three-dimensional spatial distance of the three dimensional locations of the edge boundaries of one set of sequentially-recorded images with a second set of sequentially-recorded images.
  • Some embodiments describe a method of displaying sequential images of tissue, wherein each image having assigned spatial coordinates, a discrete image display algorithm calculates the relative spacing between discrete images and modifies the rate of display of recorded discrete images to provide a uniform spatial-temporal display interval between successive discrete images.
  • Other embodiments describe a method of displaying sequential images of tissue, wherein each image having assigned spatial coordinates, a discrete image display algorithm is used to determine whether a plurality of images are described within a user- defined interval for image spacing. Further embodiments provide that one or more of the plurality of images described within a user-defined interval for image spacing is not displayed as part of the set of discrete images.
  • Additional embodiments describe a method of displaying multiple sets of sequential images of tissue, wherein each image having assigned spatial coordinates, a discrete image display algorithm is used to not display one or more discrete images when the plane of that discrete images falls within a boundary of one or more sets of other sequential images.
  • FIG. 1 is a schematic view of the disclosed system including its various subsystem components.
  • FIG. 2 illustrates the hand-held ultrasound probe assembly including the affixed position sensors.
  • FIG. 3 illustrates an exploded view of the hand-held ultrasound probe assembly revealing the first and second support members, which encase the hand held ultrasound probe and incorporate the position sensors.
  • FIG. 4 illustrates a side view of the first support member shown in FIG. 3;
  • FIG. 5 illustrates a first transverse sectional view of the first support member shown in FIG. 3 revealing the conduits for incorporation of the position sensors and leads;
  • FIG. 6 illustrates a second transverse sectional view of the first support member shown in FIG. 3 revealing the conduits for incorporation of the position sensors and leads.
  • FIG. 7 illustrates a first cross-sectional view of the human breast including the handheld ultrasound probe assembly shown at various positions during the course of a scan sequence.
  • FIG. 8A illustrates discrete images in a scan sequence.
  • FIG. 8B illustrates a second cross-sectional view of the human breast including the hand-held ultrasound probe assembly shown at various positions during the course of a scan sequence
  • FIG. 9 illustrates a perspective view of the human breast and a ultrasound scan sequence including the hand-held ultrasound probe assembly shown at one position during the course of a scan sequence.
  • FIG. 10A illustrates a first top view of the human breast illustrating the locations of 14 scan sequences.
  • FIG. 10B illustrates a second top view of the human breast illustrating the locations of 13 scan sequences
  • FIG. IOC illustrates a perspective view of the human breast illustrating the locations of 2 scan sequences and volume of tissue included within 2 scan sequences.
  • FIG. 10D illustrates a third top view of the human breast with a plurality of scan sequences.
  • FIG. 10E illustrates a fourth top view of the human breast with a plurality of scan sequences.
  • FIG. 10F illustrates two radial scan sequences.
  • FIGS. 10G-10L illustrate discrete images in two scan sequences.
  • FIG. 10M illustrates two radial scan sequences.
  • FIG. 1 lA-1 IF combine as labeled thereon to show a flow chart of the procedure associated with a described embodiment.
  • FIG 12A illustrates the superposition of a single component volume unit on two sequential two-dimensional ultrasound scan images;
  • FIG 12B illustrates the superposition of four component volume units at each of the corners of both planes of two sequential two-dimensional ultrasound scan images.
  • FIG 13 is a schematic view of the disclosed system based on optical-based position sensing including its various subsystem components.
  • FIGS. 14A-14C illustrate a hand-held ultrasound probe assembly including affixed optically unique position sensors.
  • FIG. 15 illustrates an exploded view of a hand-held ultrasound probe assembly revealing the first and second support members, which encase the hand held ultrasound probe and incorporate the optically unique position sensors.
  • FIGS. 16A-16B illustrate the spacing between adjacent ultrasound scan images as a function of the depth of the ultrasound image within the tissue.
  • FIGS. 17A-17B illustrate a top view of a plurality of scan sequences with overlap.
  • FIG. 18 illustrates is a schematic view of the disclosed system including a camera mounted on the imaging probe.
  • embodiments contemplated provide for methods, devices, systems that can be used with manual imaging techniques to ensure satisfactory quality and adequate completeness of a scanning procedure for a patient's target region.
  • Some embodiments employ rapid-response position sensors or rapidly imaged optical registers affixed to an existing hand-held imaging system, for example, a diagnostic ultrasound system, and associated handheld imaging probes.
  • a diagnostic ultrasound system for example, a diagnostic ultrasound system, and associated handheld imaging probes.
  • one type of ultrasound system that can be used with some embodiments described is the Phillips iU22 xMatrix Ultrasound System with hand-held L12-50 mm Broadband Linear Array Transducer (Andover, Massachusetts).
  • a commercially available system which provides accurate x, y, z position coordinates for multiple sensors as a function of time, providing said position information at a rapid tracking rate, is, by way of example, the Ascension Technology 3D Guidance trakSTAR (Burlington, Vermont).
  • a first subsystem is the hand-held imaging system 12, which includes hand-held imaging monitor console 18, display 17, hand-held imaging probe 14 and connecting cable 16.
  • a second system (referred to hereinafter as the "Scan Completeness Auditing System"), according to the invention, is represented in general at 10.
  • the Scan Completeness Auditing System 10 comprises a data acquisition and display module/controller 40 including microcomputer/storage/DVD ROM recording unit 41, display 3 and footpedal or other control 11. Foot pedal 11 is connected to
  • the Scan Completeness Auditing System 10 also comprises position-tracking system 20, which includes, by way of example, position tracking module 22 and position sensor locator, such as a magnetic field transmitter 24.
  • position-tracking system 20 includes, by way of example, position tracking module 22 and position sensor locator, such as a magnetic field transmitter 24.
  • the Scan Completeness Auditing System 10 also comprises a plurality of position sensors 32a, 32b and 32c affixed to the handheld imaging probe 14.
  • the hand-held imaging system 12 is shown as a subsystem separate from the scanning completeness auditing system 10, in some embodiments, the two systems are part of the same overall system. In some cases, the imaging device may be part of the scanning completeness auditing system.
  • hand-held imaging system 12 is connected to data acquisition and display module/controller 40 via data transmission cable 46 to enable each frame of imaging data (typically containing about 10 million pixels per frame) to be received by the
  • microcomputer/storage/DVD ROM recording unit 41 the frequency of which is a function of the recording capabilities of the microcomputer/storage/DVD ROM recording unit 41 and the image data transmission capabilities, whether it is raw image data or video output of the processed image data, of the hand-held imaging system 12.
  • Position information from the plurality of position sensors 32a, 32b, and 32c, is transmitted to the data acquisition and display
  • Cable 46 is removably attached to microcomputer/storage/DVD ROM recording unit 41 of data acquisition and display module/controller 40 with removably attachable connector 43 and is removably connected to diagnostic ultrasound system 12 with connector 47.
  • the successive scans associated with the hand-held imaging procedure are stored and subjected to computational algorithms to assess completeness of the diagnostic ultrasound scanning procedure as described in greater detail in the specifications which follow.
  • position tracking module 22 is connected to data acquisition and display module/controller 40 via data transmission cable 48 wherein cable 48 is removably attached to microcomputer/storage/DVD ROM recording unit 41 of data acquisition and display module/control 40 with connector 45 and is removably connected to position tracking module with connector 49.
  • Position sensor locator such as a magnetic field transmitter 24 is connected to position tracking module 22 via cable 26 with removably attachable connector 25.
  • Hand-held imaging probe assembly 30 seen in FIG. 1 includes, by way of example, position sensors 32a- 32c, which are affixed to hand-held imaging probe 14 and communicate position data to position tracking module 22 via leads 34a-34c, respectively, and removably attachable connectors 36a- 36c, respectively.
  • Position sensor cables 34a-34c may be removably attached to ultrasound system cable 16 using cable support clamps 5a-5f at multiple locations as seen in FIG 1
  • first and second "clamshell" type support members 42 and 44 respectively.
  • First support member 42 incorporates three raised ridges 35a-35c, which provide three conduits (not shown) for position sensors 32a-32c, respectively, and position sensor cables 34a-34c, respectively.
  • Said first support member 42 includes the aforementioned raised ridges 35a-35c and associated conduits 33a-33c, respectively, which accommodate position sensors 32a-32c and their corresponding cables 34a-34c, respectively.
  • First support member 42 also incorporates extension ears 36a and 36b, each with a drilled hole to enable secure mechanical attachment to second support member 44.
  • Said second support member 44 likewise incorporates extension ears 38a and 38b, each with a drilled hole which matches drilled holes in first support member to enable secure mechanical attachment to second support member 42 using screws 39a and 39b, respectively.
  • First and second support members may be
  • first and second support members 42 and 44 are designed to match the particular contour and dimensions of the off-the-shelf hand-held ultrasound probe being instrumented with the position sensors 32a-32c. Accordingly, the contours and dimensions of the first and second support members 42 and 44 will vary according the hand-held ultrasound probe design.
  • the exact location of the position sensors 32a-32c relative to the ultrasound transducer array at the end face of the hand-held imaging probe (not shown) will accordingly be known for each set of first and second support members since they are designed to attached to and operate in conjunction with a specific hand-held ultrasound probe.
  • FIGS. 4, 5 and 6 illustrate an embodiment of the first support member 42 in a side view (see FIG. 4) and sectional views (see FIGS. 5 and 6) at two locations along the length of first support member 42.
  • the raised ridge 35a is seen which extends along most of the length of first support member 42.
  • extension ear 36a is seen one end of the first support member 42.
  • conduits 33a, 33b and 33c are revealed. The dimensions of conduits 33a-33c are selected to accommodate position sensors 32a-32c and their corresponding cables 34a-34c, respectively.
  • position sensors are commercially available which have a diameter of nominally 2 mm or less. Accordingly, one described embodiment provides conduits 33a-33c dimensioned to accommodate a 2 mm diameter position sensor. As seen in FIGS. 2, 3, 5 and 6, position sensors 32a-32c and their respective cables 34a-34c can be affixed within conduits 33a- 33c using an adhesive (e.g., epoxy or cyanoacrylate).
  • an adhesive e.g., epoxy or cyanoacrylate
  • the first and second support members 42 and 44 are sized to correspond to the particular contour and dimensions of a specific hand-held ultrasonic probe design.
  • injection-molded plastic e.g., a
  • the inner dimensions of said first and second support members 42 and 44 are designed to closely match the outer dimensions of the hand-held ultrasound probe 14.
  • the wall thickness, tl (see FIG. 5) of the injection molded plastic support members 42 and 44 is preferably in the range from 0.05 to 0.10 inch.
  • FIG. 7 An example of the use of described embodiments is seen in FIG. 7 for the case of the hand-held ultrasound examination of a human breast 60.
  • a hand- held ultrasound probe assembly 30 with affixed position sensors is illustrated at a starting position on the human breast 60 adjacent to the nipple 64 and areola 62.
  • the hand-held ultrasound probe assembly 30 starts immediately over the nipple and progresses radially and follows the contour of the human breast as illustrated by translation vectors 52a-52b and 52b-52c corresponding to hand-held ultrasound probe assembly 30 successive positions 30a, 30b and 30c with the latter two positions shown in "phantom" format.
  • the ultrasound transducer array 57 is maintained in direct contact with the skin, usually with an intervening layer of an ultrasound coupling gel.
  • An ultrasound coupling gel is usually used (e.g., Aquasonics 100, Parker Laboratories, Inc., Fairfield, New Jersey) to improve ultrasound interrogation by providing an improved acoustic pathway between the ultrasound transducer array and the skin.
  • the hand-held ultrasound probe assembly 30 is moved by the operator using a manual technique along the pathway illustrated in FIG. 7, referred to herein as a single scan sequence, beginning at the nipple 64 and ending when the ultrasound transducer array has reached the surface of the chest 61 beyond the perimeter of the breast 60, or beginning at the chest wall and ending when the ultrasound transducer has reached the nipple. If this example scan sequence is performed within the acceptable limits of translation speed and rate of change of the orientation of the hand-held ultrasound probe assembly 30, then this scan sequence would be verified as a complete scan sequence. As seen in FIG.
  • a planar ultrasound beam 50a-50c is emitted and a corresponding ultrasound image is obtained at each momentary position 30a-30c of the hand-held ultrasound probe assembly 30.
  • an ultrasound beam is emitted and an image is received, constituting a single image frame, at a rate in the range from about 10 to 40 times (or frames) per second.
  • a typical frame may contain an array of 400 x 600 pixels of image data or 240,000 pixels per frame.
  • a new frame is obtained at a rate of about 10 to 40 frames per second.
  • FIGS. 8A, 8B, and 9 An important aspect of the present invention is illustrated in FIGS. 8A, 8B, and 9 related to computing (or auditing) the completeness of each scan sequence.
  • This described method and algorithm assures the frame-to-frame resolution of any individual scan sequence (e.g., any individual path scanned beginning at the nipple of the breast and ending at the chest surface beyond the perimeter of the breast boundary, or scan beginning at the chest surface and ending at the nipple, or any scan beginning at the clavicle and ending at the base of the rib cage, or any scan beginning at the base of the rib cage and ending at the clavicle, or any scan beginning in the crevice of the armpit and ending at the inferior lateral side of the rib cage).
  • any individual scan sequence e.g., any individual path scanned beginning at the nipple of the breast and ending at the chest surface beyond the perimeter of the breast boundary, or scan beginning at the chest surface and ending at the nipple, or any scan beginning
  • measuring or calculating the spacing or distance between individual images in a scan sequence may be referred to as determining the image-to-image resolution or spacing between discrete images in a scan sequence.
  • frame to frame resolution may also be used to describe the spacing/distance between images in a scan sequence.
  • the hand-held ultrasound probe assembly 30 is translated across the surface of the skin by the human hand 700. That translation will follow a linear or non-linear path 704, and there are a series of corresponding ultrasound beam positions 50s-50v, each with a corresponding ultrasound image that is recorded, as depicted in FIG 1, by the acquisition and display module/controller 40 via the data transmission cable 46, to be received by the microcomputer/storage/DVD ROM recording unit 41, the frequency of which is a function of the recording capabilities of the microcomputer/storage/DVD ROM recording unit 41 and the image data transmission capabilities.
  • the images are stored as a set of pixels, including pixels 94a-941, which are displayed in a two- dimension matrix of pixels, each matrix consisting of horizontal rows 708a-708h and vertical columns 712a-712h.
  • a single pixel 94a-94h, is displayed has a unique display address P(r x , c x ), where r x is the row of pixels on the image, r being the row at the top, e.g. 708e, or the row representing structures closest to the probe, and r las t being the row at the bottom (e.g.
  • a typical recorded ultrasound image will have between 300 and 600 horizontal rows 708 and between 400 and 800 vertical columns 712. Thus, a typical recorded ultrasound image shall have between 120,000 and 480,000 pixels 94.
  • the recorded image for each ultrasound beam position 50s-50v will have an identical pixel format.
  • a corresponding row is the row 708 which is displayed at the same distance, vertical from the top, in every image.
  • the depth, as measured as distance away from probe, shall be the same for corresponding horizontal rows 708.
  • the information in the 8 th horizontal row 708 in one image represents structures which are the same distance, away from the probe at the time they are recorded, as the location of the information in the 8 th horizontal row 708 in another image at the time that image is recorded.
  • the same logic applies to the corresponding vertical columns 712.
  • the information in the 12 th vertical column 712 in one image represents structures that are the same distance, horizontally, from the center of the probe at the time that image is recorded as the location of the information in the 12 th vertical column 712 in another image at the time it is recorded.
  • the information described any one pixel 94, P(r x , c x ), in one image is the same distance away from the surface of the probe (depth) and from the center line of the probe as the information described at the same pixel 94 location P(r x , c x ), in another image.
  • These pixels 94 that share common locations on the image format for the discrete images in the image sets are termed corresponding pixels 94.
  • One embodiment for calculating the completeness of the scan sequence in terms of frame-to-frame resolution is to calculate the maximum distance between any two adjacent image frames. Since the concept of minimum acceptable resolution, by definition, requires the establishment of a maximum acceptable spacing, then that resolution requirement will be met if the largest distance 716 between any two corresponding pixels 94 in adjacent image frames is within the acceptable limit. Since the frames are planar, then the largest distance between any two frames will occur at the corresponding pixels 94 that are at one of the four corners. Thus, the maximum distance 716 between any two corresponding frames shall be (EQ. 1):
  • DISTANCE P(FIRST-ROW, LAST-COLUMN) - P'(FIRST-ROW, LAST-COLUMN)), DISTANCE(P(LAST-ROW, FIRST-COLUMN) - P'(LAST-ROW, FIRST-COLUMN)), DISTANCE(P(LAST-ROW, LAST-COLUMN) - P'(LAST-ROW, LAST-COLUMN)))
  • P and P' are the corresponding pixels 94 in two adjacent images
  • MAX is the maximum function which chooses the largest of the numbers in the set (in this example 4)
  • DISTANCE is the absolute distance 716 between the corresponding pixels.
  • Exemplary distances are shown in FIG. 8A at 716a between pixel 94a and corresponding pixel 94b; 716b between pixels 94b and 94c; 716c between 94c and 94d; 716d between 94e and 94i; 716e between 94f and 94i; 716f between 94g and 94k; and 716g between 94i and 941.
  • This method of assuring frame-to-frame resolution may be used to assure that the resolution remains within limits regardless of the speed of longitudinal translation of the probe, speed of lateral rotation of the probe, speed of axial resolution of the probe, or speed of vertical rotation of the probe.
  • the user may be prompted during or at the end of the process/procedure to rescan a region.
  • the acceptable spacing/distance is a preselected or predetermined value.
  • the value is a user defined limit.
  • the system may provide a range or acceptable spacing/distances for selection based on the type of exam or characteristics of the patient or target region for scanning.
  • FIG. 8B provides another method of assuring adequate frame-to-frame or image-to- image spacing.
  • FIG. 8B shows the hand-held ultrasound probe assembly 30 at two adjacent positions 30d and 30i.
  • the rate of producing new ultrasound images is accomplished at a rate of 10 frames/second.
  • the hand-held ultrasound probe assembly 30 is translated from position 30d with corresponding ultrasound beam 50d and a corresponding ultrasound image to position 30i with corresponding ultrasound beam position 50i and a corresponding ultrasound image, there are 4 intermediate positions as seen by ultrasound beams 50e-50h.
  • the rate of longitudinal rotation of the hand-held ultrasound probe assembly 30 during the translation from position 30d to 30i is not uniform and an increased rate of rotation of the hand-held ultrasound probe assembly 30 inadvertently occurs between ultrasound beam 50g and 50h.
  • the time step, ⁇ is 0.10 second based on an ultrasound scan rate of 10 frames per second.
  • a suspicious lesion 73 were within omitted zone 70d, it would not be detected or recorded in the diagnostic ultrasound procedure. Unavoidably, it would be impossible for the expert (e.g., radiologist) who analyzes the ultrasound images following the ultrasound procedure to detect the presence of what could become a life-threatening malignant lesion. It is not mathematically possible to eliminate these omitted zones 70a-70e without an infinite number of ultrasound beams 50d-50i and corresponding ultrasound images, but the user can determine a level of resolution, that is the maximum acceptable size, of the zones 70a-70e and notify the user if any one of those zones exceeds that acceptable limit.
  • a preferred algorithm for computing spacing between images in a scan is to compute the maximum chord or distance, x between successive planar ultrasound scan frames at the maximum intended depth of ultrasound interrogation (i.e., maximum depth of the breast tissue in the present example).
  • This maximum distance, x can be computed between the distal boundaries of each successive ultrasound scan frame (e.g., between ultrasound beam 50g and 50h, and corresponding images, since the position of the ultrasound transducer array 57 and the orientation of the hand-held ultrasound probe assembly 30 is precisely known at all time points when ultrasound scan frames are generated and recorded.
  • the position of each sensor is determined (in one example version of a product sold by Ascension Technologies but not intended as a limitation as the data update rate may be higher or lower) at a rate of 120 times per second which is an order of magnitude more frequently than the repetition rate for ultrasound scan frames.
  • the precise location of the ultrasound scan frame and, thereby, the precise location of the 240,000 pixels within each ultrasound scan frame will be known in three- dimensional space as each ultrasound scan frame is generated by the ultrasound system 12 and recorded by the data acquisition and display module/controller 40.
  • knowing the position of all pixels within each successive frame will enable the maximum distances between corresponding pixels in successive frames to be computed, focusing on those portions of successive ultrasound beams 50d-50h, and corresponding ultrasound images, that are known to be furthest apart, i.e., at locations within the recorded scan frame most distant from the ultrasound transducer array 57.
  • This alternative method and algorithm for assuring the completeness of any individual scan sequence involves computation of the pixel density in each unit volume 96 within the swept volume 90 of the scan sequence, i containing N ultrasound beams 50[i,j(i)] and associated recorded frames where i equals the number of scan sequences and j(i) equals the number of emitted beams 50 and associated recorded frames for each scan sequence, i.
  • the rate of translation of the hand-held ultrasound probe assembly 30 along scan sequence, i, having path length, L2 is 1.0 cm/second
  • length L2 equals 15 cm
  • the ultrasound system 12 scanning rate is 10 frames/second
  • the resultant images are recorded by the data acquisition and display module/controller 40 at 10 frames/second.
  • the total time to complete the scan is 15 seconds and the total number of ultrasound scan frames recorded is 150.
  • j(i) equals 150. If each frame contains, for example, 240,000 pixels, then the total volume will include 150 frames x 240,000 pixels/frame which equals a total of 36 million pixels in the swept volume 90 of an individual scan sequence, i.
  • the swept volume 90 of the scan sequence would be the volume defined by (a) the width, W2 of the ultrasound beam, which is defined by the length of the ultrasound transducer array (e.g., 5 cm), (b) the depth, D2 of the recorded penetration of the ultrasound beam into the targeted living tissue (e.g., 5 cm) and (c) the total length, L2 traversed in an individual scan sequence (e.g., 15 cm).
  • This total volume (375 cubic cm in the present example) is then subdivided into unit volumes exemplified by unit volume 96 (e.g., cubical volume of dimensions 1.0 cm x 1.0 cm x 1.0 cm).
  • the swept volume 90 would be subdivided in to 375 unit volumes 96.
  • the number of ultrasound scan pixels 94 contained in each unit volume 96 is computed and this number is compared to a predetermined Minimum Pixel Density number.
  • the number of ultrasound scan pixels 94 within a unit volume 96 may be computed by comparing the x-y-z coordinates of each of the ultrasound scan pixels 94 in the 150 frames which comprise the swept volume 90, with the x-y-z coordinates of the boundaries of the perimeter of the unit volume 96. If the x-y-z coordinates of the ultrasound scan pixel 94 is within the boundaries of the perimeter of the unit volume 96, it is counted.
  • the operator is alerted at the end of the scan sequence that scan sequence just completed is incomplete and that all or part of it must be repeated, or that the operator must accept that the scan sequence is incomplete.
  • Said alert includes a display of the scan path just completed as well as instructions to the operator to improve scanning method to achieve a complete scan. For example, these instructions include reducing the scanning speed and/or the rate of change of orientation of hand-held ultrasound probe during the repeated scan sequence.
  • the range of the image-to-image resolution (spacing) within each scan sequence is a pixel density between 9,000 and 180,000,000 pixels/cm 3 . In other embodiments, the pixel density is between 22,500 and 18,000,000 pixels/cm 3 . In further embodiments, the pixel density is between 45,000 and 3,550,000 pixels/cm 3 ,.
  • FIGS. 10A and 10B An equally important aspect of the present invention is illustrated in FIGS. 10A and 10B related to computing (or auditing) the tissue coverage by comparing the scan sequence just completed based on its relative distance from the previously completed scan sequence.
  • the accurate and dynamic computation of the position of the hand-held ultrasound probe's transducer array enables the computation of the actual spatial position and computed orientation of sequential and manually scanned pathways completed along the tissue surface.
  • relatively uniformly and closely spaced radial scan sequences 80a-801 are superimposed on a top view of the human breast 60 as seen in FIG. 10A with scan sequences 80 spanning the distance between the nipple 64 and some distance radially outward from the nipple, for example, the chest surface 61.
  • Each scan sequence 80 has a length L and a width W.
  • each sequential and manually derived scan sequence 80a-80I scanned along the tissue surface enables the further computation of the physical spacing between the boundaries of each adjacent and successive scan sequence 80.
  • This computation can be rapidly completed during the course of the manual scanning process and a visual and audible cue as well as an image is provided showing the paths of completed scan sequences to identify where re-scanning is required.
  • This intra-procedure computation of the distances between adjacent scan sequences, 80a-801 assures that complete coverage of the ultrasound scan of the targeted tissue region is achieved by identifying any completed scan sequences that are separated by an unacceptably large distance.
  • radial scan sequences 80a-801 are superimposed on a top view of the human breast 60 with scan sequences 80 spanning the distance between the nipple 64 and the chest surface 61.
  • this example illustrates an abnormally large spacing between scan sequence 80d and 80e.
  • a zone 72 (as revealed by shaded region in FIG. 10B) by of tissue within the breast 60 is not included in the diagnostic ultrasound procedure.
  • the distance between successive scan sequences can be computed since the precise location and computed orientation of the hand-held ultrasound probe assembly 30 is known for each scan sequence 80. If the spacing between scan sequences exceeds a
  • the result of a computed physical spacing between successive scan sequences 80d and 80e being greater than a predetermined maximum spacing value is an un-scanned or omitted zone 72 within the targeted tissue (i.e., the human breast 60 in this example).
  • a suspicious lesion 73 were within omitted zone 72, it would not be detected or recorded in the diagnostic ultrasound procedure.
  • the expert e.g., radiologist
  • FIGS. 10D and 10E show scan-to-scan spacing between relatively linear scan sequences.
  • FIG. 10D shows scan sequences 80m-80q following a substantially linear pathway across the breast 60. The sequences show overlapping imaging at 3999, 4001, 4003, and 4005.
  • FIG. 10E illustrates a gap of unscanned tissue between scan sequence 1500 and scan sequence 1502. In such circumstances, embodiments described would be used to calculate, measure, or determine the size of the unscanned region 63. If the distance is greater than an acceptable spacing for scan-to-scan spacing, then the operator would be alerted during the procedure to scan the region 63.
  • FIGS. 10F and 10M show scan-to-scan spacing between relatively radial scan sequences.
  • Two scan sequences 1500 and 1502 show unscanned regions 1504a and 1504b.
  • embodiments described would be used to calculate, measure, or determine the size of the unscanned region. If the distance is greater than an acceptable spacing for scan-to-scan spacing, then the operator would be alerted during the procedure to scan the region.
  • measuring or calculating the spacing or distance between scan sequences may be referred to as determining the scan-to-scan spacing between scan sequences.
  • Scan-to-scan spacing is a method of measuring, calculating, or otherwise determining coverage.
  • FIG 10G two adjacent scan sequences 2900a-2900d and 2904a-2904d are depicted.
  • One means of measuring whether there is overlap or gap spacing is to measure the distances 2908a-2908d from one of the corner pixels of one image, for example P(FIRST-ROW, LAST-COLUMN) 2916 and each of the pixels in the same row, but opposite side of the image in all of the images in the adjacent row, for example P(FIRST-ROW, FIRST-COLUMN) 2920a- 2920d.
  • the shortest of those distances represents the spacing between adjacent images in adjacent rows. In the example of FIG 10G, that would be distance 2908b.
  • the distance between the corner pixels of the two adjacent images represents an overlap. In other words, if the angle 2915 between the two vectors 2912 and 2913 is less than
  • the two pixels overlap.
  • the shortest distance is between pixel 2948 and 2920d.
  • the vector of that distance 2945 is in the opposite general direction as the vector 2944 along the top row of image 2944, so the distance represents a gap. In other words, if the angle 2949 between the two vectors 2944 and 2945 is greater than 180 degrees then the two pixels represent a gap.
  • FIGS. 101 and 10K two adjacent scan sequences 2900a-2900d and 2904a-2904d are depicted.
  • One means of measuring whether there is overlap or gap spacing is to measure the distances 2908a-2908d from one of the corner pixels of one image, for example P(FIRST-ROW, LAST-COLUMN) 2916 and each of the pixels in the same row, but opposite side of the image in all of the images in the adjacent row, for example P(FIRST-ROW, FIRST- COLUMN) 2920a-2920d.
  • the shortest of those distances represents the spacing between adjacent images in adjacent rows. In the example of FIGS 101 and 10K, that would be distance 2908b.
  • the border pixel 2916 is considered to overlap with the adjacent scan sequence of images 2900a-2900b if the pixel is within the borders of the area 2953 described, in part, by the row of the closest image 2900b and the adjacent image 2900a.
  • the shortest distance is between pixel 2948 and 2920d.
  • the border pixel 2948 is considered to have a gap with the adjacent scan sequence of images 2900a-2900b if the pixel is outside of the borders of the area 2955 described, in part, by the row of the closest image 2900d and the adjacent image 2900c.
  • an alternative algorithm is employed wherein the volume subjected to successive scan sequences 80a-80m is transformed into the computed distribution of ultrasound scan image pixels based on the known position and computed orientation of the hand-held ultrasound probe assembly 30 for each scan sequence as described above in connection with FIG. 9.
  • the pixel density per unit volume e.g., pixel density per cubical 1.0 cubic centimeter or pixel density per cubical 0.5 cubic centimeter unit volumes
  • the included volume 75 bounded by successive scan sequences 80d and 80e would be subdivided into smaller unit volumes 79.
  • the computed position of all pixels within the included volume 75 between scan sequences 80d and 80e would then be computed, based on the known position and computed orientation of the hand-held ultrasound probe assembly 30 during periods within each scan sequence, thereby allowing the computation of pixel density within each unit volume 79.
  • the number of ultrasound scan pixels (as described above in connection with FIG. 9) contained in each unit volume 79 is computed and this number is compared to a predetermined Minimum Pixel Density number.
  • the operator is alerted at the end of the scan sequence that scan sequence just completed is incomplete and that it must be repeated including a display of instructions to improve the scanning method (e.g., reduce the spacing between the previous scan sequences and the present scan sequence to be repeated).
  • FIGS. 11 A through 1 IE a flow chart describes one embodiment of the method and system of the present invention. Beginning as represented by symbol 3100 and continuing as represented by arrow 3102 to block 3104, connectivity of the components of the system is verified. The user must verify that the hand-held ultrasound imaging probe is connected to the ultrasound system, that the position sensors are attached to the hand-held ultrasound probe, that the position sensors are connected to the position tracking module, that the magnetic field transmitter (MFT) component of the position tracking module is within 24 inches of the targeted patient volume (e.g.
  • MFT magnetic field transmitter
  • the position tracking module i.e. a requirement specifically related to the use of visible detection technologies, such as is employed when an infrared camera tracks an visible register, that the that the position tracking module is connected to the data acquisition and display
  • the operator now proceeds to positioning the hand-held imaging probe at the starting position of the target tissue site on the patient (e.g., at the nipple of the right breast).
  • the operator now proceeds to activate both the position tracking module and the associated data acquisition and display module/controller by depressing the foot pedal continuously during the entire period of each scan sequence performed using the hand-held ultrasound probe assembly with an audible tone issued and/or visible indicator confirming that the position sensing detection and recording function for the hand-held ultrasound probe assembly is currently active.
  • the operator releases the foot pedal to pause (i.e., to temporarily deactivate) the image recording function of the data acquisition and display module/controller.
  • the time- stamped hand-held imaging probe position and computed orientation data acquired within the data acquisition and display module/controller is combined with the time-stamped ultrasound scan frames received from the ultrasound system to enable rapid computation of the image-to- image resolution of the scan sequence just completed.
  • the chord distances between any two successive scan frames are computed to determine if they are within pre-selected limits as illustrated with regard to FIG. 8B discussed above.
  • an alternative embodiment of the present invention can be substituted at block 3136, which utilizes the imaging scan pixel density within the swept volume of the complete scan sequence as was described with regard to FIG. 9.
  • the time-stamped hand-held imaging probe position and computed orientation data acquired within the data acquisition and display module/controller is combined with the time- stamped imaging scan frames received from the ultrasound system to enable rapid computation of the completeness of the scan sequence just completed.
  • the pixel density within unit volumes within the swept volume are computed to determine if the computed pixel density is less than the preselected Minimum Pixel Density value.
  • block 3140 is reached via arrow 3138.
  • an audible alarm and visual error message is issued to instruct the operator that the scan failed to comply with the minimum user requirements for frame-to-frame resolution.
  • the user is queried as to whether he or she wishes to accept this scan sequence, SS(i), which does not meet the user-defined minimum limits of frame-to-frame resolution. If the operator does not choose to accept the scan sequence SS(i), which does not meet the user-defined minimum limits of frame-to-frame resolution, then, as represented by arrow 3160 to block 3120, the operator repeats the scan sequence previously performed but determined to be incomplete due to the failure of the frame-to-frame resolution to meet the minimum user-defined requirements. If the user chooses to accept the scan sequence SS(i), which does not meet the user-defined minimum limits of frame-to-frame resolution, then block 3146 is reached via arrow 3143.
  • a computation is performed to determine if the scan sequence just completed is essentially the same as the initial scan sequence performed or, alternatively, if the last scan sequence has been performed for the target tissue volume.
  • the last scan sequence is obtained when the first scan sequence is essentially repeated.
  • the operator designates on the data acquisition and display module/controller that the last scan sequence has been performed. If the scan sequence just completed is not the last scan sequence required for the ultrasound examination, proceed as represented by arrow 3170 to block 3120 to initiate sequence of steps for next scan sequence.
  • scan sequence i is greater than 1
  • one of the above two algorithms e.g., either computation of distance between two successive scan sequences or volumetric pixel density within unit volumes of the included volume between successive scan sequences
  • the edge-to-edge coverage of the two successive scan sequences just completed as specified in block 3152 If the predetermined requirement is met (i.e., maximum allowed distance between the adjacent edges of scan frames in successive scan sequences is not exceeded or the pixel density in any unit volume is not less than the minimum required pixel density), then block 3164 is reached via arrow 3162.
  • block 3156 is reached via arrow 3154.
  • an audible alarm and visual error message is issued to instruct the operator to determine that the coverage, as defined by the user-defined edge-to-edge spacing of adjacent edges in successive scan sequences, or the user-defined pixel density in any unit volume is less than the required pixel density, has not been met.
  • block 3159 is reached via arrow 3157.
  • the user is queried regarding whether he or she wishes to accept , scan sequence, SS(i), is to be accepted that the coverage, as defined by the user-defined edge-to-edge spacing of adjacent edges in successive scan sequences, or the user-defined pixel density in any unit volume is less than the required pixel density, has not been met. If the user chooses even though the coverage, as defined by the user-defined edge-to-edge spacing of adjacent edges in successive scan sequences, or the user-defined pixel density in any unit volume is less than the required pixel density, has not been met, to accept the scan sequence, SS(i), then block 3164 is reached via arrow 3163.
  • SS(i) If the user chooses not to accepted scan sequence, SS(i), because that the coverage, as defined by the user-defined edge-to-edge spacing of adjacent edges in successive scan sequences, or the user-defined pixel density in any unit volume is less than the required pixel density, then the scan sequence is repeated at a closer spacing relative to the prior scan sequence pathway.
  • arrow 3158 joins arrow 3160 to block 3120, wherein the operator repeats the scan sequence previously performed since it was determined to be incomplete due to regions of the target tissue not being included in the series of ultrasound scan frames just obtained.
  • arrow 3190 joins block 3192 in which the user is queried regarding whether he or she wishes to view the scan sequences before processing the data and saving the procedure study.
  • the viewer allows playback of the scanned images by the expert reviewer (e.g., radiologist) in a manner that is optimized for screening for cancers and other anomalies. If the user chooses to forego review, then arrow 3194 joins block 3196.
  • the expert reviewer e.g., radiologist
  • arrow 3198 proceeds to 3200, in which the scan sequence images are displayed on a video monitor, such as a digital computer monitor.
  • the system queries the user whether he or she wishes to accept the study. As depicted by arrow 3204 proceeding to join arrow 3194, which proceeds to block 3196, the images are processed. If the user chooses to not accept the images then a rescanning sequence is initiated as depicted by arrow 3208 proceeding to block 3210.
  • the complete set of sequenced image frames are assigned patient, ultrasound instrument information, time, and location information as depicted in block 3196.
  • the processed data is then stored on electronic media, such as a DVD ROM, disc drive, or flash memory drive). This process is depicted by arrow 3214 proceeding to block 3216.
  • the DVD-ROM (or other suitable recording media) is physically transferred from the data acquisition and display module/controller to the expert (e.g., radiologist) for subsequent analysis and evaluation of the diagnostic ultrasound data with the confidence that the entire target tissue volume has been included in the supplied data recording.
  • This last step defines the end of the diagnostic examination procedure for a particular patient.
  • the image procedure is concluded as depicted by arrow 3218 proceeding to block 3220.
  • each of the pixels in each ultrasound scan-derived two- dimensional image, i are specified by a unique set of coordinates X ⁇ i,j ⁇ and Y ⁇ i,j ⁇ in two- dimensional space.
  • the position of each pixel is transformed into three- dimensional space and can be defined by the three Cartesian coordinates Xij, Yij and Zij.
  • the overall volume circumscribed by any two adjacent two-dimensional scans is subdivided into smaller component volumes.
  • said smaller component volumes have two opposite square side faces measuring 2 mm x 2 mm and are defined, as seen in FIG 12A, by the coordinates listed below.
  • the physical spacing between sequential two- dimensional ultrasound scan images 2200 and 2201 has been significantly increased and is not drawn to scale relative to the overall dimensions of the ultrasound scan regions 2200 and 2201.
  • the maximum spacing between the square 2 mm x 2 mm faces on adjacent two-dimensional images 2200 and 2201 for the first component volume is determined by comparing the following four distances along the Z axis:
  • the computed first component volume is the product of the unit area, A and the maximum spacing between the square faces 2210 and 2211 (2 mm x 2 mm for this example):
  • the First Component Volume Pixel Density for the First Component Volume is given by dividing the combined total number of pixels within the 2 mm x 2 mm areas, A on faces 2210 and 2211 on the two sequential two-dimensional images (e.g., 400 pixels on each image for a combined total of 800 pixels for two sequential images) by the First Component Volume given in Equation 3 as follows:
  • Equation 3 the computed First Component Volume Pixel Density obtained in Equation 3 is compared with a
  • predetermined Minimum Allowed Volumetric Pixel Density which is selected to ensure that all regions within the targeted tissue volume are included in the ultrasound scan.
  • the above example process is repeated (a) for each component volume defined by the boundaries of two sequential two-dimensional images 2200 and 2201 and (b) for all pairs of sequential two- dimensional images acquired during a screening procedure. If any sequential pair of two- dimensional ultrasound scans results in a Component Volume Pixel Density which is less than the Minimum Allowed Volumetric Pixel Density, then a warning is displayed on the data acquisition and display module/controller 40 so that the operator can repeat the ultrasound scan sequence just completed to increase the pixel density to meet the requirements of the predetermined Minimum Allowed Volumetric Pixel Density. By this process, a complete ultrasound screening is assured which includes all tissue volumes within the targeted tissue region.
  • Another embodiment of the present invention utilizes the geometrical relationship of any two sequential ultrasound scan images to reduce the number of component volumes that need to be analyzed to determine if [a] the maximum spacing limit between sequential ultrasound scan images has been exceeded and/or [b] the minimum pixel density in a component volume has not been achieved.
  • two sequential two- dimensional ultrasound scan images 2200 and 2201 are shown in a spaced apart relationship with vector 2320 referring to the direction of transmitted and reflected ultrasound signals emanating from and received by the hand-held ultrasound probe.
  • the physical spacing between sequential two-dimensional ultrasound scan images 2200 and 2201 has been significantly increased and is not drawn to scale relative to the overall dimensions of the ultrasound scan regions 2200 and 2201.
  • Each two-dimensional ultrasound scan image e.g., scan images 2200 and 2201
  • the boundary of the i th two-dimensional scan image e.g., scan image 2200
  • the boundary of the (i+l) th two-dimensional scan image e.g., scan image 2201.
  • Said component volume 2310a is comprised of two isosceles trapezoids 2300a and 2301a corresponding to end faces of the component volume 2310a located at one of four corners of the planar two-dimensional ultrasound scan images 2200 and 2201, respectively.
  • the coordinates of 2300a are X 28 Y 28 Z 28 (1128), X 29 Y 29 Z 29 (1129), X26Y26Z26 (1126), X2 7 Y 2 7Z 2 7 (1127).
  • the coordinates of 2301a are Xi 6 Yi 6 Zi 6 (l 1 16),
  • the described embodiments greatly reduces the computation time required to assure that each subsequent two-dimensional ultrasound scan image meets the requirements for maximum allowed spacing and/or minimum required pixel density and that the operator can be alerted immediately after each scan path has been completed.
  • the ultrasound scanning-derived image recording is time-based, with the images obtained in a temporally uniform manner. This approach can present several problems. First, if the image spacing varies from one part of the scan to the next, then the ability to present the images in a spatially uniform manner is compromised. One portion may have images spaced on 0.01mm centers while another may have them spaced on 1mm centers.
  • FIGS. 16A-16B Another embodiment of the present invention is seen in FIGS. 16A-16B and includes analyzing the complete data set from the ultrasound screening procedure to identify those two- dimensional scan images 400a-400o that are separated by a function of the translational speed of the ultrasound probe during the scanning procedure and the image recording rate of the data acquisition and control module.
  • those images that are separated by a Z-axis spacing close to the predetermined minimum spacing interval are saved while any additional two-dimensional scan images located between a pair of properly spaced two-dimensional scan images, consequently being separated by a spacing interval much less than the predetermined minimum spacing interval, are excluded from the final video presentation of the ultrasound scanning procedure.
  • FIG. 16A-16B includes analyzing the complete data set from the ultrasound screening procedure to identify those two- dimensional scan images 400a-400o that are separated by a function of the translational speed of the ultrasound probe during the scanning procedure and the image recording rate of the data acquisition and control module.
  • those images that are separated by a Z-axis spacing close to the predetermined minimum spacing interval are
  • FIGS. 16A-16B Another embodiment of this present invention, also seen in FIGS. 16A-16B includes analyzing the complete data set from the ultrasound screening procedure to identify the spacing between each pair of adjacent scan images and to present those images in a spatially consistent manner, rather than a temporally consistent manner, as is the custom with most presentations of video images.
  • the presentation of images is provided as a function of sweep volume and the dwell time for each image is determined as a function of the spacing between adjacent images. In the way of example, as described in FIG.
  • the dwell times for 400c is 0.8sec
  • for 400d is 0.2sec
  • for 400e is 0.2sec
  • for 400f is 0.3sec
  • for 400g is 0.2sec
  • for 400h is 0.3sec
  • for 400i is 0.3sec
  • for 400j is 0.4sec
  • for 400k is 0.3sec
  • for 4001 is 0.5sec
  • for 400m is 0.5sec.
  • No dwell time is listed for 400o in this example because there is no sequential frame following 400o.
  • the position tracking module 22 and the data acquisition and display module/controller 40 poll the location of the hand-held imaging probe 14 to which the plurality of position sensors 32a, 32b and 32c are affixed at time intervals that are more frequent than the expected recording time interval to determine when the hand-held imaging probe 14 to which the plurality of position sensors 32a, 32b and 32c are affixed is at a location which would represent an acceptable spacing, regarding the previously recorded image 400.
  • the data acquisition and display module/controller 40 When the hand-held imaging probe is at the appropriate space, the data acquisition and display module/controller 40 will record an image. For example, in FIGS. 16A-16B, if images 400a-400o represent the location of the hand-held imaging probe 14 to which the plurality of position sensors 32a, 32b and 32c are affixed at 0.1 sec intervals, then the data acquisition and display module/controller 40 would only record an image at 0.0 seconds 400a (when the handheld imaging probe 14 to which the plurality of position sensors 32a, 32b and 32c are affixed is at its initial location), another image at 0.1 sec 400b (when the hand-held imaging probe 14 to which the plurality of position sensors 32a, 32b and 32c are affixed is 1.0mm past the previously recorded image, or at 1.0mm), another image at 0.3sec 400d (when the hand-held imaging probe 14 to which the plurality of position sensors 32a, 32b and 32c are affixed is 1.0mm past the previously recorded image
  • Some embodiments described provide for the control of the imaging recording process by taking into consideration several factors during the scanning process. For example, these factors include image-to-image spacing, angular position of the probe, and scan-to-scan spacing. This allows the images to be recorded with uneven or non-constant spacing between one or more images. Uneven or non-constant spacing is often the result of variable translation speed as the operator moves the probe across a target region. Variable speed creates images of varying distances from one another. Some embodiments allow the operator to vary the speed of scanning while still ensuring adequate resolution and coverage of the scanned images. This can be accomplished by maintaining a minimum image-to-image distance, minimum scan-to-scan distance, or minimum pixel density.
  • the system and method can reduce the review time by calculating which of those images provide useful information and should be displayed during the review process, and which, because they are so closely spaced to the previous or following image, should not be displayed.
  • the system and method may perform calculations using one or more microprocessors to determine which of the recorded images is closest to the desired spacing.
  • the desired spacing is 1.0mm
  • only images 400a, 400b, 400d, 400f, 400j, 400m, and 400o are required to provide the desired resolution.
  • the system can choose, through a logical argument which chooses only those images closest to the desired spacing parameters, to not display images 400c, 400e, 400g, 400h, 400i, 400k, 4001, and 400n.
  • the system and method can reduce the review time by calculating how long each of those images should be displayed during the review process, and which, because they are so closely spaced to the previous or following image, should not be displayed.
  • the system and method may perform calculations to determine how long to display each image, depending on the speed at which the reviewer wants to translate, from a virtual point of view, through the tissue.
  • the total review time for this sequence is 0.56sec. If the images were reviewed at 0.1 frames per second, as would be suggested from the spacing of images 400a and 400b, then the review time of the entire set of images would be 1.3sec.
  • the probe is at 0.0mm at O.Osec, it is at 1.0mm at approximately 0.21 sec, it is at 2.0mm at approximately 0.3167sec, it is at 3.0mm at approximately 0.5125sec, it is at 4.0mm at 0.8sec, 5.0mm at approximately 0.975sec, .6.0mm at approximately 1.15sec, 7.0mm at 1.3sec, 8.0mm at approximately 1.567sec, 9.0mm at approximately 1.65sec, and 10.0mm at 1.8sec. Although it would take 1.8sec to record these 1 1 images, they could be replayed in l .Osec, at 10 frames per second.
  • a redundant image is an image for which all of the information contained within that image are contained in other images, or combinations of other images.
  • the two radial scans 1600 and 1602 of the breast begin at the periphery of the breast 60 and progress to the nipple 64. There is no overlap of scan information on the periphery, but overlap does occur as the scans approach the nipple 64. Any additional images which are recorded within the bounds of the two scans would be redundant. In this example, if a third scan 1608 were obtained between the first two, then, as with the other scans, there would be no overlap of information at the periphery of the breast 60.
  • a single image 1612 were captured within that portion of the scan, there may be some information that is redundant to other images, but there is other information that has not been imaged. Therefore, this image is not entirely redundant. If the operator continues with that scan, however, he or she will scan a region 1610 which has been completely scanned by the other scans 1600 and 1602. If a single image 1614 were captured in this region then all of the information contained therein would be redundant. In this example the region 1610 may contain a plurality of images, all of which are redundant. Significant review time may be saved by simply not reviewing these images. Some embodiments described provide for reducing review time by determining the overlap or redundancy between images in a scanned set of images. The scan set of images may then be modified to remove overlapping or redundant information.
  • Determining redundancy or overlap may be accomplished by any of the methods described above, for example, by determine distances between pixels or comparing pixel density for scanned images.
  • the phrase uniform temporal display or review refers broadly to modifying a scan sequence such that the review time satisfies a predetermined time regardless of the number of images in the scan sequence. In some cases, this is accomplished by allocating dwell times or review times for each image in the scan sequence. For example, a scan sequence having 10 images may have a predetermined review time of 10 seconds for all 10 images.
  • the review time allocated to each image within the 10 image scan sequence can vary from image to image. Some images may be assigned 1.0 second dwell times. Other images may be apportioned .75 second dwell times. Such allotment may be a function of the relative spacing between the images. In some embodiments, uniform temporal display or review indicates that the overall total time for review of the scan sequence is substantially the same regardless of the individual dwell times or review times for each discrete image within the scan sequence.
  • the phrase uniform spatial display or review refers broadly to modifying a scan sequence such that the relative spacing between discrete images within a scan sequence is substantially the same.
  • a scan sequence may have recorded images at 0mm, 1.0mm, 1.5mm, 2.0mm, 2.2mm, 2.5m, and 3.0mm.
  • Such a scan sequence may be modified to have uniform spatial display or review by removing images that do not have a preferred relative spacing.
  • the relative spacing may be for example 1.0 image-to-image spacing.
  • the recorded images for review would not include 1.5mm, 2.2mm, and 2.5mm.
  • the modified scan sequence would provide for a uniform spatial display or review.
  • the review images may exhibit uniform spatial -temporal display or review having both uniform spatial and uniform temporal characteristics or some combination within the review scan sequence images.
  • Some embodiments provide for methods, systems, or devices that allow the reviewer to mark or otherwise annotate the images for review.
  • the annotation or marking indicates a location on the scanned image that may need to be reviewed further.
  • the marked section in the image may indicate the site of a suspicious lesion or structure, e.g., potential tumor.
  • FIG 13 Another embodiment of the present invention is seen in FIG 13 wherein optical recognition is used for continuously detecting the position and orientation of a hand-held ultrasound probe assembly 230 in place of the use of electromagnetic radiofrequency position sensors as described in the preceding specification related to FIGS 1 through 9 and FIG 11.
  • the optical recognition based position and orientation detection method, apparatus and system is used to accurately determine the position of each two-dimensional ultrasound scan image and, thereby, the temporal position of each pixel within each two-dimensional ultrasound scan image.
  • a first subsystem is the diagnostic ultrasound system 12, which includes ultrasound monitor console 18, display 17, hand-held ultrasound probe 214 and connecting cable 16.
  • a second system (referred to hereinafter as the "Optically Based Optically Based Ultrasound Scan Completeness Auditing System"), is represented in general at 218.
  • the Optically Based Ultrasound Scan Completeness Auditing System 218 comprises a data acquisition and display module/controller 240 including microcomputer/storage/DVD ROM recording unit 241, display 213 and foot pedal control 212. Foot pedal 212 is connected to microcomputer/storage/DVD ROM recording unit 241 via cable 215 and removably attachable connector 13.
  • Position-tracking system 220 which includes position tracking module 222 and two or more, preferably three or more cameras 235 (e.g., infrared cameras).
  • the Optically Based Ultrasound Scan Completeness Auditing System 210 also comprises two or more optically unique (i.e., uniquely identifiable) position markers 232 affixed to the hand-held ultrasound probe 214. Said two or more, preferably three or more, cameras may operate in the visible spectrum or infrared spectrum.
  • infrared cameras 235a-235d are shown at predetermined fixed positions whose fields of view include the hand-held ultrasound probe assembly 230 including six optically unique position markers with three position markers 232a-232c visible on the front side of hand-held ultrasound probe assembly 230 (232d-232f on back side of hand-held ultrasound probe assembly 230 but not shown).
  • Said infrared cameras removable connected to position tracking module 222 at connectors 236a-236d via cables 243a- 234d.
  • Said optically based position detection method, system and apparatus is capable of obtaining 100 position measurements per second at a camera-to-object distance of up to 3 meters with position accuracies to within less than 1 millimeter. See, for example, an off-the-shelf optically based position detection device, Spotlight Tracker, manufactured by Ascension Technology Corporation, Burlington, Vermont.
  • diagnostic ultrasound system 12 is connected to data acquisition and display module/controller 240 via data transmission cable 46 to enable each frame of ultrasound data (typically containing about 10 million pixels per frame) to be received by the microcomputer/storage/DVD ROM recording unit 241 at the end of each individual scan, which is completed about every 0.1 to 0.02 seconds.
  • Cable 248 is removably attached to microcomputer/storage/DVD ROM recording unit 241 of data acquisition and display module/controller 240 with removably attachable connector 245 and is removably connected to diagnostic ultrasound system 12 with connector 47.
  • the successive scans associated with the diagnostic ultrasound procedure are stored and subjected to computational algorithms to assess completeness of the diagnostic ultrasound scanning procedure as described in greater detail in the specifications which follow.
  • hand-held ultrasound probe position tracking module 222 is connected to data acquisition and display module/controller 240 via data transmission cable 248 wherein cable 248 is removably attached to microcomputer/storage/DVD ROM recording unit 241 of data acquisition and display module/control 240 with connector 245 and is removably connected to position tracking module with connector 249.
  • Hand-held ultrasound probe assembly 230 seen in FIG. 1 includes, by way of example, six optically unique position markers 232a-232c (232d-232f on back side of hand-held ultrasound probe assembly 230 and not shown), which are affixed to ultrasound hand-held probe 214.
  • infrared cameras 235a-235d are positioned at known locations around the perimeter and in unobstructed view of the hand-held ultrasound probe assembly 230.
  • Optical recognition and vectoring software contained within the position-tracking module 222 provides the exact position and orientation of the hand-held ultrasound probe assembly 230 preferably at time intervals of 0.05 seconds and more preferably of at time intervals of 0.01 seconds.
  • optically unique position markers 232a-232c (232d-232f on back side of hand-held ultrasound probe assembly 230 and not shown) are affixed to the hand-held ultrasound probe 214 as described now in greater detail.
  • These optical position markers can be differentiated from each other by the geometry of the reflective pattern, the reflective wavelength, or a combination therein.
  • the optical markers can be affixed to the probe assembly 214 by means of an adhesive bond.
  • a hand-held ultrasound probe 214 is enclosed within first and second "clamshell" type support members 242 and 244, respectively,
  • three optically unique position markers 232a-232c are affixed to the exterior surface of first support member 242.
  • three optically unique position markers 232d-232f (not shown) are affixed to the exterior surface of second support member 244.
  • the number of sensors is only limited by the ability to generate optically unique geometries and colors and the amount of surface area on the probe.
  • three cameras 271a-271c individually locate three markers 232b, 232h, 232i. Since the locations of the markers 232b, 232h, 232i relative to the geometry of the probe assembly 230 are known, the location and calculated orientation of the probe assembly 230 can be determined.
  • the location and calculated orientation of the probe assembly 230 can be determined even if one or more or all of the original markers 232b, 232h, 232i are obscured from the line-of-site of the cameras 271a-271c. As depicted in FIG 14C, this may be accomplished as the cameras 271a-272c can locate an additional marker such as 232j, 232k for each marker that is obscured 232b, 232i. In some embodiments, the location of three markers 232h, 232j, 232k are known and since the location of these three markers 232h, 232j, 232k are also known relative to the probe assembly 230, the location and the orientation of the probe assembly 230 may be determined. In other embodiments, any number or subset of a plurality of sensors/markers may be used to determine location and orientation of the probe assembly.
  • Said first support member 242 includes the aforementioned three optically unique position markers 232a-232c.
  • First support member 242 also incorporates extension ears 236a and 236b, each with a drilled hole to enable secure mechanical attachment to second support member 244.
  • Said second support member 244 likewise incorporates extension ears 238a and 238b, each with a drilled hole which matches drilled holes in first support member to enable secure mechanical attachment to second support member 242 using screws 239a and 239b, respectively.
  • First and second support members may be manufactured using metal, metal alloy or, preferably, a rigid plastic material.
  • the interior contours and dimensions of the first and second support members 242 and 244 are designed to match the particular contour and dimensions of the off-the-shelf hand-held ultrasound probe being instrumented with the optically unique position markers 232a-232c. Accordingly, the contours and dimensions of the first and second support members 242 and 244 will vary according to the hand-held ultrasound probe design. The exact location of the optically unique position markers 232a-232c relative to the ultrasound transducer array at the end face of the hand-held ultrasound probe (not shown) will accordingly be known for each set of first and second support members since they are designed to attached to and operate in conjunction with a specific hand-held ultrasound probe.
  • the first and second support members 242 and 244 are sized to correspond to the particular contour and dimensions of a specific hand-held ultrasonic probe design.
  • the inner dimensions of said first and second support members 242 and 244 are designed to closely match the outer dimensions of the hand-held ultrasound probe 214.
  • the wall thickness of the injection molded plastic support members 242 and 244 is preferably in the range from 0.05 to 0.10 inch.
  • a position sensor may not be a separate sensor added to the imaging device but may be a geometric or landmark feature of the imaging device, for example, the corners of the probe.
  • the optical, infrared, or ultraviolet cameras could capture an image of the probe and interpret the landmark feature as a unique position on the imaging device.
  • sensors may not need to be added to the imaging device. Rather, location and motion detection systems can be used to track the position of the imaging device by using geometric or landmark features of the imaging device. For example, a location system may track the corners or edges of an ultrasound imaging probe while it is scanned across a target tissue.
  • either the electromagnetic radiofrequency-based method, apparatus and system or the optical recognition- based method, apparatus and system can be used to detect the position of the hand-held ultrasound probe at all time points corresponding to the time of any two-dimensional ultrasound scan image.
  • This position and orientation data is used to compute the maximum distance between sequential two dimensional ultrasound scan images to determine if predetermined maximum spacing limits are exceeded or predetermined pixel density limits are not achieved. If any predetermined requirements are not achieved, the ultrasound screening operator is alerted with a visual display identifying that the scan just completed [a] was performed with an excessive spacing relative to the previous scan in the sequence and/or [b] was performed a rate of translation and/or rotation that was too fast to meet pixel density or spacing requirements.
  • FIG. 18 Another embodiment of the present invention is shown in FIG. 18 where optical recognition is used for continuously detecting the position and orientation of a hand-held ultrasound probe 214. This system may be used as an alternative to the use of the
  • the optical recognition based position and orientation detection method, apparatus and system is used to accurately determine the position of each two-dimensional ultrasound scan image and, thereby, the temporal position of each pixel within each two-dimensional ultrasound scan image.
  • a first subsystem is the diagnostic ultrasound system 12, which includes ultrasound monitor console 18, display 17, hand-held ultrasound probe 214 and connecting cable 16.
  • a second system (referred to hereinafter as the "Optically Based Optically Based Ultrasound Scan Completeness Auditing System"), is represented in general at 218.
  • the Optically Based Ultrasound Scan Completeness Auditing System 218 comprises a data acquisition and display module/controller 240 including microcomputer/storage/DVD ROM recording unit 241, display 213 and foot pedal control 212. Foot pedal 212 is connected to microcomputer/storage/DVD ROM recording unit 241 via cable 215 and removably attachable connector 13.
  • Position-tracking system 220 which includes position tracking module 222 and two or more, preferably three or more cameras 1235a-d (e.g., optical cameras, infrared cameras, or ultraviolet cameras) affixed to the hand-held ultrasound probe 214.
  • the Optically Based Ultrasound Scan Completeness Auditing System 210 also comprises two or more optically unique (i.e., uniquely identifiable) position markers 1232a-d affixed to locations in the surrounding environment. Said two or more, preferably three or more, cameras 1235a-d may operate in the visible spectrum or infrared spectrum or ultraviolet spectrum.
  • infrared cameras 1235a-1235d are shown at predetermined fixed positions on the hand-held ultrasound probe assembly 230, whose fields of view include four optically unique position markers 1232a-1232d visible at various locations throughout the room.
  • Said infrared cameras are removably connected to position tracking module 222 at connectors 1236a-1236d via cables 1243a-1234d.
  • Said optically based position detection method, system and apparatus is capable of obtaining 100 position measurements per second at a camera-to-object distance of up to 3 meters with position accuracies to within less than 1 millimeter. See, for example, an off-the-shelf optically based position detection device, Spotlight Tracker, manufactured by Ascension Technology Corporation, Burlington, Vermont.
  • diagnostic ultrasound system 12 is connected to data acquisition and display module/controller 240 via data transmission cable 46 to enable each frame of ultrasound data (typically containing about 10 million pixels per frame) to be received by the microcomputer/storage/DVD ROM recording unit 241 at the end of each individual scan, which is completed about every 0.1 to 0.02 seconds.
  • Cable 248 is removably attached to microcomputer/storage/DVD ROM recording unit 241 of data acquisition and display module/controller 240 with removably attachable connector 245 and is removably connected to diagnostic ultrasound system 12 with connector 47.
  • the successive scans associated with the diagnostic ultrasound procedure are stored and subjected to computational algorithms to assess completeness of the diagnostic ultrasound scanning procedure as described in greater detail in the specifications which follow.
  • hand-held ultrasound probe position tracking module 222 is connected to data acquisition and display module/controller 240 via data transmission cable 248 wherein cable 248 is removably attached to microcomputer/storage/DVD ROM recording unit 241 of data acquisition and display module/control 240 with connector 245 and is removably connected to position tracking module with connector 249.
  • Hand-held ultrasound probe assembly 230 seen in FIG. 1 includes, by way of example, four infrared cameras 1235a-1235d which are affixed to ultrasound hand-held probe 214. As seen in the example arrangement shown in FIG 18, four optically unique position markers 1232a-1232d are positioned at known locations in the room and in unobstructed view of the hand-held ultrasound probe assembly 230.
  • Optical recognition and vectoring software contained within the position-tracking module 222 provides the exact position and orientation of the hand-held ultrasound probe assembly 230 preferably at time intervals of 0.05 seconds and more preferably of at time intervals of 0.01 seconds.
  • Images may be retrieved and stored in a variety of manners.
  • the microprocessor/storage/DVD ROM recording unit 41 of the data acquisition and display module/controller 40 could be a standard computer with a video frame grabber card.
  • the data transmission cable 46 could connect to the video output of the hand-held imaging system 12 and record discrete images in a wide variety of formats including, but not restricted to JPG, BMP, PNG.
  • Each image would be stored with an information header containing, but not restricted to, the location of the image at the time it was recorded.
  • the individual images could be stored in sets of scan tracks, and the scan tracks could be stored as a complete examination, or the images could be stored using another data management protocol.
  • the resulting set of images could be comprised of several thousand individual, discrete images.
  • the set of images may be stored as a set, along with the location information and other information, such as patient identification, etc., to a portable storage device 9, such as a DVD ROM, portable hard drive, network hard drive, cloud-based memory, etc. These data may be viewed on the data acquisition display module/controller 40, or an external computer equipped with software designed to review the image data.
  • a portable storage device 9 such as a DVD ROM, portable hard drive, network hard drive, cloud-based memory, etc.
  • an optical image projector can be included in either the Ultrasound Scan Completeness Auditing System or the Optically Based Ultrasound Scan Completeness Auditing System to superimpose optical information on the surface of the targeted tissue (e.g., the human female breast).
  • Said optical information may, by way of example, include the ultrasound scan path(s) that need to be repeated due to excessive inter-scan distances, inadequate overlap and/or excessive scanning translation speed and/or rate of rotation. Said optical information can thereby guide the conduct of additional two- dimensional ultrasound scans to overcome any determined deficiencies.
  • numeric value may have a value that is +/- 0.1% of the stated value (or range of values), +/- 1% of the stated value (or range of values), +/- 2% of the stated value (or range of values), +/- 5% of the stated value (or range of values), +/- 10% of the stated value (or range of values), etc. Any numerical range recited herein is intended to include all sub-ranges subsumed therein.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
EP14740355.4A 2013-01-17 2014-01-16 Verfahren, vorrichtung und system zur vollständigen untersuchung von gewebeproben mit tragbaren bildgebungsvorrichtungen mit montierten kameras Withdrawn EP2945542A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361753832P 2013-01-17 2013-01-17
PCT/US2014/011781 WO2014113530A1 (en) 2013-01-17 2014-01-16 Method, apparatus and system for complete examination of tissue with hand-held imaging devices having mounted cameras

Publications (1)

Publication Number Publication Date
EP2945542A1 true EP2945542A1 (de) 2015-11-25

Family

ID=51210052

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14740355.4A Withdrawn EP2945542A1 (de) 2013-01-17 2014-01-16 Verfahren, vorrichtung und system zur vollständigen untersuchung von gewebeproben mit tragbaren bildgebungsvorrichtungen mit montierten kameras

Country Status (3)

Country Link
EP (1) EP2945542A1 (de)
JP (1) JP2016506781A (de)
WO (1) WO2014113530A1 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104095653B (zh) * 2014-07-25 2016-07-06 上海理工大学 一种自由臂三维超声成像系统及成像方法
US10045758B2 (en) 2014-11-26 2018-08-14 Visura Technologies, LLC Apparatus, systems and methods for proper transesophageal echocardiography probe positioning by using camera for ultrasound imaging
US10265046B2 (en) 2014-11-26 2019-04-23 Visura Technologies, Inc. Apparatus, system and methods for proper transesophageal echocardiography probe positioning by using camera for ultrasound imaging
JP7104243B2 (ja) 2019-06-06 2022-07-20 富士フイルム株式会社 3次元超音波撮像支援装置、方法、及びプログラム
CN112704514B (zh) * 2020-12-24 2021-11-02 重庆海扶医疗科技股份有限公司 病灶定位方法及病灶定位系统
JPWO2022196090A1 (de) * 2021-03-17 2022-09-22

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69736549T2 (de) * 1996-02-29 2007-08-23 Acuson Corp., Mountain View System, verfahren und wandler zum ausrichten mehrerer ultraschallbilder
US5873830A (en) * 1997-08-22 1999-02-23 Acuson Corporation Ultrasound imaging system and method for improving resolution and operation
JP5146692B2 (ja) * 2006-03-30 2013-02-20 アクティビューズ リミテッド 光学的位置測定ならびに剛性または半可撓性の針の標的への誘導のためのシステム
US20080021317A1 (en) * 2006-07-24 2008-01-24 Siemens Medical Solutions Usa, Inc. Ultrasound medical imaging with robotic assistance for volume imaging
US7940970B2 (en) * 2006-10-25 2011-05-10 Rcadia Medical Imaging, Ltd Method and system for automatic quality control used in computerized analysis of CT angiography
US8715188B2 (en) * 2007-07-12 2014-05-06 Siemens Medical Solutions Usa, Inc. Medical diagnostic ultrasound scanning and video synchronization
US8135198B2 (en) * 2007-08-08 2012-03-13 Resonant Medical, Inc. Systems and methods for constructing images
US9895135B2 (en) * 2009-05-20 2018-02-20 Analogic Canada Corporation Freehand ultrasound imaging systems and methods providing position quality feedback

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2014113530A1 *

Also Published As

Publication number Publication date
WO2014113530A1 (en) 2014-07-24
JP2016506781A (ja) 2016-03-07

Similar Documents

Publication Publication Date Title
US20180132722A1 (en) Method, apparatus and system for complete examination of tissue with hand-held imaging devices
US20160100821A1 (en) Hand-held imaging devices with position and/or orientation sensors for complete examination of tissue
US20150366535A1 (en) Method, apparatus and system for complete examination of tissue with hand-held imaging devices having mounted cameras
US20230103969A1 (en) Systems and methods for correlating regions of interest in multiple imaging modalities
US20050089205A1 (en) Systems and methods for viewing an abnormality in different kinds of images
EP2945542A1 (de) Verfahren, vorrichtung und system zur vollständigen untersuchung von gewebeproben mit tragbaren bildgebungsvorrichtungen mit montierten kameras
JP6974354B2 (ja) 同期された表面および内部腫瘍検出
JP2014504918A (ja) 画像診断に使用するために複数の異なる画像処理システムからの3次元画像データを重ね合わせるシステムおよび方法
JP7451802B2 (ja) 乳房マッピングおよび異常定位
US20230098305A1 (en) Systems and methods to produce tissue imaging biomarkers
CN115348837A (zh) 用于在多种成像模态中识别关注区域的系统和方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150814

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160802