WO2014165662A2 - Portable electronic devices with integrated imaging capabilities - Google Patents

Portable electronic devices with integrated imaging capabilities Download PDF

Info

Publication number
WO2014165662A2
WO2014165662A2 PCT/US2014/032803 US2014032803W WO2014165662A2 WO 2014165662 A2 WO2014165662 A2 WO 2014165662A2 US 2014032803 W US2014032803 W US 2014032803W WO 2014165662 A2 WO2014165662 A2 WO 2014165662A2
Authority
WO
WIPO (PCT)
Prior art keywords
portable electronic
electronic device
image
imaging
target
Prior art date
Application number
PCT/US2014/032803
Other languages
French (fr)
Other versions
WO2014165662A3 (en
Inventor
Noah Zachary ROTHBERG
Original Assignee
Butterfly Network, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Butterfly Network, Inc. filed Critical Butterfly Network, Inc.
Priority to CN201480031564.XA priority Critical patent/CN105263419A/en
Priority to EP14725300.9A priority patent/EP2981215A2/en
Priority to CA2908631A priority patent/CA2908631C/en
Priority to KR1020157031515A priority patent/KR20150145236A/en
Priority to JP2016506609A priority patent/JP6786384B2/en
Publication of WO2014165662A2 publication Critical patent/WO2014165662A2/en
Publication of WO2014165662A3 publication Critical patent/WO2014165662A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4494Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer characterised by the arrangement of the transducer elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4411Device being modular
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4455Features of the external shape of the probe, e.g. ergonomic aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4477Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/899Combination of imaging systems with ancillary equipment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/016Exploded view

Definitions

  • the present disclosure relates generally to imaging devices and methods (e.g., ultrasound imaging devices and methods).
  • Imaging technologies are used at various stages of medical care. For example, imaging technologies are used to non-invasively diagnose patients, to monitor the performance of medical (e.g., surgical) procedures, and/or to monitor post-treatment progress or recovery.
  • medical e.g., surgical
  • MRI magnetic resonance imaging
  • a portable electronic device e.g., smart phone and/or tablet computer
  • an image e.g., 2- dimensional or 3-dimensional image
  • the window and corresponding image displayed on a display screen of the portable electronic device change as the portable electronic device is moved over various portions of the body (e.g., abdomen, thorax, etc.).
  • the image displayed by the portable electronic device may identify, for example, organs, arteries, veins, tissues, bone, and/or other bodily contents or parts.
  • the image may be presented in 3 dimensions such at it appears to the viewer as if the viewer is looking into the body, or as if the body parts have been projected up (e.g., exploded view) from the body.
  • the present disclosure provides numerous embodiments of systems, apparatus, computer readable media, and methods for providing imaging functionality using a portable electronic device, such as, for example, a smart phone or a tablet computer.
  • the portable electronic device is configured to generate and display an image of what appears to be an exploded view (e.g., 3 -dimensional, upwardly projected image) of an object or its constituent parts.
  • movement of the portable electronic device results in the rendering of a different internal image of the target (e.g., different portion(s) of a human body).
  • the generated window of the underlying object e.g., the portion of the human body
  • a portable electronic device includes a processor configured to generate an image (e.g., ultrasound image) of an internal feature of a target when the device is positioned at an external surface of the target, and a display configured to display the image.
  • an image e.g., ultrasound image
  • a portable ultrasound device in some embodiments according to another aspect of the present disclosure, includes multiple ultrasound elements configured to receive ultrasound radiation reflected by or passing through a target when the ultrasound device is pointed at the target.
  • the portable ultrasound device also includes a display configured to display an image of an internal feature of the target based at least in part on the ultrasound radiation received by the plurality of ultrasound elements.
  • a method in some embodiments according to another aspect of the present disclosure, includes pointing a portable electronic device at an external surface of a subject, and viewing, on a display of the portable electronic device, an image of an internal feature of the subject while pointing the portable electronic device at the external surface of the subject.
  • the portable electronic device includes a radiation sensor
  • the method further includes receiving, with the radiation sensor, radiation reflected by or passing through the subject, and creating the image of the internal feature based at least in part on the radiation received by the radiation sensor.
  • a portable electronic device renders within a window on a display of the device an image (e.g., 3-dimensional image) of an inside of a human body when the device is directed at the body (e.g., within about one meter or less of the body).
  • the image changes to reflect additional body parts as the device is moved relative to the body.
  • the portable electronic device renders the image by processing radiation signals received by a radiation sensor which are reflected by or passing through the human body.
  • the portable electronic device is positioned within approximately one meter from the body (e.g., within 0.75 to 1.25 meters from the body).
  • a portable electronic device in some embodiments according to another aspect of the present disclosure, includes multiple imaging elements configured to receive radiation signals transmitted through or reflected by an imaging target and an imaging interface.
  • the portable electronic device also includes one or more processors configured to receive one or more sensing signals from at least one of the plurality of imaging elements, and to render an image of the imaging target for display through the imaging interface based at least in part on the one or more sensing signals.
  • each of the imaging elements is separately processed by a corresponding imaging processor, such that combining the processed signals of the imaging elements is utilized to render images having higher resolution and/or higher frame rate than images rendered based on a single imaging element.
  • the combined processed signals are used to generate a three-dimensional image of the target.
  • the portable electronic device is a handheld portable electronic device, such as a cellular phone or a tablet computer.
  • imaging elements may include their own dedicated processing circuitry, such as a graphic processing unit (GPU), digital signal processor (DSP), and/or central processing unit (CPU), and/or may utilize processing circuitry of the portable electronic device.
  • the CPU and/or GPU of the portable electronic device 100 may be utilized for image acquisition/reconstruction and image rendering.
  • the CPU of portable electronic device may be utilized to process computations based on received signals (e.g., back-scattered signals and/or transmissive signals) in order to generate an image or topography, while the GPU may be utilized to render an image based on the information received from the CPU to generate a real-time or substantially real-time image display.
  • portable electronic device may include one or more components for processing, filtering, amplification, and/or rendering images.
  • the processing circuitry may be fabricated on the same semiconductor chip as the ultrasound elements.
  • the portable electronic device may include one or more processors for identifying structure(s) identified in the image based at least in part on stored data (e.g., data stored in random access memory or other storage device of portable electronic device).
  • data stored within device may identify characteristic(s) of structure(s) (e.g., one or more shapes, colors, textures, cellular characteristics, tissue characteristics, and/or other distinctive and/or surrounding features or structures) that may be present within different areas of the human body for use by personal electronic device to identify and/or predict the type(s) of structures depicted in an image rendered by device.
  • data stored within device may identify characteristics of particular disease(s) such as cancer or other abnormalities (e.g., based on image data corresponding to previously stored characteristics of particular structures associated with a disease) for use by personal electronic device to identify and/or predict the type(s) of structures depicted in an image rendered by device.
  • characteristics of particular disease(s) such as cancer or other abnormalities (e.g., based on image data corresponding to previously stored characteristics of particular structures associated with a disease) for use by personal electronic device to identify and/or predict the type(s) of structures depicted in an image rendered by device.
  • FIG. 1A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a human body or a portion of a human body according to some embodiments of the present disclosure.
  • FIG. IB illustrates a three-dimensional internal image of a portion of a human body that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 2A illustrates a front view of portable electronic device including an imaging interface according to some embodiments of the present disclosure.
  • FIG. 2B illustrates a back view of portable electronic device including imaging elements according to some embodiments of the present disclosure.
  • FIG. 3 illustrates a transmissive imaging system and method according to some embodiments of the present disclosure.
  • FIG. 4 illustrates a reflective imaging system and method according to some embodiments of the present disclosure.
  • FIG. 5 illustrates a transmissive and/or reflective imaging system and method according to some embodiments of the present disclosure.
  • FIG. 6A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments of the present disclosure.
  • FIG. 6B illustrates a three-dimensional internal image of a portion of a human body at the first position shown in FIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 6C illustrates a three-dimensional internal image of a portion of a human body at the second position shown in FIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 7A illustrates a front view of a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 7B illustrates a back view of a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 7C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 7D illustrates a back view of a case including imaging elements for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8A illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8B illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8D illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8E illustrates a modular unit including an imaging circuit according to some embodiments of the present disclosure.
  • FIG. 9A illustrates how, in some embodiments, a single transducer element may fit within a larger transducer array.
  • FIGs. 9B-F show five different examples of how a given transducer element within an array might be configured in some embodiments.
  • FIG. 10A shows an illustrative example of a monolithic ultrasound device according to some embodiments.
  • FIG. 10B is a block diagram illustrating how, in some embodiments, the TX control circuit and the RX control circuit for a given transducer element may be used either to energize the element to emit an ultrasonic pulse, or to receive and process a signal from the element representing an ultrasonic pulse sensed by it.
  • FIG. 11 illustrates an example technique for biasing transducer elements in an array or other arrangement.
  • FIGs. 12 and 13 show illustrative examples of components that may be included within the analog processing block and the digital processing block of the RX control circuit shown in Fig. 10.
  • FIGs. 14A-14K illustrate a process sequence for fabricating a CMOS ultrasonic transducer (CUT) having a membrane formed above a cavity in a CMOS wafer, according to a non-limiting embodiment of the present application.
  • CUT CMOS ultrasonic transducer
  • a portable electronic device that includes an imaging interface and one or more imaging elements.
  • the portable electronic device may be a cellular phone, personal digital assistant, smart phone, tablet device, digital camera, laptop computer, or the like.
  • An image may be generated and/or rendered utilizing the portable electronic device.
  • the portable electronic device may be utilized to simulate a "window" into an imaging target, such as a human body or portion of the body.
  • the simulated "window” may provide a view of the inside of a human body or portion of the body, including organs, arteries, veins, tissues, bone, and/or other bodily contents or parts.
  • an image e.g., ultrasound or sonographic image
  • a real-time continuous or substantially real-time continuous image e.g., 10 frames/second, 20 frames/second, 25 frames/second, 30 frames/second, or the like
  • a substantially real-time updated image of the area e.g., 10 frames/second, 20 frames/second, 25 frames/second, 30 frames/second, or the like
  • internal movement of the target object e.g., such as expansion and/or contraction of organs
  • the portable electronic devices and methods described herein may include, be coupled to (e.g., via a suitable communications connection or port such as a USB link), or otherwise utilize one or more radiation sources, sensors, and/or transducers (e.g., array(s) of ultrasound transducers), front-end processing circuitry and associated processing techniques, and/or image reconstruction devices and/or methods, in order to generate and/or render images to a user according to the non-limiting embodiments described in detail throughout the present disclosure.
  • a suitable communications connection or port such as a USB link
  • one or more of the devices described in Figures 1A-8E herein may include or be coupled to one or more ultrasound imaging elements (e.g., one or more arrays of ultrasound sources, sensors, and/or transducers).
  • One or more computers or processors within the portable electronic device may perform image analysis and/or image rendering based at least in part on radiation signals received by an imaging device.
  • FIG. 1 A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a human body or a portion of a human body 106 according to some embodiments.
  • FIG. IB illustrates a three-dimensional internal image 110 of a portion of a human body that is generated and/or rendered by a portable electronic device 100 according to some embodiments.
  • the portable electronic device 100 may be positioned in an area near (e.g. in contact with the surface of or within about one meter from the surface of) a portion of a human body that is to be imaged and/or analyzed.
  • the portable electronic device 100 may include imaging elements 104 that are configured to transmit and/or receive radiation signals.
  • An internal image 110 as shown in FIG. IB may be generated by the portable electronic device 100.
  • the internal image 1 10 may be a three- dimensional internal image of a portion of the human body that appears to a viewer 117 to project upward from a surface of the portable electronic device 100, giving the viewer the perception of a viewing window into the underlying body.
  • the portable electronic device 100 may provide a window into the internal areas of the human body that are below the surface. As will be described in greater detail below with reference to FIGs.
  • FIG. 2A illustrates a front view of portable electronic device 100 including an imaging interface 102 according to some embodiments of the present disclosure.
  • the imaging interface 102 of the portable electronic device 100 may include a display that is configured to output a two-dimensional (2-D) or three-dimensional (3-D) image of an imaging target.
  • the imaging interface 102 is interactive and is capable of receiving user input, for example through a touch-screen.
  • An image that is displayed via the imaging interface 102 may be adjusted based on the received inputs, for example, to adjust zoom level, centering position, level of detail, depth of an underlying object to be imaged, resolution, brightness, color and/or the like of the image.
  • imaging interface 102 may be configured to allow a user to selectively traverse various layers and imaging depths of the underlying object using, for example, the touch screen.
  • Portable electronic device 100 may render a three-dimensional image of the imaging target using any suitable method or combination of methods (e.g., anaglyph, polarization, eclipse, interference filtering, and/or austosteroscopy).
  • the imaging interface 102 includes a circular polarizer and/or a linear polarizer such that a viewer having polarizing filtering spectacles can view a three-dimensional image.
  • the imaging interface 102 is configured to display alternating left and right images such that a viewer having spectacles with shutters that alternate in conjunction with the displayed image views the image as a three-dimensional image.
  • the imaging interface 102 may utilize an autostereoscopy method such that 3- D spectacles are not necessary for use by a viewer to view the three-dimensional image.
  • portable electronic device 100 may display information (e.g., text and/or graphics) in addition to (e.g., graphically overlaid on top of or adjacent to) an image of a targeted object, such as, for example, text and/or graphics identifying the structure(s) identified in the image (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts).
  • portable electronic device 100 may include one or more processors for identifying structure(s) identified in the image based at least in part on stored data (e.g., data stored in random access memory or other storage device of portable electronic device 100).
  • data stored within device 100 may identify characteristic(s) of structure(s) (e.g., one or more shapes, colors, textures, cellular characteristics, tissue characteristics, and/or other distinctive and/or surrounding features or structures) that may be present within different areas of the human body for use by personal electronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered by device 100.
  • data stored within device 100 may identify characteristics of particular disease(s) such as cancer or other abnormalities for use by personal electronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered by device 100.
  • the image, text, graphics, and/or other information displayed on the user interface 104 may be adjusted through user interaction with one or more inputs (e.g., touch screen, buttons, touch-sensitive areas, or the like) of the portable electronic device 100.
  • FIG. 2B illustrates a back view of portable electronic device 100 including imaging elements 104 according to some embodiments of the present disclosure.
  • the imaging elements 104 may be configured as sources (emitters) and/or sensors of ultrasound radiation and/or other radiation.
  • the imaging elements 104 may be of substantially the same size and/or may be arranged in an array as shown in FIG. 2B.
  • the imaging elements 104 may be of different sizes and/or arranged in an irregular or scattered configuration.
  • one or more (e.g., all) of the imaging elements 104 may be arranged in the same plane. In other embodiments, at least some of imaging elements may be arranged in at least two different planes.
  • all of the imaging elements 104 included in the portable electronic device 100 may be either emitting elements or sensing elements. In some embodiments, the imaging elements 104 may include both emitting elements and sensing elements.
  • the embodiment shown in FIG. 2B includes a 4x6 array of imaging elements 104, by way of illustration only and is not intended to be limiting. In other embodiments, any other suitable numbers of imaging elements may be provided (e.g., 10, 20, 30, 40, 50, 100, 200, 500, 1000, or any number in between, or more) and may be arranged in any suitable configuration.
  • the imaging elements 104 may be integrated within a circuit board (e.g., a printed circuit board) that includes, for example, processing (e.g., image processing) components of the portable electronic device 100.
  • the imaging elements 104 may be provided on a separate circuit board or layer of a circuit board than the processing components of the portable electronic device 100, and may be in communication with the processing circuitry through a suitable communications link (e.g., an internal bus, USB link, or other port).
  • a suitable communications link e.g., an internal bus, USB link, or other port.
  • the imaging elements 104 may be micro fabricated on a semiconductor chip having the processing circuitry.
  • the imaging elements 104 may include their own dedicated processing circuitry, such as a graphic processing unit (GPU), digital signal processor (DSP), and/or central processing unit (CPU), and/or may utilize processing circuitry of the portable electronic device 100.
  • the CPU and/or GPU of the portable electronic device 100 may be utilized for image acquisition/reconstruction and image rendering.
  • the CPU of portable electronic device 100 may be utilized to process computations based on received signals (e.g., back-scattered signals and/or transmissive signals) in order to generate an image or topography, while the GPU may be utilized to render an image based on the information received from the CPU to generate a real-time or substantially real-time image display.
  • portable electronic device 100 may include one or more components for processing, filtering, amplification, and/or rendering images.
  • FIG. 3 illustrates a transmissive imaging system and method 301 according to some embodiments of the present disclosure.
  • the transmissive imaging system 301 includes two portable electronic devices 100 A and 100B that are on opposing or generally opposing sides of an imaging target 306.
  • devices 100A and 100B may be positioned in any other relationship with respect to one another.
  • devices 100A and/or 100B may include one or more sensors for determining the relative positions of these devices to aid in the generation of image(s).
  • device 100B may be a dedicated sensing and/or emitting device such as an array of ultrasound elements and associated circuitry.
  • Signals (e.g., waves or beams 308) emitted from the portable electronic device 100B are sensed by the portable electronic device 100A and are utilized to render a 2- D or 3-D image 310 (e.g., real-time or substantially real-time image) of the target 306.
  • a generated 3-D image may be in the form of a pop-out image or a depth image.
  • the portable electronic device 100A may be configured to transmit signals (e.g., waves or beams) 308 though the target 306 to be received by the portable electronic device 100B.
  • the portable electronic device 100B may simultaneously or substantially simultaneously render an image (e.g., back view or alternate view or level of detail of an image rendered by device 100A) based at least in part on processing sensed signals.
  • the portable electronic devices 100A and/or 100B may communicate the results of the sensed signals to the other in order to generate or improve a rendered image, for example, by providing higher resolution and/or a greater frame rate.
  • a rendering device may send feedback to a signal emission device regarding the rendered image, and in response, the signal emitting device may adjust a power level, signal type, signal frequency, or other signaling parameter in order to improve the image rendered by the rendering device.
  • FIG. 4 illustrates a back-scatter or reflective imaging system and method 401 according to some embodiments of the present disclosure.
  • a portable electronic device 100 may utilize emission and/or sensing elements 104 in order to render an image 410 based at least in part on reflection (e.g., back-scatter effect) of the signals 408.
  • portable electronic device 100 is the only device utilized in order to image the target (e.g., to produce an image appearing as a window into a human body).
  • the portable electronic device 100 may include both radiation sources and sensors (e.g., separate sources and sensors, and/or multiple transducers functioning as both sources and sensors), where all or substantially all of the radiation utilized by the sensors to reconstruct image(s) is backscatter radiation or radiation produced through a similar effect.
  • both radiation sources and sensors e.g., separate sources and sensors, and/or multiple transducers functioning as both sources and sensors
  • all or substantially all of the radiation utilized by the sensors to reconstruct image(s) is backscatter radiation or radiation produced through a similar effect.
  • FIG. 5 illustrates a transmissive and/or reflective imaging system and method 501 according to some embodiments of the present disclosure.
  • a plurality of devices such as portable electronic devices 500A, 500B, 500C, and/or 500D may be utilized in order to render one or more image(s) 510 of target 506 on portable electronic device 500B.
  • Each of the portable electronic devices 500A-500D may be configured to emit signals (e.g., waves or beams) 508 as shown in FIG. 5.
  • the image 510, or alternate views of the image or imaged structure may be rendered on the other portable electronic devices (e.g., 500A, 500C, and 500D) through communication with one-another.
  • each of the devices may be configured as emitting and/or sensing devices only.
  • the image 510 that is rendered on portable device 500B may be based at least in part on signals 508 that are emitted by one or more of the devices 500A-500D, and which are sensed through reflection (e.g., back-scatter) and/or transmission by one or more of the devices 500A-500D.
  • one or more portable electronic devices may generate and/or render an image based solely on signals received by one or more sensors (e.g., ultrasound transducers) of the device.
  • one or more portable electronic devices according to the present disclosure may generate and/or render an image based at least in part on information stored in memory (e.g., random access memory) of the portable device(s) identifying detail(s) regarding the structure(s), part(s), composition(s), and/or other characteristic(s) of object(s) to be imaged.
  • memory e.g., random access memory
  • the portable electronic devices may use stored data in addition to the received data in order to generate an image of the object and/or its constituent part(s), and/or to provide additional detail or explanation regarding an object and/or its constituent parts.
  • stored data may be compared to the received data in order to determine differences between a previously rendered image or frame and a current image or frame such that an updated image can be generated by changing the output of pixels corresponding to the determined differences.
  • the generated and/or rendered image may be a real-time or substantially real-time image that is dynamically updated based on movement of a portable electronic device 100 along a surface of an imaging target and/or motion of the imaging target.
  • FIG. 6A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments.
  • FIG. 6B illustrates a three-dimensional internal image 610 of a portion of a human body at the first position shown in FIG. 6A that is generated and/or rendered by a portable electronic device 100 according to some embodiments.
  • FIG. 6A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments.
  • FIG. 6B illustrates a three-dimensional internal image 610 of a portion of a human body at the first position shown in FIG
  • FIG. 6C illustrates a three-dimensional internal image 610 of a portion of a human body at the second position shown in FIG. 6A that is generated and/or rendered by a portable electronic device 100 according to some embodiments.
  • a three-dimensional internal image 610 of a portion of the human body may be generated and displayed to a viewer 617.
  • the three-dimensional image 610 may appear to the viewer 617 as an image having variations in, for example, topography that correspond to the surfaces and/or other aspects or features of the internal portion of the body at the first position of the portable electronic device 100 as shown in FIG. 6A.
  • the three-dimensional image 610 may be a real-time continuous image (e.g., video image) that is dynamically updated based on movement of the portable electronic device 100 and/or the internal portion of the body that is being analyzed. As shown in FIG. 6C, a different three-dimensional internal image 610 is displayed to the viewer 617 showing different underlying structures and/or aspects (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts). The three-dimensional internal image 610 shown in FIG. 6C corresponds to the internal image of the body portion corresponding to the second position of the portable electronic device 100 as shown in FIG. 6A. As shown in FIG.
  • the internal image 610 is illustrated as a different image showing different topographical and/or other aspects or features of the body portion than the internal image 610 shown in FIG. 6B.
  • different types of internal images of a target may be generated, such as a three-dimensional view of an entire organ or multiple organs.
  • the imaging elements including sensors and/or sources (e.g., transducers), may be provided on, in, or otherwise coupled to a case for a portable electronic device.
  • FIG. 7A illustrates a front view of a portable electronic device 700 according to some embodiments of the present disclosure.
  • the portable electronic device 700 includes an imaging interface 702.
  • FIG. 7B illustrates a back view of the portable electronic device 700 according to some embodiments of the present disclosure.
  • the portable electronic device 700 does not include imaging elements 104 as part of the main housing or enclosure of device 700.
  • FIG. 7C illustrates a front view of a case 711 for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 7D illustrates a back view of the case 711 including imaging elements for a portable electronic device according to some embodiments of the present disclosure.
  • the case 711 may be configured to be attached to the portable electronic device so as to at least partially enclose the portable electronic device 700.
  • case 711 may simultaneously provide imaging capabilities to portable electronic device 700 and serve as a protective case.
  • the case may be made of any suitable material such as rubber, plastic, leather, and/or the like. As shown in FIG.
  • an imaging circuit 712 e.g., an integrated circuit
  • Case 711 may be considered part of portable electronic device 700.
  • the imaging circuit 712 may include one or more imaging elements 104. As discussed above, the imaging elements 104 may include sources and/or sensors.
  • the imaging circuit 712 may also include a communication device 714 configured to communicate with the portable electronic device 700 via a wired or wireless link.
  • the imaging circuit 712 may include a communication transmitter/receiver which utilizes an infrared signal, a Bluetooth communication signal, a near-field communication signal, and/or the like to communicate with the portable electronic device 700.
  • the communication device 714 may be in communication with the processing circuitry of a portable electronic device through a wired communications link (e.g., a USB port, or other data port), or combination of wired and wireless links.
  • the imaging circuit 712 may receive power through wired and/or wireless connection(s) to the portable electronic device. In some embodiments, the imaging circuit 712 may receive power from a separate power source (e.g., a battery) that is coupled to the imaging circuit 712. In some embodiments, when the portable electronic device 700 is coupled to or attached to the case 711, a software application and/or drivers are automatically loaded and/or executed by the portable electronic device 700 in order to render an image based on communication with the imaging circuit 712. The software application and/or drivers may be stored in a memory of the imaging circuit 712 and communicated to the portable electronic device 700 and/or may be retrieved by the portable electronic device through a network (e.g., the internet).
  • a network e.g., the internet
  • the portable electronic device 700 receives raw data from the communication device 714 and processes the raw data using processing circuitry (e.g., image signal processor, digital signal processor, filters, and/or the like) included in the portable electronic device 700.
  • processing circuitry e.g., image signal processor, digital signal processor, filters, and/or the like
  • the imaging circuit 712 includes a local imaging processor 716 configured to process signals received by imaging elements 104.
  • the communication device 714 may be configured to communicate data received from the imaging elements 104 (e.g., such as raw sensor data) and/or may communicate processed data that is received from the local imaging processor 716.
  • the portable electronic device 700 includes an interface 702 for displaying an image that is rendered by processing signals received from the communication device 714.
  • an imaging circuit e.g., an integrated circuit
  • FIG. 8A illustrates a front view of a case 811 A for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8B illustrates a back view of the case 811 A including a retaining mechanism 820 for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8C illustrates a front view of a case 81 IB for a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8D illustrates a back view of the case 81 IB including a retaining mechanism for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.
  • FIG. 8E illustrates a modular unit 830 including an imaging circuit 712 according to some embodiments of the present disclosure.
  • the case 811 A has a different shape than the case 81 IB.
  • the case 811 A may be utilized for a first portable electronic device, while the case 81 IB may be utilized for a second portable electronic device having a different size and/or shape than the first portable electronic device.
  • Each of the cases 811 A and 81 IB includes a retaining mechanism 820 that is configured to retain the modular unit 830.
  • the modular unit 830 may include the imaging circuit 712 as discussed above with reference to FIGs. 7A-7D.
  • the imaging circuit 712 may include one or more imaging elements 104, a communication device 714, and/or a local imaging processor 716.
  • the modular unit 830 also includes a coupling mechanism 832 that is configured to engage with the retaining mechanism 820 of the cases 811 A and 81 IB.
  • the retaining mechanism 820 may correspond to a slot on the case 811 A and/or 81 IB that is configured to receive the modular unit 830.
  • the coupling mechanism 832 may be shaped to correspond to the slot of the case 811 A and/or 81 IB such that the modular unit 830 may be secured by the case 811 A and/or 81 IB.
  • the retaining mechanism 820 and the coupling mechanism 832 may include corresponding structures for locking the modular unit 830 in place during use.
  • the retaining mechanism 820 may include one or more magnets having a first polarity
  • the coupling mechanism 832 may include one or more magnets having a second polarity that is opposite of the first polarity such that the modular unit 830 can be retained by the case 81 1 A and/or 811B.
  • the modular unit 830 may be incorporated with different cases 811 A and/or 81 IB that are utilized for different portable electronic devices, the modular unit 830 may advantageously provide flexibility in the incorporation of an imaging system with different portable electronic devices.
  • different cases 811 A and 81 IB may be manufactured using any suitable techniques (e.g., 3-D printing, injection molding, or the like).
  • case 811 A and/or case 81 IB may be manufactured at low cost such that the different cases 811 A and 81 IB may be discarded and/or upgraded while remaining compatible with the modular unit 830.
  • the modular unit 830 can be integrated into and utilized by a user with a plurality of portable electronic devices even when the design of the portable electronic devices is changed (e.g., updated and/or upgraded).
  • FIGs. 9A-9F, 13, and 14A-14K Examples of suitable imaging devices that may integrated within or coupled to a portable electronic device according to some embodiments of the present disclosure are described in connection with FIGs. 9A-9F, 13, and 14A-14K below, and in commonly-owned U.S. Patent Application Serial No. 13/654,337 filed October 17, 2012, and entitled “Transmissive Imaging and Related Apparatus and Methods;” U.S. Provisional Application Serial No. 61/798,851 filed March 15, 2013, and entitled “Monolithic Ultrasonic Imaging Devices, Systems and Methods;” and U.S. Provisional Application Serial No. 61/794,744 filed on March 15, 2013, and entitled “Complementary Metal Oxide Semiconductor (CMOS) Ultrasonic Transducers and Methods for Forming the Same,” each of which is incorporated by reference in its entirety.
  • CMOS Complementary Metal Oxide Semiconductor
  • FIG. 9 A illustrates how, in some embodiments, a single transducer element 104 may fit within a larger transducer array 900.
  • Figs. 9B-F show five different examples of how a given transducer element 104 comprised of circular transducer cells 900 within an array 900 might be configured in some embodiments.
  • each transducer element 104 in an array 900 may include only a single transducer cell 902 (e.g., a single CUT or CMUT).
  • Figs. 9B shows five different examples of how a given transducer element 104 comprised of circular transducer cells 900 within an array 900 might be configured in some embodiments.
  • each transducer element 104 in an array 900 may include only a single transducer cell 902 (e.g., a single CUT or CMUT).
  • each transducer element 104 in an array 900 may include a group of individual transducer cells 902 (e.g., CUTs or CMUTs).
  • Other possible configurations of transducer elements 104 include trapezoidal elements, triangular elements, hexagonal elements, octagonal elements, etc.
  • each transducer cell 902 (e.g., CUT or CMUT) making up a given transducer element 104 may itself take on any of the aforementioned geometric shapes, such that a given transducer element 104 may, for example, include one or more square transducer cells 902, rectangular transducer cells 902, circular transducer cells 902, asterisk-shaped transducer cells 902, trapezoidal transducer cells 902, triangular transducer cells 902, hexagonal transducer cells 902, and/or octagonal transducer cells 902, etc.
  • At least two of (e.g., all) of the transducer cells 902 within each given transducer element 104 act as a unit and together generate outgoing ultrasonic pulses in response to the output of the same pulser (described below) and/or together receive incident ultrasonic pulses and drive the same analog reception circuitry.
  • the individual transducer cells 902 may be arranged in any of numerous patterns, with the particular pattern being chosen so as to optimize the various performance parameters, e.g., directivity, signal-to-noise ratio (SNR), field of view, etc., for a given application.
  • an individual transducer cell 902 may, for example, be on the order of about 20-1 ⁇ wide, and have a membrane thickness of about 0.5-1.0 ⁇ , and an individual transducer element 104 may have a depth on the order of about 0.1-2.0 ⁇ , and have a diameter of about 0.1mm-3mm, or any values in between.
  • Fig. 10A shows an illustrative example of a monolithic ultrasound device 1000 according to some embodiments.
  • the device 1000 may include one or more transducer arrangements (e.g., arrays) 900, a transmit (TX) control circuit 1004, a receive (RX) control circuit 1006, a timing & control circuit 1008, a signal conditioning/processing circuit 1010, a power management circuit 1018, and/or a high-intensity focused ultrasound (HIFU) controller 1020.
  • TX transmit
  • RX receive
  • a timing & control circuit 1008 a signal conditioning/processing circuit 1010
  • power management circuit 1018 e.g., a power management circuit
  • HIFU high-intensity focused ultrasound
  • the illustrated example shows both a TX control circuit 1004 and an RX control circuit 1006, in alternative embodiments only a TX control circuit or only an RX control circuit may be employed. For example, such embodiments may be employed in a circumstance where one or more transmission-only devices 1000 are used to transmit acoustic signals and one or more reception-only devices 1000 are used of receive acoustic signals that have been transmitted through or reflected by a subject being ultrasonically imaged.
  • Fig. 10B is a block diagram illustrating how, in some embodiments, the TX control circuit 1004 and the RX control circuit 1006 for a given transducer element 104 may be used either to energize the transducer element 104 to emit an ultrasonic pulse, or to receive and process a signal from the transducer element 104 representing an ultrasonic pulse sensed by it.
  • the TX control circuit 1004 may be used during a "transmission” phase
  • the RX control circuit may be used during a "reception" phase that is non-overlapping with the transmission phase.
  • one of the TX control circuit 1004 and the RX control circuit 1006 may simply not be used in a given device 1000, such as when a pair of ultrasound units 200 is used for only transmissive imaging.
  • a device 1000 may alternatively employ only a TX control circuit 1004 or only an RX control circuit 1006, and aspects of the present technology do not necessarily require the presence of both such types of circuits.
  • each TX control circuit 1004 and/or each RX control circuit 1006 may be associated with a single transducer cell 900 (e.g., a CUT or CMUT), a group of two or more transducer cells 902 within a single transducer element 104, a single transducer element 104 comprising a group of transducer cells 902, a group of two or more transducer elements 104 within an array 900, or an entire array 900 of transducer elements 104.
  • a single transducer cell 900 e.g., a CUT or CMUT
  • a single transducer element 104 comprising a group of transducer cells 902, a group of two or more transducer elements 104 within an array 900, or an entire array 900 of transducer elements 104.
  • the timing & control circuit 1008 may be responsible for synchronizing and coordinating the operation of all of the TX control circuit 1004/RX control circuit 1006 combinations on the die 1012
  • the signal conditioning/processing circuit 1010 may be responsible for handling inputs from all of the RX control circuits 1006 (see element 1005 in Fig. 10) on the die 1012.
  • the timing and control circuit 1008 may output either a "TX enable” signal to enable the operation of each TX control circuit 1004, or an "RX enable” signal to enable operation of each RX control circuit 1006.
  • a switch 1003 in the RX control circuit 1006 may always be opened before the TX control circuit 1004 is enabled, so as to prevent an output of the TX control circuit 1004 from driving the RX control circuit 1006.
  • the switch 1003 may be closed when operation of the RX control circuit 1006 is enabled, so as to allow the RX control circuit 1006 to receive and process a signal generated by the transducer element 104.
  • the TX control circuit 1004 for a respective transducer element 104 may include both a waveform generator 1007 and a pulser 1009.
  • the waveform generator 1007 may, for example, be responsible for generating a waveform that is to be applied to the pulser 1009, so as to cause the pulser 1009 to output a driving signal to the transducer element 104 corresponding to the generated waveform.
  • the RX control circuit 1006 for a respective transducer element 104 includes an analog processing block 1011, an analog-to-digital converter (ADC) 1013, and a digital processing block 1015.
  • the ADC 1013 may, for example, comprise a 10-bit, 20Msps, 40Msps, or 80Msps ADC.
  • the outputs of all of the RX control circuits 1006 on the die 1012 are fed to a multiplexer (MUX) 1017 in the signal conditioning/processing circuit 1010.
  • the MUX 1017 multiplexes the digital data from the various RX control circuits 1006, and the output of the MUX 1017 is fed to a multiplexed digital processing block 1019 in the signal conditioning/processing circuit 1010, for final processing before the data is output from the die 1012, e.g., via one or more high-speed serial output ports 1014. Examples of implementations of the various circuit blocks shown in Fig. 10B are discussed further below. As explained in more detail below, various components in the analog processing block 1011 and/or the digital processing block 1015 may serve to decouple waveforms from the received signal and otherwise reduce the amount of data that needs to be output from the die 1012 via a high-speed serial data link or otherwise.
  • one or more components in the analog processing block 1011 and/or the digital processing block 1015 may thus serve to allow the RX control circuit 1006 to receive transmitted and/or scattered ultrasound pressure waves with an improved signal-to-noise ratio (SNR) and in a manner compatible with a diversity of waveforms.
  • SNR signal-to-noise ratio
  • the inclusion of such elements may thus further facilitate and/or enhance the disclosed "ultrasound-on-a-chip" solution in some embodiments.
  • analog processing block 1011 Although particular components that may optionally be included in the analog processing block 1011 are described below, it should be appreciated that digital counterparts to such analog components may additionally or alternatively be employed in the digital processing block 1015. The converse is also true. That is, although particular components that may optionally be included in the digital processing block 1015 are described below, it should be appreciated that analog counterparts to such digital components may additionally or alternatively be employed in the analog processing block 1011.
  • Fig. 11 illustrates an example of a technique for biasing the transducer elements 104 in an array 900.
  • the side of each of the transducer elements 104 that faces the patient may be connected to ground, so as to minimize risk of electric shock.
  • the other side of each transducer element 104 may be connected to the output of the pulser 1009 via a resistor 1102. Accordingly, each transducer element 104 is always biased via the output of the pulser 1009, regardless of whether the switch SI is open or closed.
  • the bias voltage applied across the element may be on the order of 100V.
  • the switch S 1 may be closed during a transmit operation and may be open during a receive operation.
  • the switch S2 may be closed during a receive operation and may be open during a transmit operation. (Note that there is always a gap between the opening of switch S 1 and the closing of switch S2, as well as between the opening of switch S2 and the closing of switch SI, so as to ensure the pulser 1009 does not apply an outgoing pulse to the LNA 1101 in the RX control circuit 1006.)
  • the pulser 1009 may hold the bottom plate of the transducer element 104 at its high output level at all times except when it is applying a waveform pulse to its transducer element 104, and the waveform pulse applied during the transmit phase may be referenced from the high output level of the pulser 1009. Accordingly, each individual pulser 1009 is able to maintain a bias on its corresponding transducer element 104 at all times.
  • a capacitor 1104 may be placed between the switch S2 and the LNA 1101 of the RX control circuit 1006 so as to block the DC bias signal (i.e., the high output of the pulser 1009) from reaching the LNA 1101 during receive operations (i.e., when switch S2 is closed).
  • the DC bias signal i.e., the high output of the pulser 1009
  • Biasing the transducer elements 104 via their respective pulsers 1009 may provide benefits in some embodiments, such as reducing cross-talk that would otherwise occur if the elements 104 were biased via a common bus, for example.
  • Fig. 12 shows an example implementation of the RX control circuit 1006 that includes a matched filter 1202 that may, for example, perform waveform removal and improve the signal-to-noise ratio of the reception circuitry.
  • the analog processing block 1011 may, for example, include a low-noise amplifier (LNA) 1201, a variable-gain amplifier (VGA) 1204, and a low-pass filter (LPF) 1206.
  • the VGA 1204 may be adjusted, for example, via a time-gain compensation (TGC) circuit.
  • TGC time-gain compensation
  • the LPF 1206 provides for anti-aliasing of the acquired signal.
  • the LPF 1206 may, for example, comprise a 2 nd order low-pass filter having a frequency cutoff on the order of 5MHz.
  • Other implementations are, however, possible and contemplated.
  • the digital control block 1015 of the RX control circuit 1006 includes a digital quadrature demodulation (DQDM) circuit 1208 and an output buffer 1216.
  • the DQDM circuit 1208 may, for example, be configured to mix down the digitized version of the received signal from center frequency to baseband, and then low-pass filter and decimate the baseband signal.
  • An illustrative embodiment of a circuit suitable for use as the matched filter 1202 is shown in FIG. 13.
  • the filter circuit 1202 may actually operate as either a matched filter or a mismatched filter so as to decouple waveforms from the received signal.
  • the matched filter 1202 may work for either linear frequency modulated (LFM) or non-LFM pulses.
  • the matched filter 1202 may, for example, include a padding circuit 1302, a fast Fourier transformation (FFT) circuit 1304, a multiplier 1306, a low-pass filter 1308, a decimator circuit 1310, and an inverse FFT circuit 1312.
  • FFT fast Fourier transformation
  • the padding circuit 1302 may, for example, apply padding to the incoming signal sufficient to avoid artifacts from an FFT implementation of circular convolution.
  • the value of ⁇ ( ⁇ )" applied to the multiplier 1306 should be a conjugate of the transmission waveform ⁇ ⁇ ( ⁇ ).
  • the filter 2202 may thus indeed operate as a "matched” filter, by applying a conjugate of the transmission waveform ⁇ ⁇ ( ⁇ ) to the multiplier 1306.
  • the "matched" filter 2202 may instead operate as a mismatched filter, in which case some value other than a conjugate of the transmission waveform ⁇ ⁇ ( ⁇ ) may be applied to the multiplier 1306.
  • CMOS wafer 1400 including a substrate 1402, a dielectric or insulating layer 1404, a first metallization layer 1406 and a second metallization layer 1408, which in some embodiments may be a top metallization layer of the CMOS wafer 1400.
  • the substrate 1402 may be silicon or any other suitable CMOS substrate.
  • the CMOS wafer 1400 may include CMOS integrated circuitry (IC), and thus the substrate 1402 may be a suitable substrate for supporting such circuitry.
  • the insulating layer 1404 may be formed of Si0 2 or any other suitable dielectric insulating material. In some embodiments, the insulating layer 1404 may be formed via tetraethyl orthosilicate (TEOS), though alternative processes may be used.
  • TEOS tetraethyl orthosilicate
  • CMOS wafer 1400 is shown as including two metallization layers 1406 and 1408, it should be appreciated that CMOS wafers according to the various aspects of the present application are not limited to having two metallization layers, but rather may have any suitable number of metallization layers, including more than two in some embodiments. Such metallization layers may be used for wiring (e.g., as wiring layers) in some embodiments, though not all embodiments are limited in this respect.
  • the first and second metallization layers 1406 and 1408 may have any suitable construction.
  • at least the second metallization layer 1408 may have a multi-layer construction, including a middle conductive layer 1412 (e.g., formed of aluminum or other suitable conductive material) and upper and lower liner layers 1410 and 1414, respectively.
  • the liner layers 1410 and 1414 may be formed of titanium nitride (TiN) or other suitable conductive material (e.g., metals other than TiN, such as tantalum, or other suitable metals for acting as a liner).
  • the upper liner layer 1410 may be used as an etch stop, for example during one or more etch steps used in as part of a process for forming a cavity for an ultrasonic transducer.
  • the liner layer 1410 may be formed of a material suitable to act as an etch stop in some embodiments.
  • the first and second metallization layers 1406 and 1408, as well as any other metallization layers described herein may optionally include silicon oxynitride (SiON) as an upper layer (e.g., on top of liner layer 1410) to serve as an anti-reflective coating during lithography stages.
  • SiON silicon oxynitride
  • the second metallization layer 1408 may be used to make electrical contact to a membrane of a CUT to be formed on the CMOS wafer. Accordingly, as shown in FIG. 14B, the second metallization layer 1408 may be suitably patterned to form an electrode 1416 and one or more contacts 1418.
  • FIG. 14B illustrates a configuration in which an electrode and electrical contacts are formed on a CMOS wafer from a metallization layer
  • an electrode e.g., electrode 1416
  • electrical contacts e.g., electrical contacts 1418
  • conductive materials other than metals but suitable to act as electrodes and/or electrical contacts may be suitably processed on the CMOS wafer to form the illustrated electrode and/or electrical contacts.
  • An insulating layer 1420 may then be deposited as shown in FIG. 14C.
  • the insulating layer 1420 may be Si0 2 or any other suitable insulator, and may be formed in any suitable manner.
  • the insulating layer 1420 may be formed by high density plasma (HDP) deposition.
  • the insulating layer 1420 may then be planarized (not shown), for example using chemical mechanical polishing (CMP) or other suitable planarization technique.
  • CMP chemical mechanical polishing
  • the insulating layer 1420 may be etched as shown to expose the upper surface of the electrode 1416 and electrical contacts 1418.
  • the upper liner layer 1410 may be used as an etch stop for a selective etch used to etch the insulating layer 1420.
  • the liner layer 1410 may be formed of TiN and may be used as an etch stop, though not all embodiments are limited in this respect.
  • a further insulating layer 1422 may be deposited as shown in FIG. 14E to cover the upper surfaces of the electrode 1416 and electrical contacts 1418 and may then be patterned as shown in FIG. 14F to open contact holes 1424 for the electrical contacts 1418.
  • the insulating layer 1422 may be Si0 2 or any other suitable insulator.
  • a conductive layer 1426 may be deposited.
  • the conductive layer may be used to form electrical contacts to a membrane of an ultrasonic transducer, as will be shown in connection with FIG. 14 J.
  • the conductive layer 1426 may be patterned to form a cavity therein for a CUT, with a remaining portion of the conductive layer 1426 defining one or more sidewalls of the cavity.
  • the conductive layer 1426 may also represent a spacer in that a membrane may be separated from the surface of the CMOS wafer 1400 by the height of the conductive layer 1426.
  • the conductive layer 1426 may serve one or more of multiple possible functions.
  • the conductive layer 1426 may be formed of any suitable conductive material.
  • the conductive layer 1426 may be formed of a metal.
  • the conductive layer 1426 may be TiN in some embodiments.
  • the conductive layer 1426 may be planarized (not shown) using CMP or other suitable planarization technique, and then may be patterned as shown in FIG. 14H to form contacts 1428. It can be seen that at this stage a cavity 1430 has been formed in the CMOS wafer with the contacts 1428 serving to at least partially define the cavity.
  • the contacts 1428 (which in some embodiments may represent a single contact forming a closed contour) function as sidewalls of the cavity 1430 in the embodiment illustrated and, as will be further appreciated from consideration of FIG. 14K, create a standoff between the electrode 1416 and a membrane overlying the cavity 1430.
  • a second wafer 1431 may be bonded to the CMOS wafer.
  • the second wafer may be any suitable type of wafer, such as a bulk silicon wafer, a silicon-on-insulator (SOI) wafer, or an engineered substrate including a polysilicon or amorphous silicon layer with an insulating layer between a single crystal silicon layer and the polysilicon or amorphous silicon layer.
  • the second wafer 1431 may include four layers including a base layer or handle layer 1432, insulating layer 1434, layer 1436, and layer 1438.
  • the second wafer 1431 may be used to transfer layers 1436 and 1438 to the CMOS wafer for forming a membrane over cavity 1430, and thus may be referred to herein as a transfer wafer.
  • the base layer 1432 may be a silicon layer (e.g., single crystal silicon), the insulating layer 1434 may be Si0 2 and may represent a buried oxide (BOX) layer, and layer 1436 may be silicon.
  • the layer 1436 may be degeneratively doped silicon phosphide (SiP+).
  • the layer 1436 may be polysilicon or amorphous silicon, though other embodiments may utilize single crystal silicon.
  • the layer 1438 may be formed of a material suitable for bonding to the contacts 1428 on the CMOS wafer.
  • the contacts 1428 and layer 1438 may be formed of the same material.
  • the contacts 1428 and layer 1438 may be formed of TiN.
  • the process used for bonding the second wafer 1431 to the CMOS wafer 1400 may be a low temperature bonding process, for example not exceeding 450° C.
  • the temperature of the bonding process may be between approximately 200° C and 450° C, between approximately 300° C and approximately 400° C, any temperature(s) within those ranges, any other temperature described herein for low temperature bonding, or any other suitable temperature.
  • damage to the metallization layers on the CMOS wafer, and any ICs on the CMOS wafer may be avoided.
  • the wafer bonding process may be one of various types.
  • the wafer bonding may be direct bonding (i.e., fusion bonding).
  • the wafer bonding may involve energizing respective surfaces of the CMOS and second wafers and then pressing the wafers together with suitable pressure to create the bond.
  • a low temperature anneal may be performed.
  • fusion bonding represents one example of a suitable bonding technique, other bonding techniques may alternatively be used, including for example bonding two wafers through the use of one or more intermediate layers (e.g., adhesive(s)).
  • anodic or plasma assisted bonding may be used.
  • the bonding illustrated in FIGs. 14I-14J may result in the second wafer 1431 being monolithically integrated with the CMOS wafer 1400. Thus, the two may form a unitary body in some situations.
  • a membrane may then be formed from the second wafer 1431.
  • the second wafer 1431 may be thinned from the backside.
  • Such thinning may be performed in stages. For example, mechanical grinding providing coarse thickness control (e.g., 10 micron control) may initially be implemented to remove a relatively large amount of the bulk wafer. In some embodiments, the thickness control of the mechanical grinding may vary from coarse to fine as the thinning process progresses. Then, CMP may be performed on the backside, for example to get to a point close to the layer 1436. Next, a selective etch, such as a selective chemical etch, may be performed to stop on the layer 1436. Other manners of thinning are also possible. [0107] Thus, as shown in FIG.
  • the base layer or handle layer 1432 and insulating layer 1434 may be removed.
  • a membrane 1440 formed of the layer 1436 and layer 1438 may remain.
  • the membrane may be any suitable thickness TM, non-limiting examples of which are described below.
  • the layer 1436 may be etched or otherwise thinned to provide a desired membrane thickness.
  • the structure includes a sealed cavity 1430 which is sealed by the membrane 1440.
  • the sidewalls of the cavity are conductive, i.e., the contacts 1428 are conductive and form the sidewalls of the sealed cavity.
  • the contacts 1428 represent a conductive standoff for the membrane 1440 from the surface of the CMOS wafer.
  • the contacts 1428 may be relatively large area electrical contacts and make contact with a relatively large area of the membrane, thus providing a low resistivity electrical path to/from the membrane.
  • the contacts may provide electrical control between the membrane and an IC on the CMOS wafer (e.g., disposed beneath the cavity) which may interact with the membrane to provide/receive electrical signals and thus in some embodiments control operation of the membrane.
  • the membrane 1440 has a first side 1442 proximate the cavity 1430 and a second side 1444 distal the cavity, and that direct electrical contact is made to the first side 1442 via the contacts 1428.
  • the first side 1442 may be referred to as a bottom side of the membrane and the second side 1444 may be referred to as a top side of the membrane.
  • Local connection to the membrane 1440 may be made in this manner, and the membrane 1440 may be connected to integrated circuitry in the CMOS wafer via this connection (e.g., via contact 1418).
  • an IC may be positioned beneath the cavity 1430 and the conductive path configuration illustrated may facilitate making connection between the integrated circuitry beneath the cavity and the membrane 1440.
  • FIG. 14K provides a non- limiting example of an embedded contact to the membrane, in that electrical contact is provided by way of a conductive path in the CMOS wafer (e.g., to contact 1418) rather than a contact made on the second side 1444.
  • Such a configuration may be preferable to making electrical contact on the second side 1444 since any contact on the second side 1444 may (negatively) impact vibration of the membrane 1440.
  • the electrode 1416 is narrower than the cavity 1430. Namely, the electrode 1416 has a width Wl less than a width W2 of the cavity 1430.
  • Such a configuration may be desirable at least in those embodiments in which the cavity has conductive sidewalls (e.g., the contacts 1428) to provide electrical isolation between the sidewalls and the electrode.
  • FIG. 14K may be altered by not including the layer 1438 in an embodiment.
  • a direct bond may be formed between contacts 1428 (e.g., formed of TiN) and layer 1436 (e.g., silicon).
  • the structure illustrated in FIG. 14K may have any suitable dimensions. Non- limiting examples of dimensions for the membrane 1440 and cavity 1430 are described further below.
  • the width W2 of the cavity 1430 may be between Dl approximately 5 microns and approximately 500 microns, between approximately 20 microns and approximately 100 microns, may be approximately 30 microns, approximately 40 microns, approximately 50 microns, any width or range of widths in between, or any other suitable width.
  • the width may be selected to maximize the void fraction, i.e., the amount of area consumed by the cavity compared to the amount of area consumed by surrounding structures.
  • the width dimension may also be used to identify the aperture size of the cavity, and thus the cavities may have apertures of any of the values described above or any other suitable values.
  • the depth may be between approximately 0.05 microns and approximately 10 microns, between approximately 0.1 microns and approximately 5 microns, between approximately 0.5 microns and approximately 1.5 microns, any depth or range of depths in between, or any other suitable depth.
  • the contacts 1428 are formed of TiN, it may be preferable in such embodiments for Dl to be less than 5 microns, since TiN is commonly formed as a thin film.
  • the cavity dimensions and/or the membrane thickness of any membrane overlying the cavity may impact the frequency behavior of the membrane, and thus may be selected to provide a desired frequency behavior (e.g., a desired resonance frequency of the membrane).
  • an ultrasonic transducer with a center resonance frequency of between approximately 20 kHz and approximately 200 MHz, between approximately 1 MHz and approximately 10 MHz, between approximately 2 MHz and approximately 5 MHz, between approximately 50 kHz and approximately 200 kHz, of approximately 2.5 MHz, approximately 4 MHz, any frequency or range of frequencies in between, or any other suitable frequency.
  • a center resonance frequency of between approximately 20 kHz and approximately 200 MHz, between approximately 1 MHz and approximately 10 MHz, between approximately 2 MHz and approximately 5 MHz, between approximately 50 kHz and approximately 200 kHz, of approximately 2.5 MHz, approximately 4 MHz, any frequency or range of frequencies in between, or any other suitable frequency.
  • the dimensions of the cavity and/or membrane may be selected accordingly.
  • the membrane thickness TM (e.g., as measured in the direction generally parallel to the depth Dl) may be less than 100 microns, less than 50 microns, less than 40 microns, less than 30 microns, less than 20 microns, less than 10 microns, less than 5 microns, less than 1 micron, less than 0.1 microns, any range of thicknesses in between, or any other suitable thickness.
  • the thickness may be selected in some embodiments based on a desired acoustic behavior of the membrane, such as a desired resonance frequency of the membrane.
  • FIGs. 22A-22D illustrate various potential shapes for cavity 1430 and the other cavities described herein.
  • FIGs. 22A-22D illustrate top views of a portion 2200 of a CMOS wafer having cavities 1430 formed therein of various shapes.
  • FIG. 22A illustrates that the cavities 1430 may have a square aperture.
  • FIG. 22B illustrates the cavities 1430 may have a circular aperture.
  • FIG. 22C illustrates the cavities may have a hexagonal aperture.
  • FIG. 22D illustrates the cavities 1430 may have an octagonal aperture.
  • Other shapes are also possible.
  • FIG. 14K illustrates an ultrasonic transducer which has a membrane 1440 overlying the cavity 1430, wherein the membrane has a substantially uniform thickness.
  • Ultrasonic transducers such as that illustrated in FIG. 14K may be used to send and/or receive acoustic signals.
  • the operation of the transducer in terms of power generated, frequencies of operation (e.g., bandwidth), and voltages needed to control vibration of the membrane may depend on the shape and size of the membrane.
  • a membrane shaped as a piston with a center mass-like portion that is connected to a CMOS wafer by a thinner peripheral portion may provide various beneficial operating characteristics.
  • an aspect of the present application provides ultrasonic transducers having piston membranes.
  • Such transducers may be formed by wafer bonding processes according to some embodiments of the present application.
  • the thicker center portion of such membranes may be formed on the top side or bottom side of the membrane, and may be formed prior to or after wafer bonding.
  • inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a non- transitory computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above.
  • the computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above.
  • computer readable media may be non-transitory media.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects as described above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present application need not reside on a single computer or processor, but may be distributed in a modular fashion among a number of different computers or processors to implement various aspects of the present application.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • data structures may be stored in computer-readable media in any suitable form.
  • data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields.
  • any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
  • the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
  • a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer, as non-limiting examples. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
  • PDA Personal Digital Assistant
  • a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible formats.
  • Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet.
  • networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • some aspects may be embodied as one or more methods. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
  • a reference to "A and/or B", when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • the phrase "at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified.
  • At least one of A and B can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Gynecology & Obstetrics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Telephone Set Structure (AREA)

Abstract

A portable electronic device (e.g., smart phone or tablet computer) is provided for generating and displaying images (e.g., 2-dimensional or 3-dimensional images) of an imaging target such as a human body. The portable electronic device may include imaging elements configured to receive radiation signals transmitted through and/or reflected by the imaging target, an imaging interface, and one or more processors. The portable electronic device may display what appears to be a window into the imaging target (e.g., a human body), and/or an exploded view (e.g., 3-dimensional, upwardly projected image) of the target. The generated image may be a real-time continuous image of the internal features of the target (e.g., a human body) that is updated to track movements of the target (e.g., breathing patterns) and the relative position of the portable electronic device as the portable electronic device moves relative to a surface of the target.

Description

PORTABLE ELECTRONIC DEVICES
WITH INTEGRATED IMAGING CAPABILITIES
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Patent Application No. 13/856,252, filed on April 3, 2013, and entitled "Portable Electronic Devices with Integrated Imaging Capabilities," and incorporates its disclosure herein by reference in its entirety.
FIELD OF THE TECHNOLOGY
[0002] The present disclosure relates generally to imaging devices and methods (e.g., ultrasound imaging devices and methods).
BACKGROUND
[0003] Imaging technologies are used at various stages of medical care. For example, imaging technologies are used to non-invasively diagnose patients, to monitor the performance of medical (e.g., surgical) procedures, and/or to monitor post-treatment progress or recovery.
[0004] Conventional imaging devices and methods, including magnetic resonance imaging (MRI) technology, are typically configured for and limited to use within a fixed location in a hospital setting. MRI technology is also generally slow, and suffers from other drawbacks including high cost, loud noise, and the use of potentially harmful magnetic fields.
[0005] In view of the foregoing, it would be desirable to provide portable electronic devices and associated methods with integrated imaging capabilities.
SUMMARY
[0006] Some embodiments of the present disclosure relate to a portable electronic device (e.g., smart phone and/or tablet computer) for generating and displaying an image (e.g., 2- dimensional or 3-dimensional image) of what appears to be a window into an underlying object, such as a human body, when placed in proximity to (e.g., on or close to) the object. The window and corresponding image displayed on a display screen of the portable electronic device change as the portable electronic device is moved over various portions of the body (e.g., abdomen, thorax, etc.). The image displayed by the portable electronic device may identify, for example, organs, arteries, veins, tissues, bone, and/or other bodily contents or parts. In various embodiments, the image may be presented in 3 dimensions such at it appears to the viewer as if the viewer is looking into the body, or as if the body parts have been projected up (e.g., exploded view) from the body.
[0007] The present disclosure provides numerous embodiments of systems, apparatus, computer readable media, and methods for providing imaging functionality using a portable electronic device, such as, for example, a smart phone or a tablet computer. In some embodiments, the portable electronic device is configured to generate and display an image of what appears to be an exploded view (e.g., 3 -dimensional, upwardly projected image) of an object or its constituent parts. In some embodiments, movement of the portable electronic device results in the rendering of a different internal image of the target (e.g., different portion(s) of a human body). In some embodiments, the generated window of the underlying object (e.g., the portion of the human body) may provide an internal view of the object (e.g., a three-dimensional rendering of an organ or a portion of an organ).
[0008] In some embodiments according to one aspect of the present disclosure, a portable electronic device is provided that includes a processor configured to generate an image (e.g., ultrasound image) of an internal feature of a target when the device is positioned at an external surface of the target, and a display configured to display the image.
[0009] In some embodiments according to another aspect of the present disclosure, a portable ultrasound device is provided that includes multiple ultrasound elements configured to receive ultrasound radiation reflected by or passing through a target when the ultrasound device is pointed at the target. The portable ultrasound device also includes a display configured to display an image of an internal feature of the target based at least in part on the ultrasound radiation received by the plurality of ultrasound elements.
[0010] In some embodiments according to another aspect of the present disclosure, a method is provided that includes pointing a portable electronic device at an external surface of a subject, and viewing, on a display of the portable electronic device, an image of an internal feature of the subject while pointing the portable electronic device at the external surface of the subject. In some embodiments, the portable electronic device includes a radiation sensor, and the method further includes receiving, with the radiation sensor, radiation reflected by or passing through the subject, and creating the image of the internal feature based at least in part on the radiation received by the radiation sensor.
[0011] In some embodiments according to yet another aspect of the present disclosure, a portable electronic device is provided that renders within a window on a display of the device an image (e.g., 3-dimensional image) of an inside of a human body when the device is directed at the body (e.g., within about one meter or less of the body). In some embodiments, the image changes to reflect additional body parts as the device is moved relative to the body.
[0012] In some embodiments, the portable electronic device renders the image by processing radiation signals received by a radiation sensor which are reflected by or passing through the human body. In some embodiments, the portable electronic device is positioned within approximately one meter from the body (e.g., within 0.75 to 1.25 meters from the body).
[0013] In some embodiments according to another aspect of the present disclosure, a portable electronic device is provided that includes multiple imaging elements configured to receive radiation signals transmitted through or reflected by an imaging target and an imaging interface. The portable electronic device also includes one or more processors configured to receive one or more sensing signals from at least one of the plurality of imaging elements, and to render an image of the imaging target for display through the imaging interface based at least in part on the one or more sensing signals.
[0014] In some embodiments, each of the imaging elements is separately processed by a corresponding imaging processor, such that combining the processed signals of the imaging elements is utilized to render images having higher resolution and/or higher frame rate than images rendered based on a single imaging element. In some embodiments, the combined processed signals are used to generate a three-dimensional image of the target.
[0015] In some embodiments, the portable electronic device is a handheld portable electronic device, such as a cellular phone or a tablet computer.
[0016] In some embodiments, imaging elements may include their own dedicated processing circuitry, such as a graphic processing unit (GPU), digital signal processor (DSP), and/or central processing unit (CPU), and/or may utilize processing circuitry of the portable electronic device. For example, in some embodiments, the CPU and/or GPU of the portable electronic device 100 may be utilized for image acquisition/reconstruction and image rendering. In some embodiments, the CPU of portable electronic device may be utilized to process computations based on received signals (e.g., back-scattered signals and/or transmissive signals) in order to generate an image or topography, while the GPU may be utilized to render an image based on the information received from the CPU to generate a real-time or substantially real-time image display. In some embodiments, portable electronic device may include one or more components for processing, filtering, amplification, and/or rendering images.
[0017] In some embodiments, the processing circuitry may be fabricated on the same semiconductor chip as the ultrasound elements.
[0018] In some embodiments, the portable electronic device may include one or more processors for identifying structure(s) identified in the image based at least in part on stored data (e.g., data stored in random access memory or other storage device of portable electronic device). For example, data stored within device may identify characteristic(s) of structure(s) (e.g., one or more shapes, colors, textures, cellular characteristics, tissue characteristics, and/or other distinctive and/or surrounding features or structures) that may be present within different areas of the human body for use by personal electronic device to identify and/or predict the type(s) of structures depicted in an image rendered by device. In some embodiments, data stored within device may identify characteristics of particular disease(s) such as cancer or other abnormalities (e.g., based on image data corresponding to previously stored characteristics of particular structures associated with a disease) for use by personal electronic device to identify and/or predict the type(s) of structures depicted in an image rendered by device.
BRIEF DESCRIPTION OF DRAWINGS
[0019] Aspects and embodiments of the present disclosure will be described with reference to the following figures. It should be appreciated that the figures are not necessarily drawn to scale. Items appearing in multiple figures are indicated by the same reference number in all the figures in which they appear. [0020] FIG. 1A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a human body or a portion of a human body according to some embodiments of the present disclosure.
[0021] FIG. IB illustrates a three-dimensional internal image of a portion of a human body that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
[0022] FIG. 2A illustrates a front view of portable electronic device including an imaging interface according to some embodiments of the present disclosure.
[0023] FIG. 2B illustrates a back view of portable electronic device including imaging elements according to some embodiments of the present disclosure.
[0024] FIG. 3 illustrates a transmissive imaging system and method according to some embodiments of the present disclosure.
[0025] FIG. 4 illustrates a reflective imaging system and method according to some embodiments of the present disclosure.
[0026] FIG. 5 illustrates a transmissive and/or reflective imaging system and method according to some embodiments of the present disclosure.
[0027] FIG. 6A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments of the present disclosure.
[0028] FIG. 6B illustrates a three-dimensional internal image of a portion of a human body at the first position shown in FIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
[0029] FIG. 6C illustrates a three-dimensional internal image of a portion of a human body at the second position shown in FIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
[0030] FIG. 7A illustrates a front view of a portable electronic device according to some embodiments of the present disclosure.
[0031] FIG. 7B illustrates a back view of a portable electronic device according to some embodiments of the present disclosure. [0032] FIG. 7C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
[0033] FIG. 7D illustrates a back view of a case including imaging elements for a portable electronic device according to some embodiments of the present disclosure.
[0034] FIG. 8A illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
[0035] FIG. 8B illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure.
[0036] FIG. 8C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
[0037] FIG. 8D illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure.
[0038] FIG. 8E illustrates a modular unit including an imaging circuit according to some embodiments of the present disclosure.
[0039] FIG. 9A illustrates how, in some embodiments, a single transducer element may fit within a larger transducer array.
[0040] FIGs. 9B-F show five different examples of how a given transducer element within an array might be configured in some embodiments.
[0041] FIG. 10A shows an illustrative example of a monolithic ultrasound device according to some embodiments.
[0042] FIG. 10B is a block diagram illustrating how, in some embodiments, the TX control circuit and the RX control circuit for a given transducer element may be used either to energize the element to emit an ultrasonic pulse, or to receive and process a signal from the element representing an ultrasonic pulse sensed by it.
[0043] FIG. 11 illustrates an example technique for biasing transducer elements in an array or other arrangement. [0044] FIGs. 12 and 13 show illustrative examples of components that may be included within the analog processing block and the digital processing block of the RX control circuit shown in Fig. 10.
[0045] FIGs. 14A-14K illustrate a process sequence for fabricating a CMOS ultrasonic transducer (CUT) having a membrane formed above a cavity in a CMOS wafer, according to a non-limiting embodiment of the present application.
DETAILED DESCRIPTION
[0046] According to some embodiments of the present disclosure, a portable electronic device is provided that includes an imaging interface and one or more imaging elements. For example, the portable electronic device may be a cellular phone, personal digital assistant, smart phone, tablet device, digital camera, laptop computer, or the like. An image may be generated and/or rendered utilizing the portable electronic device. For example, the portable electronic device may be utilized to simulate a "window" into an imaging target, such as a human body or portion of the body. The simulated "window" may provide a view of the inside of a human body or portion of the body, including organs, arteries, veins, tissues, bone, and/or other bodily contents or parts. For example, an image (e.g., ultrasound or sonographic image) may be generated that illustrates and/or simulates internal features of the imaging target for a user. In some embodiments, a real-time continuous or substantially real-time continuous image (e.g., 10 frames/second, 20 frames/second, 25 frames/second, 30 frames/second, or the like) may be generated and/or rendered such that movement of the portable electronic device results in a substantially real-time updated image of the area that corresponds to the new position of the portable electronic device. In some embodiments, internal movement of the target object (e.g., such as expansion and/or contraction of organs) may be rendered in real-time by the portable electronic device.
[0047] In some embodiments, the portable electronic devices and methods described herein may include, be coupled to (e.g., via a suitable communications connection or port such as a USB link), or otherwise utilize one or more radiation sources, sensors, and/or transducers (e.g., array(s) of ultrasound transducers), front-end processing circuitry and associated processing techniques, and/or image reconstruction devices and/or methods, in order to generate and/or render images to a user according to the non-limiting embodiments described in detail throughout the present disclosure.
[0048] In some embodiments of the present disclosure, one or more of the devices described in Figures 1A-8E herein may include or be coupled to one or more ultrasound imaging elements (e.g., one or more arrays of ultrasound sources, sensors, and/or transducers). One or more computers or processors within the portable electronic device may perform image analysis and/or image rendering based at least in part on radiation signals received by an imaging device.
[0049] FIG. 1 A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a human body or a portion of a human body 106 according to some embodiments. FIG. IB illustrates a three-dimensional internal image 110 of a portion of a human body that is generated and/or rendered by a portable electronic device 100 according to some embodiments. As shown in FIG. 1A, the portable electronic device 100 may be positioned in an area near (e.g. in contact with the surface of or within about one meter from the surface of) a portion of a human body that is to be imaged and/or analyzed. The portable electronic device 100 may include imaging elements 104 that are configured to transmit and/or receive radiation signals. The imaging elements 104, along with other components and functions of the portable electronic device 100 according to some embodiments of the present disclosure will be described in greater detail below with reference to FIG. 2A-2B. An internal image 110 as shown in FIG. IB may be generated by the portable electronic device 100. The internal image 1 10 may be a three- dimensional internal image of a portion of the human body that appears to a viewer 117 to project upward from a surface of the portable electronic device 100, giving the viewer the perception of a viewing window into the underlying body. Through generation of the internal image, the portable electronic device 100 may provide a window into the internal areas of the human body that are below the surface. As will be described in greater detail below with reference to FIGs. 6A-6C, the generated images may be real-time continuous images such that the images are dynamically updated based on movement of the portable electronic device 100 and/or the image target (e.g., internal organs of the human body). [0050] FIG. 2A illustrates a front view of portable electronic device 100 including an imaging interface 102 according to some embodiments of the present disclosure. The imaging interface 102 of the portable electronic device 100 may include a display that is configured to output a two-dimensional (2-D) or three-dimensional (3-D) image of an imaging target. In some embodiments, the imaging interface 102 is interactive and is capable of receiving user input, for example through a touch-screen. An image that is displayed via the imaging interface 102 may be adjusted based on the received inputs, for example, to adjust zoom level, centering position, level of detail, depth of an underlying object to be imaged, resolution, brightness, color and/or the like of the image. For example, in some embodiments, imaging interface 102 may be configured to allow a user to selectively traverse various layers and imaging depths of the underlying object using, for example, the touch screen.
[0051] Portable electronic device 100 may render a three-dimensional image of the imaging target using any suitable method or combination of methods (e.g., anaglyph, polarization, eclipse, interference filtering, and/or austosteroscopy). For example, in some embodiments, the imaging interface 102 includes a circular polarizer and/or a linear polarizer such that a viewer having polarizing filtering spectacles can view a three-dimensional image. In some embodiments, the imaging interface 102 is configured to display alternating left and right images such that a viewer having spectacles with shutters that alternate in conjunction with the displayed image views the image as a three-dimensional image. In some embodiments, the imaging interface 102 may utilize an autostereoscopy method such that 3- D spectacles are not necessary for use by a viewer to view the three-dimensional image.
[0052] In some embodiments, portable electronic device 100 may display information (e.g., text and/or graphics) in addition to (e.g., graphically overlaid on top of or adjacent to) an image of a targeted object, such as, for example, text and/or graphics identifying the structure(s) identified in the image (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts). In some embodiments, portable electronic device 100 may include one or more processors for identifying structure(s) identified in the image based at least in part on stored data (e.g., data stored in random access memory or other storage device of portable electronic device 100). For example, data stored within device 100 may identify characteristic(s) of structure(s) (e.g., one or more shapes, colors, textures, cellular characteristics, tissue characteristics, and/or other distinctive and/or surrounding features or structures) that may be present within different areas of the human body for use by personal electronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered by device 100. In some embodiments, data stored within device 100 may identify characteristics of particular disease(s) such as cancer or other abnormalities for use by personal electronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered by device 100. In some embodiments, the image, text, graphics, and/or other information displayed on the user interface 104 may be adjusted through user interaction with one or more inputs (e.g., touch screen, buttons, touch-sensitive areas, or the like) of the portable electronic device 100.
[0053] FIG. 2B illustrates a back view of portable electronic device 100 including imaging elements 104 according to some embodiments of the present disclosure. The imaging elements 104 may be configured as sources (emitters) and/or sensors of ultrasound radiation and/or other radiation. In some embodiments, the imaging elements 104 may be of substantially the same size and/or may be arranged in an array as shown in FIG. 2B. In some embodiments, the imaging elements 104 may be of different sizes and/or arranged in an irregular or scattered configuration. In some embodiments, one or more (e.g., all) of the imaging elements 104 may be arranged in the same plane. In other embodiments, at least some of imaging elements may be arranged in at least two different planes. In some embodiments, all of the imaging elements 104 included in the portable electronic device 100 may be either emitting elements or sensing elements. In some embodiments, the imaging elements 104 may include both emitting elements and sensing elements. The embodiment shown in FIG. 2B includes a 4x6 array of imaging elements 104, by way of illustration only and is not intended to be limiting. In other embodiments, any other suitable numbers of imaging elements may be provided (e.g., 10, 20, 30, 40, 50, 100, 200, 500, 1000, or any number in between, or more) and may be arranged in any suitable configuration.
[0054] In some embodiments, the imaging elements 104 may be integrated within a circuit board (e.g., a printed circuit board) that includes, for example, processing (e.g., image processing) components of the portable electronic device 100. In some embodiments, the imaging elements 104 may be provided on a separate circuit board or layer of a circuit board than the processing components of the portable electronic device 100, and may be in communication with the processing circuitry through a suitable communications link (e.g., an internal bus, USB link, or other port). In some embodiments, as will be described in connection with FIGs. 14A-14K below, the imaging elements 104 may be micro fabricated on a semiconductor chip having the processing circuitry.
[0055] The imaging elements 104 according to some embodiments of the present disclosure may include their own dedicated processing circuitry, such as a graphic processing unit (GPU), digital signal processor (DSP), and/or central processing unit (CPU), and/or may utilize processing circuitry of the portable electronic device 100. For example, in some embodiments, the CPU and/or GPU of the portable electronic device 100 may be utilized for image acquisition/reconstruction and image rendering. In some embodiments, the CPU of portable electronic device 100 may be utilized to process computations based on received signals (e.g., back-scattered signals and/or transmissive signals) in order to generate an image or topography, while the GPU may be utilized to render an image based on the information received from the CPU to generate a real-time or substantially real-time image display. In some embodiments, portable electronic device 100 may include one or more components for processing, filtering, amplification, and/or rendering images.
[0056] FIG. 3 illustrates a transmissive imaging system and method 301 according to some embodiments of the present disclosure. As shown in FIG. 3, the transmissive imaging system 301 includes two portable electronic devices 100 A and 100B that are on opposing or generally opposing sides of an imaging target 306. In other embodiments, devices 100A and 100B may be positioned in any other relationship with respect to one another. In some embodiments, devices 100A and/or 100B may include one or more sensors for determining the relative positions of these devices to aid in the generation of image(s). While shown as a portable electronic device 100B (e.g., smart phone), in some embodiments device 100B may be a dedicated sensing and/or emitting device such as an array of ultrasound elements and associated circuitry. Signals (e.g., waves or beams 308) emitted from the portable electronic device 100B are sensed by the portable electronic device 100A and are utilized to render a 2- D or 3-D image 310 (e.g., real-time or substantially real-time image) of the target 306. In some embodiments, a generated 3-D image may be in the form of a pop-out image or a depth image. In some embodiments, the portable electronic device 100A may be configured to transmit signals (e.g., waves or beams) 308 though the target 306 to be received by the portable electronic device 100B. In some embodiments, the portable electronic device 100B may simultaneously or substantially simultaneously render an image (e.g., back view or alternate view or level of detail of an image rendered by device 100A) based at least in part on processing sensed signals. In some embodiments, the portable electronic devices 100A and/or 100B may communicate the results of the sensed signals to the other in order to generate or improve a rendered image, for example, by providing higher resolution and/or a greater frame rate. For example, a rendering device may send feedback to a signal emission device regarding the rendered image, and in response, the signal emitting device may adjust a power level, signal type, signal frequency, or other signaling parameter in order to improve the image rendered by the rendering device.
[0057] FIG. 4 illustrates a back-scatter or reflective imaging system and method 401 according to some embodiments of the present disclosure. As shown in FIG. 4, a portable electronic device 100 may utilize emission and/or sensing elements 104 in order to render an image 410 based at least in part on reflection (e.g., back-scatter effect) of the signals 408. In some embodiments, portable electronic device 100 is the only device utilized in order to image the target (e.g., to produce an image appearing as a window into a human body). For example, the portable electronic device 100 may include both radiation sources and sensors (e.g., separate sources and sensors, and/or multiple transducers functioning as both sources and sensors), where all or substantially all of the radiation utilized by the sensors to reconstruct image(s) is backscatter radiation or radiation produced through a similar effect.
[0058] FIG. 5 illustrates a transmissive and/or reflective imaging system and method 501 according to some embodiments of the present disclosure. As shown in FIG. 5, a plurality of devices, such as portable electronic devices 500A, 500B, 500C, and/or 500D may be utilized in order to render one or more image(s) 510 of target 506 on portable electronic device 500B. Each of the portable electronic devices 500A-500D may be configured to emit signals (e.g., waves or beams) 508 as shown in FIG. 5. The image 510, or alternate views of the image or imaged structure, may be rendered on the other portable electronic devices (e.g., 500A, 500C, and 500D) through communication with one-another. In some embodiments, each of the devices (e.g., 500A, 500C, and/or 500D) may be configured as emitting and/or sensing devices only. The image 510 that is rendered on portable device 500B may be based at least in part on signals 508 that are emitted by one or more of the devices 500A-500D, and which are sensed through reflection (e.g., back-scatter) and/or transmission by one or more of the devices 500A-500D.
[0059] In some embodiments, one or more portable electronic devices according to the present disclosure may generate and/or render an image based solely on signals received by one or more sensors (e.g., ultrasound transducers) of the device. In some embodiments, one or more portable electronic devices according to the present disclosure may generate and/or render an image based at least in part on information stored in memory (e.g., random access memory) of the portable device(s) identifying detail(s) regarding the structure(s), part(s), composition(s), and/or other characteristic(s) of object(s) to be imaged. For example, in some embodiments, when data received by one or more sensor(s) of the portable electronic devices indicates that the object being imaged is a particular body part or region, the portable electronic devices may use stored data in addition to the received data in order to generate an image of the object and/or its constituent part(s), and/or to provide additional detail or explanation regarding an object and/or its constituent parts. For example, stored data may be compared to the received data in order to determine differences between a previously rendered image or frame and a current image or frame such that an updated image can be generated by changing the output of pixels corresponding to the determined differences.
[0060] In some embodiments of the present disclosure, the generated and/or rendered image may be a real-time or substantially real-time image that is dynamically updated based on movement of a portable electronic device 100 along a surface of an imaging target and/or motion of the imaging target. FIG. 6A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments. FIG. 6B illustrates a three-dimensional internal image 610 of a portion of a human body at the first position shown in FIG. 6A that is generated and/or rendered by a portable electronic device 100 according to some embodiments. FIG. 6C illustrates a three-dimensional internal image 610 of a portion of a human body at the second position shown in FIG. 6A that is generated and/or rendered by a portable electronic device 100 according to some embodiments. As shown in FIG. 6B, a three-dimensional internal image 610 of a portion of the human body may be generated and displayed to a viewer 617. The three-dimensional image 610 may appear to the viewer 617 as an image having variations in, for example, topography that correspond to the surfaces and/or other aspects or features of the internal portion of the body at the first position of the portable electronic device 100 as shown in FIG. 6A. The three-dimensional image 610 may be a real-time continuous image (e.g., video image) that is dynamically updated based on movement of the portable electronic device 100 and/or the internal portion of the body that is being analyzed. As shown in FIG. 6C, a different three-dimensional internal image 610 is displayed to the viewer 617 showing different underlying structures and/or aspects (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts). The three-dimensional internal image 610 shown in FIG. 6C corresponds to the internal image of the body portion corresponding to the second position of the portable electronic device 100 as shown in FIG. 6A. As shown in FIG. 6C, the internal image 610 is illustrated as a different image showing different topographical and/or other aspects or features of the body portion than the internal image 610 shown in FIG. 6B. As discussed above, through selection of different aspect ratios and/or zoom settings, as well as through positioning of the portable electronic device 600, different types of internal images of a target may be generated, such as a three-dimensional view of an entire organ or multiple organs.
[0061] In some embodiments, the imaging elements, including sensors and/or sources (e.g., transducers), may be provided on, in, or otherwise coupled to a case for a portable electronic device. FIG. 7A illustrates a front view of a portable electronic device 700 according to some embodiments of the present disclosure. The portable electronic device 700 includes an imaging interface 702. FIG. 7B illustrates a back view of the portable electronic device 700 according to some embodiments of the present disclosure. As shown in FIGs. 7A and 7B, unlike the portable electronic device 100, the portable electronic device 700 does not include imaging elements 104 as part of the main housing or enclosure of device 700. [0062] FIG. 7C illustrates a front view of a case 711 for a portable electronic device according to some embodiments of the present disclosure. FIG. 7D illustrates a back view of the case 711 including imaging elements for a portable electronic device according to some embodiments of the present disclosure. The case 711 may be configured to be attached to the portable electronic device so as to at least partially enclose the portable electronic device 700. In some embodiments, case 711 may simultaneously provide imaging capabilities to portable electronic device 700 and serve as a protective case. The case may be made of any suitable material such as rubber, plastic, leather, and/or the like. As shown in FIG. 7D, an imaging circuit 712 (e.g., an integrated circuit) may be provided on (e.g., directly on), embedded in, and/or otherwise coupled to the back surface and/or other surface(s) of the case 711. Case 711 may be considered part of portable electronic device 700.
[0063] The imaging circuit 712 may include one or more imaging elements 104. As discussed above, the imaging elements 104 may include sources and/or sensors. The imaging circuit 712 may also include a communication device 714 configured to communicate with the portable electronic device 700 via a wired or wireless link. For example, the imaging circuit 712 may include a communication transmitter/receiver which utilizes an infrared signal, a Bluetooth communication signal, a near-field communication signal, and/or the like to communicate with the portable electronic device 700. In some embodiments, the communication device 714 may be in communication with the processing circuitry of a portable electronic device through a wired communications link (e.g., a USB port, or other data port), or combination of wired and wireless links. In some embodiments, the imaging circuit 712 may receive power through wired and/or wireless connection(s) to the portable electronic device. In some embodiments, the imaging circuit 712 may receive power from a separate power source (e.g., a battery) that is coupled to the imaging circuit 712. In some embodiments, when the portable electronic device 700 is coupled to or attached to the case 711, a software application and/or drivers are automatically loaded and/or executed by the portable electronic device 700 in order to render an image based on communication with the imaging circuit 712. The software application and/or drivers may be stored in a memory of the imaging circuit 712 and communicated to the portable electronic device 700 and/or may be retrieved by the portable electronic device through a network (e.g., the internet). [0064] In some embodiments, the portable electronic device 700 receives raw data from the communication device 714 and processes the raw data using processing circuitry (e.g., image signal processor, digital signal processor, filters, and/or the like) included in the portable electronic device 700. In some embodiments, the imaging circuit 712 includes a local imaging processor 716 configured to process signals received by imaging elements 104. The communication device 714 may be configured to communicate data received from the imaging elements 104 (e.g., such as raw sensor data) and/or may communicate processed data that is received from the local imaging processor 716. As shown in FIG. 7A, the portable electronic device 700 includes an interface 702 for displaying an image that is rendered by processing signals received from the communication device 714.
[0065] In some embodiments, an imaging circuit (e.g., an integrated circuit) may be provided separately such that it can be mounted and/or attached to different cases used by different portable electronic devices. FIG. 8A illustrates a front view of a case 811 A for a portable electronic device according to some embodiments of the present disclosure. FIG. 8B illustrates a back view of the case 811 A including a retaining mechanism 820 for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure. FIG. 8C illustrates a front view of a case 81 IB for a portable electronic device according to some embodiments of the present disclosure. FIG. 8D illustrates a back view of the case 81 IB including a retaining mechanism for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure. FIG. 8E illustrates a modular unit 830 including an imaging circuit 712 according to some embodiments of the present disclosure. As shown in FIGs. 8A-8D, the case 811 A has a different shape than the case 81 IB. The case 811 A may be utilized for a first portable electronic device, while the case 81 IB may be utilized for a second portable electronic device having a different size and/or shape than the first portable electronic device. Each of the cases 811 A and 81 IB includes a retaining mechanism 820 that is configured to retain the modular unit 830.
[0066] The modular unit 830 may include the imaging circuit 712 as discussed above with reference to FIGs. 7A-7D. The imaging circuit 712 may include one or more imaging elements 104, a communication device 714, and/or a local imaging processor 716. The modular unit 830 also includes a coupling mechanism 832 that is configured to engage with the retaining mechanism 820 of the cases 811 A and 81 IB. For example, in some embodiments, the retaining mechanism 820 may correspond to a slot on the case 811 A and/or 81 IB that is configured to receive the modular unit 830. The coupling mechanism 832 may be shaped to correspond to the slot of the case 811 A and/or 81 IB such that the modular unit 830 may be secured by the case 811 A and/or 81 IB. In some embodiments, the retaining mechanism 820 and the coupling mechanism 832 may include corresponding structures for locking the modular unit 830 in place during use. In some embodiments, the retaining mechanism 820 may include one or more magnets having a first polarity, and the coupling mechanism 832 may include one or more magnets having a second polarity that is opposite of the first polarity such that the modular unit 830 can be retained by the case 81 1 A and/or 811B.
[0067] As described with reference to FIGs. 8A-8E, since the modular unit 830 may be incorporated with different cases 811 A and/or 81 IB that are utilized for different portable electronic devices, the modular unit 830 may advantageously provide flexibility in the incorporation of an imaging system with different portable electronic devices. Furthermore, different cases 811 A and 81 IB may be manufactured using any suitable techniques (e.g., 3-D printing, injection molding, or the like). In some embodiments, case 811 A and/or case 81 IB may be manufactured at low cost such that the different cases 811 A and 81 IB may be discarded and/or upgraded while remaining compatible with the modular unit 830. As a result, the modular unit 830 can be integrated into and utilized by a user with a plurality of portable electronic devices even when the design of the portable electronic devices is changed (e.g., updated and/or upgraded).
[0068] Examples of suitable imaging devices that may integrated within or coupled to a portable electronic device according to some embodiments of the present disclosure are described in connection with FIGs. 9A-9F, 13, and 14A-14K below, and in commonly-owned U.S. Patent Application Serial No. 13/654,337 filed October 17, 2012, and entitled "Transmissive Imaging and Related Apparatus and Methods;" U.S. Provisional Application Serial No. 61/798,851 filed March 15, 2013, and entitled "Monolithic Ultrasonic Imaging Devices, Systems and Methods;" and U.S. Provisional Application Serial No. 61/794,744 filed on March 15, 2013, and entitled "Complementary Metal Oxide Semiconductor (CMOS) Ultrasonic Transducers and Methods for Forming the Same," each of which is incorporated by reference in its entirety.
[0069] FIG. 9 A illustrates how, in some embodiments, a single transducer element 104 may fit within a larger transducer array 900. Figs. 9B-F show five different examples of how a given transducer element 104 comprised of circular transducer cells 900 within an array 900 might be configured in some embodiments. As shown in Fig. 9B, in some embodiments, each transducer element 104 in an array 900 may include only a single transducer cell 902 (e.g., a single CUT or CMUT). As shown in Figs. 9B-F, in other embodiments, each transducer element 104 in an array 900 may include a group of individual transducer cells 902 (e.g., CUTs or CMUTs). Other possible configurations of transducer elements 104 include trapezoidal elements, triangular elements, hexagonal elements, octagonal elements, etc. Similarly, each transducer cell 902 (e.g., CUT or CMUT) making up a given transducer element 104 may itself take on any of the aforementioned geometric shapes, such that a given transducer element 104 may, for example, include one or more square transducer cells 902, rectangular transducer cells 902, circular transducer cells 902, asterisk-shaped transducer cells 902, trapezoidal transducer cells 902, triangular transducer cells 902, hexagonal transducer cells 902, and/or octagonal transducer cells 902, etc.
[0070] In some embodiments, at least two of (e.g., all) of the transducer cells 902 within each given transducer element 104 act as a unit and together generate outgoing ultrasonic pulses in response to the output of the same pulser (described below) and/or together receive incident ultrasonic pulses and drive the same analog reception circuitry. When multiple transducer cells 902 are included in each transducer element 104, the individual transducer cells 902 may be arranged in any of numerous patterns, with the particular pattern being chosen so as to optimize the various performance parameters, e.g., directivity, signal-to-noise ratio (SNR), field of view, etc., for a given application. In some embodiments in which CUTs are used as transducer cells 902, an individual transducer cell 902 may, for example, be on the order of about 20-1 ΙΟμιη wide, and have a membrane thickness of about 0.5-1.0μιη, and an individual transducer element 104 may have a depth on the order of about 0.1-2.0 μιη, and have a diameter of about 0.1mm-3mm, or any values in between. These are only illustrative examples of possible dimensions, however, and greater and lesser dimensions are possible and contemplated.
[0071] Fig. 10A shows an illustrative example of a monolithic ultrasound device 1000 according to some embodiments. As shown, the device 1000 may include one or more transducer arrangements (e.g., arrays) 900, a transmit (TX) control circuit 1004, a receive (RX) control circuit 1006, a timing & control circuit 1008, a signal conditioning/processing circuit 1010, a power management circuit 1018, and/or a high-intensity focused ultrasound (HIFU) controller 1020. In the embodiment shown, all of the illustrated elements are formed on a single semiconductor die 1012. It should be appreciated, however, that in alternative embodiments one or more of the illustrated elements may be instead located off-chip. In addition, although the illustrated example shows both a TX control circuit 1004 and an RX control circuit 1006, in alternative embodiments only a TX control circuit or only an RX control circuit may be employed. For example, such embodiments may be employed in a circumstance where one or more transmission-only devices 1000 are used to transmit acoustic signals and one or more reception-only devices 1000 are used of receive acoustic signals that have been transmitted through or reflected by a subject being ultrasonically imaged.
[0072] Fig. 10B is a block diagram illustrating how, in some embodiments, the TX control circuit 1004 and the RX control circuit 1006 for a given transducer element 104 may be used either to energize the transducer element 104 to emit an ultrasonic pulse, or to receive and process a signal from the transducer element 104 representing an ultrasonic pulse sensed by it. In some implementations, the TX control circuit 1004 may be used during a "transmission" phase, and the RX control circuit may be used during a "reception" phase that is non-overlapping with the transmission phase. In other implementations, one of the TX control circuit 1004 and the RX control circuit 1006 may simply not be used in a given device 1000, such as when a pair of ultrasound units 200 is used for only transmissive imaging. As noted above, in some embodiments, a device 1000 may alternatively employ only a TX control circuit 1004 or only an RX control circuit 1006, and aspects of the present technology do not necessarily require the presence of both such types of circuits. In various embodiments, each TX control circuit 1004 and/or each RX control circuit 1006 may be associated with a single transducer cell 900 (e.g., a CUT or CMUT), a group of two or more transducer cells 902 within a single transducer element 104, a single transducer element 104 comprising a group of transducer cells 902, a group of two or more transducer elements 104 within an array 900, or an entire array 900 of transducer elements 104.
[0073] In the example shown in Fig. 10B, there is a separate TX control circuit 1004/RX control circuit 1006 combination for each transducer element 104 in the array(s) 900, but there is only one instance of each of the timing & control circuit 1008 and the signal conditioning/processing circuit 1010. Accordingly, in such an implementation, the timing & control circuit 1008 may be responsible for synchronizing and coordinating the operation of all of the TX control circuit 1004/RX control circuit 1006 combinations on the die 1012, and the signal conditioning/processing circuit 1010 may be responsible for handling inputs from all of the RX control circuits 1006 (see element 1005 in Fig. 10) on the die 1012.
[0074] As shown in Fig. 10B, in addition to generating and/or distributing clock signals to drive the various digital components in the device 1000, the timing and control circuit 1008 may output either a "TX enable" signal to enable the operation of each TX control circuit 1004, or an "RX enable" signal to enable operation of each RX control circuit 1006. In the example shown, a switch 1003 in the RX control circuit 1006 may always be opened before the TX control circuit 1004 is enabled, so as to prevent an output of the TX control circuit 1004 from driving the RX control circuit 1006. The switch 1003 may be closed when operation of the RX control circuit 1006 is enabled, so as to allow the RX control circuit 1006 to receive and process a signal generated by the transducer element 104.
[0075] As shown, the TX control circuit 1004 for a respective transducer element 104 may include both a waveform generator 1007 and a pulser 1009. The waveform generator 1007 may, for example, be responsible for generating a waveform that is to be applied to the pulser 1009, so as to cause the pulser 1009 to output a driving signal to the transducer element 104 corresponding to the generated waveform.
[0076] In the example shown in Fig. 10B, the RX control circuit 1006 for a respective transducer element 104 includes an analog processing block 1011, an analog-to-digital converter (ADC) 1013, and a digital processing block 1015. The ADC 1013 may, for example, comprise a 10-bit, 20Msps, 40Msps, or 80Msps ADC. [0077] After undergoing processing in the digital processing block 1015, the outputs of all of the RX control circuits 1006 on the die 1012 (the number of which, in this example, is equal to the number of transducer elements 104 on the chip) are fed to a multiplexer (MUX) 1017 in the signal conditioning/processing circuit 1010. The MUX 1017 multiplexes the digital data from the various RX control circuits 1006, and the output of the MUX 1017 is fed to a multiplexed digital processing block 1019 in the signal conditioning/processing circuit 1010, for final processing before the data is output from the die 1012, e.g., via one or more high-speed serial output ports 1014. Examples of implementations of the various circuit blocks shown in Fig. 10B are discussed further below. As explained in more detail below, various components in the analog processing block 1011 and/or the digital processing block 1015 may serve to decouple waveforms from the received signal and otherwise reduce the amount of data that needs to be output from the die 1012 via a high-speed serial data link or otherwise. In some embodiments, for example, one or more components in the analog processing block 1011 and/or the digital processing block 1015 may thus serve to allow the RX control circuit 1006 to receive transmitted and/or scattered ultrasound pressure waves with an improved signal-to-noise ratio (SNR) and in a manner compatible with a diversity of waveforms. The inclusion of such elements may thus further facilitate and/or enhance the disclosed "ultrasound-on-a-chip" solution in some embodiments.
[0078] Although particular components that may optionally be included in the analog processing block 1011 are described below, it should be appreciated that digital counterparts to such analog components may additionally or alternatively be employed in the digital processing block 1015. The converse is also true. That is, although particular components that may optionally be included in the digital processing block 1015 are described below, it should be appreciated that analog counterparts to such digital components may additionally or alternatively be employed in the analog processing block 1011.
[0079] Fig. 11 illustrates an example of a technique for biasing the transducer elements 104 in an array 900. As shown, the side of each of the transducer elements 104 that faces the patient may be connected to ground, so as to minimize risk of electric shock. The other side of each transducer element 104 may be connected to the output of the pulser 1009 via a resistor 1102. Accordingly, each transducer element 104 is always biased via the output of the pulser 1009, regardless of whether the switch SI is open or closed. In some embodiments, e.g., embodiments employing transducer elements 104 comprising one or more CUTs or CMUTs, the bias voltage applied across the element may be on the order of 100V.
[0080] As illustrated in the accompanying timing diagram of Fig.11 , the switch S 1 may be closed during a transmit operation and may be open during a receive operation. Conversely, the switch S2 may be closed during a receive operation and may be open during a transmit operation. (Note that there is always a gap between the opening of switch S 1 and the closing of switch S2, as well as between the opening of switch S2 and the closing of switch SI, so as to ensure the pulser 1009 does not apply an outgoing pulse to the LNA 1101 in the RX control circuit 1006.)
[0081] As also shown in the timing diagram, the pulser 1009 may hold the bottom plate of the transducer element 104 at its high output level at all times except when it is applying a waveform pulse to its transducer element 104, and the waveform pulse applied during the transmit phase may be referenced from the high output level of the pulser 1009. Accordingly, each individual pulser 1009 is able to maintain a bias on its corresponding transducer element 104 at all times. As shown in Fig.11 , a capacitor 1104 may be placed between the switch S2 and the LNA 1101 of the RX control circuit 1006 so as to block the DC bias signal (i.e., the high output of the pulser 1009) from reaching the LNA 1101 during receive operations (i.e., when switch S2 is closed).
[0082] Biasing the transducer elements 104 via their respective pulsers 1009 may provide benefits in some embodiments, such as reducing cross-talk that would otherwise occur if the elements 104 were biased via a common bus, for example.
[0083] Fig. 12 shows an example implementation of the RX control circuit 1006 that includes a matched filter 1202 that may, for example, perform waveform removal and improve the signal-to-noise ratio of the reception circuitry. As shown in Fig. 12, the analog processing block 1011 may, for example, include a low-noise amplifier (LNA) 1201, a variable-gain amplifier (VGA) 1204, and a low-pass filter (LPF) 1206. In some embodiments, the VGA 1204 may be adjusted, for example, via a time-gain compensation (TGC) circuit. The LPF 1206 provides for anti-aliasing of the acquired signal. In some embodiments, the LPF 1206 may, for example, comprise a 2nd order low-pass filter having a frequency cutoff on the order of 5MHz. Other implementations are, however, possible and contemplated.
[0084] In the example of Fig. 12, the digital control block 1015 of the RX control circuit 1006 includes a digital quadrature demodulation (DQDM) circuit 1208 and an output buffer 1216. The DQDM circuit 1208 may, for example, be configured to mix down the digitized version of the received signal from center frequency to baseband, and then low-pass filter and decimate the baseband signal. An illustrative embodiment of a circuit suitable for use as the matched filter 1202 is shown in FIG. 13.
[0085] Although labeled a "matched" filter, the filter circuit 1202 may actually operate as either a matched filter or a mismatched filter so as to decouple waveforms from the received signal. The matched filter 1202 may work for either linear frequency modulated (LFM) or non-LFM pulses.
[0086] As shown in FIG. 13, the matched filter 1202 may, for example, include a padding circuit 1302, a fast Fourier transformation (FFT) circuit 1304, a multiplier 1306, a low-pass filter 1308, a decimator circuit 1310, and an inverse FFT circuit 1312. If employed, the padding circuit 1302 may, for example, apply padding to the incoming signal sufficient to avoid artifacts from an FFT implementation of circular convolution.
[0087] To operate as a "matched" filter, the value of Ή(ω)" applied to the multiplier 1306 should be a conjugate of the transmission waveform Τχ(ω). In some embodiments, the filter 2202 may thus indeed operate as a "matched" filter, by applying a conjugate of the transmission waveform Τχ(ω) to the multiplier 1306. In other embodiments, however, the "matched" filter 2202 may instead operate as a mismatched filter, in which case some value other than a conjugate of the transmission waveform Τχ(ω) may be applied to the multiplier 1306.
[0088] A process for forming an ultrasonic transducer (e.g., transducer cells 902) having a membrane above a cavity in a CMOS wafer is now described. Referring to FIG. 14A, the process may begin with a CMOS wafer 1400 including a substrate 1402, a dielectric or insulating layer 1404, a first metallization layer 1406 and a second metallization layer 1408, which in some embodiments may be a top metallization layer of the CMOS wafer 1400. [0089] The substrate 1402 may be silicon or any other suitable CMOS substrate. In some embodiments, the CMOS wafer 1400 may include CMOS integrated circuitry (IC), and thus the substrate 1402 may be a suitable substrate for supporting such circuitry.
[0090] The insulating layer 1404 may be formed of Si02 or any other suitable dielectric insulating material. In some embodiments, the insulating layer 1404 may be formed via tetraethyl orthosilicate (TEOS), though alternative processes may be used.
[0091] While the CMOS wafer 1400 is shown as including two metallization layers 1406 and 1408, it should be appreciated that CMOS wafers according to the various aspects of the present application are not limited to having two metallization layers, but rather may have any suitable number of metallization layers, including more than two in some embodiments. Such metallization layers may be used for wiring (e.g., as wiring layers) in some embodiments, though not all embodiments are limited in this respect.
[0092] The first and second metallization layers 1406 and 1408 may have any suitable construction. In the embodiment illustrated, at least the second metallization layer 1408 may have a multi-layer construction, including a middle conductive layer 1412 (e.g., formed of aluminum or other suitable conductive material) and upper and lower liner layers 1410 and 1414, respectively. The liner layers 1410 and 1414 may be formed of titanium nitride (TiN) or other suitable conductive material (e.g., metals other than TiN, such as tantalum, or other suitable metals for acting as a liner). In some embodiments, the upper liner layer 1410 may be used as an etch stop, for example during one or more etch steps used in as part of a process for forming a cavity for an ultrasonic transducer. Thus, the liner layer 1410 may be formed of a material suitable to act as an etch stop in some embodiments. Moreover, while not shown, the first and second metallization layers 1406 and 1408, as well as any other metallization layers described herein, may optionally include silicon oxynitride (SiON) as an upper layer (e.g., on top of liner layer 1410) to serve as an anti-reflective coating during lithography stages.
[0093] In some embodiments, it may be desirable to form an electrode from the second metallization layer 1408 serving as an electrode of an ultrasonic transducer. Also, the second metallization layer 1408 may be used to make electrical contact to a membrane of a CUT to be formed on the CMOS wafer. Accordingly, as shown in FIG. 14B, the second metallization layer 1408 may be suitably patterned to form an electrode 1416 and one or more contacts 1418.
[0094] While FIG. 14B illustrates a configuration in which an electrode and electrical contacts are formed on a CMOS wafer from a metallization layer, it should be appreciated that other manners of forming an electrode (e.g., electrode 1416) and/or electrical contacts (e.g., electrical contacts 1418) may be implemented. For example, conductive materials other than metals but suitable to act as electrodes and/or electrical contacts may be suitably processed on the CMOS wafer to form the illustrated electrode and/or electrical contacts.
[0095] An insulating layer 1420 may then be deposited as shown in FIG. 14C. The insulating layer 1420 may be Si02 or any other suitable insulator, and may be formed in any suitable manner. In some embodiments, the insulating layer 1420 may be formed by high density plasma (HDP) deposition. The insulating layer 1420 may then be planarized (not shown), for example using chemical mechanical polishing (CMP) or other suitable planarization technique.
[0096] In FIG. 14D, the insulating layer 1420 may be etched as shown to expose the upper surface of the electrode 1416 and electrical contacts 1418. In some embodiments, the upper liner layer 1410 may be used as an etch stop for a selective etch used to etch the insulating layer 1420. As an example, the liner layer 1410 may be formed of TiN and may be used as an etch stop, though not all embodiments are limited in this respect.
[0097] A further insulating layer 1422 may be deposited as shown in FIG. 14E to cover the upper surfaces of the electrode 1416 and electrical contacts 1418 and may then be patterned as shown in FIG. 14F to open contact holes 1424 for the electrical contacts 1418. The insulating layer 1422 may be Si02 or any other suitable insulator.
[0098] As shown in FIG. 14G, a conductive layer 1426 may be deposited. The conductive layer may be used to form electrical contacts to a membrane of an ultrasonic transducer, as will be shown in connection with FIG. 14 J. Also, the conductive layer 1426 may be patterned to form a cavity therein for a CUT, with a remaining portion of the conductive layer 1426 defining one or more sidewalls of the cavity. In some embodiments, then, the conductive layer 1426 may also represent a spacer in that a membrane may be separated from the surface of the CMOS wafer 1400 by the height of the conductive layer 1426. Thus, the conductive layer 1426 may serve one or more of multiple possible functions.
[0099] The conductive layer 1426 may be formed of any suitable conductive material. In some embodiments, the conductive layer 1426 may be formed of a metal. For example, the conductive layer 1426 may be TiN in some embodiments.
[0100] The conductive layer 1426 may be planarized (not shown) using CMP or other suitable planarization technique, and then may be patterned as shown in FIG. 14H to form contacts 1428. It can be seen that at this stage a cavity 1430 has been formed in the CMOS wafer with the contacts 1428 serving to at least partially define the cavity. The contacts 1428 (which in some embodiments may represent a single contact forming a closed contour) function as sidewalls of the cavity 1430 in the embodiment illustrated and, as will be further appreciated from consideration of FIG. 14K, create a standoff between the electrode 1416 and a membrane overlying the cavity 1430.
[0101] As shown in FIGs. 141- 14 J, a second wafer 1431 may be bonded to the CMOS wafer. In general, the second wafer may be any suitable type of wafer, such as a bulk silicon wafer, a silicon-on-insulator (SOI) wafer, or an engineered substrate including a polysilicon or amorphous silicon layer with an insulating layer between a single crystal silicon layer and the polysilicon or amorphous silicon layer. In the embodiment illustrated, the second wafer 1431 may include four layers including a base layer or handle layer 1432, insulating layer 1434, layer 1436, and layer 1438. The second wafer 1431 may be used to transfer layers 1436 and 1438 to the CMOS wafer for forming a membrane over cavity 1430, and thus may be referred to herein as a transfer wafer.
[0102] As a non- limiting example of suitable materials making up the second wafer 1431 , the base layer 1432 may be a silicon layer (e.g., single crystal silicon), the insulating layer 1434 may be Si02 and may represent a buried oxide (BOX) layer, and layer 1436 may be silicon. In some embodiments, the layer 1436 may be degeneratively doped silicon phosphide (SiP+). In some embodiments, the layer 1436 may be polysilicon or amorphous silicon, though other embodiments may utilize single crystal silicon. The layer 1438 may be formed of a material suitable for bonding to the contacts 1428 on the CMOS wafer. For example, the contacts 1428 and layer 1438 may be formed of the same material. In some embodiments, the contacts 1428 and layer 1438 may be formed of TiN.
[0103] The process used for bonding the second wafer 1431 to the CMOS wafer 1400 may be a low temperature bonding process, for example not exceeding 450° C. In some embodiments, the temperature of the bonding process may be between approximately 200° C and 450° C, between approximately 300° C and approximately 400° C, any temperature(s) within those ranges, any other temperature described herein for low temperature bonding, or any other suitable temperature. Thus, damage to the metallization layers on the CMOS wafer, and any ICs on the CMOS wafer, may be avoided.
[0104] The wafer bonding process may be one of various types. In some embodiments, the wafer bonding may be direct bonding (i.e., fusion bonding). Thus, the wafer bonding may involve energizing respective surfaces of the CMOS and second wafers and then pressing the wafers together with suitable pressure to create the bond. A low temperature anneal may be performed. While fusion bonding represents one example of a suitable bonding technique, other bonding techniques may alternatively be used, including for example bonding two wafers through the use of one or more intermediate layers (e.g., adhesive(s)). In some embodiments, anodic or plasma assisted bonding may be used.
[0105] The bonding illustrated in FIGs. 14I-14J may result in the second wafer 1431 being monolithically integrated with the CMOS wafer 1400. Thus, the two may form a unitary body in some situations.
[0106] A membrane may then be formed from the second wafer 1431. The second wafer 1431 may be thinned from the backside. Such thinning may be performed in stages. For example, mechanical grinding providing coarse thickness control (e.g., 10 micron control) may initially be implemented to remove a relatively large amount of the bulk wafer. In some embodiments, the thickness control of the mechanical grinding may vary from coarse to fine as the thinning process progresses. Then, CMP may be performed on the backside, for example to get to a point close to the layer 1436. Next, a selective etch, such as a selective chemical etch, may be performed to stop on the layer 1436. Other manners of thinning are also possible. [0107] Thus, as shown in FIG. 14K, the base layer or handle layer 1432 and insulating layer 1434 may be removed. A membrane 1440 formed of the layer 1436 and layer 1438 may remain. The membrane may be any suitable thickness TM, non-limiting examples of which are described below. In some embodiments, the layer 1436 may be etched or otherwise thinned to provide a desired membrane thickness.
[0108] Various features of the structure illustrated in FIG. 14K are noted. First, the structure includes a sealed cavity 1430 which is sealed by the membrane 1440. Also, the sidewalls of the cavity are conductive, i.e., the contacts 1428 are conductive and form the sidewalls of the sealed cavity. In this respect, the contacts 1428 represent a conductive standoff for the membrane 1440 from the surface of the CMOS wafer. The contacts 1428 may be relatively large area electrical contacts and make contact with a relatively large area of the membrane, thus providing a low resistivity electrical path to/from the membrane. For example, the contacts may provide electrical control between the membrane and an IC on the CMOS wafer (e.g., disposed beneath the cavity) which may interact with the membrane to provide/receive electrical signals and thus in some embodiments control operation of the membrane.
[0109] Moreover, it is noted that the membrane 1440 has a first side 1442 proximate the cavity 1430 and a second side 1444 distal the cavity, and that direct electrical contact is made to the first side 1442 via the contacts 1428. The first side 1442 may be referred to as a bottom side of the membrane and the second side 1444 may be referred to as a top side of the membrane. Local connection to the membrane 1440 may be made in this manner, and the membrane 1440 may be connected to integrated circuitry in the CMOS wafer via this connection (e.g., via contact 1418). In some embodiments, an IC may be positioned beneath the cavity 1430 and the conductive path configuration illustrated may facilitate making connection between the integrated circuitry beneath the cavity and the membrane 1440. The configuration of FIG. 14K provides a non- limiting example of an embedded contact to the membrane, in that electrical contact is provided by way of a conductive path in the CMOS wafer (e.g., to contact 1418) rather than a contact made on the second side 1444. Such a configuration may be preferable to making electrical contact on the second side 1444 since any contact on the second side 1444 may (negatively) impact vibration of the membrane 1440.
[0110] Also, it is noted that in the embodiment of FIG. 14K the electrode 1416 is narrower than the cavity 1430. Namely, the electrode 1416 has a width Wl less than a width W2 of the cavity 1430. Such a configuration may be desirable at least in those embodiments in which the cavity has conductive sidewalls (e.g., the contacts 1428) to provide electrical isolation between the sidewalls and the electrode.
[0111] Moreover, it is noted that the structure of FIG. 14K may be altered by not including the layer 1438 in an embodiment. Thus, in an embodiment a direct bond may be formed between contacts 1428 (e.g., formed of TiN) and layer 1436 (e.g., silicon).
[0112] The structure illustrated in FIG. 14K may have any suitable dimensions. Non- limiting examples of dimensions for the membrane 1440 and cavity 1430 are described further below.
[0113] As non-limiting examples, the width W2 of the cavity 1430 may be between Dl approximately 5 microns and approximately 500 microns, between approximately 20 microns and approximately 100 microns, may be approximately 30 microns, approximately 40 microns, approximately 50 microns, any width or range of widths in between, or any other suitable width. In some embodiments, the width may be selected to maximize the void fraction, i.e., the amount of area consumed by the cavity compared to the amount of area consumed by surrounding structures. The width dimension may also be used to identify the aperture size of the cavity, and thus the cavities may have apertures of any of the values described above or any other suitable values.
[0114] The depth may be between approximately 0.05 microns and approximately 10 microns, between approximately 0.1 microns and approximately 5 microns, between approximately 0.5 microns and approximately 1.5 microns, any depth or range of depths in between, or any other suitable depth. If the contacts 1428 are formed of TiN, it may be preferable in such embodiments for Dl to be less than 5 microns, since TiN is commonly formed as a thin film. In some embodiments, the cavity dimensions and/or the membrane thickness of any membrane overlying the cavity may impact the frequency behavior of the membrane, and thus may be selected to provide a desired frequency behavior (e.g., a desired resonance frequency of the membrane). For example, it may be desired in some embodiments to have an ultrasonic transducer with a center resonance frequency of between approximately 20 kHz and approximately 200 MHz, between approximately 1 MHz and approximately 10 MHz, between approximately 2 MHz and approximately 5 MHz, between approximately 50 kHz and approximately 200 kHz, of approximately 2.5 MHz, approximately 4 MHz, any frequency or range of frequencies in between, or any other suitable frequency. For example, it may be desired to use the devices in air, gas, water, or other environments, for example for medical imaging, materials analysis, or for other reasons for which various frequencies of operation may be desired. The dimensions of the cavity and/or membrane may be selected accordingly.
[0115] The membrane thickness TM (e.g., as measured in the direction generally parallel to the depth Dl) may be less than 100 microns, less than 50 microns, less than 40 microns, less than 30 microns, less than 20 microns, less than 10 microns, less than 5 microns, less than 1 micron, less than 0.1 microns, any range of thicknesses in between, or any other suitable thickness. The thickness may be selected in some embodiments based on a desired acoustic behavior of the membrane, such as a desired resonance frequency of the membrane.
[0116] Also, it should be appreciated that the cavity 1430, and more generally the cavities of any embodiments described herein, may have various shapes, and that when multiple cavities are formed not all cavities need have the same shape or size. For example, FIGs. 22A-22D illustrate various potential shapes for cavity 1430 and the other cavities described herein. Specifically, FIGs. 22A-22D illustrate top views of a portion 2200 of a CMOS wafer having cavities 1430 formed therein of various shapes. FIG. 22A illustrates that the cavities 1430 may have a square aperture. FIG. 22B illustrates the cavities 1430 may have a circular aperture. FIG. 22C illustrates the cavities may have a hexagonal aperture. FIG. 22D illustrates the cavities 1430 may have an octagonal aperture. Other shapes are also possible.
[0117] While the portion 2200 is shown as including four cavities, it should be appreciated that aspects of the present application provide for one or more such cavities to be formed in a CMOS wafer. In some embodiments a single substrate (e.g., a single CMOS wafer) may have tens, hundreds, thousands, tens of thousands, hundreds of thousands, or millions of CUTs (and corresponding cavities) formed therein. [0118] FIG. 14K illustrates an ultrasonic transducer which has a membrane 1440 overlying the cavity 1430, wherein the membrane has a substantially uniform thickness. In some embodiments, it may be desirable for the membrane to have a non-uniform thickness. For example, it may be desirable for the membrane to be configured as a piston, with a center portion having a greater thickness than an outer portion of the membrane, non-limiting examples of which are described below.
[0119] Ultrasonic transducers such as that illustrated in FIG. 14K may be used to send and/or receive acoustic signals. The operation of the transducer in terms of power generated, frequencies of operation (e.g., bandwidth), and voltages needed to control vibration of the membrane may depend on the shape and size of the membrane. A membrane shaped as a piston with a center mass-like portion that is connected to a CMOS wafer by a thinner peripheral portion may provide various beneficial operating characteristics.
[0120] Accordingly, an aspect of the present application provides ultrasonic transducers having piston membranes. Such transducers may be formed by wafer bonding processes according to some embodiments of the present application. In general, the thicker center portion of such membranes may be formed on the top side or bottom side of the membrane, and may be formed prior to or after wafer bonding.
[0121] Having thus described several aspects and embodiments of the technology described herein, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the technology described in the present disclosure. For example, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described. In addition, any combination of two or more features, systems, articles, materials, kits, and/or methods described herein, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
[0122] The above-described embodiments can be implemented in any of numerous ways. One or more aspects and embodiments of the present disclosure involving the performance of processes or methods may utilize program instructions executable by a device (e.g., a computer, a processor, or other device) to perform, or control performance of, the processes or methods. In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a non- transitory computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above. In some embodiments, computer readable media may be non-transitory media.
[0123] The terms "program" or "software" are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects as described above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present application need not reside on a single computer or processor, but may be distributed in a modular fashion among a number of different computers or processors to implement various aspects of the present application.
[0124] Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments. [0125] Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
[0126] When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
[0127] Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer, as non-limiting examples. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
[0128] Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible formats.
[0129] Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks. [0130] Also, as described, some aspects may be embodied as one or more methods. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
[0131] All definitions, as defined and used herein, should be understood to control over dictionary definitions and/or ordinary meanings of the defined terms.
[0132] The indefinite articles "a" and "an," as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean "at least one."
[0133] The phrase "and/or," as used herein in the specification and in the claims, should be understood to mean "either or both" of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with "and/or" should be construed in the same fashion, i.e., "one or more" of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the "and/or" clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
[0134] As used herein in the specification and in the claims, the phrase "at least one," in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, "at least one of A and B" (or, equivalently, "at least one of A or B," or, equivalently "at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
[0135] Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including," "comprising," or "having," "containing," "involving," and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.

Claims

CLAIMS What is claimed is:
1. A device, comprising:
a processor configured to generate an image of an internal feature of a target when the device is positioned at an external surface of the target; and
a display configured to display the image, wherein the device is a portable electronic device.
2. The device of claim 1, wherein the target is a human body or a portion of a human body, and wherein the image is a real-time continuous image of the internal feature of the target.
3. The device of claims 1 or 2, wherein the image is an ultrasound image.
4. The device of any of the preceding claims, wherein the device comprises at least one transducer configured to receive radiation from the target, and wherein the image is generated based at least in part on the radiation received by the at least one transducer.
5. The device of claim 4, wherein the at least one transducer is an ultrasound transducer.
6. The device of any of claims 1-5, wherein the device comprises a body and a removable case that at least partially encloses the body.
7. The portable electronic device of claim 6, wherein the body comprises the imaging interface.
8. The portable electronic device of claims 6 or 7, wherein the body comprises at least one of the one or more processors.
9. The portable electronic device of any of claims 6-8, wherein the case comprises at least one of the one or more processors.
10. The portable electronic device of any of claims 6-9, wherein the case comprises the plurality of imaging elements.
11. The portable electronic device of one of claims 6-10, wherein the case comprises a modular unit comprising the plurality of imaging elements, wherein the case and the modular unit are configured to allow the case to retain the modular unit and wherein the modular unit is also configured for retention by a second case for a second portable electronic device having a different size and/or shape than the first portable electronic device.
12. A portable ultrasound device, comprising:
a plurality of ultrasound elements configured to receive ultrasound radiation reflected by or passing through a target when the ultrasound device is pointed at the target; and
a display configured to display an image of an internal feature of the target based at least in part on the ultrasound radiation received by the plurality of ultrasound elements.
13. The portable ultrasound device of claim 12, further comprising at least one processor configured to render the image based at least in part on the ultrasound radiation received by the plurality of ultrasound elements.
14. A method, comprising;
pointing a portable electronic device at an external surface of a subject; and viewing, on a display of the portable electronic device, an image of an internal feature of the subject while pointing the portable electronic device at the external surface of the subject.
15. The method of claim 14, wherein the portable electronic device comprises a radiation sensor, and wherein the method further comprises receiving, with the radiation sensor, radiation reflected by or passing through the subject, and creating the image of the internal feature based at least in part on the radiation received by the radiation sensor.
16. The method of claim 15, wherein the radiation comprises ultrasound signals.
17. A portable electronic device that renders within a window on a display of the device an image of an inside of a human body when the device is directed at the body.
18. The portable electronic device of claim 17, wherein the device renders the image when the device is within approximately one meter of the body.
19. The portable electronic device of claims 17 or 18, wherein the image changes to show images representing different body parts as the device is moved relative to the body.
20. The portable electronic device of any of claims 17-20, wherein the image is three- dimensional image.
21. A portable electronic device comprising:
a plurality of imaging elements configured to receive radiation signals transmitted through or reflected by an imaging target;
an imaging interface; and
one or more processors configured to receive one or more sensing signals from at least one of the plurality of imaging elements, and to render an image of the imaging target for display through the imaging interface based at least in part on the one or more sensing signals.
22. The portable electronic device of claim 21, wherein the plurality of imaging elements are ultrasound imaging elements that are configured to receive ultrasound signals.
23. The portable electronic device of claims 21 or 22, wherein the plurality of imaging elements comprise a plurality of ultrasound transducers.
24. The portable electronic device of any of claims 21-23, wherein the plurality of imaging elements comprise an array of ultrasound transducers.
25. The portable electronic device of any of claims 21-24, wherein the plurality of imaging elements are configured to receive a plurality of radiation signals transmitted through an imaging target.
26. The portable electronic device of claim 25, further comprising a second plurality of imaging elements configured to transmit the plurality of radiation signals to the plurality of imaging elements.
27. The portable electronic device of any of claims 21-26, wherein the plurality of imaging elements are configured to receive radiation signals reflected by an imaging target.
28. The portable electronic device of any of claims 21-27, wherein the portable electronic device comprises a smart phone.
29. The portable electronic device of any of claims 21-28, wherein the portable electronic device comprises a tablet computer.
30. The portable electronic device of any of claims 21-29, wherein the one or more processors are configured to render text, a graphic, and/or other information associated with the image of the imaging target for simultaneous display with the image.
31. The portable electronic device of any of claims 21-30, further comprising memory configured to store data for use by the one or more processors to identify or characterize one or more structures or objects within the image.
32. The portable electronic device of claim 31, wherein the data comprises image data associated with at least one of an organ, artery, vein, tissue, bone, and/or other bodily content or part.
33. The portable electronic device of claim 31 or 32, wherein the data comprises image data associated with at least one of a shape, color, texture, cellular characteristic, and/or tissue characteristic of an imaged structure or object.
34. The portable electronic device of any of claims 31-33, wherein the data comprises data associated with one or more abnormalities in an imaged structure or object.
35. The portable electronic device of any of claims 21-34, wherein the portable electronic device comprises a body and a removable case that at least partially encloses the body.
36. The portable electronic device of claim 35, wherein the body comprises the imaging interface.
37. The portable electronic device of claim 35 or 36, wherein the body comprises at least one of the one or more processors.
38. The portable electronic device of any of claims 35-37, wherein the case comprises at least one of the one or more processors.
39. The portable electronic device of any of claims 35-38, wherein the case comprises the plurality of imaging elements.
40. The portable electronic device of any of claims 35-39, wherein the case comprises a modular unit comprising the plurality of imaging elements, wherein the case and the modular unit are configured to allow the case to retain the modular unit and wherein the modular unit is also configured for retention by a second case for a second portable electronic device having a different size and/or shape than the first portable electronic device.
PCT/US2014/032803 2013-04-03 2014-04-03 Portable electronic devices with integrated imaging capabilities WO2014165662A2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201480031564.XA CN105263419A (en) 2013-04-03 2014-04-03 Portable electronic devices with integrated imaging capabilities
EP14725300.9A EP2981215A2 (en) 2013-04-03 2014-04-03 Portable electronic devices with integrated imaging capabilities
CA2908631A CA2908631C (en) 2013-04-03 2014-04-03 Portable electronic devices with integrated imaging capabilities
KR1020157031515A KR20150145236A (en) 2013-04-03 2014-04-03 Portable electronic devices with integrated imaging capabilities
JP2016506609A JP6786384B2 (en) 2013-04-03 2014-04-03 Portable electronics with integrated imaging capabilities

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/856,252 2013-04-03
US13/856,252 US9667889B2 (en) 2013-04-03 2013-04-03 Portable electronic devices with integrated imaging capabilities

Publications (2)

Publication Number Publication Date
WO2014165662A2 true WO2014165662A2 (en) 2014-10-09
WO2014165662A3 WO2014165662A3 (en) 2014-12-31

Family

ID=50736175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/032803 WO2014165662A2 (en) 2013-04-03 2014-04-03 Portable electronic devices with integrated imaging capabilities

Country Status (7)

Country Link
US (3) US9667889B2 (en)
EP (1) EP2981215A2 (en)
JP (2) JP6786384B2 (en)
KR (1) KR20150145236A (en)
CN (1) CN105263419A (en)
CA (1) CA2908631C (en)
WO (1) WO2014165662A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10469846B2 (en) 2017-03-27 2019-11-05 Vave Health, Inc. Dynamic range compression of ultrasound images
US10856843B2 (en) 2017-03-23 2020-12-08 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US10980511B2 (en) 2013-07-23 2021-04-20 Butterfly Network, Inc. Interconnectable ultrasound transducer probes and related methods and apparatus
EP3777700A4 (en) * 2018-04-13 2021-06-02 FUJIFILM Corporation Ultrasonic system and method of controlling ultrasonic system
US11439364B2 (en) 2013-03-15 2022-09-13 Bfly Operations, Inc. Ultrasonic imaging devices, systems and methods
US11446003B2 (en) 2017-03-27 2022-09-20 Vave Health, Inc. High performance handheld ultrasound
US11531096B2 (en) 2017-03-23 2022-12-20 Vave Health, Inc. High performance handheld ultrasound

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8804457B2 (en) * 2011-03-31 2014-08-12 Maxim Integrated Products, Inc. Transmit/receive systems for imaging devices
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
US20150038844A1 (en) * 2013-08-01 2015-02-05 Travis Blalock Portable Ultrasound System Comprising Ultrasound Front-End Directly Connected to a Mobile Device
US10042078B2 (en) * 2015-02-27 2018-08-07 The United States of America, as Represented by the Secretary of Homeland Security System and method for viewing images on a portable image viewing device related to image screening
CN104800974A (en) * 2015-04-16 2015-07-29 修清 Heart failure treatment device
US9933470B2 (en) * 2015-08-31 2018-04-03 The Boeing Company Energy spectrum visualization system
US20190336101A1 (en) * 2016-11-16 2019-11-07 Teratech Corporation Portable ultrasound system
US11793488B2 (en) * 2018-02-16 2023-10-24 Koninklijke Philips N.V. Ergonomic display and activation in handheld medical ultrasound imaging device
US11250050B2 (en) * 2018-03-01 2022-02-15 The Software Mackiev Company System for multi-tagging images
US20230101257A1 (en) * 2020-03-05 2023-03-30 Koninklijke Philips N.V. Handheld ultrasound scanner with display retention and associated devices, systems, and methods
US20220211346A1 (en) * 2021-01-04 2022-07-07 Bfly Operations, Inc. Methods and apparatuses for displaying ultrasound displays on a foldable processing device

Family Cites Families (443)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1050654A (en) 1974-04-25 1979-03-13 Varian Associates Reconstruction system and method for ultrasonic imaging
US4149247A (en) 1975-12-23 1979-04-10 Varian Associates, Inc. Tomographic apparatus and method for reconstructing planar slices from non-absorbed and non-scattered radiation
US4100916A (en) 1976-04-27 1978-07-18 King Donald L Three-dimensional ultrasonic imaging of animal soft tissue
US4075883A (en) 1976-08-20 1978-02-28 General Electric Company Ultrasonic fan beam scanner for computerized time-of-flight tomography
US4317369A (en) 1978-09-15 1982-03-02 University Of Utah Ultrasound imaging apparatus and method
US4281550A (en) 1979-12-17 1981-08-04 North American Philips Corporation Curved array of sequenced ultrasound transducers
EP0031614B2 (en) 1979-12-17 1990-07-18 North American Philips Corporation Curved array of sequenced ultrasound transducers
US5488952A (en) 1982-02-24 1996-02-06 Schoolman Scientific Corp. Stereoscopically display three dimensional ultrasound imaging
JPS595947A (en) 1982-07-02 1984-01-12 Toshiba Corp Ultrasonic scanning apparatus
US4594662A (en) 1982-11-12 1986-06-10 Schlumberger Technology Corporation Diffraction tomography systems and methods with fixed detector arrays
US4662222A (en) 1984-12-21 1987-05-05 Johnson Steven A Apparatus and method for acoustic imaging using inverse scattering techniques
DE3601983A1 (en) 1986-01-23 1987-07-30 Siemens Ag METHOD AND DEVICE FOR CONTACTLESS DETERMINATION OF TEMPERATURE DISTRIBUTION IN AN EXAMINATION OBJECT
US5065761A (en) 1989-07-12 1991-11-19 Diasonics, Inc. Lithotripsy system
US5206637A (en) 1991-01-31 1993-04-27 Meridian Incorporated Removable file programming unit
GB9109881D0 (en) 1991-05-08 1991-07-03 Advanced Tech Lab Transesophageal echocardiography scanner with rotating image plane
US5206165A (en) 1991-07-09 1993-04-27 Michigan Cancer Foundation Immortal human mammary epithelial cell sublines
US5269307A (en) 1992-01-31 1993-12-14 Tetrad Corporation Medical ultrasonic imaging system with dynamic focusing
US5409010A (en) 1992-05-19 1995-04-25 Board Of Regents Of The University Of Washington Vector doppler medical devices for blood velocity studies
US5382521A (en) 1992-07-14 1995-01-17 Michigan Cancer Foundation Method of determining metastatic potential of bladder tumor cells
DE4229817C2 (en) 1992-09-07 1996-09-12 Siemens Ag Method for the non-destructive and / or non-invasive measurement of a temperature change in the interior of a living object in particular
US5291893A (en) 1992-10-09 1994-03-08 Acoustic Imaging Technologies Corporation Endo-luminal ultrasonic instrument and method for its use
US6587540B1 (en) 1992-10-14 2003-07-01 Techniscan, Inc. Apparatus and method for imaging objects with wavefields
US6005916A (en) 1992-10-14 1999-12-21 Techniscan, Inc. Apparatus and method for imaging with wavefields using inverse scattering techniques
US5335663A (en) 1992-12-11 1994-08-09 Tetrad Corporation Laparoscopic probes and probe sheaths useful in ultrasonic imaging applications
CN2170735Y (en) * 1993-05-29 1994-07-06 清华大学 High resolving power, low dose, potable B ultrasonic image display instrument
US5471988A (en) 1993-12-24 1995-12-05 Olympus Optical Co., Ltd. Ultrasonic diagnosis and therapy system in which focusing point of therapeutic ultrasonic wave is locked at predetermined position within observation ultrasonic scanning range
US5471515A (en) 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US6570617B2 (en) 1994-01-28 2003-05-27 California Institute Of Technology CMOS active pixel sensor type imaging system on a chip
US5834442A (en) 1994-07-07 1998-11-10 Barbara Ann Karmanos Cancer Institute Method for inhibiting cancer metastasis by oral administration of soluble modified citrus pectin
US5677491A (en) 1994-08-08 1997-10-14 Diasonics Ultrasound, Inc. Sparse two-dimensional transducer array
US5619476A (en) 1994-10-21 1997-04-08 The Board Of Trustees Of The Leland Stanford Jr. Univ. Electrostatic ultrasonic transducer
US5894452A (en) 1994-10-21 1999-04-13 The Board Of Trustees Of The Leland Stanford Junior University Microfabricated ultrasonic immersion transducer
US5520188A (en) 1994-11-02 1996-05-28 Focus Surgery Inc. Annular array transducer
US5611025A (en) 1994-11-23 1997-03-11 General Electric Company Virtual internal cavity inspection system
US6517487B1 (en) 1995-03-01 2003-02-11 Lunar Corporation Ultrasonic densitometer with opposed single transducer and transducer array
US5873902A (en) 1995-03-31 1999-02-23 Focus Surgery, Inc. Ultrasound intensity determining method and apparatus
US7841982B2 (en) 1995-06-22 2010-11-30 Techniscan, Inc. Apparatus and method for imaging objects with wavefields
US5990506A (en) 1996-03-20 1999-11-23 California Institute Of Technology Active pixel sensors with substantially planarized color filtering elements
US5893363A (en) 1996-06-28 1999-04-13 Sonosight, Inc. Ultrasonic array transducer transceiver for a hand held ultrasonic diagnostic instrument
US5722412A (en) * 1996-06-28 1998-03-03 Advanced Technology Laboratories, Inc. Hand held ultrasonic diagnostic instrument
US6569101B2 (en) 2001-04-19 2003-05-27 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
US5817024A (en) 1996-06-28 1998-10-06 Sonosight, Inc. Hand held ultrasonic diagnostic instrument with digital beamformer
US6575908B2 (en) 1996-06-28 2003-06-10 Sonosite, Inc. Balance body ultrasound system
US6135961A (en) 1996-06-28 2000-10-24 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US6416475B1 (en) 1996-06-28 2002-07-09 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US6383139B1 (en) 1996-06-28 2002-05-07 Sonosite, Inc. Ultrasonic signal processor for power doppler imaging in a hand held ultrasonic diagnostic instrument
US6962566B2 (en) 2001-04-19 2005-11-08 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
US6203498B1 (en) 1996-06-28 2001-03-20 Sonosite, Inc. Ultrasonic imaging device with integral display
US5782769A (en) 1996-06-28 1998-07-21 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic image flash suppression technique
US7819807B2 (en) 1996-06-28 2010-10-26 Sonosite, Inc. Balance body ultrasound system
AU4169497A (en) 1996-08-29 1998-04-14 David T. Borup Apparatus and method for imaging with wavefields using inverse scattering techniques
DE19635593C1 (en) 1996-09-02 1998-04-23 Siemens Ag Ultrasound transducer for diagnostic and therapeutic use
US5769790A (en) 1996-10-25 1998-06-23 General Electric Company Focused ultrasound surgery system guided by ultrasound imaging
US5740805A (en) 1996-11-19 1998-04-21 Analogic Corporation Ultrasound beam softening compensation system
US5820564A (en) 1996-12-16 1998-10-13 Albatross Technologies, Inc. Method and apparatus for surface ultrasound imaging
US7476411B1 (en) 1997-02-24 2009-01-13 Cabot Corporation Direct-write deposition of phosphor powders
US6193908B1 (en) 1997-02-24 2001-02-27 Superior Micropowders Llc Electroluminescent phosphor powders, methods for making phosphor powders and devices incorporating same
US6197218B1 (en) 1997-02-24 2001-03-06 Superior Micropowders Llc Photoluminescent phosphor powders, methods for making phosphor powders and devices incorporating same
WO1998037165A1 (en) 1997-02-24 1998-08-27 Superior Micropowders Llc Oxygen-containing phosphor powders, methods for making phosphor powders and devices incorporating same
CA2287386A1 (en) * 1997-04-24 1998-10-29 Wilk Patent Development Corporation Medical imaging device and associated method
US6093883A (en) 1997-07-15 2000-07-25 Focus Surgery, Inc. Ultrasound intensity determining method and apparatus
US6049159A (en) 1997-10-06 2000-04-11 Albatros Technologies, Inc. Wideband acoustic transducer
US6050943A (en) 1997-10-14 2000-04-18 Guided Therapy Systems, Inc. Imaging, therapy, and temperature monitoring ultrasonic system
DE19840405B4 (en) 1997-10-14 2005-06-02 Siemens Ag Device for fixing the female breast in medical technology applications
DE19745400C1 (en) 1997-10-14 1999-04-15 Siemens Ag Ultrasonic breast tumour therapy process
EP1025520B1 (en) 1997-10-30 2002-08-28 Dr. Baldeweg Aktiengesellschaft Method and device for processing imaged objects
US6007499A (en) 1997-10-31 1999-12-28 University Of Washington Method and apparatus for medical procedures using high-intensity focused ultrasound
DE69941447D1 (en) 1998-01-05 2009-11-05 Univ Washington INCREASED TRANSPORT USING MEMBRANE-DAMAGED SUBSTANCES
CN1058905C (en) 1998-01-25 2000-11-29 重庆海扶(Hifu)技术有限公司 High-intensity focus supersonic tumor scanning therapy system
DE19807242C2 (en) 1998-02-20 2002-07-11 Siemens Ag Medical-technical system workstation
US6385474B1 (en) 1999-03-19 2002-05-07 Barbara Ann Karmanos Cancer Institute Method and apparatus for high-resolution detection and characterization of medical pathologies
EP1063920B1 (en) 1998-03-20 2006-11-29 Barbara Ann Karmanos Cancer Institute Multidimensional detection and characterization of pathologic tissues
US6685640B1 (en) 1998-03-30 2004-02-03 Focus Surgery, Inc. Ablation system
US5982709A (en) 1998-03-31 1999-11-09 The Board Of Trustees Of The Leland Stanford Junior University Acoustic transducers and method of microfabrication
US6036646A (en) 1998-07-10 2000-03-14 Guided Therapy Systems, Inc. Method and apparatus for three dimensional ultrasound imaging
AU6021299A (en) 1998-08-27 2000-03-21 Superior Micropowders Llc Phosphor powders, methods for making phosphor powders and devices incorporating same
US6014897A (en) 1998-09-02 2000-01-18 Mo; Larry Y. L. Method and apparatus for improving sidelobe performance of sparse array using harmonic imaging
US6042556A (en) 1998-09-04 2000-03-28 University Of Washington Method for determining phase advancement of transducer elements in high intensity focused ultrasound
US7722539B2 (en) 1998-09-18 2010-05-25 University Of Washington Treatment of unwanted tissue by the selective destruction of vasculature providing nutrients to the tissue
US7686763B2 (en) 1998-09-18 2010-03-30 University Of Washington Use of contrast agents to increase the effectiveness of high intensity focused ultrasound therapy
US6425867B1 (en) 1998-09-18 2002-07-30 University Of Washington Noise-free real time ultrasonic imaging of a treatment site undergoing high intensity focused ultrasound therapy
US6645145B1 (en) 1998-11-19 2003-11-11 Siemens Medical Solutions Usa, Inc. Diagnostic medical ultrasound systems and transducers utilizing micro-mechanical components
US6605043B1 (en) 1998-11-19 2003-08-12 Acuson Corp. Diagnostic medical ultrasound systems and transducers utilizing micro-mechanical components
US6224556B1 (en) 1998-11-25 2001-05-01 Acuson Corporation Diagnostic medical ultrasound system and method for using a sparse array
EP1176637A4 (en) 1999-01-22 2006-09-13 Hitachi Ltd Semiconductor integrated circuit and manufacture thereof
US6447451B1 (en) 1999-05-04 2002-09-10 Sonosite, Inc. Mobile ultrasound diagnostic instrument and docking stand
US6364839B1 (en) 1999-05-04 2002-04-02 Sonosite, Inc. Ultrasound diagnostic instrument having software in detachable scanhead
US6471651B1 (en) 1999-05-05 2002-10-29 Sonosite, Inc. Low power portable ultrasonic diagnostic instrument
US6371918B1 (en) 1999-05-05 2002-04-16 Sonosite Inc. Transducer connector
US6666835B2 (en) 1999-05-14 2003-12-23 University Of Washington Self-cooled ultrasonic applicator for medical applications
US6217530B1 (en) 1999-05-14 2001-04-17 University Of Washington Ultrasonic applicator for medical applications
US20040015079A1 (en) 1999-06-22 2004-01-22 Teratech Corporation Ultrasound probe with integrated electronics
US6238346B1 (en) 1999-06-25 2001-05-29 Agilent Technologies, Inc. System and method employing two dimensional ultrasound array for wide field of view imaging
US7510536B2 (en) 1999-09-17 2009-03-31 University Of Washington Ultrasound guided high intensity focused ultrasound treatment of nerves
US7520856B2 (en) 1999-09-17 2009-04-21 University Of Washington Image guided high intensity focused ultrasound device for therapy in obstetrics and gynecology
US6262946B1 (en) 1999-09-29 2001-07-17 The Board Of Trustees Of The Leland Stanford Junior University Capacitive micromachined ultrasonic transducer arrays with reduced cross-coupling
US6430109B1 (en) 1999-09-30 2002-08-06 The Board Of Trustees Of The Leland Stanford Junior University Array of capacitive micromachined ultrasonic transducer elements with through wafer via connections
US6440071B1 (en) 1999-10-18 2002-08-27 Guided Therapy Systems, Inc. Peripheral ultrasound imaging system
US6552841B1 (en) 2000-01-07 2003-04-22 Imperium Advanced Ultrasonic Imaging Ultrasonic imager
US6432053B1 (en) 2000-02-18 2002-08-13 Advanced Diagnostics, Inc. Process for non-invasively determining the dimensions of a lesion
US7499745B2 (en) 2000-02-28 2009-03-03 Barbara Ann Karmanos Cancer Institute Multidimensional bioelectrical tissue analyzer
US6613004B1 (en) 2000-04-21 2003-09-02 Insightec-Txsonics, Ltd. Systems and methods for creating longer necrosed volumes using a phased array focused ultrasound system
US6419648B1 (en) 2000-04-21 2002-07-16 Insightec-Txsonics Ltd. Systems and methods for reducing secondary hot spots in a phased array focused ultrasound system
US6543272B1 (en) 2000-04-21 2003-04-08 Insightec-Txsonics Ltd. Systems and methods for testing and calibrating a focused ultrasound transducer array
US6443901B1 (en) 2000-06-15 2002-09-03 Koninklijke Philips Electronics N.V. Capacitive micromachined ultrasonic transducers
US6506171B1 (en) 2000-07-27 2003-01-14 Insightec-Txsonics, Ltd System and methods for controlling distribution of acoustic energy around a focal point using a focused ultrasound system
JP2002052018A (en) 2000-08-11 2002-02-19 Canon Inc Image display device, image display method and storage medium
US6669641B2 (en) 2000-08-17 2003-12-30 Koninklijke Philips Electronics N.V. Method of and system for ultrasound imaging
US6755788B2 (en) 2000-08-17 2004-06-29 Koninklijke Philips Electronics N. V. Image orientation display for a three dimensional ultrasonic imaging system
US6443896B1 (en) 2000-08-17 2002-09-03 Koninklijke Philips Electronics N.V. Method for creating multiplanar ultrasonic images of a three dimensional object
US6709394B2 (en) 2000-08-17 2004-03-23 Koninklijke Philips Electronics N.V. Biplane ultrasonic imaging
US6761689B2 (en) 2000-08-17 2004-07-13 Koninklijke Philips Electronics N.V. Biplane ultrasonic imaging
US7037264B2 (en) 2000-08-17 2006-05-02 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging with steered image plane
US6612988B2 (en) 2000-08-29 2003-09-02 Brigham And Women's Hospital, Inc. Ultrasound therapy
US6450960B1 (en) 2000-08-29 2002-09-17 Barbara Ann Karmanos Cancer Institute Real-time three-dimensional acoustoelectronic imaging and characterization of objects
US6419633B1 (en) 2000-09-15 2002-07-16 Koninklijke Philips Electronics N.V. 2D ultrasonic transducer array for two dimensional and three dimensional imaging
US6613005B1 (en) 2000-11-28 2003-09-02 Insightec-Txsonics, Ltd. Systems and methods for steering a focused ultrasound array
US20100081893A1 (en) 2008-09-19 2010-04-01 Physiosonics, Inc. Acoustic palpation using non-invasive ultrasound techniques to identify and localize tissue eliciting biological responses and target treatments
US7022077B2 (en) 2000-11-28 2006-04-04 Allez Physionix Ltd. Systems and methods for making noninvasive assessments of cardiac tissue and parameters
US20100087728A1 (en) 2000-11-28 2010-04-08 Physiosonics, Inc. Acoustic palpation using non-invasive ultrasound techniques to identify and localize tissue eliciting biological responses
US6666833B1 (en) 2000-11-28 2003-12-23 Insightec-Txsonics Ltd Systems and methods for focussing an acoustic energy beam transmitted through non-uniform tissue medium
US6506154B1 (en) 2000-11-28 2003-01-14 Insightec-Txsonics, Ltd. Systems and methods for controlling a phased array focused ultrasound system
AU2002239360A1 (en) 2000-11-28 2002-06-11 Allez Physionix Limited Systems and methods for making non-invasive physiological assessments
US6645162B2 (en) 2000-12-27 2003-11-11 Insightec - Txsonics Ltd. Systems and methods for ultrasound assisted lipolysis
US6626854B2 (en) 2000-12-27 2003-09-30 Insightec - Txsonics Ltd. Systems and methods for ultrasound assisted lipolysis
US6540679B2 (en) 2000-12-28 2003-04-01 Guided Therapy Systems, Inc. Visual imaging system for ultrasonic probe
US6600325B2 (en) 2001-02-06 2003-07-29 Sun Microsystems, Inc. Method and apparatus for probing an integrated circuit through capacitive coupling
USD456509S1 (en) 2001-04-19 2002-04-30 Sonosite, Inc. Combined module and cable for ultrasound system
USD461895S1 (en) 2001-04-19 2002-08-20 Sonosite, Inc. Handheld medical diagnostic ultrasound instrument
US6559644B2 (en) 2001-05-30 2003-05-06 Insightec - Txsonics Ltd. MRI-based temperature mapping with error compensation
US6735461B2 (en) 2001-06-19 2004-05-11 Insightec-Txsonics Ltd Focused ultrasound system with MRI synchronization
DE10134014A1 (en) 2001-07-12 2003-01-30 Driescher Eltech Werk Power converter
US6880137B1 (en) 2001-08-03 2005-04-12 Inovys Dynamically reconfigurable precision signal delay test system for automatic test equipment
US6694817B2 (en) 2001-08-21 2004-02-24 Georgia Tech Research Corporation Method and apparatus for the ultrasonic actuation of the cantilever of a probe-based instrument
US6795374B2 (en) 2001-09-07 2004-09-21 Siemens Medical Solutions Usa, Inc. Bias control of electrostatic transducers
CA2406684A1 (en) 2001-10-05 2003-04-05 Queen's University At Kingston Ultrasound transducer array
US20040238732A1 (en) 2001-10-19 2004-12-02 Andrei State Methods and systems for dynamic virtual convergence and head mountable display
US7175596B2 (en) 2001-10-29 2007-02-13 Insightec-Txsonics Ltd System and method for sensing and locating disturbances in an energy path of a focused ultrasound system
US7115093B2 (en) 2001-11-21 2006-10-03 Ge Medical Systems Global Technology Company, Llc Method and system for PDA-based ultrasound system
US6790180B2 (en) 2001-12-03 2004-09-14 Insightec-Txsonics Ltd. Apparatus, systems, and methods for measuring power output of an ultrasound transducer
US6522142B1 (en) 2001-12-14 2003-02-18 Insightec-Txsonics Ltd. MRI-guided temperature mapping of tissue undergoing thermal treatment
US6659954B2 (en) 2001-12-19 2003-12-09 Koninklijke Philips Electronics Nv Micromachined ultrasound transducer and method for fabricating same
US7371218B2 (en) 2002-01-17 2008-05-13 Siemens Medical Solutions Usa, Inc. Immersive portable ultrasound system and method
US6604630B1 (en) 2002-01-30 2003-08-12 Sonosite, Inc. Carrying case for lightweight ultrasound device
US6648826B2 (en) 2002-02-01 2003-11-18 Sonosite, Inc. CW beam former in an ASIC
US7128711B2 (en) 2002-03-25 2006-10-31 Insightec, Ltd. Positioning systems and methods for guided ultrasound therapy systems
US20030187371A1 (en) 2002-03-27 2003-10-02 Insightec-Txsonics Ltd. Systems and methods for enhanced focused ultrasound ablation using microbubbles
US7534211B2 (en) 2002-03-29 2009-05-19 Sonosite, Inc. Modular apparatus for diagnostic ultrasound
US6716168B2 (en) 2002-04-30 2004-04-06 Siemens Medical Solutions Usa, Inc. Ultrasound drug delivery enhancement and imaging systems and methods
US7285092B2 (en) 2002-12-18 2007-10-23 Barbara Ann Karmanos Cancer Institute Computerized ultrasound risk evaluation system
EP1551303A4 (en) 2002-05-16 2009-03-18 Karmanos B A Cancer Inst Method and system for combined diagnostic and therapeutic ultrasound system incorporating noninvasive thermometry, ablation control and automation
US6783497B2 (en) 2002-05-23 2004-08-31 Volumetrics Medical Imaging, Inc. Two-dimensional ultrasonic array with asymmetric apertures
WO2003101530A2 (en) 2002-05-30 2003-12-11 University Of Washington Solid hydrogel coupling for ultrasound imaging and therapy
US20030230488A1 (en) 2002-06-13 2003-12-18 Lawrence Lee Microfluidic device preparation system
US6705994B2 (en) 2002-07-08 2004-03-16 Insightec - Image Guided Treatment Ltd Tissue inhomogeneity correction in ultrasound imaging
US6958255B2 (en) 2002-08-08 2005-10-25 The Board Of Trustees Of The Leland Stanford Junior University Micromachined ultrasonic transducers and method of fabrication
US6835177B2 (en) 2002-11-06 2004-12-28 Sonosite, Inc. Ultrasonic blood vessel measurement apparatus and method
US6831394B2 (en) 2002-12-11 2004-12-14 General Electric Company Backing material for micromachined ultrasonic transducer devices
US6926672B2 (en) 2002-12-18 2005-08-09 Barbara Ann Karmanos Cancer Institute Electret acoustic transducer array for computerized ultrasound risk evaluation system
US6837854B2 (en) 2002-12-18 2005-01-04 Barbara Ann Karmanos Cancer Institute Methods and systems for using reference images in acoustic image processing
US8088067B2 (en) 2002-12-23 2012-01-03 Insightec Ltd. Tissue aberration corrections in ultrasound therapy
US6836020B2 (en) 2003-01-22 2004-12-28 The Board Of Trustees Of The Leland Stanford Junior University Electrical through wafer interconnects
US7313053B2 (en) 2003-03-06 2007-12-25 General Electric Company Method and apparatus for controlling scanning of mosaic sensor array
US7257051B2 (en) 2003-03-06 2007-08-14 General Electric Company Integrated interface electronics for reconfigurable sensor array
US7353056B2 (en) 2003-03-06 2008-04-01 General Electric Company Optimized switching configurations for reconfigurable arrays of sensor elements
US6865140B2 (en) 2003-03-06 2005-03-08 General Electric Company Mosaic arrays using micromachined ultrasound transducers
US7443765B2 (en) 2003-03-06 2008-10-28 General Electric Company Reconfigurable linear sensor arrays for reduced channel count
US7280435B2 (en) 2003-03-06 2007-10-09 General Electric Company Switching circuitry for reconfigurable arrays of sensor elements
US20120035473A1 (en) 2003-03-10 2012-02-09 Focus Surgery, Inc. Laparoscopic hifu probe
US6980419B2 (en) 2003-03-12 2005-12-27 Zonare Medical Systems, Inc. Portable ultrasound unit and docking station
US7771360B2 (en) 2003-04-09 2010-08-10 Techniscan, Inc. Breast scanning system
US7303530B2 (en) 2003-05-22 2007-12-04 Siemens Medical Solutions Usa, Inc. Transducer arrays with an integrated sensor and methods of use
US7611462B2 (en) 2003-05-22 2009-11-03 Insightec-Image Guided Treatment Ltd. Acoustic beam forming in phased arrays including large numbers of transducer elements
JP4332372B2 (en) * 2003-05-27 2009-09-16 アロカ株式会社 Ultrasonic diagnostic equipment
US7377900B2 (en) 2003-06-02 2008-05-27 Insightec - Image Guided Treatment Ltd. Endo-cavity focused ultrasound transducer
EP1636609A1 (en) * 2003-06-10 2006-03-22 Koninklijke Philips Electronics N.V. User interface for a three-dimensional colour ultrasound imaging system
US7549961B1 (en) 2003-07-31 2009-06-23 Sonosite, Inc. System and method supporting imaging and monitoring applications
US20050049495A1 (en) * 2003-09-03 2005-03-03 Siemens Medical Solutions Usa, Inc. Remote assistance for medical diagnostic ultrasound
WO2005037060A2 (en) 2003-10-03 2005-04-28 University Of Washington Transcutaneous localization of arterial bleeding by ultrasonic imaging
US7972271B2 (en) 2003-10-28 2011-07-05 The Board Of Trustees Of The Leland Stanford Junior University Apparatus and method for phased subarray imaging
ATE426345T1 (en) 2003-11-04 2009-04-15 Univ Washington TOOTHBRUSH USING AN ACOUSTIC WAVEGUIDE
JP4773366B2 (en) 2003-12-04 2011-09-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Ultrasonic transducer and method for performing flip-chip two-dimensional array technology on curved array
US20110040171A1 (en) 2003-12-16 2011-02-17 University Of Washington Image guided high intensity focused ultrasound treatment of nerves
US7030536B2 (en) 2003-12-29 2006-04-18 General Electric Company Micromachined ultrasonic transducer cells having compliant support structure
US7125383B2 (en) 2003-12-30 2006-10-24 General Electric Company Method and apparatus for ultrasonic continuous, non-invasive blood pressure monitoring
US7425199B2 (en) 2003-12-30 2008-09-16 General Electric Company Method and apparatus for ultrasonic continuous, non-invasive blood pressure monitoring
US7285897B2 (en) 2003-12-31 2007-10-23 General Electric Company Curved micromachined ultrasonic transducer arrays and related methods of manufacture
US7052464B2 (en) 2004-01-01 2006-05-30 General Electric Company Alignment method for fabrication of integrated ultrasonic transducer array
US7588539B2 (en) 2004-01-21 2009-09-15 Siemens Medical Solutions Usa, Inc. Integrated low-power pw/cw transmitter
JP2007528153A (en) 2004-02-06 2007-10-04 ジョージア テック リサーチ コーポレイション CMUT device and manufacturing method
US7691063B2 (en) 2004-02-26 2010-04-06 Siemens Medical Solutions Usa, Inc. Receive circuit for minimizing channels in ultrasound imaging
US8008835B2 (en) 2004-02-27 2011-08-30 Georgia Tech Research Corporation Multiple element electrode cMUT devices and fabrication methods
EP1761998A4 (en) 2004-02-27 2011-05-11 Georgia Tech Res Inst Harmonic cmut devices and fabrication methods
US7646133B2 (en) 2004-02-27 2010-01-12 Georgia Tech Research Corporation Asymmetric membrane cMUT devices and fabrication methods
US7662114B2 (en) 2004-03-02 2010-02-16 Focus Surgery, Inc. Ultrasound phased arrays
US7530952B2 (en) 2004-04-01 2009-05-12 The Board Of Trustees Of The Leland Stanford Junior University Capacitive ultrasonic transducers with isolation posts
US20050219695A1 (en) 2004-04-05 2005-10-06 Vesely Michael A Horizontal perspective display
US7274623B2 (en) 2004-04-06 2007-09-25 Board Of Trustees Of The Deland Stanford Junior University Method and system for operating capacitive membrane ultrasonic transducers
US7321181B2 (en) 2004-04-07 2008-01-22 The Board Of Trustees Of The Leland Stanford Junior University Capacitive membrane ultrasonic transducers with reduced bulk wave generation and method
US20060009693A1 (en) 2004-04-08 2006-01-12 Techniscan, Inc. Apparatus for imaging and treating a breast
US8213467B2 (en) 2004-04-08 2012-07-03 Sonosite, Inc. Systems and methods providing ASICs for use in multiple applications
US7470232B2 (en) 2004-05-04 2008-12-30 General Electric Company Method and apparatus for non-invasive ultrasonic fetal heart rate monitoring
US20060058667A1 (en) 2004-05-06 2006-03-16 Lemmerhirt David F Integrated circuit for an ultrasound system
US20070219448A1 (en) 2004-05-06 2007-09-20 Focus Surgery, Inc. Method and Apparatus for Selective Treatment of Tissue
US8235909B2 (en) 2004-05-12 2012-08-07 Guided Therapy Systems, L.L.C. Method and system for controlled scanning, imaging and/or therapy
US8199685B2 (en) 2004-05-17 2012-06-12 Sonosite, Inc. Processing of medical signals
US20050264857A1 (en) 2004-06-01 2005-12-01 Vesely Michael A Binaural horizontal perspective display
US7545075B2 (en) 2004-06-04 2009-06-09 The Board Of Trustees Of The Leland Stanford Junior University Capacitive micromachined ultrasonic transducer array with through-substrate electrical connection and method of fabricating same
JP4820365B2 (en) * 2004-06-30 2011-11-24 ギブン イメージング リミテッド In-vivo detection system apparatus and method for real-time display
US7955264B2 (en) 2004-07-07 2011-06-07 General Electric Company System and method for providing communication between ultrasound scanners
JP4746291B2 (en) 2004-08-05 2011-08-10 オリンパス株式会社 Capacitive ultrasonic transducer and manufacturing method thereof
US7699780B2 (en) 2004-08-11 2010-04-20 Insightec—Image-Guided Treatment Ltd. Focused ultrasound system with adaptive anatomical aperture shaping
US7996688B2 (en) 2004-08-24 2011-08-09 Sonosite, Inc. Ultrasound system power management
US7867168B2 (en) 2004-08-24 2011-01-11 Sonosite, Inc. Ultrasonic transducer having distributed weight properties
US8409099B2 (en) 2004-08-26 2013-04-02 Insightec Ltd. Focused ultrasound system for surrounding a body tissue mass and treatment method
US7888709B2 (en) 2004-09-15 2011-02-15 Sonetics Ultrasound, Inc. Capacitive micromachined ultrasonic transducer and manufacturing method
US8658453B2 (en) 2004-09-15 2014-02-25 Sonetics Ultrasound, Inc. Capacitive micromachined ultrasonic transducer
US8309428B2 (en) 2004-09-15 2012-11-13 Sonetics Ultrasound, Inc. Capacitive micromachined ultrasonic transducer
EP1789136B1 (en) 2004-09-16 2010-12-29 University of Washington Interference-free ultrasound imaging during hifu therapy, using software tools
US7393325B2 (en) 2004-09-16 2008-07-01 Guided Therapy Systems, L.L.C. Method and system for ultrasound treatment with a multi-directional transducer
US7824348B2 (en) 2004-09-16 2010-11-02 Guided Therapy Systems, L.L.C. System and method for variable depth ultrasound treatment
US20120165668A1 (en) 2010-08-02 2012-06-28 Guided Therapy Systems, Llc Systems and methods for treating acute and/or chronic injuries in soft tissue
US7530958B2 (en) 2004-09-24 2009-05-12 Guided Therapy Systems, Inc. Method and system for combined ultrasound treatment
US20080255452A1 (en) 2004-09-29 2008-10-16 Koninklijke Philips Electronics, N.V. Methods and Apparatus For Performing Enhanced Ultrasound Diagnostic Breast Imaging
US20060111744A1 (en) 2004-10-13 2006-05-25 Guided Therapy Systems, L.L.C. Method and system for treatment of sweat glands
US8133180B2 (en) 2004-10-06 2012-03-13 Guided Therapy Systems, L.L.C. Method and system for treating cellulite
US7530356B2 (en) 2004-10-06 2009-05-12 Guided Therapy Systems, Inc. Method and system for noninvasive mastopexy
WO2006042201A1 (en) 2004-10-06 2006-04-20 Guided Therapy Systems, L.L.C. Method and system for ultrasound tissue treatment
US7758524B2 (en) 2004-10-06 2010-07-20 Guided Therapy Systems, L.L.C. Method and system for ultra-high frequency ultrasound treatment
US20060079868A1 (en) 2004-10-07 2006-04-13 Guided Therapy Systems, L.L.C. Method and system for treatment of blood vessel disorders
US7375420B2 (en) 2004-12-03 2008-05-20 General Electric Company Large area transducer array
US7037746B1 (en) 2004-12-27 2006-05-02 General Electric Company Capacitive micromachined ultrasound transducer fabricated with epitaxial silicon membrane
US7293462B2 (en) 2005-01-04 2007-11-13 General Electric Company Isolation of short-circuited sensor cells for high-reliability operation of sensor array
CN100542635C (en) 2005-01-10 2009-09-23 重庆海扶(Hifu)技术有限公司 High intensity focused ultrasound therapy device and method
CN100574809C (en) 2005-01-10 2009-12-30 重庆海扶(Hifu)技术有限公司 A kind of high-strength focusing ultrasonic therapy fluorocarbon emulsion analog assistant and application thereof
CN100574811C (en) 2005-01-10 2009-12-30 重庆海扶(Hifu)技术有限公司 A kind of particle analog assistant for high-intensity focusing ultrasonic therapy and application thereof
CN100574810C (en) 2005-01-10 2009-12-30 重庆海扶(Hifu)技术有限公司 A kind of grain analog assistant for high-intensity focusing ultrasonic therapy and application thereof
CN100506323C (en) 2005-01-10 2009-07-01 重庆海扶(Hifu)技术有限公司 Integral ultrasonic therapy energy converter
US7563228B2 (en) * 2005-01-24 2009-07-21 Siemens Medical Solutions Usa, Inc. Stereoscopic three or four dimensional ultrasound imaging
CN100563752C (en) 2005-01-31 2009-12-02 重庆融海超声医学工程研究中心有限公司 MRI guided ultrasonic treatment device
CN100450563C (en) 2005-01-31 2009-01-14 重庆海扶(Hifu)技术有限公司 Device for delivering medium to body cavity, vascular-cavity leading-in and ultrasonic blocking
CN1814323B (en) 2005-01-31 2010-05-12 重庆海扶(Hifu)技术有限公司 Focusing ultrasonic therapeutical system
US8388544B2 (en) 2005-03-17 2013-03-05 General Electric Company System and method for measuring blood viscosity
EP1875327A2 (en) 2005-04-25 2008-01-09 Guided Therapy Systems, L.L.C. Method and system for enhancing computer peripheral saftey
US8066642B1 (en) 2005-05-03 2011-11-29 Sonosite, Inc. Systems and methods for ultrasound beam forming data control
US7914458B2 (en) 2005-05-05 2011-03-29 Volcano Corporation Capacitive microfabricated ultrasound transducer-based intravascular ultrasound probes
WO2006121957A2 (en) 2005-05-09 2006-11-16 Michael Vesely Three dimensional horizontal perspective workstation
CN101223633A (en) 2005-05-18 2008-07-16 科隆科技公司 Micro-electro-mechanical transducers
EP1882127A2 (en) 2005-05-18 2008-01-30 Kolo Technologies, Inc. Micro-electro-mechanical transducers
US8038631B1 (en) 2005-06-01 2011-10-18 Sanghvi Narendra T Laparoscopic HIFU probe
CA2509590A1 (en) * 2005-06-06 2006-12-06 Solar International Products Inc. Portable imaging apparatus
US7589456B2 (en) 2005-06-14 2009-09-15 Siemens Medical Solutions Usa, Inc. Digital capacitive membrane transducer
CA2608164A1 (en) 2005-06-17 2006-12-21 Kolo Technologies, Inc. Micro-electro-mechanical transducer having an insulation extension
US20070016039A1 (en) 2005-06-21 2007-01-18 Insightec-Image Guided Treatment Ltd. Controlled, non-linear focused ultrasound treatment
US7775979B2 (en) 2005-06-29 2010-08-17 General Electric Company Transmit and receive interface array for highly integrated ultrasound scanner
US20070010805A1 (en) 2005-07-08 2007-01-11 Fedewa Russell J Method and apparatus for the treatment of tissue
US7880565B2 (en) 2005-08-03 2011-02-01 Kolo Technologies, Inc. Micro-electro-mechanical transducer having a surface plate
WO2007015219A2 (en) 2005-08-03 2007-02-08 Kolo Technologies, Inc. Micro-electro-mechanical transducer having a surface plate
WO2007021958A2 (en) 2005-08-12 2007-02-22 University Of Washington Method and apparatus for preparing organs and tissues for laparoscopic surgery
US7591996B2 (en) 2005-08-17 2009-09-22 University Of Washington Ultrasound target vessel occlusion using microbubbles
US7621873B2 (en) 2005-08-17 2009-11-24 University Of Washington Method and system to synchronize acoustic therapy with ultrasound imaging
US7804595B2 (en) 2005-09-14 2010-09-28 University Of Washington Using optical scattering to measure properties of ultrasound contrast agent shells
US8264683B2 (en) 2005-09-14 2012-09-11 University Of Washington Dynamic characterization of particles with flow cytometry
US8016757B2 (en) 2005-09-30 2011-09-13 University Of Washington Non-invasive temperature estimation technique for HIFU therapy monitoring using backscattered ultrasound
US7878977B2 (en) 2005-09-30 2011-02-01 Siemens Medical Solutions Usa, Inc. Flexible ultrasound transducer array
JP4880275B2 (en) 2005-10-03 2012-02-22 オリンパスメディカルシステムズ株式会社 Capacitive ultrasonic transducer
US7441447B2 (en) 2005-10-07 2008-10-28 Georgia Tech Research Corporation Methods of imaging in probe microscopy
US7449640B2 (en) 2005-10-14 2008-11-11 Sonosite, Inc. Alignment features for dicing multi element acoustic arrays
CN101309645B (en) * 2005-11-15 2010-12-08 株式会社日立医药 Ultrasonic diagnosis device
CN101313354B (en) 2005-11-23 2012-02-15 因赛泰克有限公司 Hierarchical switching in ultra-high density ultrasound array
US7546769B2 (en) 2005-12-01 2009-06-16 General Electric Compnay Ultrasonic inspection system and method
US8465431B2 (en) 2005-12-07 2013-06-18 Siemens Medical Solutions Usa, Inc. Multi-dimensional CMUT array with integrated beamformation
US8038620B2 (en) 2005-12-20 2011-10-18 General Electric Company Fresnel zone imaging system and method
US7622848B2 (en) 2006-01-06 2009-11-24 General Electric Company Transducer assembly with z-axis interconnect
US20070239011A1 (en) 2006-01-13 2007-10-11 Mirabilis Medica, Inc. Apparatus for delivering high intensity focused ultrasound energy to a treatment site internal to a patient's body
US20070239020A1 (en) 2006-01-19 2007-10-11 Kazuhiro Iinuma Ultrasonography apparatus
US20070180916A1 (en) 2006-02-09 2007-08-09 General Electric Company Capacitive micromachined ultrasound transducer and methods of making the same
US20070239019A1 (en) 2006-02-13 2007-10-11 Richard William D Portable ultrasonic imaging probe than connects directly to a host computer
GB2454603B (en) 2006-02-24 2010-05-05 Wolfson Microelectronics Plc Mems device
US7615834B2 (en) 2006-02-28 2009-11-10 The Board Of Trustees Of The Leland Stanford Junior University Capacitive micromachined ultrasonic transducer(CMUT) with varying thickness membrane
US7699793B2 (en) 2006-03-07 2010-04-20 Brainlab Ag Method and device for detecting and localising an impingement of joint components
US7764003B2 (en) 2006-04-04 2010-07-27 Kolo Technologies, Inc. Signal control in micromachined ultrasonic transducer
CA2649119A1 (en) 2006-04-13 2007-12-13 Mirabilis Medica, Inc. Methods and apparatus for the treatment of menometrorrhagia, endometrial pathology, and cervical neoplasia using high intensity focused ultrasound energy
US20110263997A1 (en) * 2006-04-20 2011-10-27 Engineered Vigilance, Llc System and method for remotely diagnosing and managing treatment of restrictive and obstructive lung disease and cardiopulmonary disorders
US7745973B2 (en) 2006-05-03 2010-06-29 The Board Of Trustees Of The Leland Stanford Junior University Acoustic crosstalk reduction for capacitive micromachined ultrasonic transducers in immersion
WO2007131163A2 (en) 2006-05-05 2007-11-15 Worcester Polytechnic Institute Reconfigurable wireless ultrasound diagnostic system
US7767484B2 (en) 2006-05-31 2010-08-03 Georgia Tech Research Corporation Method for sealing and backside releasing of microelectromechanical systems
US7874991B2 (en) * 2006-06-23 2011-01-25 Teratech Corporation Ultrasound 3D imaging system
JP5432708B2 (en) 2006-06-23 2014-03-05 コーニンクレッカ フィリップス エヌ ヴェ Timing control device for photoacoustic and ultrasonic composite imager
US8360986B2 (en) * 2006-06-30 2013-01-29 University Of Louisville Research Foundation, Inc. Non-contact and passive measurement of arterial pulse through thermal IR imaging, and analysis of thermal IR imagery
CN100486521C (en) 2006-07-19 2009-05-13 西门子(中国)有限公司 Device transmitting magnetic resonance signal in MRI guided medical equipment
US7741686B2 (en) 2006-07-20 2010-06-22 The Board Of Trustees Of The Leland Stanford Junior University Trench isolated capacitive micromachined ultrasonic transducer arrays with a supporting frame
US7535794B2 (en) 2006-08-01 2009-05-19 Insightec, Ltd. Transducer surface mapping
EP2049365B1 (en) 2006-08-01 2017-12-13 3M Innovative Properties Company Illumination device
US20080033278A1 (en) 2006-08-01 2008-02-07 Insightec Ltd. System and method for tracking medical device using magnetic resonance detection
US7652410B2 (en) 2006-08-01 2010-01-26 Insightec Ltd Ultrasound transducer with non-uniform elements
US20080033292A1 (en) 2006-08-02 2008-02-07 Insightec Ltd Ultrasound patient interface device
US7903830B2 (en) 2006-08-10 2011-03-08 Siemens Medical Solutions Usa, Inc. Push-pull capacitive micro-machined ultrasound transducer array
CN101126800B (en) 2006-08-16 2010-05-12 西门子(中国)有限公司 HIFU compatibe MRI radio frequency signal receiving coil and its receiving method
CN100574829C (en) 2006-08-24 2009-12-30 重庆融海超声医学工程研究中心有限公司 A kind of high-strength focus supersonic therapeutic system of image documentation equipment guiding
CN100574828C (en) 2006-08-24 2009-12-30 重庆融海超声医学工程研究中心有限公司 A kind of apparatus for ultrasonic therapeutic treatment and contain the supersonic therapeutic system of this apparatus for ultrasonic therapeutic treatment
DE102006040420A1 (en) 2006-08-29 2008-03-13 Siemens Ag Thermal ablation e.g. microwave ablation, implementing and monitoring device for treating tumor of patient, has magnet resonance system producing images composed of voxel, where geometry of voxel is adapted to form of ultrasonic focus
CN101140354B (en) 2006-09-04 2012-01-25 重庆融海超声医学工程研究中心有限公司 Resonant vibration type supersonic transducer
US20080097207A1 (en) 2006-09-12 2008-04-24 Siemens Medical Solutions Usa, Inc. Ultrasound therapy monitoring with diagnostic ultrasound
US9566454B2 (en) 2006-09-18 2017-02-14 Guided Therapy Systems, Llc Method and sysem for non-ablative acne treatment and prevention
ES2579765T3 (en) 2006-09-19 2016-08-16 Guided Therapy Systems, L.L.C. System for the treatment of muscle, tendon, ligamentous and cartilaginous tissue
US7825383B2 (en) 2006-09-21 2010-11-02 Siemens Medical Solutions Usa, Inc. Mobile camera for organ targeted imaging
US7559905B2 (en) 2006-09-21 2009-07-14 Focus Surgery, Inc. HIFU probe for treating tissue with in-line degassing of fluid
US8242665B2 (en) 2006-09-25 2012-08-14 Koninklijke Philips Electronics N.V. Flip-chip interconnection through chip vias
WO2008040015A2 (en) 2006-09-28 2008-04-03 University Of Washington 3d micro-scale engineered tissue model systems
CN101164637B (en) 2006-10-16 2011-05-18 重庆融海超声医学工程研究中心有限公司 Ultrasonic therapeutic system capable of reducing electromagnetic interference to imaging equipment
US20080183077A1 (en) 2006-10-19 2008-07-31 Siemens Corporate Research, Inc. High intensity focused ultrasound path determination
USD558351S1 (en) 2006-10-31 2007-12-25 Sonosite, Inc. Ultrasound display apparatus
US20100056925A1 (en) 2006-11-28 2010-03-04 Chongqing Ronghai Medical Ultrasound Industry Ltd. Ultrasonic Therapeutic Device Capable of Multipoint Transmitting
DE102006056885B4 (en) 2006-12-01 2016-06-30 Siemens Healthcare Gmbh Method and device for positioning a bearing device of a magnetic resonance apparatus
US7451651B2 (en) 2006-12-11 2008-11-18 General Electric Company Modular sensor assembly and methods of fabricating the same
CN101204700B (en) 2006-12-19 2012-08-08 重庆融海超声医学工程研究中心有限公司 Electromagnetic ultrasonic transducer and array thereof
US8672850B1 (en) 2007-01-11 2014-03-18 General Electric Company Focusing of a two-dimensional array to perform four-dimensional imaging
JP5211487B2 (en) 2007-01-25 2013-06-12 株式会社ニコン Exposure method, exposure apparatus, and microdevice manufacturing method
US20100298711A1 (en) 2007-01-29 2010-11-25 Worcester Polytechnic Institute Wireless ultrasound transducer using ultrawideband
CN101234234B (en) 2007-01-30 2011-11-16 西门子公司 Automatic selection method for region of interest of covering heating region
US7687976B2 (en) 2007-01-31 2010-03-30 General Electric Company Ultrasound imaging system
US7920731B2 (en) 2007-03-27 2011-04-05 Siemens Medical Solutions Usa, Inc. Bleeding detection using a blanket ultrasound device
CN101273890B (en) 2007-03-29 2010-10-06 西门子(中国)有限公司 Method and device for reducing folding artifact in HIFU therapy imaging monitored by MR
JP4885779B2 (en) 2007-03-29 2012-02-29 オリンパスメディカルシステムズ株式会社 Capacitance type transducer device and intracorporeal ultrasound diagnostic system
CN101273891B (en) 2007-03-29 2010-09-29 西门子(中国)有限公司 Method and device for accelerating magnetic resonance temperature imaging
US7824335B2 (en) 2007-04-26 2010-11-02 General Electric Company Reconfigurable array with multi-level transmitters
US7892176B2 (en) 2007-05-02 2011-02-22 General Electric Company Monitoring or imaging system with interconnect structure for large area sensor array
US8870771B2 (en) 2007-05-04 2014-10-28 Barbara Ann Karmanos Cancer Institute Method and apparatus for categorizing breast density and assessing cancer risk utilizing acoustic parameters
TWI526233B (en) 2007-05-07 2016-03-21 指導治療系統股份有限公司 Methods and systems for modulating medicants using acoustic energy
US8764687B2 (en) 2007-05-07 2014-07-01 Guided Therapy Systems, Llc Methods and systems for coupling and focusing acoustic energy using a coupler member
US20080296708A1 (en) 2007-05-31 2008-12-04 General Electric Company Integrated sensor arrays and method for making and using such arrays
WO2008146206A2 (en) 2007-06-01 2008-12-04 Koninklijke Philips Electronics, N.V. Wireless ultrasound probe asset tracking
US20090018446A1 (en) 2007-07-10 2009-01-15 Insightec, Ltd. Transrectal ultrasound ablation probe
EP2170531A2 (en) 2007-07-31 2010-04-07 Koninklijke Philips Electronics N.V. Cmuts with a high-k dielectric
US8052604B2 (en) 2007-07-31 2011-11-08 Mirabilis Medica Inc. Methods and apparatus for engagement and coupling of an intracavitory imaging and high intensity focused ultrasound probe
US7978461B2 (en) 2007-09-07 2011-07-12 Sonosite, Inc. Enhanced ultrasound system
USD591423S1 (en) 2007-09-07 2009-04-28 Sonosite, Inc. Ultrasound platform
US8235902B2 (en) 2007-09-11 2012-08-07 Focus Surgery, Inc. System and method for tissue change monitoring during HIFU treatment
US8277380B2 (en) 2007-09-11 2012-10-02 Siemens Medical Solutions Usa, Inc. Piezoelectric and CMUT layered ultrasound transducer array
US8137278B2 (en) 2007-09-12 2012-03-20 Sonosite, Inc. System and method for spatial compounding using phased arrays
US10092270B2 (en) 2007-09-17 2018-10-09 Koninklijke Philips Electronics N.V. Pre-collapsed CMUT with mechanical collapse retention
US8327521B2 (en) 2007-09-17 2012-12-11 Koninklijke Philips Electronics N.V. Method for production and using a capacitive micro-machined ultrasonic transducer
US20100256488A1 (en) 2007-09-27 2010-10-07 University Of Southern California High frequency ultrasonic convex array transducers and tissue imaging
CN101396280A (en) 2007-09-27 2009-04-01 重庆融海超声医学工程研究中心有限公司 Ultrasonic therapy intestinal tract pushing device
US8251908B2 (en) 2007-10-01 2012-08-28 Insightec Ltd. Motion compensated image-guided focused ultrasound therapy system
US7843022B2 (en) 2007-10-18 2010-11-30 The Board Of Trustees Of The Leland Stanford Junior University High-temperature electrostatic transducers and fabrication method
US7745248B2 (en) 2007-10-18 2010-06-29 The Board Of Trustees Of The Leland Stanford Junior University Fabrication of capacitive micromachined ultrasonic transducers by local oxidation
US8439907B2 (en) 2007-11-07 2013-05-14 Mirabilis Medica Inc. Hemostatic tissue tunnel generator for inserting treatment apparatus into tissue of a patient
US8187270B2 (en) 2007-11-07 2012-05-29 Mirabilis Medica Inc. Hemostatic spark erosion tissue tunnel generator with integral treatment providing variable volumetric necrotization of tissue
CN100560157C (en) 2007-11-13 2009-11-18 重庆市生力医疗设备有限公司 Ultrasonic medicine plaster
US7786584B2 (en) 2007-11-26 2010-08-31 Infineon Technologies Ag Through substrate via semiconductor components
US8767514B2 (en) 2007-12-03 2014-07-01 Kolo Technologies, Inc. Telemetric sensing using micromachined ultrasonic transducer
EP2215855A1 (en) 2007-12-03 2010-08-11 Kolo Technologies, Inc. Capacitive micromachined ultrasonic transducer with voltage feedback
JP2011505205A (en) 2007-12-03 2011-02-24 コロ テクノロジーズ インコーポレイテッド Ultrasonic scanner constructed with capacitive micromachined ultrasonic transducer (CMUTS)
US8483014B2 (en) 2007-12-03 2013-07-09 Kolo Technologies, Inc. Micromachined ultrasonic transducers
WO2009073692A1 (en) 2007-12-03 2009-06-11 Kolo Technologies, Inc. Packaging and connecting electrostatic transducer arrays
US8559274B2 (en) 2007-12-03 2013-10-15 Kolo Technologies, Inc. Dual-mode operation micromachined ultrasonic transducer
US8345513B2 (en) 2007-12-03 2013-01-01 Kolo Technologies, Inc. Stacked transducing devices
EP2227835A1 (en) 2007-12-03 2010-09-15 Kolo Technologies, Inc. Variable operating voltage in micromachined ultrasonic transducer
EP2218094A1 (en) 2007-12-03 2010-08-18 Kolo Technologies, Inc. Through-wafer interconnections in electrostatic transducer and array
US8787116B2 (en) 2007-12-14 2014-07-22 Koninklijke Philips N.V. Collapsed mode operable cMUT including contoured substrate
EP2231011A1 (en) * 2007-12-31 2010-09-29 Real Imaging Ltd. System and method for registration of imaging data
CN101513554B (en) 2008-02-21 2011-09-07 重庆海扶(Hifu)技术有限公司 Intelligent type tissue-mimicking ultrasonic phantom and preparation method thereof
US20110055447A1 (en) 2008-05-07 2011-03-03 Signostics Limited Docking system for medical diagnostic scanning using a handheld device
GB2459862B (en) 2008-05-07 2010-06-30 Wolfson Microelectronics Plc Capacitive transducer circuit and method
JP2009291514A (en) 2008-06-09 2009-12-17 Canon Inc Method for manufacturing capacitive transducer, and capacitive transducer
JP5063515B2 (en) * 2008-07-25 2012-10-31 日立アロカメディカル株式会社 Ultrasonic diagnostic equipment
US7898905B2 (en) 2008-07-28 2011-03-01 General Electric Company Reconfigurable array with locally determined switch configuration
US8216161B2 (en) 2008-08-06 2012-07-10 Mirabilis Medica Inc. Optimization and feedback control of HIFU power deposition through the frequency analysis of backscattered HIFU signals
US9248318B2 (en) 2008-08-06 2016-02-02 Mirabilis Medica Inc. Optimization and feedback control of HIFU power deposition through the analysis of detected signal characteristics
US8133182B2 (en) 2008-09-09 2012-03-13 Siemens Medical Solutions Usa, Inc. Multi-dimensional transducer array and beamforming for ultrasound imaging
US9050449B2 (en) 2008-10-03 2015-06-09 Mirabilis Medica, Inc. System for treating a volume of tissue with high intensity focused ultrasound
EP2331207B1 (en) 2008-10-03 2013-12-11 Mirabilis Medica Inc. Apparatus for treating tissues with hifu
US8237601B2 (en) 2008-10-14 2012-08-07 Sonosite, Inc. Remote control device
US20100173437A1 (en) 2008-10-21 2010-07-08 Wygant Ira O Method of fabricating CMUTs that generate low-frequency and high-intensity ultrasound
EP2349482B1 (en) 2008-10-24 2016-07-27 Mirabilis Medica Inc. Apparatus for feedback control of hifu treatments
FR2939003B1 (en) 2008-11-21 2011-02-25 Commissariat Energie Atomique CMUT CELL FORMED OF A MEMBRANE OF NANO-TUBES OR NANO-THREADS OR NANO-BEAMS AND ULTRA HIGH-FREQUENCY ACOUSTIC IMAGING DEVICE COMPRISING A PLURALITY OF SUCH CELLS
US20100160781A1 (en) 2008-12-09 2010-06-24 University Of Washington Doppler and image guided device for negative feedback phased array hifu treatment of vascularized lesions
US8176787B2 (en) 2008-12-17 2012-05-15 General Electric Company Systems and methods for operating a two-dimensional transducer array
CN102281818B (en) 2009-01-16 2013-11-06 株式会社日立医疗器械 Ultrasonic probe manufacturing method and ultrasonic probe
US8108647B2 (en) 2009-01-29 2012-01-31 International Business Machines Corporation Digital data architecture employing redundant links in a daisy chain of component modules
US8398408B1 (en) 2009-02-25 2013-03-19 Sonosite, Inc. Charging station for cordless ultrasound cart
US8402831B2 (en) 2009-03-05 2013-03-26 The Board Of Trustees Of The Leland Standford Junior University Monolithic integrated CMUTs fabricated by low-temperature wafer bonding
KR20110127736A (en) 2009-03-06 2011-11-25 미라빌리스 메디카 인코포레이티드 Ultrasound treatment and imaging applicator
US8315125B2 (en) 2009-03-18 2012-11-20 Sonetics Ultrasound, Inc. System and method for biasing CMUT elements
US8408065B2 (en) 2009-03-18 2013-04-02 Bp Corporation North America Inc. Dry-coupled permanently installed ultrasonic sensor linear array
WO2010109363A2 (en) 2009-03-23 2010-09-30 Koninklijke Philips Electronics, N.V. Gas sensing using ultrasound
ES2416182T3 (en) 2009-03-26 2013-07-30 Norwegian University Of Science And Technology (Ntnu) CMUT matrix of wave junction with conductive pathways
EP2413802A1 (en) 2009-04-01 2012-02-08 Analogic Corporation Ultrasound probe
US8355554B2 (en) 2009-04-14 2013-01-15 Sonosite, Inc. Systems and methods for adaptive volume imaging
US8992426B2 (en) 2009-05-04 2015-03-31 Siemens Medical Solutions Usa, Inc. Feedback in medical ultrasound imaging for high intensity focused ultrasound
US8276433B2 (en) 2009-05-18 2012-10-02 The Board Of Trustees Of The Leland Stanford Junior University Sensor for measuring properties of liquids and gases
US8207652B2 (en) 2009-06-16 2012-06-26 General Electric Company Ultrasound transducer with improved acoustic performance
US8451693B2 (en) 2009-08-25 2013-05-28 The Board Of Trustees Of The Leland Stanford Junior University Micromachined ultrasonic transducer having compliant post structure
US8409095B1 (en) 2009-09-03 2013-04-02 Sonosite, Inc. Systems and methods for hands free control of medical devices
US20110060221A1 (en) 2009-09-04 2011-03-10 Siemens Medical Solutions Usa, Inc. Temperature prediction using medical diagnostic ultrasound
US8345508B2 (en) 2009-09-20 2013-01-01 General Electric Company Large area modular sensor array assembly and method for making the same
US8563345B2 (en) 2009-10-02 2013-10-22 National Semiconductor Corporated Integration of structurally-stable isolated capacitive micromachined ultrasonic transducer (CMUT) array cells and array elements
US8222065B1 (en) 2009-10-02 2012-07-17 National Semiconductor Corporation Method and system for forming a capacitive micromachined ultrasonic transducer
US8324006B1 (en) 2009-10-28 2012-12-04 National Semiconductor Corporation Method of forming a capacitive micromachined ultrasonic transducer (CMUT)
US8081301B2 (en) 2009-10-08 2011-12-20 The United States Of America As Represented By The Secretary Of The Army LADAR transmitting and receiving system and method
US8819591B2 (en) 2009-10-30 2014-08-26 Accuray Incorporated Treatment planning in a virtual environment
US8368401B2 (en) 2009-11-10 2013-02-05 Insightec Ltd. Techniques for correcting measurement artifacts in magnetic resonance thermometry
US8715186B2 (en) 2009-11-24 2014-05-06 Guided Therapy Systems, Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
DE102009060317B4 (en) 2009-12-23 2013-04-04 Siemens Aktiengesellschaft A contrast agent for use in an imaging method for diagnosing a metastatic tumor disease and a method for imaging a metastatic tumor tissue
US20110178407A1 (en) 2010-01-20 2011-07-21 Siemens Medical Solutions Usa, Inc. Hard and Soft Backing for Medical Ultrasound Transducer Array
EP2528509B1 (en) 2010-01-29 2021-10-13 University Of Virginia Patent Foundation Ultrasound for locating anatomy or probe guidance
US8717360B2 (en) 2010-01-29 2014-05-06 Zspace, Inc. Presenting a view within a three dimensional scene
US8876716B2 (en) 2010-02-12 2014-11-04 Delphinus Medical Technologies, Inc. Method of characterizing breast tissue using muliple ultrasound renderings
WO2011100691A1 (en) 2010-02-12 2011-08-18 Delphinus Medical Technologies, Inc. Method of characterizing the pathological response of tissue to a treatmant plan
US20130278631A1 (en) 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20110218436A1 (en) 2010-03-06 2011-09-08 Dewey Russell H Mobile ultrasound system with computer-aided detection
JP5394299B2 (en) 2010-03-30 2014-01-22 富士フイルム株式会社 Ultrasonic diagnostic equipment
US8876740B2 (en) 2010-04-12 2014-11-04 University Of Washington Methods and systems for non-invasive treatment of tissue using high intensity focused ultrasound therapy
US8439840B1 (en) 2010-05-04 2013-05-14 Sonosite, Inc. Ultrasound imaging system and method with automatic adjustment and/or multiple sample volumes
KR101999078B1 (en) 2010-06-09 2019-07-10 리전츠 오브 더 유니버스티 오브 미네소타 Dual mode ultrasound transducer (dmut) system and method for controlling delivery of ultrasound therapy
US9465090B2 (en) 2010-06-09 2016-10-11 Siemens Aktiengesellschaft Method of magnetic resonance-based temperature mapping
US8647279B2 (en) 2010-06-10 2014-02-11 Siemens Medical Solutions Usa, Inc. Volume mechanical transducer for medical diagnostic ultrasound
US8512247B2 (en) * 2010-06-25 2013-08-20 John C. Hill System for non-invasive determination of glycogen stores
US8527033B1 (en) 2010-07-01 2013-09-03 Sonosite, Inc. Systems and methods for assisting with internal positioning of instruments
US20120005624A1 (en) 2010-07-02 2012-01-05 Vesely Michael A User Interface Elements for Use within a Three Dimensional Scene
JP5702966B2 (en) 2010-08-02 2015-04-15 キヤノン株式会社 Electromechanical transducer and method for manufacturing the same
EP2605830A4 (en) 2010-08-18 2015-12-02 Mirabilis Medica Inc Hifu applicator
US7954387B1 (en) 2010-08-18 2011-06-07 General Electric Company Ultrasonic transducer device
US8425425B2 (en) 2010-09-20 2013-04-23 M. Dexter Hagy Virtual image formation method for an ultrasound device
US8716816B2 (en) 2010-10-12 2014-05-06 Micralyne Inc. SOI-based CMUT device with buried electrodes
US9354718B2 (en) 2010-12-22 2016-05-31 Zspace, Inc. Tightly coupled interactive stereo display
KR20120073887A (en) * 2010-12-27 2012-07-05 삼성전자주식회사 Image processing apparatus and method for porcessing image thereof
US8128050B1 (en) 2011-02-08 2012-03-06 Sonosite, Inc. Ultrasound scanner support devices
DE102011011530B4 (en) 2011-02-17 2013-05-08 Karlsruher Institut für Technologie Method for reducing ultrasound data
US8891334B2 (en) 2011-03-04 2014-11-18 Georgia Tech Research Corporation Compact, energy-efficient ultrasound imaging probes using CMUT arrays with integrated electronics
USD657361S1 (en) 2011-03-25 2012-04-10 Sonosite, Inc. Housing for an electronic device
US8804457B2 (en) 2011-03-31 2014-08-12 Maxim Integrated Products, Inc. Transmit/receive systems for imaging devices
US20120250454A1 (en) 2011-04-04 2012-10-04 Robert Nicholas Rohling Method and system for shaping a cmut membrane
US9736466B2 (en) 2011-05-27 2017-08-15 Zspace, Inc. Optimizing stereo video display
US9161025B2 (en) 2011-08-29 2015-10-13 Zspace, Inc. Extended overdrive tables and use
CN102981156A (en) 2011-09-06 2013-03-20 中国科学院声学研究所 Ultrasonic imaging post-processing method and device thereof
WO2013059358A2 (en) 2011-10-17 2013-04-25 Butterfly Network, Inc. Transmissive imaging and related apparatus and methods
US9533873B2 (en) 2013-02-05 2017-01-03 Butterfly Network, Inc. CMOS ultrasonic transducers and related apparatus and methods
KR20220097541A (en) 2013-03-15 2022-07-07 버터플라이 네트워크, 인크. Monolithic ultrasonic imaging devices, systems and methods
EP3639937A1 (en) 2013-03-15 2020-04-22 Butterfly Network, Inc. Complementary metal oxide semiconductor (cmos) ultrasonic transducers and methods for forming the same
CA2919183A1 (en) 2013-07-23 2015-01-29 Butterfly Network, Inc. Interconnectable ultrasound transducer probes and related methods and apparatus

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11439364B2 (en) 2013-03-15 2022-09-13 Bfly Operations, Inc. Ultrasonic imaging devices, systems and methods
US10980511B2 (en) 2013-07-23 2021-04-20 Butterfly Network, Inc. Interconnectable ultrasound transducer probes and related methods and apparatus
US11039812B2 (en) 2013-07-23 2021-06-22 Butterfly Network, Inc. Interconnectable ultrasound transducer probes and related methods and apparatus
US11647985B2 (en) 2013-07-23 2023-05-16 Bfly Operations, Inc. Interconnectable ultrasound transducer probes and related methods and apparatus
US10856843B2 (en) 2017-03-23 2020-12-08 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US11531096B2 (en) 2017-03-23 2022-12-20 Vave Health, Inc. High performance handheld ultrasound
US11553896B2 (en) 2017-03-23 2023-01-17 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US10469846B2 (en) 2017-03-27 2019-11-05 Vave Health, Inc. Dynamic range compression of ultrasound images
US10681357B2 (en) 2017-03-27 2020-06-09 Vave Health, Inc. Dynamic range compression of ultrasound images
US11446003B2 (en) 2017-03-27 2022-09-20 Vave Health, Inc. High performance handheld ultrasound
EP3777700A4 (en) * 2018-04-13 2021-06-02 FUJIFILM Corporation Ultrasonic system and method of controlling ultrasonic system
US11690596B2 (en) 2018-04-13 2023-07-04 Fujifilm Corporation Ultrasound system and method for controlling ultrasound system

Also Published As

Publication number Publication date
US20140300720A1 (en) 2014-10-09
JP2016515903A (en) 2016-06-02
JP2019030736A (en) 2019-02-28
JP6786384B2 (en) 2020-11-18
CA2908631A1 (en) 2014-10-09
EP2981215A2 (en) 2016-02-10
WO2014165662A3 (en) 2014-12-31
CN105263419A (en) 2016-01-20
KR20150145236A (en) 2015-12-29
CA2908631C (en) 2021-08-24
US20170228862A1 (en) 2017-08-10
US20170309023A1 (en) 2017-10-26
US9667889B2 (en) 2017-05-30

Similar Documents

Publication Publication Date Title
CA2908631C (en) Portable electronic devices with integrated imaging capabilities
EP3471622B1 (en) Universal ultrasound device
US11540805B2 (en) Universal ultrasound device and related apparatus and methods
US11833542B2 (en) CMOS ultrasonic transducers and related apparatus and methods
EP3236855B1 (en) Device and system for monitoring internal organs of a human or animal
CN106999149B (en) Multi-sensor ultrasound probe and related methods
US20110077517A1 (en) Ultrasonic diagnostic apparatus
KR101915255B1 (en) Method of manufacturing the ultrasonic probe and the ultrasonic probe
CN113950295A (en) Method and apparatus for collecting ultrasound data along different thickness guide angles
KR20150020945A (en) Acoustic probe and Method for manufacturing the same
US20140221840A1 (en) Ultrasound transducer, ultrasound probe including the same, and ultrasound diagnostic equipment including the ultrasound probe
CN110087539A (en) The positioning to therapeutic equipment of ultrasonic guidance
CN104739445B (en) Ultrasonic diagnostic equipment management system and the method for controlling the management system
US11857372B2 (en) System and method for graphical user interface with filter for ultrasound image presets
KR20180023604A (en) Ultrasonic probe and ultrasonic imaging apparatus
KR20150084635A (en) Ultrasonic probe and Method for manufacturing the same

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480031564.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14725300

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2908631

Country of ref document: CA

Ref document number: 2016506609

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2014725300

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20157031515

Country of ref document: KR

Kind code of ref document: A