US20170309023A1 - Portable Electronic Devices With Integrated Imaging Capabilities - Google Patents
Portable Electronic Devices With Integrated Imaging Capabilities Download PDFInfo
- Publication number
- US20170309023A1 US20170309023A1 US15/644,456 US201715644456A US2017309023A1 US 20170309023 A1 US20170309023 A1 US 20170309023A1 US 201715644456 A US201715644456 A US 201715644456A US 2017309023 A1 US2017309023 A1 US 2017309023A1
- Authority
- US
- United States
- Prior art keywords
- dimensional image
- image
- imaging device
- following
- combination
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 131
- 230000005855 radiation Effects 0.000 claims abstract description 21
- 238000000034 method Methods 0.000 claims description 49
- 238000002604 ultrasonography Methods 0.000 claims description 35
- 238000009877 rendering Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 13
- 230000001413 cellular effect Effects 0.000 claims description 8
- 238000012285 ultrasound imaging Methods 0.000 claims description 8
- 238000012876 topography Methods 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000002708 enhancing effect Effects 0.000 claims 6
- 230000033001 locomotion Effects 0.000 abstract description 8
- 230000029058 respiratory gaseous exchange Effects 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 16
- 230000007246 mechanism Effects 0.000 description 15
- 210000000056 organ Anatomy 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 5
- 210000001367 artery Anatomy 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 210000003462 vein Anatomy 0.000 description 4
- 239000000470 constituent Substances 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000015654 memory Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 210000000038 chest Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000005060 rubber Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4411—Device being modular
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
- A61B8/4455—Features of the external shape of the probe, e.g. ergonomic aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4477—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4494—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer characterised by the arrangement of the transducer elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/899—Combination of imaging systems with ancillary equipment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
- G06T2207/10136—3D ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/016—Exploded view
Definitions
- the present disclosure relates generally to imaging devices and methods (e.g., ultrasound imaging devices and methods).
- Imaging technologies are used at various stages of medical care. For example, imaging technologies are used to non-invasively diagnose patients, to monitor the performance of medical (e.g., surgical) procedures, and/or to monitor post-treatment progress or recovery.
- medical e.g., surgical
- Imaging devices and methods including magnetic resonance imaging (MRI) technology, are typically configured for and limited to use within a fixed location in a hospital setting.
- MRI technology is also generally slow, and suffers from other drawbacks including high cost, loud noise, and the use of potentially harmful magnetic fields.
- a portable electronic device e.g., smart phone and/or tablet computer
- an image e.g., 2-dimensional or 3-dimensional image
- the window and corresponding image displayed on a display screen of the portable electronic device change as the portable electronic device is moved over various portions of the body (e.g., abdomen, thorax, etc.).
- the image displayed by the portable electronic device may identify, for example, organs, arteries, veins, tissues, bone, and/or other bodily contents or parts.
- the image may be presented in 3 dimensions such at it appears to the viewer as if the viewer is looking into the body, or as if the body parts have been projected up (e.g., exploded view) from the body.
- the present disclosure provides numerous embodiments of systems, apparatus, computer readable media, and methods for providing imaging functionality using a portable electronic device, such as, for example, a smart phone or a tablet computer.
- the portable electronic device is configured to generate and display an image of what appears to be an exploded view (e.g., 3-dimensional, upwardly projected image) of an object or its constituent parts.
- movement of the portable electronic device results in the rendering of a different internal image of the target (e.g., different portion(s) of a human body).
- the generated window of the underlying object e.g., the portion of the human body
- may provide an internal view of the object e.g., a three-dimensional rendering of an organ or a portion of an organ).
- a portable electronic device includes a processor configured to generate an image (e.g., ultrasound image) of an internal feature of a target when the device is positioned at an external surface of the target, and a display configured to display the image.
- an image e.g., ultrasound image
- a portable ultrasound device in some embodiments according to another aspect of the present disclosure, includes multiple ultrasound elements configured to receive ultrasound radiation reflected by or passing through a target when the ultrasound device is pointed at the target.
- the portable ultrasound device also includes a display configured to display an image of an internal feature of the target based at least in part on the ultrasound radiation received by the plurality of ultrasound elements.
- a method in some embodiments according to another aspect of the present disclosure, includes pointing a portable electronic device at an external surface of a subject, and viewing, on a display of the portable electronic device, an image of an internal feature of the subject while pointing the portable electronic device at the external surface of the subject.
- the portable electronic device includes a radiation sensor
- the method further includes receiving, with the radiation sensor, radiation reflected by or passing through the subject, and creating the image of the internal feature based at least in part on the radiation received by the radiation sensor.
- a portable electronic device renders within a window on a display of the device an image (e.g., 3-dimensional image) of an inside of a human body when the device is directed at the body (e.g., within about one meter or less of the body).
- the image changes to reflect additional body parts as the device is moved relative to the body.
- a portable electronic device in some embodiments according to another aspect of the present disclosure, includes multiple imaging elements configured to receive radiation signals transmitted through or reflected by an imaging target and an imaging interface.
- the portable electronic device also includes one or more processors configured to receive one or more sensing signals from at least one of the plurality of imaging elements, and to render an image of the imaging target for display through the imaging interface based at least in part on the one or more sensing signals.
- FIG. 1A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a human body or a portion of a human body according to some embodiments of the present disclosure.
- FIG. 1B illustrates a three-dimensional internal image of a portion of a human body that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
- FIG. 2A illustrates a front view of portable electronic device including an imaging interface according to some embodiments of the present disclosure.
- FIG. 2B illustrates a back view of portable electronic device including imaging elements according to some embodiments of the present disclosure.
- FIG. 3 illustrates a transmissive imaging system and method according to some embodiments of the present disclosure.
- FIG. 4 illustrates a reflective imaging system and method according to some embodiments of the present disclosure.
- FIG. 5 illustrates a transmissive and/or reflective imaging system and method according to some embodiments of the present disclosure.
- FIG. 6A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments of the present disclosure.
- FIG. 6B illustrates a three-dimensional internal image of a portion of a human body at the first position shown in FIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
- FIG. 6C illustrates a three-dimensional internal image of a portion of a human body at the second position shown in FIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure.
- FIG. 7A illustrates a front view of a portable electronic device according to some embodiments of the present disclosure.
- FIG. 7B illustrates a back view of a portable electronic device according to some embodiments of the present disclosure.
- FIG. 7C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 7D illustrates a back view of a case including imaging elements for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8A illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8B illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8D illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8E illustrates a modular unit including an imaging circuit according to some embodiments of the present disclosure.
- a portable electronic device that includes an imaging interface and one or more imaging elements.
- the portable electronic device may be a cellular phone, personal digital assistant, smart phone, tablet device, digital camera, laptop computer, or the like.
- An image may be generated and/or rendered utilizing the portable electronic device.
- the portable electronic device may be utilized to simulate a “window” into an imaging target, such as a human body or portion of the body.
- the simulated “window” may provide a view of the inside of a human body or portion of the body, including organs, arteries, veins, tissues, bone, and/or other bodily contents or parts.
- an image e.g., ultrasound or sonographic image
- a real-time continuous or substantially real-time continuous image may be generated and/or rendered such that movement of the portable electronic device results in a substantially real-time updated image of the area that corresponds to the new position of the portable electronic device.
- internal movement of the target object e.g., such as expansion and/or contraction of organs
- the portable electronic devices and methods described herein may include, be coupled to (e.g., via a suitable communications connection or port such as a USB link), or otherwise utilize one or more radiation sources, sensors, and/or transducers (e.g., array(s) of ultrasound transducers), front-end processing circuitry and associated processing techniques, and/or image reconstruction devices and/or methods, in order to generate and/or render images to a user according to the non-limiting embodiments described in detail throughout the present disclosure.
- a suitable communications connection or port such as a USB link
- one or more of the devices described in FIGS. 1A-8E herein may include or be coupled to one or more ultrasound imaging elements (e.g., one or more arrays of ultrasound sources, sensors, and/or transducers).
- One or more computers or processors within the portable electronic device may perform image analysis and/or image rendering based at least in part on radiation signals received by an imaging device.
- FIG. 1A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a human body or a portion of a human body 106 according to some embodiments.
- FIG. 1B illustrates a three-dimensional internal image 110 of a portion of a human body that is generated and/or rendered by a portable electronic device 100 according to some embodiments.
- the portable electronic device 100 may be positioned in an area near (e.g. in contact with the surface of or within about one meter from the surface of) a portion of a human body that is to be imaged and/or analyzed.
- the portable electronic device 100 may include imaging elements 104 that are configured to transmit and/or receive radiation signals.
- An internal image 110 as shown in FIG. 1B may be generated by the portable electronic device 100 .
- the internal image 110 may be a three-dimensional internal image of a portion of the human body that appears to a viewer 117 to project upward from a surface of the portable electronic device 100 , giving the viewer the perception of a viewing window into the underlying body.
- the portable electronic device 100 may provide a window into the internal areas of the human body that are below the surface.
- the generated images may be real-time continuous images such that the images are dynamically updated based on movement of the portable electronic device 100 and/or the image target (e.g., internal organs of the human body).
- FIG. 2A illustrates a front view of portable electronic device 100 including an imaging interface 102 according to some embodiments of the present disclosure.
- the imaging interface 102 of the portable electronic device 100 may include a display that is configured to output a two-dimensional (2-D) or three-dimensional (3-D) image of an imaging target.
- the imaging interface 102 is interactive and is capable of receiving user input, for example through a touch-screen.
- An image that is displayed via the imaging interface 102 may be adjusted based on the received inputs, for example, to adjust zoom level, centering position, level of detail, depth of an underlying object to be imaged, resolution, brightness, color and/or the like of the image.
- imaging interface 102 may be configured to allow a user to selectively traverse various layers and imaging depths of the underlying object using, for example, the touch screen.
- Portable electronic device 100 may render a three-dimensional image of the imaging target using any suitable method or combination of methods (e.g., anaglyph, polarization, eclipse, interference filtering, and/or austosteroscopy).
- the imaging interface 102 includes a circular polarizer and/or a linear polarizer such that a viewer having polarizing filtering spectacles can view a three-dimensional image.
- the imaging interface 102 is configured to display alternating left and right images such that a viewer having spectacles with shutters that alternate in conjunction with the displayed image.
- the imaging interface 102 may utilize an autostereoscopy method such that 3-D spectacles are not necessary for use by a viewer to view the three-dimensional image.
- portable electronic device 100 may display information (e.g., text and/or graphics) in addition to (e.g., graphically overlaid on top of or adjacent to) an image of a targeted object, such as, for example, text and/or graphics identifying the structure(s) identified in the image (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts).
- portable electronic device 100 may include one or more processors for identifying structure(s) identified in the image based at least in part on stored data (e.g., data stored in random access memory or other storage device of portable electronic device 100 ).
- data stored within device 100 may identify characteristic(s) of structure(s) (e.g., one or more shapes, colors, textures, cellular characteristics, tissue characteristics, and/or other distinctive and/or surrounding features or structures) that may be present within different areas of the human body for use by personal electronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered by device 100 .
- data stored within device 100 may identify characteristics of particular disease(s) such as cancer or other abnormalities for use by personal electronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered by device 100 .
- the image, text, graphics, and/or other information displayed on the user interface 104 may be adjusted through user interaction with one or more inputs (e.g., touch screen, buttons, touch-sensitive areas, or the like) of the portable electronic device 100 .
- one or more inputs e.g., touch screen, buttons, touch-sensitive areas, or the like
- FIG. 2B illustrates a back view of portable electronic device 100 including imaging elements 104 according to some embodiments of the present disclosure.
- the imaging elements 104 may be configured as sources (emitters) and/or sensors of ultrasound radiation and/or other radiation.
- the imaging elements 104 may be of substantially the same size and/or may be arranged in an array as shown in FIG. 2B .
- the imaging elements 104 may be of different sizes and/or arranged in an irregular or scattered configuration.
- one or more (e.g., all) of the imaging elements 104 may be arranged in the same plane. In other embodiments, at least some of imaging elements may be arranged in at least two different planes.
- all of the imaging elements 104 included in the portable electronic device 100 may be either emitting elements or sensing elements. In some embodiments, the imaging elements 104 may include both emitting elements and sensing elements.
- the embodiment shown in FIG. 2B includes a 4 ⁇ 6 array of imaging elements 104 , by way of illustration only and is not intended to be limiting. In other embodiments, any other suitable numbers of imaging elements may be provided (e.g., 10, 20, 30, 40, 50, 100, 200, 500, 1000, or any number in between, or more) and may be arranged in any suitable configuration.
- the imaging elements 104 may be integrated within a circuit board (e.g., a printed circuit board) that includes, for example, processing (e.g., image processing) components of the portable electronic device 100 .
- processing e.g., image processing
- the imaging elements 104 may be provided on a separate circuit board or layer of a circuit board than the processing components of the portable electronic device 100 , and may be in communication with the processing circuitry through a suitable communications link (e.g., an internal bus, USB link, or other port).
- the imaging elements 104 may include their own dedicated processing circuitry, such as a graphic processing unit (GPU), digital signal processor (DSP), and/or central processing unit (CPU), and/or may utilize processing circuitry of the portable electronic device 100 .
- the CPU and/or GPU of the portable electronic device 100 may be utilized for image acquisition/reconstruction and image rendering.
- the CPU of portable electronic device 100 may be utilized to process computations based on received signals (e.g., back-scattered signals and/or transmissive signals) in order to generate an image or topography, while the GPU may be utilized to render an image based on the information received from the CPU to generate a real-time or substantially real-time image display.
- portable electronic device 100 may include one or more components for processing, filtering, amplification, and/or rendering images.
- FIG. 3 illustrates a transmissive imaging system and method 301 according to some embodiments of the present disclosure.
- the transmissive imaging system 301 includes two portable electronic devices 100 A and 100 B that are on opposing or generally opposing sides of an imaging target 306 .
- devices 100 A and 100 B may be positioned in any other relationship with respect to one another.
- devices 100 A and/or 100 B may include one or more sensors for determining the relative positions of these devices to aid in the generation of image(s).
- device 100 B may be a dedicated sensing and/or emitting device such as an array of ultrasound elements and associated circuitry.
- Signals (e.g., waves or beams 308 ) emitted from the portable electronic device 100 B are sensed by the portable electronic device 100 A and are utilized to render a 2-D or 3-D image 312 (e.g., real-time or substantially real-time image) of the target 306 .
- a generated 3-D image may be in the form of a pop-out image or a depth image.
- the portable electronic device 100 A may be configured to transmit signals (e.g., waves or beams) 308 though the target 306 to be received by the portable electronic device 100 B.
- the portable electronic device 100 B may simultaneously or substantially simultaneously render an image (e.g., back view or alternate view or level of detail of an image rendered by device 100 A) based at least in part on processing sensed signals. In some embodiments, the portable electronic devices 100 A and/or 100 B may communicate the results of the sensed signals to the other in order to generate or improve a rendered image.
- an image e.g., back view or alternate view or level of detail of an image rendered by device 100 A
- the portable electronic devices 100 A and/or 100 B may communicate the results of the sensed signals to the other in order to generate or improve a rendered image.
- FIG. 4 illustrates a back-scatter or reflective imaging system and method 401 according to some embodiments of the present disclosure.
- a portable electronic device 100 may utilize emission and/or sensing elements 104 in order to render an image 410 based at least in part on reflection (e.g., back-scatter effect) of the signals 408 .
- portable electronic device 100 is the only device utilized in order to image the target (e.g., to produce an image appearing as a window into a human body).
- the portable electronic device 100 may include both radiation sources and sensors (e.g., separate sources and sensors, and/or multiple transducers functioning as both sources and sensors), where all or substantially all of the radiation utilized by the sensors to reconstruct image(s) is backscatter radiation or radiation produced through a similar effect.
- both radiation sources and sensors e.g., separate sources and sensors, and/or multiple transducers functioning as both sources and sensors
- all or substantially all of the radiation utilized by the sensors to reconstruct image(s) is backscatter radiation or radiation produced through a similar effect.
- FIG. 5 illustrates a transmissive and/or reflective imaging system and method 501 according to some embodiments of the present disclosure.
- a plurality of devices such as portable electronic devices 500 A, 500 B, 500 C, and/or 500 D may be utilized in order to render one or more image(s) 510 of target 506 on portable electronic device 500 B.
- Each of the portable electronic devices 500 A- 500 D may be configured to emit signals (e.g., waves or beams) 508 as shown in FIG. 5 .
- the image 510 or alternate views of the image or imaged structure, may be rendered on the other portable electronic devices (e.g., 500 A, 500 C, and 500 D) through communication with one-another.
- each of the devices may be configured as emitting and/or sensing devices only.
- the image 510 that is rendered on portable device 500 B may be based at least in part on signals 508 that are emitted by one or more of the devices 500 A- 500 D, and which are sensed through reflection (e.g., back-scatter) and/or transmission by one or more of the devices 500 A- 500 D.
- one or more portable electronic devices according to the present disclosure may generate and/or render an image based solely on signals received by one or more sensors (e.g., ultrasound transducers) of the device. In some embodiments, one or more portable electronic devices according to the present disclosure may generate and/or render an image based at least in part on information stored in memory (e.g., random access memory) of the portable device(s) identifying detail(s) regarding the structure(s), part(s), composition(s), and/or other characteristic(s) of object(s) to be imaged.
- memory e.g., random access memory
- the portable electronic devices may use stored data in addition to the received data in order to generate an image of the object and/or its constituent part(s), and/or to provide addition detail or explanation regarding an object and/or its constituent parts.
- the generated and/or rendered image may be a real-time or substantially real-time image that is dynamically updated based on movement of a portable electronic device 100 along a surface of an imaging target and/or motion of the imaging target.
- FIG. 6A illustrates a portable electronic device 100 including an imaging interface 102 for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments.
- FIG. 6B illustrates a three-dimensional internal image 610 of a portion of a human body at the first position shown in FIG. 6A that is generated and/or rendered by a portable electronic device 100 according to some embodiments.
- FIG. 6C illustrates a three-dimensional internal image 610 of a portion of a human body at the second position shown in FIG. 6A that is generated and/or rendered by a portable electronic device 100 according to some embodiments.
- a three-dimensional internal image 610 of a portion of the human body may generated and displayed to a viewer 617 .
- the three-dimensional image 610 may appear to the viewer 617 as an image having variations in, for example, topography that correspond to the surfaces and/or other aspects or features of the internal portion of the body at the first position of the portable electronic device 100 as shown in FIG. 6A .
- the three-dimensional image 610 may be a real-time continuous image that is dynamically updated based on movement of the portable electronic device 100 and/or the internal portion of the body that is being analyzed. As shown in FIG. 6C , a different three-dimensional internal image 610 is displayed to the viewer 617 showing different underlying structures and/or aspects (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts).
- the three-dimensional internal image 610 shown in FIG. 6C corresponds to the internal image of the body portion corresponding to the second position of the portable electronic device 100 as shown in FIG. 6A . As shown in FIG.
- the internal image 610 is illustrated as a different image showing different topographical and/or other aspects or features of the body portion than the internal image 610 shown in FIG. 6B .
- different types of internal images of a target may be generated, such as a three-dimensional view of an entire organ or multiple organs.
- the imaging elements including sensors and/or sources (e.g., transducers), may be provided on, in, or otherwise coupled to a case for a portable electronic device.
- FIG. 7A illustrates a front view of a portable electronic device 700 according to some embodiments of the present disclosure.
- the portable electronic device 700 includes an imaging interface 702 .
- FIG. 7B illustrates a back view of the portable electronic device 700 according to some embodiments of the present disclosure.
- the portable electronic device 700 does not include imaging elements 104 as part of the main housing or enclosure of device 700 .
- FIG. 7C illustrates a front view of a case 711 for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 7D illustrates a back view of the case 711 including imaging elements for a portable electronic device according to some embodiments of the present disclosure.
- the case 711 may be configured to at least partially enclose the portable electronic device 700 .
- case 711 may simultaneously provide imaging capabilities to portable electronic device 700 and serve as a protective case.
- the case may be made of any suitable material such as rubber, plastic, leather, and/or or the like. As shown in FIG.
- an imaging circuit 712 (e.g., an integrated circuit) may be provided on (e.g., directly on), embedded in, and/or otherwise coupled to the back surface and/or other surface(s) of the case 711 .
- Case 711 may be considered part of portable electronic device 700 .
- the imaging circuit 712 may include one or more imaging elements 104 . As discussed above, the imaging elements 104 may include sources and/or sensors.
- the imaging circuit 712 may also include a communication device 714 configured to communicate with the portable electronic device 700 via a wired or wireless link.
- the imaging circuit 712 may include a communication transmitter/receiver which utilizes an infrared signal, a Bluetooth communication signal, a near-field communication signal, and/or the like to communicate with the portable electronic device 700 .
- the communication device 714 may be in communication with the processing circuitry of a portable electronic device through a wired communications link (e.g., a USB port, or other data port), or combination of wired and wireless links.
- the imaging circuit 712 may receive power through wired and/or wireless connection(s) to the portable electronic device. In some embodiments, the imaging circuit 712 may receive power from a separate power source (e.g., a battery) that is coupled to the imaging circuit 712 . In some embodiments, when the portable electronic device 700 is coupled to or attached to the case 711 , a software application and/or drivers are automatically loaded and/or executed by the portable electronic device 700 in order to render an image based on communication with the imaging circuit 712 . The software application and/or drivers may be stored in a memory of the imaging circuit 712 and communicated to the portable electronic device 700 and/or may be retrieved by the portable electronic device through a network (e.g., the internet).
- a network e.g., the internet
- the portable electronic device 700 receives raw data from the communication device 714 and processes the raw data using processing circuitry (e.g., image signal processor, digital signal processor, filters, and/or the like) included in the portable electronic device 700 .
- processing circuitry e.g., image signal processor, digital signal processor, filters, and/or the like
- the imaging circuit 712 includes a local imaging processor 716 configured to process signals received by imaging elements 104 .
- the communication device 714 may be configured to communicate data received from the imaging elements 104 (e.g., such as raw sensor data) and/or may communicate processed data that is received from the local imaging processor 716 .
- the portable electronic device 700 includes an interface 702 for displaying an image that is rendered by processing signals received from the communication device 714 .
- an imaging circuit e.g., an integrated circuit
- FIG. 8A illustrates a front view of a case 811 A for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8B illustrates a back view of the case 811 A including a retaining mechanism 820 for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8C illustrates a front view of a case 811 B for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8A illustrates a front view of a case 811 A for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8B illustrates a back view of the case 811 A including a retaining mechanism 820 for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8C illustrates a front view of a case 811 B for a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8D illustrates a back view of the case 811 B including a retaining mechanism for a modular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.
- FIG. 8E illustrates a modular unit 830 including an imaging circuit 712 according to some embodiments of the present disclosure.
- the case 811 A has a different shape than the case 811 B.
- the case 811 A may be utilized for a first portable electronic device, while the case 811 B may be utilized for a second portable electronic device having a different size and/or shape than the first portable electronic device.
- Each of the cases 811 A and 811 B includes a retaining mechanism 820 that is configured to retain the modular unit 830 .
- the modular unit 830 may include the imaging circuit 712 as discussed above with reference to FIGS. 7A-7D .
- the imaging circuit 712 may include one or more imaging elements 104 , a communication device 714 , and/or a local imaging processor 716 .
- the modular unit 830 also includes a coupling mechanism 832 that is configured to engage with the retaining mechanism 820 of the cases 811 A and 811 B.
- the retaining mechanism 820 may correspond to a slot on the case 811 A and/or 811 B that is configured to receive the modular unit 830 .
- the coupling mechanism 832 may be shaped to correspond to the slot of the case 811 A and/or 811 B such that the modular unit 830 may be secured by the case 811 A and/or 811 B.
- the retaining mechanism 820 and the coupling mechanism 832 may include corresponding structures for locking the modular unit 830 in place during use.
- the retaining mechanism 820 may include one or more magnets having a first polarity
- the coupling mechanism 832 may include one or more magnets having a second polarity that is opposite of the first polarity such that the modular unit 830 can be retained by the case 811 A and/or 811 B.
- the modular unit 830 may be incorporated with different cases 811 A and/or 811 B that are utilized for different portable electronic devices, the modular unit 830 may advantageously provide flexibility in the incorporation of an imaging system with different portable electronic devices.
- different cases 811 A and 811 B may be manufactured using any suitable techniques (e.g., 3-D printing, injection molding, or the like).
- case 811 A and/or case 811 B may be manufactured at low cost such that the different cases 811 A and 811 B may be discarded and/or upgraded while remaining compatible with the modular unit 830 .
- the modular unit 830 can be integrated into and utilized by a user with a plurality of portable electronic devices even when the design of the portable electronic devices is changed (e.g., updated and/or upgraded).
- CMOS Complementary Metal Oxide Semiconductor
- One or more aspects and embodiments of the present disclosure involving the performance of processes or methods may utilize program instructions executable by a device (e.g., a computer, a processor, or other device) to perform, or control performance of, the processes or methods.
- a device e.g., a computer, a processor, or other device
- inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a non-transitory computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above.
- the computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above.
- computer readable media may be non-transitory media.
- program or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects as described above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present application need not reside on a single computer or processor, but may be distributed in a modular fashion among a number of different computers or processors to implement various aspects of the present application.
- Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
- program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- functionality of the program modules may be combined or distributed as desired in various embodiments.
- data structures may be stored in computer-readable media in any suitable form.
- data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields.
- any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
- the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
- a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer, as non-limiting examples. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
- PDA Personal Digital Assistant
- a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible formats.
- Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet.
- networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
- some aspects may be embodied as one or more methods.
- the acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
- a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
- the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
- This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
- “at least one of A and B” can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Gynecology & Obstetrics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Geometry (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Telephone Set Structure (AREA)
Abstract
Description
- The present disclosure relates generally to imaging devices and methods (e.g., ultrasound imaging devices and methods).
- Imaging technologies are used at various stages of medical care. For example, imaging technologies are used to non-invasively diagnose patients, to monitor the performance of medical (e.g., surgical) procedures, and/or to monitor post-treatment progress or recovery.
- Conventional imaging devices and methods, including magnetic resonance imaging (MRI) technology, are typically configured for and limited to use within a fixed location in a hospital setting. MRI technology is also generally slow, and suffers from other drawbacks including high cost, loud noise, and the use of potentially harmful magnetic fields.
- In view of the foregoing, it would be desirable to provide portable electronic devices and associated methods with integrated imaging capabilities.
- Some embodiments of the present disclosure relate to a portable electronic device (e.g., smart phone and/or tablet computer) for generating and displaying an image (e.g., 2-dimensional or 3-dimensional image) of what appears to be a window into an underlying object, such as a human body, when placed in proximity to (e.g., on or close to) the object. The window and corresponding image displayed on a display screen of the portable electronic device change as the portable electronic device is moved over various portions of the body (e.g., abdomen, thorax, etc.). The image displayed by the portable electronic device may identify, for example, organs, arteries, veins, tissues, bone, and/or other bodily contents or parts. In various embodiments, the image may be presented in 3 dimensions such at it appears to the viewer as if the viewer is looking into the body, or as if the body parts have been projected up (e.g., exploded view) from the body.
- The present disclosure provides numerous embodiments of systems, apparatus, computer readable media, and methods for providing imaging functionality using a portable electronic device, such as, for example, a smart phone or a tablet computer. In some embodiments, the portable electronic device is configured to generate and display an image of what appears to be an exploded view (e.g., 3-dimensional, upwardly projected image) of an object or its constituent parts. In some embodiments, movement of the portable electronic device results in the rendering of a different internal image of the target (e.g., different portion(s) of a human body). In some embodiments, the generated window of the underlying object (e.g., the portion of the human body) may provide an internal view of the object (e.g., a three-dimensional rendering of an organ or a portion of an organ).
- In some embodiments according to one aspect of the present disclosure, a portable electronic device is provided that includes a processor configured to generate an image (e.g., ultrasound image) of an internal feature of a target when the device is positioned at an external surface of the target, and a display configured to display the image.
- In some embodiments according to another aspect of the present disclosure, a portable ultrasound device is provided that includes multiple ultrasound elements configured to receive ultrasound radiation reflected by or passing through a target when the ultrasound device is pointed at the target. The portable ultrasound device also includes a display configured to display an image of an internal feature of the target based at least in part on the ultrasound radiation received by the plurality of ultrasound elements.
- In some embodiments according to another aspect of the present disclosure, a method is provided that includes pointing a portable electronic device at an external surface of a subject, and viewing, on a display of the portable electronic device, an image of an internal feature of the subject while pointing the portable electronic device at the external surface of the subject. In some embodiments, the portable electronic device includes a radiation sensor, and the method further includes receiving, with the radiation sensor, radiation reflected by or passing through the subject, and creating the image of the internal feature based at least in part on the radiation received by the radiation sensor.
- In some embodiments according to yet another aspect of the present disclosure, a portable electronic device is provided that renders within a window on a display of the device an image (e.g., 3-dimensional image) of an inside of a human body when the device is directed at the body (e.g., within about one meter or less of the body). In some embodiments, the image changes to reflect additional body parts as the device is moved relative to the body.
- In some embodiments according to another aspect of the present disclosure, a portable electronic device is provided that includes multiple imaging elements configured to receive radiation signals transmitted through or reflected by an imaging target and an imaging interface. The portable electronic device also includes one or more processors configured to receive one or more sensing signals from at least one of the plurality of imaging elements, and to render an image of the imaging target for display through the imaging interface based at least in part on the one or more sensing signals.
- Aspects and embodiments of the present disclosure will be described with reference to the following figures. It should be appreciated that the figures are not necessarily drawn to scale. Items appearing in multiple figures are indicated by the same reference number in all the figures in which they appear.
-
FIG. 1A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a human body or a portion of a human body according to some embodiments of the present disclosure. -
FIG. 1B illustrates a three-dimensional internal image of a portion of a human body that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure. -
FIG. 2A illustrates a front view of portable electronic device including an imaging interface according to some embodiments of the present disclosure. -
FIG. 2B illustrates a back view of portable electronic device including imaging elements according to some embodiments of the present disclosure. -
FIG. 3 illustrates a transmissive imaging system and method according to some embodiments of the present disclosure. -
FIG. 4 illustrates a reflective imaging system and method according to some embodiments of the present disclosure. -
FIG. 5 illustrates a transmissive and/or reflective imaging system and method according to some embodiments of the present disclosure. -
FIG. 6A illustrates a portable electronic device including an imaging interface for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments of the present disclosure. -
FIG. 6B illustrates a three-dimensional internal image of a portion of a human body at the first position shown inFIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure. -
FIG. 6C illustrates a three-dimensional internal image of a portion of a human body at the second position shown inFIG. 6A that is generated and/or rendered by a portable electronic device according to some embodiments of the present disclosure. -
FIG. 7A illustrates a front view of a portable electronic device according to some embodiments of the present disclosure. -
FIG. 7B illustrates a back view of a portable electronic device according to some embodiments of the present disclosure. -
FIG. 7C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure. -
FIG. 7D illustrates a back view of a case including imaging elements for a portable electronic device according to some embodiments of the present disclosure. -
FIG. 8A illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure. -
FIG. 8B illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure. -
FIG. 8C illustrates a front view of a case for a portable electronic device according to some embodiments of the present disclosure. -
FIG. 8D illustrates a back view of a case including a retaining mechanism for a modular unit utilized with a portable electronic device according to some embodiments of the present disclosure. -
FIG. 8E illustrates a modular unit including an imaging circuit according to some embodiments of the present disclosure. - According to some embodiments of the present disclosure, a portable electronic device is provided that includes an imaging interface and one or more imaging elements. For example, the portable electronic device may be a cellular phone, personal digital assistant, smart phone, tablet device, digital camera, laptop computer, or the like. An image may be generated and/or rendered utilizing the portable electronic device. For example, the portable electronic device may be utilized to simulate a “window” into an imaging target, such as a human body or portion of the body. The simulated “window” may provide a view of the inside of a human body or portion of the body, including organs, arteries, veins, tissues, bone, and/or other bodily contents or parts. For example, an image (e.g., ultrasound or sonographic image) may be generated that illustrates and/or simulates internal features of the imaging target for a user. In some embodiments, a real-time continuous or substantially real-time continuous image may be generated and/or rendered such that movement of the portable electronic device results in a substantially real-time updated image of the area that corresponds to the new position of the portable electronic device. In some embodiments, internal movement of the target object (e.g., such as expansion and/or contraction of organs) may be rendered in real-time by the portable electronic device.
- In some embodiments, the portable electronic devices and methods described herein may include, be coupled to (e.g., via a suitable communications connection or port such as a USB link), or otherwise utilize one or more radiation sources, sensors, and/or transducers (e.g., array(s) of ultrasound transducers), front-end processing circuitry and associated processing techniques, and/or image reconstruction devices and/or methods, in order to generate and/or render images to a user according to the non-limiting embodiments described in detail throughout the present disclosure.
- In some embodiments of the present disclosure, one or more of the devices described in
FIGS. 1A-8E herein may include or be coupled to one or more ultrasound imaging elements (e.g., one or more arrays of ultrasound sources, sensors, and/or transducers). One or more computers or processors within the portable electronic device may perform image analysis and/or image rendering based at least in part on radiation signals received by an imaging device. -
FIG. 1A illustrates a portableelectronic device 100 including animaging interface 102 for generating and/or rendering an internal image of a human body or a portion of a human body 106 according to some embodiments.FIG. 1B illustrates a three-dimensionalinternal image 110 of a portion of a human body that is generated and/or rendered by a portableelectronic device 100 according to some embodiments. As shown inFIG. 1A , the portableelectronic device 100 may be positioned in an area near (e.g. in contact with the surface of or within about one meter from the surface of) a portion of a human body that is to be imaged and/or analyzed. The portableelectronic device 100 may includeimaging elements 104 that are configured to transmit and/or receive radiation signals. Theimaging elements 104, along with other components and functions of the portableelectronic device 100 according to some embodiments of the present disclosure will be described in greater detail below with reference toFIG. 2A-2B . Aninternal image 110 as shown inFIG. 1B may be generated by the portableelectronic device 100. Theinternal image 110 may be a three-dimensional internal image of a portion of the human body that appears to aviewer 117 to project upward from a surface of the portableelectronic device 100, giving the viewer the perception of a viewing window into the underlying body. Through generation of the internal image, the portableelectronic device 100 may provide a window into the internal areas of the human body that are below the surface. As will be described in greater detail below with reference toFIGS. 6A-6C , the generated images may be real-time continuous images such that the images are dynamically updated based on movement of the portableelectronic device 100 and/or the image target (e.g., internal organs of the human body). -
FIG. 2A illustrates a front view of portableelectronic device 100 including animaging interface 102 according to some embodiments of the present disclosure. Theimaging interface 102 of the portableelectronic device 100 may include a display that is configured to output a two-dimensional (2-D) or three-dimensional (3-D) image of an imaging target. In some embodiments, theimaging interface 102 is interactive and is capable of receiving user input, for example through a touch-screen. An image that is displayed via theimaging interface 102 may be adjusted based on the received inputs, for example, to adjust zoom level, centering position, level of detail, depth of an underlying object to be imaged, resolution, brightness, color and/or the like of the image. For example, in some embodiments,imaging interface 102 may be configured to allow a user to selectively traverse various layers and imaging depths of the underlying object using, for example, the touch screen. - Portable
electronic device 100 may render a three-dimensional image of the imaging target using any suitable method or combination of methods (e.g., anaglyph, polarization, eclipse, interference filtering, and/or austosteroscopy). For example, in some embodiments, theimaging interface 102 includes a circular polarizer and/or a linear polarizer such that a viewer having polarizing filtering spectacles can view a three-dimensional image. In some embodiments, theimaging interface 102 is configured to display alternating left and right images such that a viewer having spectacles with shutters that alternate in conjunction with the displayed image. In some embodiments, theimaging interface 102 may utilize an autostereoscopy method such that 3-D spectacles are not necessary for use by a viewer to view the three-dimensional image. - In some embodiments, portable
electronic device 100 may display information (e.g., text and/or graphics) in addition to (e.g., graphically overlaid on top of or adjacent to) an image of a targeted object, such as, for example, text and/or graphics identifying the structure(s) identified in the image (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts). In some embodiments, portableelectronic device 100 may include one or more processors for identifying structure(s) identified in the image based at least in part on stored data (e.g., data stored in random access memory or other storage device of portable electronic device 100). For example, data stored withindevice 100 may identify characteristic(s) of structure(s) (e.g., one or more shapes, colors, textures, cellular characteristics, tissue characteristics, and/or other distinctive and/or surrounding features or structures) that may be present within different areas of the human body for use by personalelectronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered bydevice 100. In some embodiments, data stored withindevice 100 may identify characteristics of particular disease(s) such as cancer or other abnormalities for use by personalelectronic device 100 to identify and/or predict the type(s) of structures depicted in an image rendered bydevice 100. In some embodiments, the image, text, graphics, and/or other information displayed on theuser interface 104 may be adjusted through user interaction with one or more inputs (e.g., touch screen, buttons, touch-sensitive areas, or the like) of the portableelectronic device 100. -
FIG. 2B illustrates a back view of portableelectronic device 100 includingimaging elements 104 according to some embodiments of the present disclosure. Theimaging elements 104 may be configured as sources (emitters) and/or sensors of ultrasound radiation and/or other radiation. In some embodiments, theimaging elements 104 may be of substantially the same size and/or may be arranged in an array as shown inFIG. 2B . In some embodiments, theimaging elements 104 may be of different sizes and/or arranged in an irregular or scattered configuration. In some embodiments, one or more (e.g., all) of theimaging elements 104 may be arranged in the same plane. In other embodiments, at least some of imaging elements may be arranged in at least two different planes. In some embodiments, all of theimaging elements 104 included in the portableelectronic device 100 may be either emitting elements or sensing elements. In some embodiments, theimaging elements 104 may include both emitting elements and sensing elements. The embodiment shown inFIG. 2B includes a 4×6 array ofimaging elements 104, by way of illustration only and is not intended to be limiting. In other embodiments, any other suitable numbers of imaging elements may be provided (e.g., 10, 20, 30, 40, 50, 100, 200, 500, 1000, or any number in between, or more) and may be arranged in any suitable configuration. - In some embodiments, the
imaging elements 104 may be integrated within a circuit board (e.g., a printed circuit board) that includes, for example, processing (e.g., image processing) components of the portableelectronic device 100. In some embodiments, theimaging elements 104 may be provided on a separate circuit board or layer of a circuit board than the processing components of the portableelectronic device 100, and may be in communication with the processing circuitry through a suitable communications link (e.g., an internal bus, USB link, or other port). - The
imaging elements 104 according to some embodiments of the present disclosure may include their own dedicated processing circuitry, such as a graphic processing unit (GPU), digital signal processor (DSP), and/or central processing unit (CPU), and/or may utilize processing circuitry of the portableelectronic device 100. For example, in some embodiments, the CPU and/or GPU of the portableelectronic device 100 may be utilized for image acquisition/reconstruction and image rendering. In some embodiments, the CPU of portableelectronic device 100 may be utilized to process computations based on received signals (e.g., back-scattered signals and/or transmissive signals) in order to generate an image or topography, while the GPU may be utilized to render an image based on the information received from the CPU to generate a real-time or substantially real-time image display. In some embodiments, portableelectronic device 100 may include one or more components for processing, filtering, amplification, and/or rendering images. -
FIG. 3 illustrates a transmissive imaging system andmethod 301 according to some embodiments of the present disclosure. As shown inFIG. 3 , thetransmissive imaging system 301 includes two portableelectronic devices 100A and 100B that are on opposing or generally opposing sides of animaging target 306. In other embodiments,devices 100A and 100B may be positioned in any other relationship with respect to one another. In some embodiments,devices 100A and/or 100B may include one or more sensors for determining the relative positions of these devices to aid in the generation of image(s). While shown as a portable electronic device 100B (e.g., smart phone), in some embodiments device 100B may be a dedicated sensing and/or emitting device such as an array of ultrasound elements and associated circuitry. Signals (e.g., waves or beams 308) emitted from the portable electronic device 100B are sensed by the portableelectronic device 100A and are utilized to render a 2-D or 3-D image 312 (e.g., real-time or substantially real-time image) of thetarget 306. In some embodiments, a generated 3-D image may be in the form of a pop-out image or a depth image. In some embodiments, the portableelectronic device 100A may be configured to transmit signals (e.g., waves or beams) 308 though thetarget 306 to be received by the portable electronic device 100B. In some embodiments, the portable electronic device 100B may simultaneously or substantially simultaneously render an image (e.g., back view or alternate view or level of detail of an image rendered bydevice 100A) based at least in part on processing sensed signals. In some embodiments, the portableelectronic devices 100A and/or 100B may communicate the results of the sensed signals to the other in order to generate or improve a rendered image. -
FIG. 4 illustrates a back-scatter or reflective imaging system andmethod 401 according to some embodiments of the present disclosure. As shown inFIG. 4 , a portableelectronic device 100 may utilize emission and/or sensingelements 104 in order to render animage 410 based at least in part on reflection (e.g., back-scatter effect) of thesignals 408. In some embodiments, portableelectronic device 100 is the only device utilized in order to image the target (e.g., to produce an image appearing as a window into a human body). For example, the portableelectronic device 100 may include both radiation sources and sensors (e.g., separate sources and sensors, and/or multiple transducers functioning as both sources and sensors), where all or substantially all of the radiation utilized by the sensors to reconstruct image(s) is backscatter radiation or radiation produced through a similar effect. -
FIG. 5 illustrates a transmissive and/or reflective imaging system andmethod 501 according to some embodiments of the present disclosure. As shown inFIG. 5 , a plurality of devices, such as portableelectronic devices electronic device 500B. Each of the portableelectronic devices 500A-500D may be configured to emit signals (e.g., waves or beams) 508 as shown inFIG. 5 . Theimage 510, or alternate views of the image or imaged structure, may be rendered on the other portable electronic devices (e.g., 500A, 500C, and 500D) through communication with one-another. In some embodiments, each of the devices (e.g., 500A, 500C, and/or 500D) may be configured as emitting and/or sensing devices only. Theimage 510 that is rendered onportable device 500B may be based at least in part onsignals 508 that are emitted by one or more of thedevices 500A-500D, and which are sensed through reflection (e.g., back-scatter) and/or transmission by one or more of thedevices 500A-500D. - In some embodiments, one or more portable electronic devices according to the present disclosure may generate and/or render an image based solely on signals received by one or more sensors (e.g., ultrasound transducers) of the device. In some embodiments, one or more portable electronic devices according to the present disclosure may generate and/or render an image based at least in part on information stored in memory (e.g., random access memory) of the portable device(s) identifying detail(s) regarding the structure(s), part(s), composition(s), and/or other characteristic(s) of object(s) to be imaged. For example, in some embodiments, when data received by one or more sensor(s) of the portable electronic devices indicates that the object being imaged is a particular body part or region, the portable electronic devices may use stored data in addition to the received data in order to generate an image of the object and/or its constituent part(s), and/or to provide addition detail or explanation regarding an object and/or its constituent parts.
- In some embodiments of the present disclosure, the generated and/or rendered image may be a real-time or substantially real-time image that is dynamically updated based on movement of a portable
electronic device 100 along a surface of an imaging target and/or motion of the imaging target.FIG. 6A illustrates a portableelectronic device 100 including animaging interface 102 for generating and/or rendering an internal image of a portion of a human body at a first position and at a second position according to some embodiments.FIG. 6B illustrates a three-dimensional internal image 610 of a portion of a human body at the first position shown inFIG. 6A that is generated and/or rendered by a portableelectronic device 100 according to some embodiments.FIG. 6C illustrates a three-dimensional internal image 610 of a portion of a human body at the second position shown inFIG. 6A that is generated and/or rendered by a portableelectronic device 100 according to some embodiments. As shown inFIG. 6B , a three-dimensional internal image 610 of a portion of the human body may generated and displayed to a viewer 617. The three-dimensional image 610 may appear to the viewer 617 as an image having variations in, for example, topography that correspond to the surfaces and/or other aspects or features of the internal portion of the body at the first position of the portableelectronic device 100 as shown inFIG. 6A . The three-dimensional image 610 may be a real-time continuous image that is dynamically updated based on movement of the portableelectronic device 100 and/or the internal portion of the body that is being analyzed. As shown inFIG. 6C , a different three-dimensional internal image 610 is displayed to the viewer 617 showing different underlying structures and/or aspects (e.g., organs, arteries, veins, tissues, bone, and/or other bodily contents or parts). The three-dimensional internal image 610 shown inFIG. 6C corresponds to the internal image of the body portion corresponding to the second position of the portableelectronic device 100 as shown inFIG. 6A . As shown inFIG. 6C , the internal image 610 is illustrated as a different image showing different topographical and/or other aspects or features of the body portion than the internal image 610 shown inFIG. 6B . As discussed above, through selection of different aspect ratios and/or zoom settings, as well as through positioning of the portable electronic device 600, different types of internal images of a target may be generated, such as a three-dimensional view of an entire organ or multiple organs. - In some embodiments, the imaging elements, including sensors and/or sources (e.g., transducers), may be provided on, in, or otherwise coupled to a case for a portable electronic device.
FIG. 7A illustrates a front view of a portableelectronic device 700 according to some embodiments of the present disclosure. The portableelectronic device 700 includes animaging interface 702.FIG. 7B illustrates a back view of the portableelectronic device 700 according to some embodiments of the present disclosure. As shown inFIGS. 7A and 7B , unlike the portableelectronic device 100, the portableelectronic device 700 does not includeimaging elements 104 as part of the main housing or enclosure ofdevice 700. -
FIG. 7C illustrates a front view of acase 711 for a portable electronic device according to some embodiments of the present disclosure.FIG. 7D illustrates a back view of thecase 711 including imaging elements for a portable electronic device according to some embodiments of the present disclosure. Thecase 711 may be configured to at least partially enclose the portableelectronic device 700. In some embodiments,case 711 may simultaneously provide imaging capabilities to portableelectronic device 700 and serve as a protective case. The case may be made of any suitable material such as rubber, plastic, leather, and/or or the like. As shown inFIG. 7D , an imaging circuit 712 (e.g., an integrated circuit) may be provided on (e.g., directly on), embedded in, and/or otherwise coupled to the back surface and/or other surface(s) of thecase 711.Case 711 may be considered part of portableelectronic device 700. - The
imaging circuit 712 may include one ormore imaging elements 104. As discussed above, theimaging elements 104 may include sources and/or sensors. Theimaging circuit 712 may also include acommunication device 714 configured to communicate with the portableelectronic device 700 via a wired or wireless link. For example, theimaging circuit 712 may include a communication transmitter/receiver which utilizes an infrared signal, a Bluetooth communication signal, a near-field communication signal, and/or the like to communicate with the portableelectronic device 700. In some embodiments, thecommunication device 714 may be in communication with the processing circuitry of a portable electronic device through a wired communications link (e.g., a USB port, or other data port), or combination of wired and wireless links. In some embodiments, theimaging circuit 712 may receive power through wired and/or wireless connection(s) to the portable electronic device. In some embodiments, theimaging circuit 712 may receive power from a separate power source (e.g., a battery) that is coupled to theimaging circuit 712. In some embodiments, when the portableelectronic device 700 is coupled to or attached to thecase 711, a software application and/or drivers are automatically loaded and/or executed by the portableelectronic device 700 in order to render an image based on communication with theimaging circuit 712. The software application and/or drivers may be stored in a memory of theimaging circuit 712 and communicated to the portableelectronic device 700 and/or may be retrieved by the portable electronic device through a network (e.g., the internet). - In some embodiments, the portable
electronic device 700 receives raw data from thecommunication device 714 and processes the raw data using processing circuitry (e.g., image signal processor, digital signal processor, filters, and/or the like) included in the portableelectronic device 700. In some embodiments, theimaging circuit 712 includes alocal imaging processor 716 configured to process signals received by imagingelements 104. Thecommunication device 714 may be configured to communicate data received from the imaging elements 104 (e.g., such as raw sensor data) and/or may communicate processed data that is received from thelocal imaging processor 716. As shown inFIG. 7A , the portableelectronic device 700 includes aninterface 702 for displaying an image that is rendered by processing signals received from thecommunication device 714. - In some embodiments, an imaging circuit (e.g., an integrated circuit) may be provided separately such that it can be mounted and/or attached to different cases used by different portable electronic devices.
FIG. 8A illustrates a front view of acase 811A for a portable electronic device according to some embodiments of the present disclosure.FIG. 8B illustrates a back view of thecase 811A including aretaining mechanism 820 for amodular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.FIG. 8C illustrates a front view of acase 811B for a portable electronic device according to some embodiments of the present disclosure.FIG. 8D illustrates a back view of thecase 811B including a retaining mechanism for amodular unit 830 utilized with a portable electronic device according to some embodiments of the present disclosure.FIG. 8E illustrates amodular unit 830 including animaging circuit 712 according to some embodiments of the present disclosure. As shown inFIGS. 8A-8D , thecase 811A has a different shape than thecase 811B. Thecase 811A may be utilized for a first portable electronic device, while thecase 811B may be utilized for a second portable electronic device having a different size and/or shape than the first portable electronic device. Each of thecases retaining mechanism 820 that is configured to retain themodular unit 830. - The
modular unit 830 may include theimaging circuit 712 as discussed above with reference toFIGS. 7A-7D . Theimaging circuit 712 may include one ormore imaging elements 104, acommunication device 714, and/or alocal imaging processor 716. Themodular unit 830 also includes acoupling mechanism 832 that is configured to engage with theretaining mechanism 820 of thecases retaining mechanism 820 may correspond to a slot on thecase 811A and/or 811B that is configured to receive themodular unit 830. Thecoupling mechanism 832 may be shaped to correspond to the slot of thecase 811A and/or 811B such that themodular unit 830 may be secured by thecase 811A and/or 811B. In some embodiments, theretaining mechanism 820 and thecoupling mechanism 832 may include corresponding structures for locking themodular unit 830 in place during use. In some embodiments, theretaining mechanism 820 may include one or more magnets having a first polarity, and thecoupling mechanism 832 may include one or more magnets having a second polarity that is opposite of the first polarity such that themodular unit 830 can be retained by thecase 811A and/or 811B. - As described with reference to
FIGS. 8A-8E , since themodular unit 830 may be incorporated withdifferent cases 811A and/or 811B that are utilized for different portable electronic devices, themodular unit 830 may advantageously provide flexibility in the incorporation of an imaging system with different portable electronic devices. Furthermore,different cases case 811A and/orcase 811B may be manufactured at low cost such that thedifferent cases modular unit 830. As a result, themodular unit 830 can be integrated into and utilized by a user with a plurality of portable electronic devices even when the design of the portable electronic devices is changed (e.g., updated and/or upgraded). - Examples of suitable imaging devices that may integrated within or coupled to a portable electronic device according to some embodiments of the present disclosure are described in commonly-owned U.S. patent application Ser. No. 13/654,337 filed Oct. 17, 2012, and entitled “Transmissive Imaging and Related Apparatus and Methods;” U.S. Provisional Application Ser. No. 61/798,851 filed Mar. 15, 2013, and entitled “Monolithic Ultrasonic Imaging Devices, Systems and Methods;” and U.S. Provisional Application Ser. No. 61/794,744 filed on Mar. 15, 2013, and entitled “Complementary Metal Oxide Semiconductor (CMOS) Ultrasonic Transducers and Methods for Forming the Same,” each of which is incorporated by reference in its entirety.
- Having thus described several aspects and embodiments of the technology described herein, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the technology described in the present disclosure. For example, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described. In addition, any combination of two or more features, systems, articles, materials, kits, and/or methods described herein, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
- The above-described embodiments can be implemented in any of numerous ways. One or more aspects and embodiments of the present disclosure involving the performance of processes or methods may utilize program instructions executable by a device (e.g., a computer, a processor, or other device) to perform, or control performance of, the processes or methods. In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a non-transitory computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement one or more of the various embodiments described above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various ones of the aspects described above. In some embodiments, computer readable media may be non-transitory media.
- The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects as described above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present application need not reside on a single computer or processor, but may be distributed in a modular fashion among a number of different computers or processors to implement various aspects of the present application.
- Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
- Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
- When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
- Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer, as non-limiting examples. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
- Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible formats.
- Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
- Also, as described, some aspects may be embodied as one or more methods. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
- All definitions, as defined and used herein, should be understood to control over dictionary definitions and/or ordinary meanings of the defined terms.
- The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
- The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
- As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
- Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.
Claims (37)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/644,456 US20170309023A1 (en) | 2013-04-03 | 2017-07-07 | Portable Electronic Devices With Integrated Imaging Capabilities |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/856,252 US9667889B2 (en) | 2013-04-03 | 2013-04-03 | Portable electronic devices with integrated imaging capabilities |
US15/581,429 US20170228862A1 (en) | 2013-04-03 | 2017-04-28 | Portable Electronic Devices With Integrated Imaging Capabilities |
US15/644,456 US20170309023A1 (en) | 2013-04-03 | 2017-07-07 | Portable Electronic Devices With Integrated Imaging Capabilities |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/581,429 Continuation US20170228862A1 (en) | 2013-04-03 | 2017-04-28 | Portable Electronic Devices With Integrated Imaging Capabilities |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170309023A1 true US20170309023A1 (en) | 2017-10-26 |
Family
ID=50736175
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/856,252 Active 2033-12-28 US9667889B2 (en) | 2013-04-03 | 2013-04-03 | Portable electronic devices with integrated imaging capabilities |
US15/581,429 Abandoned US20170228862A1 (en) | 2013-04-03 | 2017-04-28 | Portable Electronic Devices With Integrated Imaging Capabilities |
US15/644,456 Abandoned US20170309023A1 (en) | 2013-04-03 | 2017-07-07 | Portable Electronic Devices With Integrated Imaging Capabilities |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/856,252 Active 2033-12-28 US9667889B2 (en) | 2013-04-03 | 2013-04-03 | Portable electronic devices with integrated imaging capabilities |
US15/581,429 Abandoned US20170228862A1 (en) | 2013-04-03 | 2017-04-28 | Portable Electronic Devices With Integrated Imaging Capabilities |
Country Status (7)
Country | Link |
---|---|
US (3) | US9667889B2 (en) |
EP (1) | EP2981215A2 (en) |
JP (2) | JP6786384B2 (en) |
KR (1) | KR20150145236A (en) |
CN (1) | CN105263419A (en) |
CA (1) | CA2908631C (en) |
WO (1) | WO2014165662A2 (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8804457B2 (en) * | 2011-03-31 | 2014-08-12 | Maxim Integrated Products, Inc. | Transmit/receive systems for imaging devices |
US10667790B2 (en) | 2012-03-26 | 2020-06-02 | Teratech Corporation | Tablet ultrasound system |
US9877699B2 (en) | 2012-03-26 | 2018-01-30 | Teratech Corporation | Tablet ultrasound system |
KR20220097541A (en) | 2013-03-15 | 2022-07-07 | 버터플라이 네트워크, 인크. | Monolithic ultrasonic imaging devices, systems and methods |
CA2919183A1 (en) | 2013-07-23 | 2015-01-29 | Butterfly Network, Inc. | Interconnectable ultrasound transducer probes and related methods and apparatus |
US20150038844A1 (en) * | 2013-08-01 | 2015-02-05 | Travis Blalock | Portable Ultrasound System Comprising Ultrasound Front-End Directly Connected to a Mobile Device |
US10042078B2 (en) * | 2015-02-27 | 2018-08-07 | The United States of America, as Represented by the Secretary of Homeland Security | System and method for viewing images on a portable image viewing device related to image screening |
CN104800974A (en) * | 2015-04-16 | 2015-07-29 | 修清 | Heart failure treatment device |
US9933470B2 (en) * | 2015-08-31 | 2018-04-03 | The Boeing Company | Energy spectrum visualization system |
US20190336101A1 (en) * | 2016-11-16 | 2019-11-07 | Teratech Corporation | Portable ultrasound system |
US10856843B2 (en) | 2017-03-23 | 2020-12-08 | Vave Health, Inc. | Flag table based beamforming in a handheld ultrasound device |
US11446003B2 (en) | 2017-03-27 | 2022-09-20 | Vave Health, Inc. | High performance handheld ultrasound |
US10469846B2 (en) | 2017-03-27 | 2019-11-05 | Vave Health, Inc. | Dynamic range compression of ultrasound images |
US11531096B2 (en) | 2017-03-23 | 2022-12-20 | Vave Health, Inc. | High performance handheld ultrasound |
US11793488B2 (en) * | 2018-02-16 | 2023-10-24 | Koninklijke Philips N.V. | Ergonomic display and activation in handheld medical ultrasound imaging device |
US11250050B2 (en) * | 2018-03-01 | 2022-02-15 | The Software Mackiev Company | System for multi-tagging images |
CN111970976B (en) * | 2018-04-13 | 2023-09-29 | 富士胶片株式会社 | Ultrasonic system and control method for ultrasonic system |
US20230101257A1 (en) * | 2020-03-05 | 2023-03-30 | Koninklijke Philips N.V. | Handheld ultrasound scanner with display retention and associated devices, systems, and methods |
US20220211346A1 (en) * | 2021-01-04 | 2022-07-07 | Bfly Operations, Inc. | Methods and apparatuses for displaying ultrasound displays on a foldable processing device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060173326A1 (en) * | 2003-06-10 | 2006-08-03 | Koninklijke Philips Electronics N.V. | User interface for a three-dimensional colour ultrasound imaging system |
US20080262304A1 (en) * | 2004-06-30 | 2008-10-23 | Micha Nisani | In-Vivo Sensing System Device and Method for Real Time Viewing |
US20110319736A1 (en) * | 2010-06-25 | 2011-12-29 | MuscleSound, LLC | System for non-invasive determination of glycogen stores |
US8551000B2 (en) * | 2006-06-23 | 2013-10-08 | Teratech Corp. | Ultrasound 3D imaging system |
Family Cites Families (439)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA1050654A (en) | 1974-04-25 | 1979-03-13 | Varian Associates | Reconstruction system and method for ultrasonic imaging |
US4149247A (en) | 1975-12-23 | 1979-04-10 | Varian Associates, Inc. | Tomographic apparatus and method for reconstructing planar slices from non-absorbed and non-scattered radiation |
US4100916A (en) | 1976-04-27 | 1978-07-18 | King Donald L | Three-dimensional ultrasonic imaging of animal soft tissue |
US4075883A (en) | 1976-08-20 | 1978-02-28 | General Electric Company | Ultrasonic fan beam scanner for computerized time-of-flight tomography |
US4317369A (en) | 1978-09-15 | 1982-03-02 | University Of Utah | Ultrasound imaging apparatus and method |
US4281550A (en) | 1979-12-17 | 1981-08-04 | North American Philips Corporation | Curved array of sequenced ultrasound transducers |
EP0031614B2 (en) | 1979-12-17 | 1990-07-18 | North American Philips Corporation | Curved array of sequenced ultrasound transducers |
US5488952A (en) | 1982-02-24 | 1996-02-06 | Schoolman Scientific Corp. | Stereoscopically display three dimensional ultrasound imaging |
JPS595947A (en) | 1982-07-02 | 1984-01-12 | Toshiba Corp | Ultrasonic scanning apparatus |
US4594662A (en) | 1982-11-12 | 1986-06-10 | Schlumberger Technology Corporation | Diffraction tomography systems and methods with fixed detector arrays |
US4662222A (en) | 1984-12-21 | 1987-05-05 | Johnson Steven A | Apparatus and method for acoustic imaging using inverse scattering techniques |
DE3601983A1 (en) | 1986-01-23 | 1987-07-30 | Siemens Ag | METHOD AND DEVICE FOR CONTACTLESS DETERMINATION OF TEMPERATURE DISTRIBUTION IN AN EXAMINATION OBJECT |
US5065761A (en) | 1989-07-12 | 1991-11-19 | Diasonics, Inc. | Lithotripsy system |
US5206637A (en) | 1991-01-31 | 1993-04-27 | Meridian Incorporated | Removable file programming unit |
GB9109881D0 (en) | 1991-05-08 | 1991-07-03 | Advanced Tech Lab | Transesophageal echocardiography scanner with rotating image plane |
US5206165A (en) | 1991-07-09 | 1993-04-27 | Michigan Cancer Foundation | Immortal human mammary epithelial cell sublines |
US5269307A (en) | 1992-01-31 | 1993-12-14 | Tetrad Corporation | Medical ultrasonic imaging system with dynamic focusing |
US5409010A (en) | 1992-05-19 | 1995-04-25 | Board Of Regents Of The University Of Washington | Vector doppler medical devices for blood velocity studies |
US5382521A (en) | 1992-07-14 | 1995-01-17 | Michigan Cancer Foundation | Method of determining metastatic potential of bladder tumor cells |
DE4229817C2 (en) | 1992-09-07 | 1996-09-12 | Siemens Ag | Method for the non-destructive and / or non-invasive measurement of a temperature change in the interior of a living object in particular |
US5291893A (en) | 1992-10-09 | 1994-03-08 | Acoustic Imaging Technologies Corporation | Endo-luminal ultrasonic instrument and method for its use |
US6587540B1 (en) | 1992-10-14 | 2003-07-01 | Techniscan, Inc. | Apparatus and method for imaging objects with wavefields |
US6005916A (en) | 1992-10-14 | 1999-12-21 | Techniscan, Inc. | Apparatus and method for imaging with wavefields using inverse scattering techniques |
US5335663A (en) | 1992-12-11 | 1994-08-09 | Tetrad Corporation | Laparoscopic probes and probe sheaths useful in ultrasonic imaging applications |
CN2170735Y (en) * | 1993-05-29 | 1994-07-06 | 清华大学 | High resolving power, low dose, potable B ultrasonic image display instrument |
US5471988A (en) | 1993-12-24 | 1995-12-05 | Olympus Optical Co., Ltd. | Ultrasonic diagnosis and therapy system in which focusing point of therapeutic ultrasonic wave is locked at predetermined position within observation ultrasonic scanning range |
US5471515A (en) | 1994-01-28 | 1995-11-28 | California Institute Of Technology | Active pixel sensor with intra-pixel charge transfer |
US6570617B2 (en) | 1994-01-28 | 2003-05-27 | California Institute Of Technology | CMOS active pixel sensor type imaging system on a chip |
US5834442A (en) | 1994-07-07 | 1998-11-10 | Barbara Ann Karmanos Cancer Institute | Method for inhibiting cancer metastasis by oral administration of soluble modified citrus pectin |
US5677491A (en) | 1994-08-08 | 1997-10-14 | Diasonics Ultrasound, Inc. | Sparse two-dimensional transducer array |
US5619476A (en) | 1994-10-21 | 1997-04-08 | The Board Of Trustees Of The Leland Stanford Jr. Univ. | Electrostatic ultrasonic transducer |
US5894452A (en) | 1994-10-21 | 1999-04-13 | The Board Of Trustees Of The Leland Stanford Junior University | Microfabricated ultrasonic immersion transducer |
US5520188A (en) | 1994-11-02 | 1996-05-28 | Focus Surgery Inc. | Annular array transducer |
US5611025A (en) | 1994-11-23 | 1997-03-11 | General Electric Company | Virtual internal cavity inspection system |
US6517487B1 (en) | 1995-03-01 | 2003-02-11 | Lunar Corporation | Ultrasonic densitometer with opposed single transducer and transducer array |
US5873902A (en) | 1995-03-31 | 1999-02-23 | Focus Surgery, Inc. | Ultrasound intensity determining method and apparatus |
US7841982B2 (en) | 1995-06-22 | 2010-11-30 | Techniscan, Inc. | Apparatus and method for imaging objects with wavefields |
US5990506A (en) | 1996-03-20 | 1999-11-23 | California Institute Of Technology | Active pixel sensors with substantially planarized color filtering elements |
US5893363A (en) | 1996-06-28 | 1999-04-13 | Sonosight, Inc. | Ultrasonic array transducer transceiver for a hand held ultrasonic diagnostic instrument |
US5722412A (en) * | 1996-06-28 | 1998-03-03 | Advanced Technology Laboratories, Inc. | Hand held ultrasonic diagnostic instrument |
US6569101B2 (en) | 2001-04-19 | 2003-05-27 | Sonosite, Inc. | Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use |
US5817024A (en) | 1996-06-28 | 1998-10-06 | Sonosight, Inc. | Hand held ultrasonic diagnostic instrument with digital beamformer |
US6575908B2 (en) | 1996-06-28 | 2003-06-10 | Sonosite, Inc. | Balance body ultrasound system |
US6135961A (en) | 1996-06-28 | 2000-10-24 | Sonosite, Inc. | Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument |
US6416475B1 (en) | 1996-06-28 | 2002-07-09 | Sonosite, Inc. | Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument |
US6383139B1 (en) | 1996-06-28 | 2002-05-07 | Sonosite, Inc. | Ultrasonic signal processor for power doppler imaging in a hand held ultrasonic diagnostic instrument |
US6962566B2 (en) | 2001-04-19 | 2005-11-08 | Sonosite, Inc. | Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use |
US6203498B1 (en) | 1996-06-28 | 2001-03-20 | Sonosite, Inc. | Ultrasonic imaging device with integral display |
US5782769A (en) | 1996-06-28 | 1998-07-21 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic image flash suppression technique |
US7819807B2 (en) | 1996-06-28 | 2010-10-26 | Sonosite, Inc. | Balance body ultrasound system |
AU4169497A (en) | 1996-08-29 | 1998-04-14 | David T. Borup | Apparatus and method for imaging with wavefields using inverse scattering techniques |
DE19635593C1 (en) | 1996-09-02 | 1998-04-23 | Siemens Ag | Ultrasound transducer for diagnostic and therapeutic use |
US5769790A (en) | 1996-10-25 | 1998-06-23 | General Electric Company | Focused ultrasound surgery system guided by ultrasound imaging |
US5740805A (en) | 1996-11-19 | 1998-04-21 | Analogic Corporation | Ultrasound beam softening compensation system |
US5820564A (en) | 1996-12-16 | 1998-10-13 | Albatross Technologies, Inc. | Method and apparatus for surface ultrasound imaging |
US7476411B1 (en) | 1997-02-24 | 2009-01-13 | Cabot Corporation | Direct-write deposition of phosphor powders |
US6193908B1 (en) | 1997-02-24 | 2001-02-27 | Superior Micropowders Llc | Electroluminescent phosphor powders, methods for making phosphor powders and devices incorporating same |
US6197218B1 (en) | 1997-02-24 | 2001-03-06 | Superior Micropowders Llc | Photoluminescent phosphor powders, methods for making phosphor powders and devices incorporating same |
WO1998037165A1 (en) | 1997-02-24 | 1998-08-27 | Superior Micropowders Llc | Oxygen-containing phosphor powders, methods for making phosphor powders and devices incorporating same |
CA2287386A1 (en) * | 1997-04-24 | 1998-10-29 | Wilk Patent Development Corporation | Medical imaging device and associated method |
US6093883A (en) | 1997-07-15 | 2000-07-25 | Focus Surgery, Inc. | Ultrasound intensity determining method and apparatus |
US6049159A (en) | 1997-10-06 | 2000-04-11 | Albatros Technologies, Inc. | Wideband acoustic transducer |
US6050943A (en) | 1997-10-14 | 2000-04-18 | Guided Therapy Systems, Inc. | Imaging, therapy, and temperature monitoring ultrasonic system |
DE19840405B4 (en) | 1997-10-14 | 2005-06-02 | Siemens Ag | Device for fixing the female breast in medical technology applications |
DE19745400C1 (en) | 1997-10-14 | 1999-04-15 | Siemens Ag | Ultrasonic breast tumour therapy process |
EP1025520B1 (en) | 1997-10-30 | 2002-08-28 | Dr. Baldeweg Aktiengesellschaft | Method and device for processing imaged objects |
US6007499A (en) | 1997-10-31 | 1999-12-28 | University Of Washington | Method and apparatus for medical procedures using high-intensity focused ultrasound |
DE69941447D1 (en) | 1998-01-05 | 2009-11-05 | Univ Washington | INCREASED TRANSPORT USING MEMBRANE-DAMAGED SUBSTANCES |
CN1058905C (en) | 1998-01-25 | 2000-11-29 | 重庆海扶(Hifu)技术有限公司 | High-intensity focus supersonic tumor scanning therapy system |
DE19807242C2 (en) | 1998-02-20 | 2002-07-11 | Siemens Ag | Medical-technical system workstation |
US6385474B1 (en) | 1999-03-19 | 2002-05-07 | Barbara Ann Karmanos Cancer Institute | Method and apparatus for high-resolution detection and characterization of medical pathologies |
EP1063920B1 (en) | 1998-03-20 | 2006-11-29 | Barbara Ann Karmanos Cancer Institute | Multidimensional detection and characterization of pathologic tissues |
US6685640B1 (en) | 1998-03-30 | 2004-02-03 | Focus Surgery, Inc. | Ablation system |
US5982709A (en) | 1998-03-31 | 1999-11-09 | The Board Of Trustees Of The Leland Stanford Junior University | Acoustic transducers and method of microfabrication |
US6036646A (en) | 1998-07-10 | 2000-03-14 | Guided Therapy Systems, Inc. | Method and apparatus for three dimensional ultrasound imaging |
AU6021299A (en) | 1998-08-27 | 2000-03-21 | Superior Micropowders Llc | Phosphor powders, methods for making phosphor powders and devices incorporating same |
US6014897A (en) | 1998-09-02 | 2000-01-18 | Mo; Larry Y. L. | Method and apparatus for improving sidelobe performance of sparse array using harmonic imaging |
US6042556A (en) | 1998-09-04 | 2000-03-28 | University Of Washington | Method for determining phase advancement of transducer elements in high intensity focused ultrasound |
US7722539B2 (en) | 1998-09-18 | 2010-05-25 | University Of Washington | Treatment of unwanted tissue by the selective destruction of vasculature providing nutrients to the tissue |
US7686763B2 (en) | 1998-09-18 | 2010-03-30 | University Of Washington | Use of contrast agents to increase the effectiveness of high intensity focused ultrasound therapy |
US6425867B1 (en) | 1998-09-18 | 2002-07-30 | University Of Washington | Noise-free real time ultrasonic imaging of a treatment site undergoing high intensity focused ultrasound therapy |
US6645145B1 (en) | 1998-11-19 | 2003-11-11 | Siemens Medical Solutions Usa, Inc. | Diagnostic medical ultrasound systems and transducers utilizing micro-mechanical components |
US6605043B1 (en) | 1998-11-19 | 2003-08-12 | Acuson Corp. | Diagnostic medical ultrasound systems and transducers utilizing micro-mechanical components |
US6224556B1 (en) | 1998-11-25 | 2001-05-01 | Acuson Corporation | Diagnostic medical ultrasound system and method for using a sparse array |
EP1176637A4 (en) | 1999-01-22 | 2006-09-13 | Hitachi Ltd | Semiconductor integrated circuit and manufacture thereof |
US6447451B1 (en) | 1999-05-04 | 2002-09-10 | Sonosite, Inc. | Mobile ultrasound diagnostic instrument and docking stand |
US6364839B1 (en) | 1999-05-04 | 2002-04-02 | Sonosite, Inc. | Ultrasound diagnostic instrument having software in detachable scanhead |
US6471651B1 (en) | 1999-05-05 | 2002-10-29 | Sonosite, Inc. | Low power portable ultrasonic diagnostic instrument |
US6371918B1 (en) | 1999-05-05 | 2002-04-16 | Sonosite Inc. | Transducer connector |
US6666835B2 (en) | 1999-05-14 | 2003-12-23 | University Of Washington | Self-cooled ultrasonic applicator for medical applications |
US6217530B1 (en) | 1999-05-14 | 2001-04-17 | University Of Washington | Ultrasonic applicator for medical applications |
US20040015079A1 (en) | 1999-06-22 | 2004-01-22 | Teratech Corporation | Ultrasound probe with integrated electronics |
US6238346B1 (en) | 1999-06-25 | 2001-05-29 | Agilent Technologies, Inc. | System and method employing two dimensional ultrasound array for wide field of view imaging |
US7510536B2 (en) | 1999-09-17 | 2009-03-31 | University Of Washington | Ultrasound guided high intensity focused ultrasound treatment of nerves |
US7520856B2 (en) | 1999-09-17 | 2009-04-21 | University Of Washington | Image guided high intensity focused ultrasound device for therapy in obstetrics and gynecology |
US6262946B1 (en) | 1999-09-29 | 2001-07-17 | The Board Of Trustees Of The Leland Stanford Junior University | Capacitive micromachined ultrasonic transducer arrays with reduced cross-coupling |
US6430109B1 (en) | 1999-09-30 | 2002-08-06 | The Board Of Trustees Of The Leland Stanford Junior University | Array of capacitive micromachined ultrasonic transducer elements with through wafer via connections |
US6440071B1 (en) | 1999-10-18 | 2002-08-27 | Guided Therapy Systems, Inc. | Peripheral ultrasound imaging system |
US6552841B1 (en) | 2000-01-07 | 2003-04-22 | Imperium Advanced Ultrasonic Imaging | Ultrasonic imager |
US6432053B1 (en) | 2000-02-18 | 2002-08-13 | Advanced Diagnostics, Inc. | Process for non-invasively determining the dimensions of a lesion |
US7499745B2 (en) | 2000-02-28 | 2009-03-03 | Barbara Ann Karmanos Cancer Institute | Multidimensional bioelectrical tissue analyzer |
US6613004B1 (en) | 2000-04-21 | 2003-09-02 | Insightec-Txsonics, Ltd. | Systems and methods for creating longer necrosed volumes using a phased array focused ultrasound system |
US6419648B1 (en) | 2000-04-21 | 2002-07-16 | Insightec-Txsonics Ltd. | Systems and methods for reducing secondary hot spots in a phased array focused ultrasound system |
US6543272B1 (en) | 2000-04-21 | 2003-04-08 | Insightec-Txsonics Ltd. | Systems and methods for testing and calibrating a focused ultrasound transducer array |
US6443901B1 (en) | 2000-06-15 | 2002-09-03 | Koninklijke Philips Electronics N.V. | Capacitive micromachined ultrasonic transducers |
US6506171B1 (en) | 2000-07-27 | 2003-01-14 | Insightec-Txsonics, Ltd | System and methods for controlling distribution of acoustic energy around a focal point using a focused ultrasound system |
JP2002052018A (en) | 2000-08-11 | 2002-02-19 | Canon Inc | Image display device, image display method and storage medium |
US6669641B2 (en) | 2000-08-17 | 2003-12-30 | Koninklijke Philips Electronics N.V. | Method of and system for ultrasound imaging |
US6755788B2 (en) | 2000-08-17 | 2004-06-29 | Koninklijke Philips Electronics N. V. | Image orientation display for a three dimensional ultrasonic imaging system |
US6443896B1 (en) | 2000-08-17 | 2002-09-03 | Koninklijke Philips Electronics N.V. | Method for creating multiplanar ultrasonic images of a three dimensional object |
US6709394B2 (en) | 2000-08-17 | 2004-03-23 | Koninklijke Philips Electronics N.V. | Biplane ultrasonic imaging |
US6761689B2 (en) | 2000-08-17 | 2004-07-13 | Koninklijke Philips Electronics N.V. | Biplane ultrasonic imaging |
US7037264B2 (en) | 2000-08-17 | 2006-05-02 | Koninklijke Philips Electronics N.V. | Ultrasonic diagnostic imaging with steered image plane |
US6612988B2 (en) | 2000-08-29 | 2003-09-02 | Brigham And Women's Hospital, Inc. | Ultrasound therapy |
US6450960B1 (en) | 2000-08-29 | 2002-09-17 | Barbara Ann Karmanos Cancer Institute | Real-time three-dimensional acoustoelectronic imaging and characterization of objects |
US6419633B1 (en) | 2000-09-15 | 2002-07-16 | Koninklijke Philips Electronics N.V. | 2D ultrasonic transducer array for two dimensional and three dimensional imaging |
US6613005B1 (en) | 2000-11-28 | 2003-09-02 | Insightec-Txsonics, Ltd. | Systems and methods for steering a focused ultrasound array |
US20100081893A1 (en) | 2008-09-19 | 2010-04-01 | Physiosonics, Inc. | Acoustic palpation using non-invasive ultrasound techniques to identify and localize tissue eliciting biological responses and target treatments |
US7022077B2 (en) | 2000-11-28 | 2006-04-04 | Allez Physionix Ltd. | Systems and methods for making noninvasive assessments of cardiac tissue and parameters |
US20100087728A1 (en) | 2000-11-28 | 2010-04-08 | Physiosonics, Inc. | Acoustic palpation using non-invasive ultrasound techniques to identify and localize tissue eliciting biological responses |
US6666833B1 (en) | 2000-11-28 | 2003-12-23 | Insightec-Txsonics Ltd | Systems and methods for focussing an acoustic energy beam transmitted through non-uniform tissue medium |
US6506154B1 (en) | 2000-11-28 | 2003-01-14 | Insightec-Txsonics, Ltd. | Systems and methods for controlling a phased array focused ultrasound system |
AU2002239360A1 (en) | 2000-11-28 | 2002-06-11 | Allez Physionix Limited | Systems and methods for making non-invasive physiological assessments |
US6645162B2 (en) | 2000-12-27 | 2003-11-11 | Insightec - Txsonics Ltd. | Systems and methods for ultrasound assisted lipolysis |
US6626854B2 (en) | 2000-12-27 | 2003-09-30 | Insightec - Txsonics Ltd. | Systems and methods for ultrasound assisted lipolysis |
US6540679B2 (en) | 2000-12-28 | 2003-04-01 | Guided Therapy Systems, Inc. | Visual imaging system for ultrasonic probe |
US6600325B2 (en) | 2001-02-06 | 2003-07-29 | Sun Microsystems, Inc. | Method and apparatus for probing an integrated circuit through capacitive coupling |
USD456509S1 (en) | 2001-04-19 | 2002-04-30 | Sonosite, Inc. | Combined module and cable for ultrasound system |
USD461895S1 (en) | 2001-04-19 | 2002-08-20 | Sonosite, Inc. | Handheld medical diagnostic ultrasound instrument |
US6559644B2 (en) | 2001-05-30 | 2003-05-06 | Insightec - Txsonics Ltd. | MRI-based temperature mapping with error compensation |
US6735461B2 (en) | 2001-06-19 | 2004-05-11 | Insightec-Txsonics Ltd | Focused ultrasound system with MRI synchronization |
DE10134014A1 (en) | 2001-07-12 | 2003-01-30 | Driescher Eltech Werk | Power converter |
US6880137B1 (en) | 2001-08-03 | 2005-04-12 | Inovys | Dynamically reconfigurable precision signal delay test system for automatic test equipment |
US6694817B2 (en) | 2001-08-21 | 2004-02-24 | Georgia Tech Research Corporation | Method and apparatus for the ultrasonic actuation of the cantilever of a probe-based instrument |
US6795374B2 (en) | 2001-09-07 | 2004-09-21 | Siemens Medical Solutions Usa, Inc. | Bias control of electrostatic transducers |
CA2406684A1 (en) | 2001-10-05 | 2003-04-05 | Queen's University At Kingston | Ultrasound transducer array |
US20040238732A1 (en) | 2001-10-19 | 2004-12-02 | Andrei State | Methods and systems for dynamic virtual convergence and head mountable display |
US7175596B2 (en) | 2001-10-29 | 2007-02-13 | Insightec-Txsonics Ltd | System and method for sensing and locating disturbances in an energy path of a focused ultrasound system |
US7115093B2 (en) | 2001-11-21 | 2006-10-03 | Ge Medical Systems Global Technology Company, Llc | Method and system for PDA-based ultrasound system |
US6790180B2 (en) | 2001-12-03 | 2004-09-14 | Insightec-Txsonics Ltd. | Apparatus, systems, and methods for measuring power output of an ultrasound transducer |
US6522142B1 (en) | 2001-12-14 | 2003-02-18 | Insightec-Txsonics Ltd. | MRI-guided temperature mapping of tissue undergoing thermal treatment |
US6659954B2 (en) | 2001-12-19 | 2003-12-09 | Koninklijke Philips Electronics Nv | Micromachined ultrasound transducer and method for fabricating same |
US7371218B2 (en) | 2002-01-17 | 2008-05-13 | Siemens Medical Solutions Usa, Inc. | Immersive portable ultrasound system and method |
US6604630B1 (en) | 2002-01-30 | 2003-08-12 | Sonosite, Inc. | Carrying case for lightweight ultrasound device |
US6648826B2 (en) | 2002-02-01 | 2003-11-18 | Sonosite, Inc. | CW beam former in an ASIC |
US7128711B2 (en) | 2002-03-25 | 2006-10-31 | Insightec, Ltd. | Positioning systems and methods for guided ultrasound therapy systems |
US20030187371A1 (en) | 2002-03-27 | 2003-10-02 | Insightec-Txsonics Ltd. | Systems and methods for enhanced focused ultrasound ablation using microbubbles |
US7534211B2 (en) | 2002-03-29 | 2009-05-19 | Sonosite, Inc. | Modular apparatus for diagnostic ultrasound |
US6716168B2 (en) | 2002-04-30 | 2004-04-06 | Siemens Medical Solutions Usa, Inc. | Ultrasound drug delivery enhancement and imaging systems and methods |
US7285092B2 (en) | 2002-12-18 | 2007-10-23 | Barbara Ann Karmanos Cancer Institute | Computerized ultrasound risk evaluation system |
EP1551303A4 (en) | 2002-05-16 | 2009-03-18 | Karmanos B A Cancer Inst | Method and system for combined diagnostic and therapeutic ultrasound system incorporating noninvasive thermometry, ablation control and automation |
US6783497B2 (en) | 2002-05-23 | 2004-08-31 | Volumetrics Medical Imaging, Inc. | Two-dimensional ultrasonic array with asymmetric apertures |
WO2003101530A2 (en) | 2002-05-30 | 2003-12-11 | University Of Washington | Solid hydrogel coupling for ultrasound imaging and therapy |
US20030230488A1 (en) | 2002-06-13 | 2003-12-18 | Lawrence Lee | Microfluidic device preparation system |
US6705994B2 (en) | 2002-07-08 | 2004-03-16 | Insightec - Image Guided Treatment Ltd | Tissue inhomogeneity correction in ultrasound imaging |
US6958255B2 (en) | 2002-08-08 | 2005-10-25 | The Board Of Trustees Of The Leland Stanford Junior University | Micromachined ultrasonic transducers and method of fabrication |
US6835177B2 (en) | 2002-11-06 | 2004-12-28 | Sonosite, Inc. | Ultrasonic blood vessel measurement apparatus and method |
US6831394B2 (en) | 2002-12-11 | 2004-12-14 | General Electric Company | Backing material for micromachined ultrasonic transducer devices |
US6926672B2 (en) | 2002-12-18 | 2005-08-09 | Barbara Ann Karmanos Cancer Institute | Electret acoustic transducer array for computerized ultrasound risk evaluation system |
US6837854B2 (en) | 2002-12-18 | 2005-01-04 | Barbara Ann Karmanos Cancer Institute | Methods and systems for using reference images in acoustic image processing |
US8088067B2 (en) | 2002-12-23 | 2012-01-03 | Insightec Ltd. | Tissue aberration corrections in ultrasound therapy |
US6836020B2 (en) | 2003-01-22 | 2004-12-28 | The Board Of Trustees Of The Leland Stanford Junior University | Electrical through wafer interconnects |
US7313053B2 (en) | 2003-03-06 | 2007-12-25 | General Electric Company | Method and apparatus for controlling scanning of mosaic sensor array |
US7257051B2 (en) | 2003-03-06 | 2007-08-14 | General Electric Company | Integrated interface electronics for reconfigurable sensor array |
US7353056B2 (en) | 2003-03-06 | 2008-04-01 | General Electric Company | Optimized switching configurations for reconfigurable arrays of sensor elements |
US6865140B2 (en) | 2003-03-06 | 2005-03-08 | General Electric Company | Mosaic arrays using micromachined ultrasound transducers |
US7443765B2 (en) | 2003-03-06 | 2008-10-28 | General Electric Company | Reconfigurable linear sensor arrays for reduced channel count |
US7280435B2 (en) | 2003-03-06 | 2007-10-09 | General Electric Company | Switching circuitry for reconfigurable arrays of sensor elements |
US20120035473A1 (en) | 2003-03-10 | 2012-02-09 | Focus Surgery, Inc. | Laparoscopic hifu probe |
US6980419B2 (en) | 2003-03-12 | 2005-12-27 | Zonare Medical Systems, Inc. | Portable ultrasound unit and docking station |
US7771360B2 (en) | 2003-04-09 | 2010-08-10 | Techniscan, Inc. | Breast scanning system |
US7303530B2 (en) | 2003-05-22 | 2007-12-04 | Siemens Medical Solutions Usa, Inc. | Transducer arrays with an integrated sensor and methods of use |
US7611462B2 (en) | 2003-05-22 | 2009-11-03 | Insightec-Image Guided Treatment Ltd. | Acoustic beam forming in phased arrays including large numbers of transducer elements |
JP4332372B2 (en) * | 2003-05-27 | 2009-09-16 | アロカ株式会社 | Ultrasonic diagnostic equipment |
US7377900B2 (en) | 2003-06-02 | 2008-05-27 | Insightec - Image Guided Treatment Ltd. | Endo-cavity focused ultrasound transducer |
US7549961B1 (en) | 2003-07-31 | 2009-06-23 | Sonosite, Inc. | System and method supporting imaging and monitoring applications |
US20050049495A1 (en) * | 2003-09-03 | 2005-03-03 | Siemens Medical Solutions Usa, Inc. | Remote assistance for medical diagnostic ultrasound |
WO2005037060A2 (en) | 2003-10-03 | 2005-04-28 | University Of Washington | Transcutaneous localization of arterial bleeding by ultrasonic imaging |
US7972271B2 (en) | 2003-10-28 | 2011-07-05 | The Board Of Trustees Of The Leland Stanford Junior University | Apparatus and method for phased subarray imaging |
ATE426345T1 (en) | 2003-11-04 | 2009-04-15 | Univ Washington | TOOTHBRUSH USING AN ACOUSTIC WAVEGUIDE |
JP4773366B2 (en) | 2003-12-04 | 2011-09-14 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Ultrasonic transducer and method for performing flip-chip two-dimensional array technology on curved array |
US20110040171A1 (en) | 2003-12-16 | 2011-02-17 | University Of Washington | Image guided high intensity focused ultrasound treatment of nerves |
US7030536B2 (en) | 2003-12-29 | 2006-04-18 | General Electric Company | Micromachined ultrasonic transducer cells having compliant support structure |
US7125383B2 (en) | 2003-12-30 | 2006-10-24 | General Electric Company | Method and apparatus for ultrasonic continuous, non-invasive blood pressure monitoring |
US7425199B2 (en) | 2003-12-30 | 2008-09-16 | General Electric Company | Method and apparatus for ultrasonic continuous, non-invasive blood pressure monitoring |
US7285897B2 (en) | 2003-12-31 | 2007-10-23 | General Electric Company | Curved micromachined ultrasonic transducer arrays and related methods of manufacture |
US7052464B2 (en) | 2004-01-01 | 2006-05-30 | General Electric Company | Alignment method for fabrication of integrated ultrasonic transducer array |
US7588539B2 (en) | 2004-01-21 | 2009-09-15 | Siemens Medical Solutions Usa, Inc. | Integrated low-power pw/cw transmitter |
JP2007528153A (en) | 2004-02-06 | 2007-10-04 | ジョージア テック リサーチ コーポレイション | CMUT device and manufacturing method |
US7691063B2 (en) | 2004-02-26 | 2010-04-06 | Siemens Medical Solutions Usa, Inc. | Receive circuit for minimizing channels in ultrasound imaging |
US8008835B2 (en) | 2004-02-27 | 2011-08-30 | Georgia Tech Research Corporation | Multiple element electrode cMUT devices and fabrication methods |
EP1761998A4 (en) | 2004-02-27 | 2011-05-11 | Georgia Tech Res Inst | Harmonic cmut devices and fabrication methods |
US7646133B2 (en) | 2004-02-27 | 2010-01-12 | Georgia Tech Research Corporation | Asymmetric membrane cMUT devices and fabrication methods |
US7662114B2 (en) | 2004-03-02 | 2010-02-16 | Focus Surgery, Inc. | Ultrasound phased arrays |
US7530952B2 (en) | 2004-04-01 | 2009-05-12 | The Board Of Trustees Of The Leland Stanford Junior University | Capacitive ultrasonic transducers with isolation posts |
US20050219695A1 (en) | 2004-04-05 | 2005-10-06 | Vesely Michael A | Horizontal perspective display |
US7274623B2 (en) | 2004-04-06 | 2007-09-25 | Board Of Trustees Of The Deland Stanford Junior University | Method and system for operating capacitive membrane ultrasonic transducers |
US7321181B2 (en) | 2004-04-07 | 2008-01-22 | The Board Of Trustees Of The Leland Stanford Junior University | Capacitive membrane ultrasonic transducers with reduced bulk wave generation and method |
US20060009693A1 (en) | 2004-04-08 | 2006-01-12 | Techniscan, Inc. | Apparatus for imaging and treating a breast |
US8213467B2 (en) | 2004-04-08 | 2012-07-03 | Sonosite, Inc. | Systems and methods providing ASICs for use in multiple applications |
US7470232B2 (en) | 2004-05-04 | 2008-12-30 | General Electric Company | Method and apparatus for non-invasive ultrasonic fetal heart rate monitoring |
US20060058667A1 (en) | 2004-05-06 | 2006-03-16 | Lemmerhirt David F | Integrated circuit for an ultrasound system |
US20070219448A1 (en) | 2004-05-06 | 2007-09-20 | Focus Surgery, Inc. | Method and Apparatus for Selective Treatment of Tissue |
US8235909B2 (en) | 2004-05-12 | 2012-08-07 | Guided Therapy Systems, L.L.C. | Method and system for controlled scanning, imaging and/or therapy |
US8199685B2 (en) | 2004-05-17 | 2012-06-12 | Sonosite, Inc. | Processing of medical signals |
US20050264857A1 (en) | 2004-06-01 | 2005-12-01 | Vesely Michael A | Binaural horizontal perspective display |
US7545075B2 (en) | 2004-06-04 | 2009-06-09 | The Board Of Trustees Of The Leland Stanford Junior University | Capacitive micromachined ultrasonic transducer array with through-substrate electrical connection and method of fabricating same |
US7955264B2 (en) | 2004-07-07 | 2011-06-07 | General Electric Company | System and method for providing communication between ultrasound scanners |
JP4746291B2 (en) | 2004-08-05 | 2011-08-10 | オリンパス株式会社 | Capacitive ultrasonic transducer and manufacturing method thereof |
US7699780B2 (en) | 2004-08-11 | 2010-04-20 | Insightec—Image-Guided Treatment Ltd. | Focused ultrasound system with adaptive anatomical aperture shaping |
US7996688B2 (en) | 2004-08-24 | 2011-08-09 | Sonosite, Inc. | Ultrasound system power management |
US7867168B2 (en) | 2004-08-24 | 2011-01-11 | Sonosite, Inc. | Ultrasonic transducer having distributed weight properties |
US8409099B2 (en) | 2004-08-26 | 2013-04-02 | Insightec Ltd. | Focused ultrasound system for surrounding a body tissue mass and treatment method |
US7888709B2 (en) | 2004-09-15 | 2011-02-15 | Sonetics Ultrasound, Inc. | Capacitive micromachined ultrasonic transducer and manufacturing method |
US8658453B2 (en) | 2004-09-15 | 2014-02-25 | Sonetics Ultrasound, Inc. | Capacitive micromachined ultrasonic transducer |
US8309428B2 (en) | 2004-09-15 | 2012-11-13 | Sonetics Ultrasound, Inc. | Capacitive micromachined ultrasonic transducer |
EP1789136B1 (en) | 2004-09-16 | 2010-12-29 | University of Washington | Interference-free ultrasound imaging during hifu therapy, using software tools |
US7393325B2 (en) | 2004-09-16 | 2008-07-01 | Guided Therapy Systems, L.L.C. | Method and system for ultrasound treatment with a multi-directional transducer |
US7824348B2 (en) | 2004-09-16 | 2010-11-02 | Guided Therapy Systems, L.L.C. | System and method for variable depth ultrasound treatment |
US20120165668A1 (en) | 2010-08-02 | 2012-06-28 | Guided Therapy Systems, Llc | Systems and methods for treating acute and/or chronic injuries in soft tissue |
US7530958B2 (en) | 2004-09-24 | 2009-05-12 | Guided Therapy Systems, Inc. | Method and system for combined ultrasound treatment |
US20080255452A1 (en) | 2004-09-29 | 2008-10-16 | Koninklijke Philips Electronics, N.V. | Methods and Apparatus For Performing Enhanced Ultrasound Diagnostic Breast Imaging |
US20060111744A1 (en) | 2004-10-13 | 2006-05-25 | Guided Therapy Systems, L.L.C. | Method and system for treatment of sweat glands |
US8133180B2 (en) | 2004-10-06 | 2012-03-13 | Guided Therapy Systems, L.L.C. | Method and system for treating cellulite |
US7530356B2 (en) | 2004-10-06 | 2009-05-12 | Guided Therapy Systems, Inc. | Method and system for noninvasive mastopexy |
WO2006042201A1 (en) | 2004-10-06 | 2006-04-20 | Guided Therapy Systems, L.L.C. | Method and system for ultrasound tissue treatment |
US7758524B2 (en) | 2004-10-06 | 2010-07-20 | Guided Therapy Systems, L.L.C. | Method and system for ultra-high frequency ultrasound treatment |
US20060079868A1 (en) | 2004-10-07 | 2006-04-13 | Guided Therapy Systems, L.L.C. | Method and system for treatment of blood vessel disorders |
US7375420B2 (en) | 2004-12-03 | 2008-05-20 | General Electric Company | Large area transducer array |
US7037746B1 (en) | 2004-12-27 | 2006-05-02 | General Electric Company | Capacitive micromachined ultrasound transducer fabricated with epitaxial silicon membrane |
US7293462B2 (en) | 2005-01-04 | 2007-11-13 | General Electric Company | Isolation of short-circuited sensor cells for high-reliability operation of sensor array |
CN100542635C (en) | 2005-01-10 | 2009-09-23 | 重庆海扶(Hifu)技术有限公司 | High intensity focused ultrasound therapy device and method |
CN100574809C (en) | 2005-01-10 | 2009-12-30 | 重庆海扶(Hifu)技术有限公司 | A kind of high-strength focusing ultrasonic therapy fluorocarbon emulsion analog assistant and application thereof |
CN100574811C (en) | 2005-01-10 | 2009-12-30 | 重庆海扶(Hifu)技术有限公司 | A kind of particle analog assistant for high-intensity focusing ultrasonic therapy and application thereof |
CN100574810C (en) | 2005-01-10 | 2009-12-30 | 重庆海扶(Hifu)技术有限公司 | A kind of grain analog assistant for high-intensity focusing ultrasonic therapy and application thereof |
CN100506323C (en) | 2005-01-10 | 2009-07-01 | 重庆海扶(Hifu)技术有限公司 | Integral ultrasonic therapy energy converter |
US7563228B2 (en) * | 2005-01-24 | 2009-07-21 | Siemens Medical Solutions Usa, Inc. | Stereoscopic three or four dimensional ultrasound imaging |
CN100563752C (en) | 2005-01-31 | 2009-12-02 | 重庆融海超声医学工程研究中心有限公司 | MRI guided ultrasonic treatment device |
CN100450563C (en) | 2005-01-31 | 2009-01-14 | 重庆海扶(Hifu)技术有限公司 | Device for delivering medium to body cavity, vascular-cavity leading-in and ultrasonic blocking |
CN1814323B (en) | 2005-01-31 | 2010-05-12 | 重庆海扶(Hifu)技术有限公司 | Focusing ultrasonic therapeutical system |
US8388544B2 (en) | 2005-03-17 | 2013-03-05 | General Electric Company | System and method for measuring blood viscosity |
EP1875327A2 (en) | 2005-04-25 | 2008-01-09 | Guided Therapy Systems, L.L.C. | Method and system for enhancing computer peripheral saftey |
US8066642B1 (en) | 2005-05-03 | 2011-11-29 | Sonosite, Inc. | Systems and methods for ultrasound beam forming data control |
US7914458B2 (en) | 2005-05-05 | 2011-03-29 | Volcano Corporation | Capacitive microfabricated ultrasound transducer-based intravascular ultrasound probes |
WO2006121957A2 (en) | 2005-05-09 | 2006-11-16 | Michael Vesely | Three dimensional horizontal perspective workstation |
CN101223633A (en) | 2005-05-18 | 2008-07-16 | 科隆科技公司 | Micro-electro-mechanical transducers |
EP1882127A2 (en) | 2005-05-18 | 2008-01-30 | Kolo Technologies, Inc. | Micro-electro-mechanical transducers |
US8038631B1 (en) | 2005-06-01 | 2011-10-18 | Sanghvi Narendra T | Laparoscopic HIFU probe |
CA2509590A1 (en) * | 2005-06-06 | 2006-12-06 | Solar International Products Inc. | Portable imaging apparatus |
US7589456B2 (en) | 2005-06-14 | 2009-09-15 | Siemens Medical Solutions Usa, Inc. | Digital capacitive membrane transducer |
CA2608164A1 (en) | 2005-06-17 | 2006-12-21 | Kolo Technologies, Inc. | Micro-electro-mechanical transducer having an insulation extension |
US20070016039A1 (en) | 2005-06-21 | 2007-01-18 | Insightec-Image Guided Treatment Ltd. | Controlled, non-linear focused ultrasound treatment |
US7775979B2 (en) | 2005-06-29 | 2010-08-17 | General Electric Company | Transmit and receive interface array for highly integrated ultrasound scanner |
US20070010805A1 (en) | 2005-07-08 | 2007-01-11 | Fedewa Russell J | Method and apparatus for the treatment of tissue |
US7880565B2 (en) | 2005-08-03 | 2011-02-01 | Kolo Technologies, Inc. | Micro-electro-mechanical transducer having a surface plate |
WO2007015219A2 (en) | 2005-08-03 | 2007-02-08 | Kolo Technologies, Inc. | Micro-electro-mechanical transducer having a surface plate |
WO2007021958A2 (en) | 2005-08-12 | 2007-02-22 | University Of Washington | Method and apparatus for preparing organs and tissues for laparoscopic surgery |
US7591996B2 (en) | 2005-08-17 | 2009-09-22 | University Of Washington | Ultrasound target vessel occlusion using microbubbles |
US7621873B2 (en) | 2005-08-17 | 2009-11-24 | University Of Washington | Method and system to synchronize acoustic therapy with ultrasound imaging |
US7804595B2 (en) | 2005-09-14 | 2010-09-28 | University Of Washington | Using optical scattering to measure properties of ultrasound contrast agent shells |
US8264683B2 (en) | 2005-09-14 | 2012-09-11 | University Of Washington | Dynamic characterization of particles with flow cytometry |
US8016757B2 (en) | 2005-09-30 | 2011-09-13 | University Of Washington | Non-invasive temperature estimation technique for HIFU therapy monitoring using backscattered ultrasound |
US7878977B2 (en) | 2005-09-30 | 2011-02-01 | Siemens Medical Solutions Usa, Inc. | Flexible ultrasound transducer array |
JP4880275B2 (en) | 2005-10-03 | 2012-02-22 | オリンパスメディカルシステムズ株式会社 | Capacitive ultrasonic transducer |
US7441447B2 (en) | 2005-10-07 | 2008-10-28 | Georgia Tech Research Corporation | Methods of imaging in probe microscopy |
US7449640B2 (en) | 2005-10-14 | 2008-11-11 | Sonosite, Inc. | Alignment features for dicing multi element acoustic arrays |
CN101309645B (en) * | 2005-11-15 | 2010-12-08 | 株式会社日立医药 | Ultrasonic diagnosis device |
CN101313354B (en) | 2005-11-23 | 2012-02-15 | 因赛泰克有限公司 | Hierarchical switching in ultra-high density ultrasound array |
US7546769B2 (en) | 2005-12-01 | 2009-06-16 | General Electric Compnay | Ultrasonic inspection system and method |
US8465431B2 (en) | 2005-12-07 | 2013-06-18 | Siemens Medical Solutions Usa, Inc. | Multi-dimensional CMUT array with integrated beamformation |
US8038620B2 (en) | 2005-12-20 | 2011-10-18 | General Electric Company | Fresnel zone imaging system and method |
US7622848B2 (en) | 2006-01-06 | 2009-11-24 | General Electric Company | Transducer assembly with z-axis interconnect |
US20070239011A1 (en) | 2006-01-13 | 2007-10-11 | Mirabilis Medica, Inc. | Apparatus for delivering high intensity focused ultrasound energy to a treatment site internal to a patient's body |
US20070239020A1 (en) | 2006-01-19 | 2007-10-11 | Kazuhiro Iinuma | Ultrasonography apparatus |
US20070180916A1 (en) | 2006-02-09 | 2007-08-09 | General Electric Company | Capacitive micromachined ultrasound transducer and methods of making the same |
US20070239019A1 (en) | 2006-02-13 | 2007-10-11 | Richard William D | Portable ultrasonic imaging probe than connects directly to a host computer |
GB2454603B (en) | 2006-02-24 | 2010-05-05 | Wolfson Microelectronics Plc | Mems device |
US7615834B2 (en) | 2006-02-28 | 2009-11-10 | The Board Of Trustees Of The Leland Stanford Junior University | Capacitive micromachined ultrasonic transducer(CMUT) with varying thickness membrane |
US7699793B2 (en) | 2006-03-07 | 2010-04-20 | Brainlab Ag | Method and device for detecting and localising an impingement of joint components |
US7764003B2 (en) | 2006-04-04 | 2010-07-27 | Kolo Technologies, Inc. | Signal control in micromachined ultrasonic transducer |
CA2649119A1 (en) | 2006-04-13 | 2007-12-13 | Mirabilis Medica, Inc. | Methods and apparatus for the treatment of menometrorrhagia, endometrial pathology, and cervical neoplasia using high intensity focused ultrasound energy |
US20110263997A1 (en) * | 2006-04-20 | 2011-10-27 | Engineered Vigilance, Llc | System and method for remotely diagnosing and managing treatment of restrictive and obstructive lung disease and cardiopulmonary disorders |
US7745973B2 (en) | 2006-05-03 | 2010-06-29 | The Board Of Trustees Of The Leland Stanford Junior University | Acoustic crosstalk reduction for capacitive micromachined ultrasonic transducers in immersion |
WO2007131163A2 (en) | 2006-05-05 | 2007-11-15 | Worcester Polytechnic Institute | Reconfigurable wireless ultrasound diagnostic system |
US7767484B2 (en) | 2006-05-31 | 2010-08-03 | Georgia Tech Research Corporation | Method for sealing and backside releasing of microelectromechanical systems |
JP5432708B2 (en) | 2006-06-23 | 2014-03-05 | コーニンクレッカ フィリップス エヌ ヴェ | Timing control device for photoacoustic and ultrasonic composite imager |
US8360986B2 (en) * | 2006-06-30 | 2013-01-29 | University Of Louisville Research Foundation, Inc. | Non-contact and passive measurement of arterial pulse through thermal IR imaging, and analysis of thermal IR imagery |
CN100486521C (en) | 2006-07-19 | 2009-05-13 | 西门子(中国)有限公司 | Device transmitting magnetic resonance signal in MRI guided medical equipment |
US7741686B2 (en) | 2006-07-20 | 2010-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Trench isolated capacitive micromachined ultrasonic transducer arrays with a supporting frame |
US7535794B2 (en) | 2006-08-01 | 2009-05-19 | Insightec, Ltd. | Transducer surface mapping |
EP2049365B1 (en) | 2006-08-01 | 2017-12-13 | 3M Innovative Properties Company | Illumination device |
US20080033278A1 (en) | 2006-08-01 | 2008-02-07 | Insightec Ltd. | System and method for tracking medical device using magnetic resonance detection |
US7652410B2 (en) | 2006-08-01 | 2010-01-26 | Insightec Ltd | Ultrasound transducer with non-uniform elements |
US20080033292A1 (en) | 2006-08-02 | 2008-02-07 | Insightec Ltd | Ultrasound patient interface device |
US7903830B2 (en) | 2006-08-10 | 2011-03-08 | Siemens Medical Solutions Usa, Inc. | Push-pull capacitive micro-machined ultrasound transducer array |
CN101126800B (en) | 2006-08-16 | 2010-05-12 | 西门子(中国)有限公司 | HIFU compatibe MRI radio frequency signal receiving coil and its receiving method |
CN100574829C (en) | 2006-08-24 | 2009-12-30 | 重庆融海超声医学工程研究中心有限公司 | A kind of high-strength focus supersonic therapeutic system of image documentation equipment guiding |
CN100574828C (en) | 2006-08-24 | 2009-12-30 | 重庆融海超声医学工程研究中心有限公司 | A kind of apparatus for ultrasonic therapeutic treatment and contain the supersonic therapeutic system of this apparatus for ultrasonic therapeutic treatment |
DE102006040420A1 (en) | 2006-08-29 | 2008-03-13 | Siemens Ag | Thermal ablation e.g. microwave ablation, implementing and monitoring device for treating tumor of patient, has magnet resonance system producing images composed of voxel, where geometry of voxel is adapted to form of ultrasonic focus |
CN101140354B (en) | 2006-09-04 | 2012-01-25 | 重庆融海超声医学工程研究中心有限公司 | Resonant vibration type supersonic transducer |
US20080097207A1 (en) | 2006-09-12 | 2008-04-24 | Siemens Medical Solutions Usa, Inc. | Ultrasound therapy monitoring with diagnostic ultrasound |
US9566454B2 (en) | 2006-09-18 | 2017-02-14 | Guided Therapy Systems, Llc | Method and sysem for non-ablative acne treatment and prevention |
ES2579765T3 (en) | 2006-09-19 | 2016-08-16 | Guided Therapy Systems, L.L.C. | System for the treatment of muscle, tendon, ligamentous and cartilaginous tissue |
US7825383B2 (en) | 2006-09-21 | 2010-11-02 | Siemens Medical Solutions Usa, Inc. | Mobile camera for organ targeted imaging |
US7559905B2 (en) | 2006-09-21 | 2009-07-14 | Focus Surgery, Inc. | HIFU probe for treating tissue with in-line degassing of fluid |
US8242665B2 (en) | 2006-09-25 | 2012-08-14 | Koninklijke Philips Electronics N.V. | Flip-chip interconnection through chip vias |
WO2008040015A2 (en) | 2006-09-28 | 2008-04-03 | University Of Washington | 3d micro-scale engineered tissue model systems |
CN101164637B (en) | 2006-10-16 | 2011-05-18 | 重庆融海超声医学工程研究中心有限公司 | Ultrasonic therapeutic system capable of reducing electromagnetic interference to imaging equipment |
US20080183077A1 (en) | 2006-10-19 | 2008-07-31 | Siemens Corporate Research, Inc. | High intensity focused ultrasound path determination |
USD558351S1 (en) | 2006-10-31 | 2007-12-25 | Sonosite, Inc. | Ultrasound display apparatus |
US20100056925A1 (en) | 2006-11-28 | 2010-03-04 | Chongqing Ronghai Medical Ultrasound Industry Ltd. | Ultrasonic Therapeutic Device Capable of Multipoint Transmitting |
DE102006056885B4 (en) | 2006-12-01 | 2016-06-30 | Siemens Healthcare Gmbh | Method and device for positioning a bearing device of a magnetic resonance apparatus |
US7451651B2 (en) | 2006-12-11 | 2008-11-18 | General Electric Company | Modular sensor assembly and methods of fabricating the same |
CN101204700B (en) | 2006-12-19 | 2012-08-08 | 重庆融海超声医学工程研究中心有限公司 | Electromagnetic ultrasonic transducer and array thereof |
US8672850B1 (en) | 2007-01-11 | 2014-03-18 | General Electric Company | Focusing of a two-dimensional array to perform four-dimensional imaging |
JP5211487B2 (en) | 2007-01-25 | 2013-06-12 | 株式会社ニコン | Exposure method, exposure apparatus, and microdevice manufacturing method |
US20100298711A1 (en) | 2007-01-29 | 2010-11-25 | Worcester Polytechnic Institute | Wireless ultrasound transducer using ultrawideband |
CN101234234B (en) | 2007-01-30 | 2011-11-16 | 西门子公司 | Automatic selection method for region of interest of covering heating region |
US7687976B2 (en) | 2007-01-31 | 2010-03-30 | General Electric Company | Ultrasound imaging system |
US7920731B2 (en) | 2007-03-27 | 2011-04-05 | Siemens Medical Solutions Usa, Inc. | Bleeding detection using a blanket ultrasound device |
CN101273890B (en) | 2007-03-29 | 2010-10-06 | 西门子(中国)有限公司 | Method and device for reducing folding artifact in HIFU therapy imaging monitored by MR |
JP4885779B2 (en) | 2007-03-29 | 2012-02-29 | オリンパスメディカルシステムズ株式会社 | Capacitance type transducer device and intracorporeal ultrasound diagnostic system |
CN101273891B (en) | 2007-03-29 | 2010-09-29 | 西门子(中国)有限公司 | Method and device for accelerating magnetic resonance temperature imaging |
US7824335B2 (en) | 2007-04-26 | 2010-11-02 | General Electric Company | Reconfigurable array with multi-level transmitters |
US7892176B2 (en) | 2007-05-02 | 2011-02-22 | General Electric Company | Monitoring or imaging system with interconnect structure for large area sensor array |
US8870771B2 (en) | 2007-05-04 | 2014-10-28 | Barbara Ann Karmanos Cancer Institute | Method and apparatus for categorizing breast density and assessing cancer risk utilizing acoustic parameters |
TWI526233B (en) | 2007-05-07 | 2016-03-21 | 指導治療系統股份有限公司 | Methods and systems for modulating medicants using acoustic energy |
US8764687B2 (en) | 2007-05-07 | 2014-07-01 | Guided Therapy Systems, Llc | Methods and systems for coupling and focusing acoustic energy using a coupler member |
US20080296708A1 (en) | 2007-05-31 | 2008-12-04 | General Electric Company | Integrated sensor arrays and method for making and using such arrays |
WO2008146206A2 (en) | 2007-06-01 | 2008-12-04 | Koninklijke Philips Electronics, N.V. | Wireless ultrasound probe asset tracking |
US20090018446A1 (en) | 2007-07-10 | 2009-01-15 | Insightec, Ltd. | Transrectal ultrasound ablation probe |
EP2170531A2 (en) | 2007-07-31 | 2010-04-07 | Koninklijke Philips Electronics N.V. | Cmuts with a high-k dielectric |
US8052604B2 (en) | 2007-07-31 | 2011-11-08 | Mirabilis Medica Inc. | Methods and apparatus for engagement and coupling of an intracavitory imaging and high intensity focused ultrasound probe |
US7978461B2 (en) | 2007-09-07 | 2011-07-12 | Sonosite, Inc. | Enhanced ultrasound system |
USD591423S1 (en) | 2007-09-07 | 2009-04-28 | Sonosite, Inc. | Ultrasound platform |
US8235902B2 (en) | 2007-09-11 | 2012-08-07 | Focus Surgery, Inc. | System and method for tissue change monitoring during HIFU treatment |
US8277380B2 (en) | 2007-09-11 | 2012-10-02 | Siemens Medical Solutions Usa, Inc. | Piezoelectric and CMUT layered ultrasound transducer array |
US8137278B2 (en) | 2007-09-12 | 2012-03-20 | Sonosite, Inc. | System and method for spatial compounding using phased arrays |
US10092270B2 (en) | 2007-09-17 | 2018-10-09 | Koninklijke Philips Electronics N.V. | Pre-collapsed CMUT with mechanical collapse retention |
US8327521B2 (en) | 2007-09-17 | 2012-12-11 | Koninklijke Philips Electronics N.V. | Method for production and using a capacitive micro-machined ultrasonic transducer |
US20100256488A1 (en) | 2007-09-27 | 2010-10-07 | University Of Southern California | High frequency ultrasonic convex array transducers and tissue imaging |
CN101396280A (en) | 2007-09-27 | 2009-04-01 | 重庆融海超声医学工程研究中心有限公司 | Ultrasonic therapy intestinal tract pushing device |
US8251908B2 (en) | 2007-10-01 | 2012-08-28 | Insightec Ltd. | Motion compensated image-guided focused ultrasound therapy system |
US7843022B2 (en) | 2007-10-18 | 2010-11-30 | The Board Of Trustees Of The Leland Stanford Junior University | High-temperature electrostatic transducers and fabrication method |
US7745248B2 (en) | 2007-10-18 | 2010-06-29 | The Board Of Trustees Of The Leland Stanford Junior University | Fabrication of capacitive micromachined ultrasonic transducers by local oxidation |
US8439907B2 (en) | 2007-11-07 | 2013-05-14 | Mirabilis Medica Inc. | Hemostatic tissue tunnel generator for inserting treatment apparatus into tissue of a patient |
US8187270B2 (en) | 2007-11-07 | 2012-05-29 | Mirabilis Medica Inc. | Hemostatic spark erosion tissue tunnel generator with integral treatment providing variable volumetric necrotization of tissue |
CN100560157C (en) | 2007-11-13 | 2009-11-18 | 重庆市生力医疗设备有限公司 | Ultrasonic medicine plaster |
US7786584B2 (en) | 2007-11-26 | 2010-08-31 | Infineon Technologies Ag | Through substrate via semiconductor components |
US8767514B2 (en) | 2007-12-03 | 2014-07-01 | Kolo Technologies, Inc. | Telemetric sensing using micromachined ultrasonic transducer |
EP2215855A1 (en) | 2007-12-03 | 2010-08-11 | Kolo Technologies, Inc. | Capacitive micromachined ultrasonic transducer with voltage feedback |
JP2011505205A (en) | 2007-12-03 | 2011-02-24 | コロ テクノロジーズ インコーポレイテッド | Ultrasonic scanner constructed with capacitive micromachined ultrasonic transducer (CMUTS) |
US8483014B2 (en) | 2007-12-03 | 2013-07-09 | Kolo Technologies, Inc. | Micromachined ultrasonic transducers |
WO2009073692A1 (en) | 2007-12-03 | 2009-06-11 | Kolo Technologies, Inc. | Packaging and connecting electrostatic transducer arrays |
US8559274B2 (en) | 2007-12-03 | 2013-10-15 | Kolo Technologies, Inc. | Dual-mode operation micromachined ultrasonic transducer |
US8345513B2 (en) | 2007-12-03 | 2013-01-01 | Kolo Technologies, Inc. | Stacked transducing devices |
EP2227835A1 (en) | 2007-12-03 | 2010-09-15 | Kolo Technologies, Inc. | Variable operating voltage in micromachined ultrasonic transducer |
EP2218094A1 (en) | 2007-12-03 | 2010-08-18 | Kolo Technologies, Inc. | Through-wafer interconnections in electrostatic transducer and array |
US8787116B2 (en) | 2007-12-14 | 2014-07-22 | Koninklijke Philips N.V. | Collapsed mode operable cMUT including contoured substrate |
EP2231011A1 (en) * | 2007-12-31 | 2010-09-29 | Real Imaging Ltd. | System and method for registration of imaging data |
CN101513554B (en) | 2008-02-21 | 2011-09-07 | 重庆海扶(Hifu)技术有限公司 | Intelligent type tissue-mimicking ultrasonic phantom and preparation method thereof |
US20110055447A1 (en) | 2008-05-07 | 2011-03-03 | Signostics Limited | Docking system for medical diagnostic scanning using a handheld device |
GB2459862B (en) | 2008-05-07 | 2010-06-30 | Wolfson Microelectronics Plc | Capacitive transducer circuit and method |
JP2009291514A (en) | 2008-06-09 | 2009-12-17 | Canon Inc | Method for manufacturing capacitive transducer, and capacitive transducer |
JP5063515B2 (en) * | 2008-07-25 | 2012-10-31 | 日立アロカメディカル株式会社 | Ultrasonic diagnostic equipment |
US7898905B2 (en) | 2008-07-28 | 2011-03-01 | General Electric Company | Reconfigurable array with locally determined switch configuration |
US8216161B2 (en) | 2008-08-06 | 2012-07-10 | Mirabilis Medica Inc. | Optimization and feedback control of HIFU power deposition through the frequency analysis of backscattered HIFU signals |
US9248318B2 (en) | 2008-08-06 | 2016-02-02 | Mirabilis Medica Inc. | Optimization and feedback control of HIFU power deposition through the analysis of detected signal characteristics |
US8133182B2 (en) | 2008-09-09 | 2012-03-13 | Siemens Medical Solutions Usa, Inc. | Multi-dimensional transducer array and beamforming for ultrasound imaging |
US9050449B2 (en) | 2008-10-03 | 2015-06-09 | Mirabilis Medica, Inc. | System for treating a volume of tissue with high intensity focused ultrasound |
EP2331207B1 (en) | 2008-10-03 | 2013-12-11 | Mirabilis Medica Inc. | Apparatus for treating tissues with hifu |
US8237601B2 (en) | 2008-10-14 | 2012-08-07 | Sonosite, Inc. | Remote control device |
US20100173437A1 (en) | 2008-10-21 | 2010-07-08 | Wygant Ira O | Method of fabricating CMUTs that generate low-frequency and high-intensity ultrasound |
EP2349482B1 (en) | 2008-10-24 | 2016-07-27 | Mirabilis Medica Inc. | Apparatus for feedback control of hifu treatments |
FR2939003B1 (en) | 2008-11-21 | 2011-02-25 | Commissariat Energie Atomique | CMUT CELL FORMED OF A MEMBRANE OF NANO-TUBES OR NANO-THREADS OR NANO-BEAMS AND ULTRA HIGH-FREQUENCY ACOUSTIC IMAGING DEVICE COMPRISING A PLURALITY OF SUCH CELLS |
US20100160781A1 (en) | 2008-12-09 | 2010-06-24 | University Of Washington | Doppler and image guided device for negative feedback phased array hifu treatment of vascularized lesions |
US8176787B2 (en) | 2008-12-17 | 2012-05-15 | General Electric Company | Systems and methods for operating a two-dimensional transducer array |
CN102281818B (en) | 2009-01-16 | 2013-11-06 | 株式会社日立医疗器械 | Ultrasonic probe manufacturing method and ultrasonic probe |
US8108647B2 (en) | 2009-01-29 | 2012-01-31 | International Business Machines Corporation | Digital data architecture employing redundant links in a daisy chain of component modules |
US8398408B1 (en) | 2009-02-25 | 2013-03-19 | Sonosite, Inc. | Charging station for cordless ultrasound cart |
US8402831B2 (en) | 2009-03-05 | 2013-03-26 | The Board Of Trustees Of The Leland Standford Junior University | Monolithic integrated CMUTs fabricated by low-temperature wafer bonding |
KR20110127736A (en) | 2009-03-06 | 2011-11-25 | 미라빌리스 메디카 인코포레이티드 | Ultrasound treatment and imaging applicator |
US8315125B2 (en) | 2009-03-18 | 2012-11-20 | Sonetics Ultrasound, Inc. | System and method for biasing CMUT elements |
US8408065B2 (en) | 2009-03-18 | 2013-04-02 | Bp Corporation North America Inc. | Dry-coupled permanently installed ultrasonic sensor linear array |
WO2010109363A2 (en) | 2009-03-23 | 2010-09-30 | Koninklijke Philips Electronics, N.V. | Gas sensing using ultrasound |
ES2416182T3 (en) | 2009-03-26 | 2013-07-30 | Norwegian University Of Science And Technology (Ntnu) | CMUT matrix of wave junction with conductive pathways |
EP2413802A1 (en) | 2009-04-01 | 2012-02-08 | Analogic Corporation | Ultrasound probe |
US8355554B2 (en) | 2009-04-14 | 2013-01-15 | Sonosite, Inc. | Systems and methods for adaptive volume imaging |
US8992426B2 (en) | 2009-05-04 | 2015-03-31 | Siemens Medical Solutions Usa, Inc. | Feedback in medical ultrasound imaging for high intensity focused ultrasound |
US8276433B2 (en) | 2009-05-18 | 2012-10-02 | The Board Of Trustees Of The Leland Stanford Junior University | Sensor for measuring properties of liquids and gases |
US8207652B2 (en) | 2009-06-16 | 2012-06-26 | General Electric Company | Ultrasound transducer with improved acoustic performance |
US8451693B2 (en) | 2009-08-25 | 2013-05-28 | The Board Of Trustees Of The Leland Stanford Junior University | Micromachined ultrasonic transducer having compliant post structure |
US8409095B1 (en) | 2009-09-03 | 2013-04-02 | Sonosite, Inc. | Systems and methods for hands free control of medical devices |
US20110060221A1 (en) | 2009-09-04 | 2011-03-10 | Siemens Medical Solutions Usa, Inc. | Temperature prediction using medical diagnostic ultrasound |
US8345508B2 (en) | 2009-09-20 | 2013-01-01 | General Electric Company | Large area modular sensor array assembly and method for making the same |
US8563345B2 (en) | 2009-10-02 | 2013-10-22 | National Semiconductor Corporated | Integration of structurally-stable isolated capacitive micromachined ultrasonic transducer (CMUT) array cells and array elements |
US8222065B1 (en) | 2009-10-02 | 2012-07-17 | National Semiconductor Corporation | Method and system for forming a capacitive micromachined ultrasonic transducer |
US8324006B1 (en) | 2009-10-28 | 2012-12-04 | National Semiconductor Corporation | Method of forming a capacitive micromachined ultrasonic transducer (CMUT) |
US8081301B2 (en) | 2009-10-08 | 2011-12-20 | The United States Of America As Represented By The Secretary Of The Army | LADAR transmitting and receiving system and method |
US8819591B2 (en) | 2009-10-30 | 2014-08-26 | Accuray Incorporated | Treatment planning in a virtual environment |
US8368401B2 (en) | 2009-11-10 | 2013-02-05 | Insightec Ltd. | Techniques for correcting measurement artifacts in magnetic resonance thermometry |
US8715186B2 (en) | 2009-11-24 | 2014-05-06 | Guided Therapy Systems, Llc | Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy |
DE102009060317B4 (en) | 2009-12-23 | 2013-04-04 | Siemens Aktiengesellschaft | A contrast agent for use in an imaging method for diagnosing a metastatic tumor disease and a method for imaging a metastatic tumor tissue |
US20110178407A1 (en) | 2010-01-20 | 2011-07-21 | Siemens Medical Solutions Usa, Inc. | Hard and Soft Backing for Medical Ultrasound Transducer Array |
EP2528509B1 (en) | 2010-01-29 | 2021-10-13 | University Of Virginia Patent Foundation | Ultrasound for locating anatomy or probe guidance |
US8717360B2 (en) | 2010-01-29 | 2014-05-06 | Zspace, Inc. | Presenting a view within a three dimensional scene |
US8876716B2 (en) | 2010-02-12 | 2014-11-04 | Delphinus Medical Technologies, Inc. | Method of characterizing breast tissue using muliple ultrasound renderings |
WO2011100691A1 (en) | 2010-02-12 | 2011-08-18 | Delphinus Medical Technologies, Inc. | Method of characterizing the pathological response of tissue to a treatmant plan |
US20130278631A1 (en) | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
US20110218436A1 (en) | 2010-03-06 | 2011-09-08 | Dewey Russell H | Mobile ultrasound system with computer-aided detection |
JP5394299B2 (en) | 2010-03-30 | 2014-01-22 | 富士フイルム株式会社 | Ultrasonic diagnostic equipment |
US8876740B2 (en) | 2010-04-12 | 2014-11-04 | University Of Washington | Methods and systems for non-invasive treatment of tissue using high intensity focused ultrasound therapy |
US8439840B1 (en) | 2010-05-04 | 2013-05-14 | Sonosite, Inc. | Ultrasound imaging system and method with automatic adjustment and/or multiple sample volumes |
KR101999078B1 (en) | 2010-06-09 | 2019-07-10 | 리전츠 오브 더 유니버스티 오브 미네소타 | Dual mode ultrasound transducer (dmut) system and method for controlling delivery of ultrasound therapy |
US9465090B2 (en) | 2010-06-09 | 2016-10-11 | Siemens Aktiengesellschaft | Method of magnetic resonance-based temperature mapping |
US8647279B2 (en) | 2010-06-10 | 2014-02-11 | Siemens Medical Solutions Usa, Inc. | Volume mechanical transducer for medical diagnostic ultrasound |
US8527033B1 (en) | 2010-07-01 | 2013-09-03 | Sonosite, Inc. | Systems and methods for assisting with internal positioning of instruments |
US20120005624A1 (en) | 2010-07-02 | 2012-01-05 | Vesely Michael A | User Interface Elements for Use within a Three Dimensional Scene |
JP5702966B2 (en) | 2010-08-02 | 2015-04-15 | キヤノン株式会社 | Electromechanical transducer and method for manufacturing the same |
EP2605830A4 (en) | 2010-08-18 | 2015-12-02 | Mirabilis Medica Inc | Hifu applicator |
US7954387B1 (en) | 2010-08-18 | 2011-06-07 | General Electric Company | Ultrasonic transducer device |
US8425425B2 (en) | 2010-09-20 | 2013-04-23 | M. Dexter Hagy | Virtual image formation method for an ultrasound device |
US8716816B2 (en) | 2010-10-12 | 2014-05-06 | Micralyne Inc. | SOI-based CMUT device with buried electrodes |
US9354718B2 (en) | 2010-12-22 | 2016-05-31 | Zspace, Inc. | Tightly coupled interactive stereo display |
KR20120073887A (en) * | 2010-12-27 | 2012-07-05 | 삼성전자주식회사 | Image processing apparatus and method for porcessing image thereof |
US8128050B1 (en) | 2011-02-08 | 2012-03-06 | Sonosite, Inc. | Ultrasound scanner support devices |
DE102011011530B4 (en) | 2011-02-17 | 2013-05-08 | Karlsruher Institut für Technologie | Method for reducing ultrasound data |
US8891334B2 (en) | 2011-03-04 | 2014-11-18 | Georgia Tech Research Corporation | Compact, energy-efficient ultrasound imaging probes using CMUT arrays with integrated electronics |
USD657361S1 (en) | 2011-03-25 | 2012-04-10 | Sonosite, Inc. | Housing for an electronic device |
US8804457B2 (en) | 2011-03-31 | 2014-08-12 | Maxim Integrated Products, Inc. | Transmit/receive systems for imaging devices |
US20120250454A1 (en) | 2011-04-04 | 2012-10-04 | Robert Nicholas Rohling | Method and system for shaping a cmut membrane |
US9736466B2 (en) | 2011-05-27 | 2017-08-15 | Zspace, Inc. | Optimizing stereo video display |
US9161025B2 (en) | 2011-08-29 | 2015-10-13 | Zspace, Inc. | Extended overdrive tables and use |
CN102981156A (en) | 2011-09-06 | 2013-03-20 | 中国科学院声学研究所 | Ultrasonic imaging post-processing method and device thereof |
WO2013059358A2 (en) | 2011-10-17 | 2013-04-25 | Butterfly Network, Inc. | Transmissive imaging and related apparatus and methods |
US9533873B2 (en) | 2013-02-05 | 2017-01-03 | Butterfly Network, Inc. | CMOS ultrasonic transducers and related apparatus and methods |
KR20220097541A (en) | 2013-03-15 | 2022-07-07 | 버터플라이 네트워크, 인크. | Monolithic ultrasonic imaging devices, systems and methods |
EP3639937A1 (en) | 2013-03-15 | 2020-04-22 | Butterfly Network, Inc. | Complementary metal oxide semiconductor (cmos) ultrasonic transducers and methods for forming the same |
CA2919183A1 (en) | 2013-07-23 | 2015-01-29 | Butterfly Network, Inc. | Interconnectable ultrasound transducer probes and related methods and apparatus |
-
2013
- 2013-04-03 US US13/856,252 patent/US9667889B2/en active Active
-
2014
- 2014-04-03 EP EP14725300.9A patent/EP2981215A2/en not_active Withdrawn
- 2014-04-03 CN CN201480031564.XA patent/CN105263419A/en active Pending
- 2014-04-03 JP JP2016506609A patent/JP6786384B2/en not_active Expired - Fee Related
- 2014-04-03 WO PCT/US2014/032803 patent/WO2014165662A2/en active Application Filing
- 2014-04-03 CA CA2908631A patent/CA2908631C/en active Active
- 2014-04-03 KR KR1020157031515A patent/KR20150145236A/en not_active Application Discontinuation
-
2017
- 2017-04-28 US US15/581,429 patent/US20170228862A1/en not_active Abandoned
- 2017-07-07 US US15/644,456 patent/US20170309023A1/en not_active Abandoned
-
2018
- 2018-11-02 JP JP2018207508A patent/JP2019030736A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060173326A1 (en) * | 2003-06-10 | 2006-08-03 | Koninklijke Philips Electronics N.V. | User interface for a three-dimensional colour ultrasound imaging system |
US20080262304A1 (en) * | 2004-06-30 | 2008-10-23 | Micha Nisani | In-Vivo Sensing System Device and Method for Real Time Viewing |
US8551000B2 (en) * | 2006-06-23 | 2013-10-08 | Teratech Corp. | Ultrasound 3D imaging system |
US20110319736A1 (en) * | 2010-06-25 | 2011-12-29 | MuscleSound, LLC | System for non-invasive determination of glycogen stores |
Also Published As
Publication number | Publication date |
---|---|
US20140300720A1 (en) | 2014-10-09 |
JP2016515903A (en) | 2016-06-02 |
WO2014165662A2 (en) | 2014-10-09 |
JP2019030736A (en) | 2019-02-28 |
JP6786384B2 (en) | 2020-11-18 |
CA2908631A1 (en) | 2014-10-09 |
EP2981215A2 (en) | 2016-02-10 |
WO2014165662A3 (en) | 2014-12-31 |
CN105263419A (en) | 2016-01-20 |
KR20150145236A (en) | 2015-12-29 |
CA2908631C (en) | 2021-08-24 |
US20170228862A1 (en) | 2017-08-10 |
US9667889B2 (en) | 2017-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170309023A1 (en) | Portable Electronic Devices With Integrated Imaging Capabilities | |
US20070016025A1 (en) | Medical diagnostic imaging three dimensional navigation device and methods | |
US20190021698A1 (en) | Methods for acquiring ultrasonic data | |
US8563932B2 (en) | Device and method for diffusion optical tomography | |
JP2016515903A5 (en) | ||
WO2005117711A3 (en) | Processing and displaying breast ultrasound information | |
US20190272667A1 (en) | Systems and methods for generating b-mode images from 3d ultrasound data | |
WO2015039302A1 (en) | Method and system for guided ultrasound image acquisition | |
US20140171799A1 (en) | Systems and methods for providing ultrasound probe location and image information | |
US20160299565A1 (en) | Eye tracking for registration of a haptic device with a holograph | |
US12130666B2 (en) | Housing structures and input-output devices for electronic devices | |
CN107233134B (en) | Method and device for displaying internal marking points of three-dimensional medical model and medical equipment | |
CN103443799B (en) | 3D rendering air navigation aid | |
US20140047378A1 (en) | Image processing device, image display apparatus, image processing method, and computer program medium | |
US10269453B2 (en) | Method and apparatus for providing medical information | |
US9589387B2 (en) | Image processing apparatus and image processing method | |
US9911224B2 (en) | Volume rendering apparatus and method using voxel brightness gain values and voxel selecting model | |
US10441249B2 (en) | Ultrasound diagnosis apparatus and method of operating the same | |
KR102695456B1 (en) | Ultrasound diagnostic apparatus for displaying shear wave data of the object and method for operating the same | |
CN109313818B (en) | System and method for illumination in rendered images | |
JP2023004884A (en) | Rendering device for displaying graphical representation of augmented reality | |
KR102321642B1 (en) | Input apparatus and medical image apparatus comprising the same | |
Caviedes et al. | User Interfaces to Interact with Tensor Fields. A State-of-the-Art Analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BUTTERFLY NETWORK, INC., CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROTHBERG, NOAH ZACHARY;REEL/FRAME:043342/0342 Effective date: 20140211 |
|
AS | Assignment |
Owner name: BUTTERFLY NETWORK, INC., CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROTHBERG, NOAH ZACHARY;REEL/FRAME:043008/0733 Effective date: 20140211 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |