WO2015075612A1 - Système à ultrasons avec assistance à la navigation et son procédé d'exploitation - Google Patents

Système à ultrasons avec assistance à la navigation et son procédé d'exploitation Download PDF

Info

Publication number
WO2015075612A1
WO2015075612A1 PCT/IB2014/066049 IB2014066049W WO2015075612A1 WO 2015075612 A1 WO2015075612 A1 WO 2015075612A1 IB 2014066049 W IB2014066049 W IB 2014066049W WO 2015075612 A1 WO2015075612 A1 WO 2015075612A1
Authority
WO
WIPO (PCT)
Prior art keywords
displacement
determined
cue
current frame
frame
Prior art date
Application number
PCT/IB2014/066049
Other languages
English (en)
Inventor
Robert Joseph SCHNEIDER
Michael Daniel Cardinale
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015075612A1 publication Critical patent/WO2015075612A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52073Production of cursor lines, markers or indicia by electronic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8934Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
    • G01S15/8936Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in three dimensions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52085Details related to the ultrasound signal acquisition, e.g. scan sequences
    • G01S7/52087Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters

Definitions

  • the present system relates to an ultrasound imaging system and, more particularly, to an ultrasound imaging system which provides navigation assistance to a user, and a method of operation thereof.
  • the controller may be configured to determine whether the ultrasound image data is static. It is also envisioned that the controller may be configured to determine a cycle of the anatomical object, when it is determined that the ultrasound image data is not static (e.g., is dynamic). It is also envisioned that the controller may be configured to select the reference frame from the one or more previous frames in accordance with the determined cycle of the anatomical object. It is further envisioned that the controller may be configured to select the reference frame from a most recent frame of the one or more of previous frames. In accordance with yet other embodiments, it is envisioned that the controller may be configured to form a stationary cue, when it is determined that the determined displacement is less than or equal to the threshold displacement value. In yet further embodiments, it is envisioned that the controller may be configured to adjust characteristics of the cue in accordance with a magnitude and/or direction of the displacement.
  • a method of displaying ultrasound images may be performed by at least one controller of an imaging system and may include one or more acts of: acquiring ultrasound image data of an anatomical object, the ultrasound image data including a current frame and one or more previous frames; comparing the current frame and a reference frame selected from the one or more previous frames; determining a displacement of the current frame relative to the reference frame based upon the results of the comparison of the current frame and the reference frame; determining whether the determined displacement is greater than a threshold displacement value; forming a movement cue that indicates movement in accordance with the determined displacement, when it is determined that the displacement is greater than the threshold displacement value; and/or rendering the movement cue in association with the current frame on a rendering device of the system.
  • the method may further include an act of determining whether the ultrasound image data is static.
  • the method may also include an act of determining a cycle of the anatomical object, when it is determined that the ultrasound image data is not static.
  • the method may include an act of selecting the reference frame from the one or more of previous frames in accordance with the determined cycle of the anatomical object.
  • the method may include an act of selecting the reference frame from a most recent frame of the one or more previous frames.
  • the method may include an act of forming a cue corresponding with a stationary cue in accordance with the determined displacement, when it is determined that the displacement is less than or equal to the threshold displacement value.
  • the method may include an act of adjusting characteristics of the cue in accordance with a magnitude of the displacement.
  • a computer program stored on a computer readable memory medium may be configured to render ultrasonic image data obtained from an ultrasonic probe
  • the computer program may include a program portion which may be configured to: acquire ultrasound image data of an anatomical object, the ultrasound image data may include a current frame and one or more previous frames; compare the current frame and a reference frame selected from the one or more previous frames; determine a displacement of the current frame relative to the reference frame based upon the results of the comparison of the current frame and the reference frame; determine whether the determined displacement is greater than a threshold displacement value; form a movement cue that indicates movement in accordance with the determined displacement, when it is determined that the displacement is greater than the threshold displacement value; ; and/or render the movement cue in association with the current frame on a rendering device of the system.
  • FIG. 1 shows a flow diagram that illustrates a process performed by a system in accordance with embodiments of the present system
  • FIG. 2 shows a graph which illustrates a menu including a plurality of motion cue types in accordance with embodiments of the present system
  • FIG. 1 shows a flow diagram that illustrates a process 100 performed by a system in accordance with embodiments of the present system.
  • the process 100 may be performed using one or more computers communicating over a network and may obtain information, such as ultrasound image data, from, and/or store information including ultrasound image data to one or more memories of the system which may be local and/or remote from each other.
  • the process 100 can include one or more of the following acts. Further, one or more of these acts may be combined and/or separated into sub-acts, if desired. Further, one or more of these acts may be skipped depending upon settings. In operation, the process may start during act 101 and then proceed to act 103.
  • the process may acquire image data of an object (e.g., an anatomical object such as a patient in the current example) using any suitable ultrasound method. Accordingly, the process may acquire image data (e.g., 2D and/or 3D ultrasound image data) of the patient in real-time using any suitable ultrasound imaging system such as a PhilipsTM EPIQ system operating in accordance with embodiments of the present system and thereafter convert the image data into corresponding image frames in real time.
  • the most recently acquired frame will be assumed to be a current frame. Thus, assuming the current frame is an i th frame (e.g. frame(i)), a most recently acquired previous frame may be an (i-1 ) th frame.
  • a static scan may be a scan where substantially static anatomy of the object is being imaged such as a liver or the like.
  • a non-static scan e.g., a dynamic scan
  • objects in the image data are constantly moving such as heart valves, etc.
  • the process may determine whether a scan is a static scan based upon a selected scan type (e.g., cardiac scan, liver scan, bladder scan, etc.) which selection (e.g., a scan type selection) may be selected by a user prior to starting the current scan using, for example, a menu-based scan type selection (e.g., cardiac scan, liver scan, bladder scan, etc.).
  • a scan type selection e.g., a scan type selection
  • the process may determine whether a scan is a static or dynamic (e.g., non-static) using image processing methods applied to the acquired image data to, for example, detect movement and/or periodicity.
  • the process may set the reference frame as a previously acquired reference frame which is a most recently acquired previous frame (e.g., the (i- 1 ) th frame).
  • x may be multiplied by a sensitivity multiplier M.
  • the reference frame may be set to the (i-m*x) th frame.
  • m may be set by the system and/or user and may be an integer and/or a positive number. By increasing the value of m, a larger time interval may pass between the reference frame and a current frame as will be discussed below. This may increase sensitivity if desired.
  • the process may set the reference frame as a selected previously-acquired frame (e.g., an (i-x) th frame) which is obtained at a similar point in a previous cycle as the current frame (e.g., the (i) th ) frame.
  • a selected previously-acquired frame e.g., an (i-x) th frame
  • the process may determine a point of a cardiac cycle that corresponds with the current frame and may determine a corresponding point in a most recent previous cardiac cycle and a corresponding frame (CF).
  • the process may then set the reference frame equal to this frame (e.g., frame CF).
  • the process may determine displacement between a current frame (e.g., frame(i), which may be considered a sensed or target image frame) and the reference frame (e.g., frame (i-1 ), which may be considered a reference or source image frame), which is a previously-acquired frame.
  • the process may determine displacement of the current frame relative to the reference frame using any suitable method such as spatial registration and/or optical flow methods (e.g., performed in real time).
  • the process may form a displacement vector corresponding with the determined displacement. For example, one could use block matching as described in Boukerroui et al.
  • the process may determine whether an absolute value of the displacement vector (DV) is greater than a threshold displacement value (TDV). Accordingly, if it is determined that DV is greater than the TDV, the process may continue to act 1 15. This is illustrative, for example, of motion of an ultrasonic probe that exceeds a preset TDV. However, if it is determined that DV is less than or equal to the TDV, then the process may continue to act 1 19, which illustrates a case where motion of the probe does not exceed a threshold value.
  • a purpose of utilizing the TDV is to reduce the likelihood that a cue (as described herein) may be produced when detected displacement is minimal, such as 2 mm or less.
  • the probe may be held in a substantially fixed position, however some nominal displacement is detected,. In short, for example, if the displacement is 2mm or less, then no cue will be produced.
  • the threshold displacement value (TDV)
  • this value may be set by the system and/or user and stored in a memory of the system for later use.
  • a training method may be used to allow a user to select one or more values for the threshold displacement value (TDV).
  • the training application may request a user hold an ultrasound probe and/or move the ultrasound probe a minimum amount that the user wishes to set as a threshold movement amount. Then, the training application may select values for the threshold displacement value (TDV) in accordance with the minimum amounts of movement as determined by movement of the ultrasonic probe.
  • a motion cue is selected from a cue type (e.g., a cue style) which is set as a default cue type.
  • a cue type e.g., a cue style
  • FIG. 2 shows a graph 200 providing an illustrative menu 202 including a plurality of cue types in accordance with embodiments of the present system.
  • cues of each type e.g., cue types I through VI
  • rows 203 and 205 illustrate right and left motion or movement cues, respectively, for each cue type. These rows may be considered motion or movement cues.
  • the cue characteristics may then be related to the displacement value. For example, a size of a cue may be related to a magnitude of the displacement value. Thus, as the displacement value increases (e.g., indicating more movement), the size of the associated cue (as selected during acts 1 17, 1 19, and 121) may increase. However, with regard to the cues, their characteristics may be further related to the magnitude of the motion vector and/or time. Thus, after a cue is selected (e.g., during acts 1 17, 1 19, and 121 ), appropriate cue characteristics may be determined based upon a magnitude of the motion vector and/or time (e.g., stationary time). While FIG.
  • the process may select a left motion cue from, for example, the default cue type (e.g., see, FIG. 2, item 231 ). The characteristics of the left motion cue may then also be determined in accordance with the magnitude of the determined motion vector as discussed above. After completing act 1 17, the process may continue to act 123.
  • the process may select a stationary cue from the selected default cue type (e.g., see, FIG. 2, item 237) The characteristics of the stationary cue may then also be determined in accordance with the magnitude of the determined motion vector as discussed above. After completing act 1 19, the process may continue to act 123.
  • one or more cues such as those shown in FIG. 2 may be applied as default cues for selection during one or more of acts 1 17, 1 19 and 121.
  • the default cue may be selected by the user.
  • other cues in addition or besides the cues shown in FIG. 2 may be readily utilized in accordance with embodiments of the present system.
  • the process may associate the cue (e.g., a right cue, a left cue, a stationary cue, etc.) with the corresponding current image frame data and may form display image data which may include one or more corresponding image frames suitable for rendering on a rendering device of the system.
  • the process may determine a location of the selected cues relative to the image data. For example, a location of the selected cue may be determined relative to the current image frame (e.g., the location of a right motion cue and a left motion cue may differ from each other) and then corresponding display image data may be formed.
  • the location of the selected cue may be provided in a same position within an image regardless of the type of cue determined.
  • Other ways of determining a position of the cue within an image may also be suitably applied in accordance with embodiments of the present system.
  • the process may render the display image data on a user interface (Ul) of the system such as on a display as shown in FIG. 3 which shows a screenshot 300 of a current image 301 of a region-of-interest and an associated right- cue 303 in accordance with embodiments of the present system.
  • the right cue 303 indicates motion of the probe to the right (relative to the reference frame) which is illustratively determined to exceed the TDV.
  • the image e.g., see, reference points RPs
  • the process may repeat act 103 or continue to act 127 and end, if desired.
  • the process may store image data or other information generated, obtained and/or otherwise processed during process 100 in a memory of the system for later use.
  • the process 100 may store the image data or other information that it generates in association with the corresponding selected cue(s).
  • the process may store the generated image data or other information without the corresponding cue information.
  • embodiments of the present system may further illustrate other movement cues that indicate rotational changes, or movement up, down, right, and/or left. Accordingly, the process may be repeated for each of these motion types.
  • the direction vector may include a rotational vector illustrating rotational motion about a common axis (e.g., in degrees) (as opposed to a linear motion directional vector) between the current and reference frames.
  • the right motion cue may then include a clockwise rotational cue (e.g., a right arrow) and the left motion cue may then include a counter clockwise arrow.
  • a medical imaging apparatus which may include an imaging mode that provides visual cues to the user to indicate a directional change of an acquired region such as a region- of-interest which is indicative of a given probe movement.
  • Embodiments of the present system may be configured to continuously run in real-time a spatial registration and/or optical flow algorithms to determine the displacement of a current frame relative to a reference frame (e.g., a frame acquired prior to the current frame).
  • a reference frame may be selected from a frame from a previous cycle (e.g., a cardiac cycle, etc.) at similar part of the cycle in a prior cycle. It is further envisioned that for applications where static anatomy is imaged, such as a liver, the reference frame may be set to an immediately previously-acquired frame.
  • an indicator of the determined movement of the probe e.g., direction of movement
  • the cue size may be used to indicate a magnitude or direction of motion of a probe.
  • larger cues such as a larger arrow (e.g., see, FIG. 2 item 239) may be used to indicate a large movement
  • smaller cues e.g., of the same type
  • embodiments of the present system may provide guidance for novice ultrasound operators, if operators do not wish to have the cues, they may turn them off by deselecting a cue selection option menu item or the like.
  • Embodiments of the present system provide methods to generate visual cues that may be rendered on a display of the system to indicate the directional change of the acquired region (e.g., the ROI) due to a movement of an ultrasound probe which may enhance user convenience.
  • the acquired region e.g., the ROI
  • cues and characteristics thereof such as colors, symbols, transparency, or image shifting effects may be used to portray to the user a direction that the image is shifting.
  • cues and characteristics may be used to portray movements left, up, down, and also rotation. While shown with respect to a single 2D image plane, the method could also be used for xPlaneTM imaging and 3D imaging methods.
  • embodiments of the present system provide a navigation-assisted imaging mode which may: acquire an ultrasound image, images, or volume using for example: a live 2D, xPlaneTM, or 3D imaging mode of a suitable ultrasound system such as a PhilipsTM ultrasound system or the like.
  • the system may further, perform spatial registration and/or optical flow to determine the displacement of a current image frame relative to an appropriately chosen reference frame. Then, based on the computed displacement of the frame (e.g., movement), the system may choose to display a visual cue to the user (e.g., superimposed over a current image) to indicate the directional change of the acquired region.
  • the system may render the cue.
  • the system may determine characteristics of the cue such as line style, color, intensity, highlighting, transparency, image effect (e.g., blinking) type, etc. based upon the movement (e.g., direction, rate of change, etc.). For example, the characteristics of the cue such as color, shape, size, movement may be based upon a determined magnitude of the detected displacement and/or rotation which form the movement.
  • the system may then associate the cue with a current (acquired) image and render them on a display of the system.
  • a portion of the present system 400 may include a processor 410 (e.g., a controller) operationally coupled to a memory 420, a rendering device 430, a probe 440, and a user input device 470.
  • the memory 420 may be any type of device for storing application data as well as other data related to the described operation.
  • the application data and other data are received by the processor 410 for configuring (e.g., programming) the processor 410 to perform operation acts in accordance with the present system.
  • the processor 410 so configured becomes a special purpose machine particularly suited for performing in accordance with embodiments of the present system.
  • the operation acts may include configuring the system 400 by, for example, configuring the processor 410 to obtain information from user inputs, the probe 440, and/or the memory 420 and processing this information in accordance with embodiments of the present system to obtain information related to an acquired image and which may form at least part of image data.
  • the user input portion 470 may include a keyboard, a mouse, a trackball and/or other device, including touch-sensitive displays, which may be stand alone or be a part of a system, such as part of an imaging system, a personal computer, a notebook computer, a netbook, a tablet, a smart phone, a personal digital assistant (PDA), a mobile phone, and/or other device for communicating with the processor 410 via any operable link.
  • PDA personal digital assistant
  • the user input portion 470 may be operable for interacting with the processor 410 including enabling interaction within a Ul 430 (e.g., the menu 202) as described herein.
  • a Ul 430 e.g., the menu 202
  • the processor 410, the memory 420, the Ul 430 and/or user input device 470 may all or partly be a portion of a computer system or other device as described herein.
  • Operation acts may include requesting, providing, and/or rendering of image data such as, for example, ultrasound image data of a volume in whole or part related to patient.
  • the processor 410 may render the information on the rendering device 430 (e.g., a display) of the system providing a user interface as described herein.
  • the probe 440 may include sensors such as ultrasound transducers in an array to provide desired sensor information to the processor 410 for further processing in accordance with embodiments of the present system.
  • the methods of the present system are particularly suited to be carried out by processor programmed by a computer software program, such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system.
  • the processor 410 is operable for providing control signals and/or performing operations in response to input signals from the user input device 470 as well as in response to other devices of a system and executing instructions stored in the memory 420.
  • the processors 410 may obtain imaging information from the probe 440 and may process this information to determine displacement information, a magnitude of displacement, etc.
  • the processor 410 may include one or more of a microprocessor, an application-specific or general-use integrated circuit(s), a logic device, etc.
  • the processor 410 may be a dedicated processor for performing in accordance with the present system or may be a general-purpose processor wherein only one of many functions operates for performing in accordance with the present system.
  • the processor 410 may operate utilizing a program portion, multiple program segments, or may be a hardware device utilizing a dedicated or multi-purpose integrated circuit.
  • a method to render cues illustrating displacement (e.g., motion information) of a region of interest rather than requiring a user to visually detect motion of the objects within an image to detect a direction of motion of a probe, embodiments of the present system simplify navigation by providing simple visual cues which are rendered to the user to inform the user of a direction of movement of the probe relative to a corresponding image.
  • a plurality of visual cues may be provided and a user may select default visual cues within a Ul as described herein. These cues may provide a visual indication on a screen illustrating movement of the probe over time that cause a shift in a region of interest. For example, FIG. 3 illustrates cues 303 displayed when the region of interest shifts to the right due to probe movement to the right.
  • embodiments of the present system provide methods assist operators during use, to train operators and help enhance a learning curve of operators by providing visual cues at, of, or near the image(s) displayed on a display of the system to indicate a directional change over time (e.g., a current image relative to a prior acquired image) of the region-of-interest corresponding to a motion of a probe while acquiring images.
  • a directional change over time e.g., a current image relative to a prior acquired image
  • a user- interface which may provide navigation assistance to an operator of an ultrasound machine to indicate to the operator on the viewing screen the direction the probe is moving with respect to the displayed images.
  • Direction may be indicated by visual markers that appear on the side of the image corresponding to the direction of motion, and/or by an actual shift of an image in the direction corresponding to the motion and/or by other visual indicators.
  • embodiments of the present system may provide a method for tracking or otherwise following an object of interest by providing directional information to assist in the manipulation of a probe to capture anatomy or an object of interest in an image of an object obtained using ultrasound imaging methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Acoustics & Sound (AREA)
  • Physiology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne des procédés et des systèmes utilisant un appareil de formation d'image à ultrasons. Les procédés et les systèmes peuvent acquérir des informations d'image ultrasonores d'un objet anatomique (103), les données d'image ultrasonores pouvant comprendre une trame courante et une ou plusieurs trames précédentes. La trame courante et une trame de référence sélectionnées parmi lesdites trames précédentes (107, 109) peuvent être comparées afin de déterminer un déplacement de la trame courante par rapport à la trame de référence (111). Si le déplacement déterminé est supérieur à une valeur de déplacement de seuil (113), un repère de mouvement indiquant un mouvement peut être formé (117, 121). Le repère de mouvement peut être soumis à un rendu en association avec la trame courante sur un dispositif de rendu (123, 125) et peut être affiché afin d'assister un utilisateur dans l'obtention d'images ultrasonores.
PCT/IB2014/066049 2013-11-19 2014-11-14 Système à ultrasons avec assistance à la navigation et son procédé d'exploitation WO2015075612A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361905882P 2013-11-19 2013-11-19
US61/905,882 2013-11-19

Publications (1)

Publication Number Publication Date
WO2015075612A1 true WO2015075612A1 (fr) 2015-05-28

Family

ID=52345472

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/066049 WO2015075612A1 (fr) 2013-11-19 2014-11-14 Système à ultrasons avec assistance à la navigation et son procédé d'exploitation

Country Status (1)

Country Link
WO (1) WO2015075612A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018054288A1 (fr) * 2015-09-21 2018-03-29 Edan Instruments, Inc. Amélioration du rapport signal/bruit et indépendance d'opérateur à l'aide d'une sélection de trame variant dans le temps pour une estimation de déformation
CN110072466A (zh) * 2016-12-15 2019-07-30 皇家飞利浦有限公司 产前超声成像
TWI691310B (zh) * 2019-01-04 2020-04-21 宏碁股份有限公司 超音波掃描方法與超音波掃描裝置
CN111479510A (zh) * 2017-11-14 2020-07-31 皇家飞利浦有限公司 超声跟踪和可视化

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000041127A1 (fr) * 1998-12-30 2000-07-13 Acuson Corporation Procede et systeme d'imagerie ultrasonore de diagnostic medical pouvant presenter des images multiphases et multitrames
WO2009147621A2 (fr) * 2008-06-05 2009-12-10 Koninklijke Philips Electronics, N.V. Imagerie ultrasonore à champ de vision étendu et balayage efov guidé

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000041127A1 (fr) * 1998-12-30 2000-07-13 Acuson Corporation Procede et systeme d'imagerie ultrasonore de diagnostic medical pouvant presenter des images multiphases et multitrames
WO2009147621A2 (fr) * 2008-06-05 2009-12-10 Koninklijke Philips Electronics, N.V. Imagerie ultrasonore à champ de vision étendu et balayage efov guidé

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BOUKERROUI ET AL., VELOCITY ESTIMATION IN ULTRASOUND IMAGES: A BLOCK MATCHING APPROACH, 2003
NI ET AL., VOLUMETRIC ULTRASOUND PANORAMA BASED ON 3D SIFT, 2008

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018054288A1 (fr) * 2015-09-21 2018-03-29 Edan Instruments, Inc. Amélioration du rapport signal/bruit et indépendance d'opérateur à l'aide d'une sélection de trame variant dans le temps pour une estimation de déformation
CN110072466A (zh) * 2016-12-15 2019-07-30 皇家飞利浦有限公司 产前超声成像
CN110072466B (zh) * 2016-12-15 2022-07-19 皇家飞利浦有限公司 产前超声成像
CN111479510A (zh) * 2017-11-14 2020-07-31 皇家飞利浦有限公司 超声跟踪和可视化
CN111479510B (zh) * 2017-11-14 2023-09-22 皇家飞利浦有限公司 超声跟踪和可视化
TWI691310B (zh) * 2019-01-04 2020-04-21 宏碁股份有限公司 超音波掃描方法與超音波掃描裝置
US11253228B2 (en) 2019-01-04 2022-02-22 Acer Incorporated Ultrasonic scanning method and ultrasonic scanning device

Similar Documents

Publication Publication Date Title
KR102269467B1 (ko) 의료 진단 이미징에서의 측정 포인트 결정
US8320989B2 (en) Region of interest methods and systems for ultrasound imaging
US10001844B2 (en) Information processing apparatus information processing method and storage medium
CN105303550B (zh) 图像处理装置及图像处理方法
JP6222807B2 (ja) 医用画像処理装置、x線診断装置及び医用画像処理プログラム
JP6059261B2 (ja) マルチモーダル画像統合におけるレジストレーション精度を改善するインテリジェントなランドマーク選択
US8090168B2 (en) Method and system for visualizing registered images
US9301733B2 (en) Systems and methods for ultrasound image rendering
JP5538861B2 (ja) 情報処理装置、情報処理方法、情報処理システム、及びプログラム
US9449247B2 (en) Contour correction device, method, and program
RU2017135209A (ru) Ультразвуковая диагностика работы сердца с использованием сегментации модели сердечной камеры под контролем пользователя
JP2013106870A (ja) 医用画像処理装置
JP2010264232A (ja) 診断支援装置、診断支援プログラムおよび診断支援方法
CN106133797B (zh) 具有查看平面确定的医学查看系统
WO2015075612A1 (fr) Système à ultrasons avec assistance à la navigation et son procédé d'exploitation
US9460538B2 (en) Animation for conveying spatial relationships in multi-planar reconstruction
US20140306961A1 (en) Medical image processing system, recording medium having recorded thereon a medical image processing program and medical image processing method
JP5479138B2 (ja) 医用画像表示装置、医用画像表示方法、及びそのプログラム
US20150320377A1 (en) Medical image display control apparatus, method, and program
US20150320507A1 (en) Path creation using medical imaging for planning device insertion
US20220392607A1 (en) Image acquisition visuals for augmented reality
JP6416350B2 (ja) 医用画像処理装置、x線診断装置及び医用画像処理プログラム
EP4210577A1 (fr) Méthode de fourniture d'une source d'imagerie médicale secondaire
US20150121276A1 (en) Method of displaying multi medical image and medical image equipment for performing the same
CN107329669B (zh) 在人体医学三维模型中选择人体子器官模型的方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14825436

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14825436

Country of ref document: EP

Kind code of ref document: A1