WO2017137807A1 - Procédé et système de génération d'une image composite - Google Patents
Procédé et système de génération d'une image composite Download PDFInfo
- Publication number
- WO2017137807A1 WO2017137807A1 PCT/IB2016/050746 IB2016050746W WO2017137807A1 WO 2017137807 A1 WO2017137807 A1 WO 2017137807A1 IB 2016050746 W IB2016050746 W IB 2016050746W WO 2017137807 A1 WO2017137807 A1 WO 2017137807A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- main
- image frame
- transducer array
- frame
- secondary image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 150000001875 compounds Chemical class 0.000 title claims abstract description 44
- 238000002604 ultrasonography Methods 0.000 claims abstract description 97
- 239000000523 sample Substances 0.000 claims abstract description 42
- 230000005540 biological transmission Effects 0.000 claims abstract description 11
- 230000015654 memory Effects 0.000 claims description 29
- 239000011800 void material Substances 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 63
- 238000003860 storage Methods 0.000 description 24
- 238000003384 imaging method Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 13
- 238000013329 compounding Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000001914 filtration Methods 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000001934 delay Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000003491 array Methods 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 238000010304 firing Methods 0.000 description 4
- 230000013011 mating Effects 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000001427 coherent effect Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 210000000746 body region Anatomy 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000002592 echocardiography Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005111 flow chemistry technique Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000011112 process operation Methods 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000012285 ultrasound imaging Methods 0.000 description 2
- 230000003187 abdominal effect Effects 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000013021 overheating Methods 0.000 description 1
- 230000035935 pregnancy Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8995—Combining images from different aspect angles, e.g. spatial compounding
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
Definitions
- Ultrasound systems exist today that utilize a variety of techniques for processing ultrasound signals to generate information of interest.
- One of the problems to be solved in diagnostic imaging and also in ultrasound imaging relates to increasing image resolution, eliminating artifacts, shadows, increasing edge detail and suppressing speckle.
- Spatial compounding is an imaging technique in which a number of ultrasound images of a given target that have been obtained from multiple vantage points or angles are combined into a single compounded image by combining the data received from each point in the compound image target which has been received from each angle. Examples of spatial compounding may be found in U.S. Pat. Nos . 4,649,927; 4,319,489; and 4,159,462.
- Real time spatial series of partially overlapping component image frames from substantially independent spatial directions, utilizing an array transducer to implement electronic beam steering and/or electronic translation of the component frames.
- the component frames are combined into a compound image by summation, averaging, peak detection, or other combinational means.
- the acquisition sequence and formation of compound images are repeated continuously at a rate limited by the acquisition frame rate, that is, the time required to acquire the full complement of scanlines over the selected width and depth of imaging.
- Speckle is reduced (i.e. speckle signal to noise ratio is improved) by the square root of N in a compound image with N component frames, provided that the component frames used to create the compound image are substantially independent and are averaged.
- Standard ultrasound images compound is generally provided using images acquired with different steering angles. Each image leads on a fixed line of sight (LOS) discontinuities due to a not complete areas overlapping. To avoid this it is necessary to reduce the field of view of the output image or heavy filtering it.
- LOS line of sight
- a method for performing compound imaging comprises the operations of :
- main frame boundaries for the main image frame along opposite lateral sides thereof;
- the line of sight (LOS) of the different images are chosen such that the virtual apex also defined as virtual source of ultrasound beams is set at the center of the transducer array.
- the secondary image frames obtained by steering will cover an area having a trapezoidal shape and the combination of the different image frames can be carried out by considering adjacent boundaries of the primary and secondary image frames in such a way as to obviate to the generation of discontinuities in the compound image.
- the virtual apex can be placed in different positions for at least some of the lines of view of the main and secondary image frame.
- the virtual source or sources being moved from one line to the other of the first and/or second image frames.
- an ultrasound system comprising an ultrasound probe having a transducer array
- a beam former configured to:
- a processor configured to execute the program instructions to:
- Embodiments herein provide improvements to the method allowing the process to be simplified, while keeping the focusing accuracy high and while reducing the computational burden without the need for a specific particular hardware structure.
- Still another aim in accordance with at least some embodiments, is to provide a beamforming processor that allow the method according to the embodiments herein to be carried out.
- Fig. 2 illustrates a more detailed block diagram of the ultrasound system of Fig. 1.
- Fig. 3 schematically illustrates three images to be compounded according to the prior art.
- Fig. 4 schematically illustrates the resulted compounded image .
- Fig. 5 schematically illustrates three images to be compounded in connection with embodiments herein. compounded image .
- Fig. 7 illustrates a block diagram of an ultrasound system formed in accordance with an alternative embodiment .
- FIG. 8 illustrates a block diagram of a portion of the digital front-end boards.
- Fig. 9 illustrates a block diagram of the digital processing board.
- Fig. 11 illustrates a block diagram of another compound imaging module formed in accordance with embodiments herein.
- Fig. 12 illustrates a block diagram of another compound imaging module formed in accordance with embodiments herein.
- Figure 14 schematically shows a field of view of a secondary image frame obtained by a convex transducer array and by offsetting the virtual apex from the center of the array.
- Fig. 1 illustrates a high-level block diagram of an ultrasound system implemented in accordance with embodiments herein. Portions of the system (as defined by various functional blocks) may be implemented with dedicated hardware, analog and/or digital circuitry, and/or one or more processors operating program instructions stored in memory. Additionally or alternatively, all or portions of the system may be implemented utilizing digital components, digital signal processors (DSPs) and/or field programmable gate arrays (FPGAs) and the like.
- DSPs digital signal processors
- FPGAs field programmable gate arrays
- the blocks/modules illustrated in Fig. 1 can be implemented with dedicated hardware (DPSs, FPGAs, memories) and/or in software with one or more processors .
- a transmit section and a receive section 152, 153 are connected alternatively one another with the probe to provide to each individual transducer an excitation signal of the corresponding ultrasound pulse and to receive the electric signal corresponding to an acoustic pulse that has hit the transducer.
- the transmit signals to the transducers are each one sent in an independent manner through a dedicated channel or by a multiplexer to an digital analog converter 125 that generate signals at a predetermined sampling rate and it provides analog excitation signals to each transducer/channel .
- Digital transmit signals are subjected to a processing by a so called beamforming processor 103 that carries out a proper delay of the transmission signals of each channel in order to selective concentrate ultrasound energy in a narrow line, zone or eventually in the whole body region to be investigated depending on the adopted image formation scheme.
- the receive signals of the transducers are each one sent in an independent manner through a dedicated channel or by a multiplexer to an analog digital converter 124 that samples said signals with a predetermined sampling rate and it provides output digitized receive signals of each transducer/channel .
- digitized signals are subjected to a processing by a so called beamforming processor 103 that the receive signal of each channel correspondingly to the travel time of the signal reflected by a predetermined reflection point from said reflection point to the corresponding transducer.
- the individual transducers of the array provided on the probe have positions different from each other, they necessarily have different distances from the reflection point and therefore the echo signal deriving from such point reaches each individual reflector in a different moment.
- the focusing process performs the time re-alignment of the contributions of the receive signal of each transducer deriving from the same reflection point and therefore to sum together such contributions in a coherent manner .
- the focusing process in dependence of the transmission scheme adopted, may concern a narrow line, a zone or the whole investigated body region.
- a TX waveform generator 102 is coupled to the beamformer 103 and generates the transmit signals that are supplied from the beamformer 103 to the probe 101.
- the transmit signals may represent various types of ultrasound TX signals such as used in connection with B- mode imaging, Doppler imaging, color Doppler imaging, pulse-inversion transmit techniques, contrast-based imaging, M-mode imaging and the like. Additionally or alternatively, the transmit signals may include single or multi-line transmit, narrow beams transmit, zone transmit, broad beams transmit, plane-waves transmit, shear waves transmit and the like.
- the beamformer 103 performs beamforming upon received echo signals to form beamformed echo signals in connection to pixel locations distributed across the region of interest.
- the transducer elements generate raw analog receive signals that are supplied to the beamformer.
- the beamformer adjusts the delays to focus the receive signal along one or more select receive beams and at one or more select depths within the region of the receive signals to obtain a desired apodization and profile.
- the beamformer applies weights and delays to the receive signals from individual corresponding transducers of the probe. The delayed, weighted receive signals are then summed to form a coherent receive signal .
- the beamformer 103 includes (or is coupled to) an A/D converter 124 that digitizes the receive signals at a select sampling rate. The digitization process may be performed before or after the summing operation that produces the coherent receive signals.
- the beamformer also includes (or is coupled to) a demodulator 122 that demodulates the receive signals to remove the carrier waveform.
- complex receive signals are generated that include I,Q components (also referred to as I,Q data pairs) .
- the I,Q data pairs are saved as image pixels in memory.
- the I,Q data pairs defining the image pixels for corresponding individual locations along corresponding lines of sight (LOS) or view lines.
- a collection of image pixels (e.g., I,Q data pairs) are collected over time and saved as 2D image frames and/or 3D volumes of image data.
- the image pixels correspond to tissue and other anatomy within the ROI .
- the sequence controller 110 may be programmed to manage acquisition timing which can be generalized as a sequence of firings aimed at select reflection points/targets in the ROI .
- the sequence controller 110 manages operation of the TX/RX beamformer 103 in connection with transmitting ultrasound beams and the lines of sight.
- the sequence controller 110 also manages collection of receive signals.
- the beamformer may be configured to acquire a main image frame and a secondary image frame of ultrasound data at the transducer array, the main and secondary image frames at least partially overlapping one another.
- One or more processors 106 and/or CPU 112 perform various processing operations as described herein.
- the processor 106 executes a B/W module to generate B-mode images.
- the processor 106 and/or CPU 112 executes a Doppler module to generate Doppler images.
- the processor executes a Color flow module (CFM) to generate color flow images.
- the processor 106 and/or CPU 112 may implement additional ultrasound imaging and measurement operations.
- the processor 106 and/or CPU 112 may filter the first and second displacements to eliminate movement-related artifacts.
- An image scan converter 107 performs scan conversion on the image pixels to convert the format of the image pixels from the coordinate system of the ultrasound acquisition signal path (e.g., the beamformer, etc.) and the coordinate system of the display.
- the scan converter 107 may convert the image pixels from polar coordinates to Cartesian coordinates for image frames.
- a cine memory 108 stores a collection of image frames over time.
- the image frames may be stored formatted in polar coordinates , Cartesian coordinates or another coordinate system. in ormation, such as the image frames and information measured in accordance with embodiments herein.
- the display 109 displays the ultrasound image with the region of interest shown.
- a control CPU module 112 is configured to perform various tasks such as implementing the user/interface and overall system configuration/control.
- the processing node In case of fully software implementation of the ultrasound signal path, the processing node usually hosts also the functions of the control CPU.
- a power supply circuit 111 is provided to supply power to the various circuitry, modules, processors, memory components, and the like.
- the power supply 111 may be an A.C. power source and/or a battery power source (e.g., in connection with portable operation).
- the processor 106 and/or CPU 112 may be configured to execute a compound module to generate compound images .
- Spatial compounding is an imaging technique in which a number of ultrasound images of a given target that have been obtained from multiple vantage points or angles are combined into a single compounded image by combining the data received from each point in the compound image target which has been received from each angle. Examples of spatial compounding may be found in U.S. Pat. Nos . 4,649,927; 4,319,489; and 4,159,462.
- Real time spatial compound imaging is performed by rapidly acquiring a series of partially overlapping component image frames from substantially independent spatial directions, utilizing an array transducer to implement electronic component frames.
- the component frames are combined into a compound image by summation, averaging, peak detection, or other combinational means.
- the acquisition sequence and formation of compound images are repeated continuously at a rate limited by the acquisition frame rate, that is, the time required to acquire the full complement of scanlines over the selected width and depth of imaging.
- the compounded image typically shows lower speckle and better specular reflector delineation than conventional ultrasound images from a single viewpoint. Speckle is reduced (i.e. speckle signal to noise ratio is improved) by the square root of N in a compound image with N component frames, provided that the component frames used to create the compound image are substantially independent and are averaged.
- Standard ultrasound images compound is generally provided using images acquired with different steering angles. Each image leads on a fixed line of sight (LOS) angle step. Resulting composed image shows a double side discontinuities due to a not complete areas overlapping. To avoid this it is necessary to reduce the field of view of the output image or heavy filtering it.
- processor 106 is configured to execute the program instructions to: combine the main and secondary image frames to form a compound image; and align at least one of the main frame boundaries for the main image frame with one of the lines of view of the secondary image frame .
- Fig. 3 illustrates three different steered acquired images (steer left, centered, steer right) normally used to obtain a compounded image according to the status of the art.
- Fig. 4 shows the resulting overlapped image with the highlighted discontinuity side artifact.
- the compounded image can be produced with the same field of view as the center one without heavy filtering.
- Fig. 5 illustrates a series of image frames of the ultrasound data that are acquired by an ultrasound probe having a transducer array with a desired configuration. While the transducer array may have a linear, convex or alternative shape, in the example of Fig. 5, the transducer array has a linear configuration.
- a main image frame 304 is illustrated as corresponding to a central portion of a compounded image.
- Left and right secondary image frames 306 and 308 are illustrated as corresponding to left and right portions of a compounded image. Is few as two image frames, or from more than three image frames.
- the main and secondary image frames 304 - 308 are combined to form a compound image (e.g. as illustrated in Fig. 6) .
- the receive beamformer defines main frame boundaries 310, 312 for the main image frame 304 that are located along opposite lateral sides of a main field of view 314.
- the main field of view 314 includes a profile defined by lateral main frame boundaries 310, 312, a proximal edge 330 and a distal depth 331.
- the receive beamformer also defines a secondary frame boundaries 316, 318 for the left secondary image frame 306, and secondary frame boundaries 320, 322 for the right secondary image frame 308.
- the secondary frame boundaries 316, 318 are located along opposite lateral sides of the secondary field of view 324, while the secondary frame boundaries 320, 322 are located along opposite lateral sides of the secondary field of view 326.
- the right and left secondary fields of view 326, 324 include corresponding profiles that are defined by the lateral secondary frame boundaries 320, 322 and 316, 318, proximal edges 334, 332 and distal depths 335, 333, respectively.
- the profiles for the right and left secondary fields of view 326, 324 and secondary image frames 308, 306 correspond to trapezoids with a virtual apex remotely located from the surface of the transducer array in the example of Fig. 3.
- the profile may correspond to alternative shapes.
- the boundaries 310, 312, 318 array and overlap such that the boundaries 310, 312 of the main image frame align with the boundaries 318, 322.
- the aligned boundaries 318, 310 and 322, 312 of main and secondary frames 314, 324 and 326 may be oriented at non-perpendicular angles with respect to the surface of the transducer array, provide that boundaries 318 and 310 align, and boundaries 322 and 312 align and are oriented at common corresponding angles .
- the receive beamformer defines lines of view within the main and secondary image frames that extend from the transducer array and project into the region of interest.
- a linear surface of the transducer array may correspond to the proximal edges 330, 332, 334 of the main and secondary fields of view 314, 324, 326.
- a portion of the lines of view 340 in the main field of view 314 are illustrated to extend at an angle 342 into the region of interest from the surface of the transducer array at the proximal edge 330.
- the lines of view 340 in the main field of view 314 extend at a common reception steering angle 342 from the surface of the transducer array.
- the lines of view 340 may extend at different angles from the surface of the transducer array.
- the receive beamformer defines the lines of view in the secondary image frames 306, 308 to extend from the surface of the transducer array into the region of interest at different angles from one another relative to the surface of the transducer array.
- lines of view 352, 354 and 356 each are defined to have a corresponding reception steering angle that are relative to the surface of the transducer array and are different from one another.
- At least a portion of the reception steering angles are oriented at a non-perpendicular angle with respect to the surface of the transducer array.
- the reception steering angles associated with the peripheral outermost lines of view in the secondary image frame e.g., proximate to the secondary frame boundary that does not overlap the main image frame.
- reception angles associated with individual lines of view may be defined in various manners as explained herein.
- the reception steering angles of adjacent/neighboring lines of view may differ from one by a predetermined amount, or may be varied as a function of the position along the transducer array as well as a function of the profile of the field of view.
- a combiner module e.g. a dedicated circuit, firmware and/or a processor executing program instructions
- main image frame 304 left secondary image frame 306 and right secondary image frame 308.
- the boundaries of the corresponding image frames 304 - 308 are defined in connection with the acquisition operation and aligned during the combining operation such that one or more of the main frame boundaries 310, 312 substantially correspond to and align with an associated one of the boundaries of the secondary image frames 308, 306 (as well as the line of sight in the secondary image frame corresponding to the associated boundary) .
- the main frame boundary 312 may be aligned with the secondary frame boundary 318 of the left secondary image frame 306.
- the mainframe boundary 310 may be aligned with the secondary frame boundary 322 of the right secondary image frame 308.
- the main frame boundaries 310, 312 are also aligned with corresponding lines of view within the secondary image frames 308, 306.
- scan conversion is done following the compounding process by a scan converter 107.
- the compound images may be stored in a Cine memory 108 in either estimate or display pixel form. If stored in estimate form the images may be scan converted when replayed from the Cine memory for display.
- the scan converter and Cine memory may also be used to render three dimensional presentations of the spatially compounded images as described in U.S. Pat. Nos . 5,485,842 and compounded images are processed for display by a video processor and displayed on an image display 109.
- boundary of secondary image frame may be perpendicular to transducer while main image boundary extends at a non-perpendicular angle from the surface of the transducer.
- boundaries of main and secondary frames may both be non- perpendicular to the surface of the transducer array but are oriented at a common angle.
- Fig. 7 illustrates a block diagram of an ultrasound system formed in accordance with an alternative embodiment.
- the system of Fig. 7 implements the operations described herein in connection with various embodiments.
- one or more circuits/processors within the system implement the operations of any processes illustrated in connection with the figures and/or described herein.
- the system includes a probe interconnect board 702 that includes one or more probe connection ports 70 .
- the connection ports 704 may support various numbers of signal channels (e.g., 128, 192, 256, etc.).
- the connector ports 704 may be configured to be used with different types of probe arrays (e.g., phased array, linear array, curved array, ID, 1.25D, 1.5D, 1.75D, 2D array, etc.).
- the probes may be configured for different types of applications, such as abdominal, cardiac, maternity, gynecological, urological and cerebrovascular examination, breast examination and the like.
- acquisition of 2D image data and/or one or more of the connection ports 704 may support 3D image data.
- the 3D image data may be acquired through physical movement (e.g., mechanically sweeping or physician movement) of the probe and/or by a probe that electrically or mechanically steers the transducer array.
- Figure 13 relates to a linear transducer array where the steering is carried out by generating lines of sight which are not parallel and cover a trapezoidal field of view or image frame 1301.
- the virtual prolongation 1303 of the lines of sight 1302 intersects at the virtual apex is offset relating to the center of the linear array and in normal conditions in which the lines of sight are parallel the virtual apex or virtual source lines in the infinite.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
L'invention concerne des procédés pour générer une image composite avec un système ultrasonore, le procédé consistant à : acquérir une trame d'image principale et une trame d'image secondaire de données ultrasonores au niveau d'une sonde ultrasonore ayant une matrice de transducteurs, les trames d'images principale et secondaire se chevauchant au moins partiellement l'une l'autre; définir des limites de trame principale pour la trame d'image principale le long de côtés latéraux opposés de celle-ci ; et définir des lignes de vue de réception dans la trame d'image secondaire s'étendant à partir de la matrice de transducteurs, au moins une partie de lignes de vue voisines s'étendant, dans une région d'intérêt, à différents angles de réception l'une par rapport l'autre, par rapport à une surface de la matrice de transducteurs ; et combiner les trames d'images principale et secondaire pour former une image composite, les lignes de vue dans la trame d'image primaire et/ou secondaire étant orientées de telle sorte que des extensions virtuelles des lignes de vue s'étendent à travers la matrice de transducteurs et convergent au niveau d'une source virtuelle derrière la matrice de transducteurs en considérant la direction de transmission des faisceaux ultrasonores. L'invention concerne également un système correspondant.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2016/050746 WO2017137807A1 (fr) | 2016-02-12 | 2016-02-12 | Procédé et système de génération d'une image composite |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2016/050746 WO2017137807A1 (fr) | 2016-02-12 | 2016-02-12 | Procédé et système de génération d'une image composite |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017137807A1 true WO2017137807A1 (fr) | 2017-08-17 |
Family
ID=55453230
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2016/050746 WO2017137807A1 (fr) | 2016-02-12 | 2016-02-12 | Procédé et système de génération d'une image composite |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017137807A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112401932A (zh) * | 2020-12-08 | 2021-02-26 | 深圳开立生物医疗科技股份有限公司 | 超声扩展空间复合成像方法和相关装置 |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4159462A (en) | 1977-08-18 | 1979-06-26 | General Electric Company | Ultrasonic multi-sector scanner |
US4319489A (en) | 1980-03-28 | 1982-03-16 | Yokogawa Electric Works, Ltd. | Ultrasonic diagnostic method and apparatus |
US4649927A (en) | 1984-09-25 | 1987-03-17 | Kontron Holding Ag | Real time display of an ultrasonic compound image |
US5485842A (en) | 1994-11-30 | 1996-01-23 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic scan conversion for three dimensional display processing |
US5860924A (en) | 1996-11-26 | 1999-01-19 | Advanced Technology Laboratories, Inc. | Three dimensional ultrasonic diagnostic image rendering from tissue and flow images |
US20040054284A1 (en) * | 2002-09-13 | 2004-03-18 | Acuson Corporation | Overlapped scanning for multi-directional compounding of ultrasound images |
US20040193047A1 (en) * | 2003-02-19 | 2004-09-30 | Ultrasonix Medical Corporation | Compound ultrasound imaging method |
EP1757954A2 (fr) * | 2005-08-22 | 2007-02-28 | Medison Co., Ltd. | Système et procédé de formation d'un image ultrasonore de composition spatiale |
EP1681020B1 (fr) | 2005-01-18 | 2008-06-04 | Esaote S.p.A. | Méthode d'imagerie ultrasonique et sonde pour inspection 3D gynécologique |
EP2444821A2 (fr) * | 2010-10-19 | 2012-04-25 | Samsung Medison Co., Ltd. | Fourniture d'une image composée spatiale d'ultrasons fondée sur les lignes centrales des images par ultrasons dans un système par ultrasons |
US20120209107A1 (en) * | 2010-12-27 | 2012-08-16 | General Electric Company | Method and apparatus for enhancing needle visualization in ultrasound imaging |
-
2016
- 2016-02-12 WO PCT/IB2016/050746 patent/WO2017137807A1/fr active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4159462A (en) | 1977-08-18 | 1979-06-26 | General Electric Company | Ultrasonic multi-sector scanner |
US4319489A (en) | 1980-03-28 | 1982-03-16 | Yokogawa Electric Works, Ltd. | Ultrasonic diagnostic method and apparatus |
US4649927A (en) | 1984-09-25 | 1987-03-17 | Kontron Holding Ag | Real time display of an ultrasonic compound image |
US5485842A (en) | 1994-11-30 | 1996-01-23 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic scan conversion for three dimensional display processing |
US5860924A (en) | 1996-11-26 | 1999-01-19 | Advanced Technology Laboratories, Inc. | Three dimensional ultrasonic diagnostic image rendering from tissue and flow images |
US20040054284A1 (en) * | 2002-09-13 | 2004-03-18 | Acuson Corporation | Overlapped scanning for multi-directional compounding of ultrasound images |
US20040193047A1 (en) * | 2003-02-19 | 2004-09-30 | Ultrasonix Medical Corporation | Compound ultrasound imaging method |
EP1681020B1 (fr) | 2005-01-18 | 2008-06-04 | Esaote S.p.A. | Méthode d'imagerie ultrasonique et sonde pour inspection 3D gynécologique |
EP1757954A2 (fr) * | 2005-08-22 | 2007-02-28 | Medison Co., Ltd. | Système et procédé de formation d'un image ultrasonore de composition spatiale |
EP2444821A2 (fr) * | 2010-10-19 | 2012-04-25 | Samsung Medison Co., Ltd. | Fourniture d'une image composée spatiale d'ultrasons fondée sur les lignes centrales des images par ultrasons dans un système par ultrasons |
US20120209107A1 (en) * | 2010-12-27 | 2012-08-16 | General Electric Company | Method and apparatus for enhancing needle visualization in ultrasound imaging |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112401932A (zh) * | 2020-12-08 | 2021-02-26 | 深圳开立生物医疗科技股份有限公司 | 超声扩展空间复合成像方法和相关装置 |
CN112401932B (zh) * | 2020-12-08 | 2023-07-07 | 深圳开立生物医疗科技股份有限公司 | 超声扩展空间复合成像方法和相关装置 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11624816B2 (en) | Method and system for performing retrospective dynamic transmit focussing beamforming on ultrasound signals | |
US11633175B2 (en) | Method and ultrasound system for shear wave elasticity imaging | |
US11471130B2 (en) | Method and ultrasound system for shear wave elasticity imaging | |
US11160536B2 (en) | Ultrasound method and ultrasound system for real time automatic setting of parameters for doppler imaging modes | |
US10679349B2 (en) | Method and system for estimating motion between images, particularly in ultrasound spatial compounding | |
US11612381B2 (en) | Method for tissue characterization by ultrasound wave attenuation measurements and ultrasound system for tissue characterization | |
US20130012819A1 (en) | Method and apparatus for performing ultrasound elevation compounding | |
US10444333B2 (en) | Method and system for performing baseband digital receiver beamforming on ultrasound signals | |
CN107205722A (zh) | 宽带混合的基波和谐波频率超声诊断成像 | |
US8343054B1 (en) | Methods and apparatus for ultrasound imaging | |
US11402354B2 (en) | Method for generating ultrasound transmission waves and ultrasound system for carrying out the said method | |
US8348848B1 (en) | Methods and apparatus for ultrasound imaging | |
US20230281837A1 (en) | Method and system for registering images acquired with different modalities for generating fusion images from registered images acquired with different modalities | |
US11751849B2 (en) | High-resolution and/or high-contrast 3-D and/or 4-D ultrasound imaging with a 1-D transducer array | |
CN102626328B (zh) | 超声波诊断装置、超声波图像处理装置及取得方法 | |
KR102146374B1 (ko) | 초음파 영상장치 및 그 제어방법 | |
WO2017137807A1 (fr) | Procédé et système de génération d'une image composite | |
JP2024518474A (ja) | コヒーレントに合成された超音波画像生成、並びに関連するシステム、方法、及び装置 | |
EP4331499A1 (fr) | Systèmes et procédés d'imagerie par ultrasons | |
Giangrossi | Development and real-time implementation of novel 2-D and 3-D imaging techniques on a research scanner | |
CN118266989A (zh) | 超分辨率造影成像方法和超声成像系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16707966 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.11.2018) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16707966 Country of ref document: EP Kind code of ref document: A1 |