EP2863363A1 - Procédé et appareil de génération d'image tridimensionnelle d'objet cible - Google Patents

Procédé et appareil de génération d'image tridimensionnelle d'objet cible Download PDF

Info

Publication number
EP2863363A1
EP2863363A1 EP20140182742 EP14182742A EP2863363A1 EP 2863363 A1 EP2863363 A1 EP 2863363A1 EP 20140182742 EP20140182742 EP 20140182742 EP 14182742 A EP14182742 A EP 14182742A EP 2863363 A1 EP2863363 A1 EP 2863363A1
Authority
EP
European Patent Office
Prior art keywords
image
target object
light
attribute information
reflection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP20140182742
Other languages
German (de)
English (en)
Inventor
Dong-Hoon Oh
Dong-Gyu Hyun
Han-Jun Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Medison Co Ltd
Original Assignee
Samsung Medison Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140085306A external-priority patent/KR102377530B1/ko
Application filed by Samsung Medison Co Ltd filed Critical Samsung Medison Co Ltd
Publication of EP2863363A1 publication Critical patent/EP2863363A1/fr
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52068Stereoscopic displays; Three-dimensional displays; Pseudo 3D displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • One or more embodiments of the present invention relate to a method and apparatus for generating a three-dimensional (3D) image of a target object, and more particularly, to a method and apparatus for generating a 3D medical image of a target object.
  • An ultrasound diagnosis apparatus transmits an ultrasonic signal from the surface of a body of a target object toward an inner part of the body by using a probe and obtains an image of a cross-section of soft tissue or a blood flow image by using information about an ultrasonic signal reflected by the inner part of the body.
  • the ultrasound diagnosis apparatus displays information about a target object in real time and is safe due to lack of exposure to X-rays or the like.
  • an ultrasound diagnosis apparatus is widely used together with other image diagnosis apparatuses, for example, an X-ray diagnosis apparatus, a computed tomography (CT) scanner, a magnetic resonance imaging (MRI) apparatus, a nuclear medical diagnosis apparatus, and the like.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • nuclear medical diagnosis apparatus and the like.
  • a method of generating a three-dimensional (3D) image of a target object includes: acquiring ultrasound data of the target object; and generating the 3D image of the target object by using the ultrasound data, so that a part of the target object having attribute information different than attribute information of other parts of the target object is shown on the 3D image differently than the other parts.
  • the attribute information may include at least one selected from image attribute information, physical property information, and surface information of the target object.
  • the generating of the 3D image may be performed by using at least one selected from a specula reflection coefficient, a specular light exponent, and a color of the target object.
  • the generating of the 3D image of the target object may include performing rendering 3D image of the target object according to the attribute information.
  • the method may further include displaying the 3D image of the target object.
  • the generating of the 3D image of the target object may include generating an image having a reflection light effect.
  • the image having the reflection light effect may be generated by a plurality of light sources.
  • an apparatus for generating a three-dimensional (3D) image of a target object includes: an ultrasound data acquirer which acquires ultrasound data of the target object; and an image generator which generates the 3D image of the target object by using the ultrasound data, so that a part of the target object having attribute information different than attribute information of other parts of the target object is shown on the 3D image differently than the other parts.
  • the attribute information may include at least one selected from image attribute information, physical property information, and surface information of the target object.
  • the image generator may generate the 3D image of the target object by using at least one selected from a specula reflection coefficient, a specular light exponent, and a color of the target object.
  • the image generator may generate the 3D image of the target object by performing rendering 3D image of the target object according to the attribute information.
  • the apparatus may further include a display unit which displays the 3D image of the target object.
  • the image generator may generate an image having a reflection light effect.
  • the image having the reflection light effect may be generated by a plurality of light sources.
  • a method of generating a three-dimensional (3D) image of a target object includes: acquiring medical image data of the target object; and generating an image having a reflection light effect based on the medical image data.
  • the method may further include displaying the image having the reflection light effect.
  • the generating of the image having the reflection light effect may include: calculating a representative voxel to be displayed on a screen from among a plurality of voxels on a path of a view vector; calculating a surface normal vector; calculating a reflection light vector by using the surface normal vector; generating a color of the target object, a reflection coefficient for a light source, and a specular light exponent for the light source; and calculating a color of a single point in the image by using at least one selected from the reflection light vector, the color of the target object, the reflection coefficient for the light source, and the specular light exponent for the light source.
  • the generating of the image of the target object having the reflection light effect may include: extracting a depth map for the target object and screen; calculating a surface normal vector for the depth map; calculating a reflection light vector by using the surface normal vector; generating a color of the target object, a reflection coefficient for a light source, and a specular light exponent; and calculating a color of a single point in the image by using at least one selected from the reflection light vector, the color of the target object, the reflection coefficient for the light source, and the specular light exponent for the light source.
  • the image having the reflection light effect may be generated via a plurality of renderings including specular rendering.
  • the image having the reflection light effect may be generated by using a plurality of light sources.
  • the medical image data may be acquired by an ultrasound diagnosis apparatus, an X-ray diagnosis apparatus, a computed tomography (CT) scanner, or a magnetic resonance imaging (MRI) apparatus.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • an apparatus for generating a three-dimensional (3D) image of a target object includes: an ultrasound data acquirer which acquires medical image data of the target object; and an image generator which generates a three-dimensional (3D) image having a reflection light effect based on the medical image data.
  • the apparatus may further include a display unit which displays the image having the reflection light effect.
  • the image generator may receive from an external source a color of the target object, a reflection coefficient for a light source, and a specular light exponent for the light source or internally calculates the color of the target object, the reflection coefficient for the light source, and the specular light exponent for the light source, and calculates a color of a single point in the image by using at least one selected from the color of the target object, the reflection coefficient for the light source, and the specular light exponent for the light source.
  • the image generator may extract a depth map for the target object on a screen, calculates a reflection light vector, and calculates a color of a single point in the image by using the reflection light vector.
  • the image having the reflection light effect may be generated by a plurality of light sources.
  • the image having the reflection light effect may be generated via a plurality of renderings including specular rendering.
  • the medical image data may be acquired by an ultrasound diagnosis apparatus, an X-ray diagnosis apparatus, a computed tomography (CT) scanner, or a magnetic resonance imaging (MRI) apparatus.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • an "ultrasound image” refers to an image of a target object which is obtained using ultrasound.
  • the target object may be a part of a body.
  • the target object may be an organ (for example, the liver, the heart, the womb, the brain, a breast, or the abdomen), an embryo, or the like.
  • a "medical image” refers not only to an image of a target object that is obtained using ultrasound but also to an image of a target object that is captured by an X-ray diagnosis apparatus, a computed tomography (CT) scanner, a magnetic resonance imaging (MRI) apparatus, or a nuclear medical diagnosis apparatus.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • nuclear medical diagnosis apparatus a nuclear medical diagnosis apparatus
  • a "user” may be, but is not limited to, a medical expert, such as a medic, a nurse, a medical laboratory technologist, a medical image expert, or the like.
  • voxel may be, but is not limited to, a minimal unit of a three-dimensional (3D) image.
  • FIG. 1 illustrates an ultrasound image 10 of a target object and a 3D rendered image 30 thereof.
  • the internal structure of a digestive organ may be easily ascertained by using ultrasound data of the digestive organ including a stomach and the like and a 3D volume rendering technique, for example.
  • a target object for example, the stomach of a patient
  • a deposit 20 for example, noodles attached to the stomach
  • the deposit 20 may be shown brighter than neighboring tissues on the ultrasound image 10 of the target object.
  • a certain area 11 is selected from the ultrasound image 10 for precise observation of the deposit 20 and rendering is performed with regard to the certain area 11, the deposit 20 within the target object may be shown like a polyp 31 on the 3D rendered image 30.
  • the deposit 20 and the like may have a different brightness and a different texture than surrounding tissues on a two-dimensional (2D) ultrasound image of the target object.
  • 3D volume rendering is performed based on the attributes of the target object including the deposit 20 and the like to thereby provide a user with a 3D image in which the attributes of the target object are reflected.
  • the 3D image of the target object when a 3D image of a target object is generated using acquired ultrasound data, the 3D image of the target object may be generated so that a part of the target object having attribute information different from other parts of the target object may be shown on the 3D image differently from the other parts.
  • the deposit 20 when the deposit 20 has different attribute information than attribute information of neighboring tissues in FIG. 1 , the deposit 20 may be shown differently from the neighboring tissues on a 3D image.
  • Attribute information used in the present specification may include at least one selected from image attribute information, physical property information, and surface information. The attribute information will now be described in more detail.
  • FIG. 2 is a flowchart of a method of generating a 3D image in which attribute information of a target object is reflected, according to an embodiment of the present invention.
  • the method may include operation S10 of acquiring ultrasound data of the target object, operation S20 of acquiring the attribute information of the target object from the ultrasound data, and operation S30 of generating the 3D image of the target object based on the attribute information.
  • the ultrasound data may include an ultrasound response signal received from the target object irradiated with ultrasound waves.
  • the ultrasound response signal may include at least one selected from an analog signal and a digital signal.
  • the ultrasound data may include an ultrasound response signal that is acquired from a target object and may be expressed in units of voxels.
  • the ultrasound data may include the amplitude and phase values of a returning ultrasound wave that returns from the target object to a probe after ultrasound waves are transmitted toward the target object via the probe and pass through the target object.
  • the attribute information of the target object may include image attribute information, physical property information, and surface information of the target object.
  • the image attribute information may include at least one selected from pieces of attribute information of an ultrasound image of the target object, such as a spatial frequency, an image intensity, an image histogram, a co-occurrence matrix, a local binary pattern (LBP), and homogeneity.
  • pieces of attribute information of an ultrasound image of the target object such as a spatial frequency, an image intensity, an image histogram, a co-occurrence matrix, a local binary pattern (LBP), and homogeneity.
  • the homogeneity of the target object denotes a degree of change of the size or shape of at least one particle included in the target object. For example, when the size or shape of at least one particle (or tissue) included in the target object is different from that of other particles (or tissues), the homogeneity of the target object is small or very low. On the other hand, when the size or shape of at least one particle (or tissue) included in the target object is identical to that of other particles (or tissues), the target object has very high homogeneity.
  • the physical property information of the target object may include information representing the physical properties of the target object, such as, strength, hardness, elasticity, plasticity, viscosity, density, ductile, brittleness, malleability, rigidity, and toughness.
  • the surface information of the target object may include information representing a surface property of the target object that can be perceived via a tactile perception organ of a human, such as, 'smooth', 'bumpy', 'rough', and 'soft'.
  • the surface information of the target object may also include information representing a visual or three-dimensional property of the target object that can be perceived via a visual organ of a human, such as, 'a contrast between brightness and darkness', 'separation or division between the entire background and a predetermined product.
  • the attribute information of the target object may be information about each of a plurality of pixels that constitute an image of the target object, or information about each of at least one area that constitutes the image of the target object.
  • the attribute information of the target object may also be information about a frame of the image of the target object.
  • the ultrasound image of the target object may be at least one selected from a brightness (B) mode image, a color (C) mode image, and a Doppler (D) mode image.
  • the ultrasound image may be a 2D or 3D image of the target object.
  • the operation S20 of acquiring the attribute information of the target object from the ultrasound data may include an operation of acquiring a spatial frequency, an image intensity, an image histogram, a co-occurrence matrix, an LBP, or homogeneity based on the ultrasound image generated according to the ultrasound response signal.
  • the operation S20 of acquiring the attribute information of the target object from the ultrasound data may include an operation of analyzing the image attribute information of the target object and an operation of determining the attribute information of the target object based on a result of the analysis.
  • FIG. 3 illustrates a case where the image attribute information of a target object is a spatial frequency, according to an embodiment of the present invention.
  • a spatial frequency represents a rate at which a pixel value varies in a horizontal or vertical direction on a screen on which an image is shown. For example, when a pixel value changes slowly (or when a pixel value remain almost unchanged) or a correlation between objects included in an image is high, a spatial frequency is low. On the other hand, when a pixel value varies greatly (or when a pixel value changes discontinuously like an edge) or a correlation between objects included in an image is low, a spatial frequency is high.
  • a first or second ultrasound image 40 or 50 of the target object may be acquired as illustrated in FIG. 3 .
  • the first or second ultrasound image 40 or 50 of the target object may be a portion of the ultrasound image 10 of the target object (for example, a portion of the ultrasound image 10 corresponding to a selected area 11 of FIG. 1 ).
  • tissue 41, 42, and 43 of the target object may be different from one another.
  • tissue used in the present specification may refer to a particle of an organ of a target object or a particle of a deposit.
  • the target object may include the tissue 42, which has the largest size, the tissue 43, which has a medium size, and the tissue 41, which has the smallest size. Shapes of the tissues 41, 42, and 43 of the target object may also be different from one another.
  • Brightness values of the tissues 41, 42, and 43 of the target object may also be different from one another.
  • the target object may include the tissue 43, which is the brightest on the first or second ultrasound image 40 or 50, the tissue 42, which is moderately bright on the first or second ultrasound image 40 or 50, and the tissue 41, which is the darkest on the first or second ultrasound image 40 or 50. Since different ultrasound response signals may be generated according to the depths, densities, elasticity, and the like of tissues, different tissues have different brightness values.
  • the tissue 41 which is located at the highest depth from among the tissues 41, 42, and 43, may be the darkest tissue on the the first or second ultrasound image 40 or 50. In other words, as the depth of the tissue increase, the brightness value thereof may decrease.
  • the brightness values of a plurality of pixels on a line k located at a k-th position from the top of the first ultrasound image 40 may be expressed as the amplitude values of the pixels as illustrated in FIG. 3 .
  • the tissue 42 which is moderately bright, may be expressed as a pixel amplitude value of 0.5.
  • the tissue 41 which is the darkest, may be expressed as a pixel amplitude value of -1.
  • the brightness values of a plurality of pixels on a line k located at a k-th position from the top of the second ultrasound image 50 may also be expressed as the amplitude values of the pixels as illustrated in FIG. 3 .
  • the second ultrasound image 50 includes tissues 51 having identical sizes and identical brightness values. Accordingly, as illustrated in FIG. 3 , the brightness values of the pixels on the line k on the second ultrasound image 50 may be all expressed as a pixel amplitude value of 1.
  • the first ultrasound image 40 and the second ultrasound image 50 are characterized by different aspects in terms of the brightness values of a plurality of pixels in the same k-th line.
  • a pixel amplitude value (or a pixel brightness value) of the first ultrasound image 40 variously changes from -1 to 0.5
  • a pixel amplitude value of the second ultrasound image 50 only changes from 0 to 1.
  • the spatial frequency of the first ultrasound image 40 may be determined to be high. In other words, correlations between the tissues 41, 42, and 43 included in the first ultrasound image 40 are low, and thus the target object corresponding to the first ultrasound image 40 may be determined to have a 'bumpy' or 'rough' property.
  • the spatial frequency of the second ultrasound image 50 may be determined to be low.
  • correlations between the objects (for example, the tissues 51) included in the second ultrasound image 50 are very high, and thus the target object corresponding to the second ultrasound image 50 may be determined to have a 'smooth' or 'soft' property.
  • the spatial frequency of an ultrasound image of a target object may be extracted as image attribute information of the target object, and attribute information of the target object may be determined (or acquired) based on the spatial frequency.
  • FIG. 4 illustrates a case where the image attribute information is image intensity, according to an embodiment of the present invention.
  • An image intensity according to an embodiment of the present invention may be expressed as the brightness values of a plurality of pixels of an ultrasound image of a target object.
  • tissue 41 through 43 having brightness values in different ranges may be shown on the first ultrasound image 40.
  • Tissues 51 having brightness values in the same range may be shown on the second ultrasound image 50.
  • the tissues 41 through 43 in the first ultrasound image 40 may be determined to be of different types. Even when the tissues 41 through 43 are of the same kind, the brightness values of the tissues 41 through 43 are in different brightness sections, and thus the tissues 41 through 43 may be determined to be located at different depths.
  • the target object corresponding to the first ultrasound image 40 may be determined to have a 'bumpy' or 'rough' property.
  • the tissues 51 of the second ultrasound image 50 may be determined to be of the same kind.
  • the target object corresponding to the second ultrasound image 50 may be determined to have a'smooth' or 'soft' property.
  • FIG. 5 illustrates a case where the image attribute information is an image histogram, according to an embodiment of the present invention.
  • An image histogram of the first or second ultrasound image 40 or 50 may be acquired as the image attribute information as illustrated in FIG. 5 .
  • the image histogram of the first or second ultrasound image 40 or 50 represents the number of pixels versus pixel values (for example, brightness values) of the first or second ultrasound image 40 or 50.
  • the tissues 41 through 43 having different brightness values may be shown in the first ultrasound image 40.
  • the numbers of pixels of the tissues 41 through 43 of the first ultrasound image 40 versus pixel values thereof may be evenly distributed between a pixel value of 0 and a pixel value of 192.
  • the number of pixels having brightness values in the first section 401 may be represented as a first graph 501
  • the number of pixels having brightness values in the second section 402 may be represented as a second graph 402
  • the number of pixels having brightness values in the third section 403 may be represented as a third graph 505.
  • the tissues 41 through 43 included in the first ultrasound image 40 may be determined to be of different types having various brightness values. Even when the tissues 41 through 43 are of the same kind, a histogram distribution of the tissues 41 through 43 is relatively wide, and thus the tissues 41 through 43 may be determined to be located at different depths.
  • the target object corresponding to the first ultrasound image 40 may be determined to have a 'bumpy' or 'rough' property.
  • Tissues 51 having the same or similar brightness values may be shown in the second ultrasound image 50.
  • the second ultrasound image 50 may be determined to include the tissues 51 of the same kind.
  • the target object corresponding to the second ultrasound image 50 may be determined to have a 'smooth' or 'soft' property.
  • FIG. 6 illustrates a case where the image attribute information is a co-concurrence matrix, according to an embodiment of the present invention.
  • the co-occurrence matrix may be used to determine repeatability of pixel values. For example, when many values correspond to a predetermined pattern, repetitiveness of this pattern may be determined to be high.
  • the first ultrasound image 40 may be simply represented as a 4x4 matrix 610 which represents the pixel values of the first ultrasound image 40.
  • a co-occurrence matrix 630 may be acquired using a pattern indicated in units of an inter-pixel horizontal distance of 1 with respect to the 4x4 matrix 610.
  • the inter-pixel direction of the pattern indication is not limited to a horizontal direction, and the pattern indication may be performed in a vertical direction, a diagonal direction, or the like.
  • the number of horizontal patterns of (1, 0), namely, 3 is shown on an element 631 of the co-concurrence matrix 630 that corresponds to a pattern of (1, 0).
  • Each element of a matrix is represented by the number of patterns corresponding thereto to thereby acquire the co-occurrence matrix 630.
  • the second ultrasound image 50 may also be represented as a 4x4 matrix 620.
  • a co-occurrence matrix 640 for the second ultrasound image 50 may be acquired in the same manner as the co-occurrence matrix 630 for the first ultrasound image 40.
  • the value of a specific element 641 is overwhelmingly high.
  • the element 641 corresponding to the pattern of (1, 1) may be expressed as 6.
  • the value of a specific element of the co-occurrence matrix 640 for the second ultrasound image 50 is overwhelmingly high, and thus the co-occurrence matrix 640 is relatively monotonous.
  • the reason for this monotony is that a predetermined pattern (for example, (1, 1) or (3, 3)) is highly repeated on the second ultrasound image 50.
  • the target object corresponding to the second ultrasound image 50 may be determined to have a 'smooth' or 'soft' property compared with that corresponding to the first ultrasound image 40.
  • FIG. 7 illustrates a case where the image attribute information is an LBP, according to an embodiment of the present invention.
  • the LBP is a technique for determining similarity between pixels by expressing a difference between a pixel value at a current position and a neighboring pixel in a binary system.
  • the similarity between pixels may be determined by acquiring an LBP histogram in which a difference between each pixel of an image and each neighboring pixel within a predetermine radius is expressed as a binary value and by comparing a plurality of binary values obtained for each pixel with one another.
  • pixels that are highly similar to each other may have the same or similar brightness values.
  • patterns of an image may be relatively accurately predicted using an LBP.
  • a 3x3 matrix 710 having an element 711 at the center thereof is selected from the 4x4 matrix 610, which represents the pixel values of the first ultrasound image 40.
  • the values of the elements of the binary expression matrix 730 may be interpreted clockwise starting from the element 711 to thereby obtain a binary value 750 of '11101001'. In this way, a binary value of '11111111' may be determined by starting from an element 712.
  • a binary value 760 of '00111110' may be acquired based on a 3x3 matrix 720 having an element 721 at the center thereof selected from the 4x4 matrix 620, which represents the pixel value of the second ultrasound image 50. In this way, a binary value of '00111110' may be acquired for an element 722.
  • the binary values of '11101001' and '11111111' determined from the first ultrasound image 40 are greatly different, but the binary values of '00111110' and '00111110' determined from the second ultrasound image 50 have no difference therebetween. In other words, an inter-pixel pixel value of the second ultrasound image 50 does not greatly change compared with the first ultrasound image 40.
  • the target object corresponding to the second ultrasound image 50 may be determined to have a 'smooth' or 'soft' property compared with the first ultrasound image 40.
  • FIG. 8 illustrates a case where the image attribute information is homogeneity, according to an embodiment of the present invention.
  • the homogeneity denotes homogeneity of the sizes or shapes of the tissue 41, 42, 43, and 51 in the first or second ultrasound image 40 or 50.
  • a homogeneity value corresponding to complete homogeneity is expressed as '10' and a homogeneity value when the sizes or shapes of tissues are not homogeneous is expressed as '1'.
  • the first ultrasound image 40 includes a plurality of tissues 41 through 43 having different sizes. Accordingly, the homogeneity of the first ultrasound image 40 may be determined to be low. For example, the homogeneity of the first ultrasound image 40 may be expressed as '1', and a particle (or tissue) included in the target object corresponding to the first ultrasound image 40 may be determined to be uneven. In other words, the target object corresponding to the first ultrasound image 40 may be determined to have a 'bumpy' or 'rough' property.
  • the second ultrasound image 50 includes a plurality of tissues 51 having the same size. Accordingly, the homogeneity of the second ultrasound image 50 may be determined to be relatively high, compared with the first ultrasound image 40.
  • the homogeneity of the second ultrasound image 50 may be expressed as '10', and a particle (or tissue) included in the target object corresponding to the second ultrasound image 50 may be determined to be even.
  • the target object corresponding to the second ultrasound image 50 may be determined to have a 'smooth' or 'soft' property, compared with the target object corresponding to the first ultrasound image 40.
  • FIG. 9 is a flowchart of a method of displaying and rendering a 3D image in which attribute information of a target object is reflected, according to an embodiment of the present invention.
  • operation S30 of generating the 3D image of the target object based on the attribute information may include operation S31 of rendering the 3D image of the target object based on the attribute information.
  • the 3D image may be rendered using the attribute information generated according to the aforementioned method on the target object.
  • a 3D volume image in which not only a structure of the target object but also attributes of the target object, such as a texture of the target object, are reflected may be constructed using attribute (for example, a texture) information corresponding to each pixel, which is acquired (or determined) via the above-described analysis of the image attribute information.
  • the method according to the present embodiment may further include operation S40 of displaying the 3D image in which the attribute information is reflected.
  • FIG. 10 is a flowchart of a method of generating an image in which attribute information of a window area of a target object is reflected, according to an embodiment of the present invention.
  • the method may include operation S100 of acquiring an ultrasound image of the target object based on ultrasound data about the target object, operation S200 of setting the window area from the ultrasound image, operation S300 of generating attribute information of the window area from the ultrasound data, and operation S400 of generating an image of the window area based on the attribute information.
  • Operation S200 of setting the window area from the ultrasound image may include operation S210 of selecting the window area from the ultrasound image according to an external input.
  • the window area may be automatically set to have a predetermined size to include a predetermined part of the target object, although there are no external inputs.
  • a predetermined window area having a predetermined size may be automatically set to include an upper part or a bottom curved side of the stomach of a patient.
  • Image attribute information may include at least one selected from a spatial frequency, image intensity, an image histogram, a co-occurrence matrix, an LBP, and homogeneity of an ultrasound image of a target object, and the attribute information may include a texture of the target object.
  • FIG. 11 is a flowchart of an operation of generating the image of the window area based on the attribute information, according to an embodiment of the present invention.
  • the operation S400 may include operation S410 of rendering the 3D image of the window area according to the attribute information.
  • a 3D volume image in which not only a structure of the target object but also attributes of the target object, such as a texture of the target object, are reflected may be acquired using attribute information corresponding to each pixel, which is acquired (or determined) via analysis of the attribute information.
  • FIG. 12 is a flowchart of an operation of generating the image of the window area based on the attribute information, according to another embodiment of the present invention.
  • the operation S400 may include operation S420 of performing rendering to acquire a rendered 3D image of the window area of the target object and operation S440 of adding the attribute information to the rendered 3D image.
  • the rendered 3D image is acquired by performing 3D volume rendering on the window area, and the attribute information is added to the rendered 3D image, thereby acquiring the 3D image in which the attribute information is reflected.
  • a first image is generated by performing 3D volume rendering on the window area
  • a second image is generated based on attribute information that corresponds to the window area and is extracted from the ultrasound data
  • the first image and the second image overlap with each other so that the attribute information may be added to the first image (namely, the rendered 3D image).
  • Attribute information obtained by filtering via a predetermined image processing filter or the like may be added to the rendered 3D image to thereby acquire the 3D image in which the attribute information has been reflected.
  • FIG. 13 is a block diagram of an apparatus 1300 for generating a 3D image in which attribute information of a target object is reflected, according to an embodiment of the present invention.
  • the apparatus 1300 may include an ultrasound data acquirer 1310 acquiring ultrasound data of the target object, an attribute information generator 1350 generating the attribute information of the target object from the ultrasound data, and an image generator 1370 generating the 3D image of the target object based on the attribute information.
  • the apparatus 1300 may further include a display unit 1390 displaying the 3D image.
  • the attribute information may include at least one image attribute information selected from a spatial frequency, image intensity, an image histogram, a co-occurrence matrix, an LBP, and homogeneity of an ultrasound image of the target object.
  • the attribute information may include at least one selected from image attribute information, physical property information, and surface information of the target object.
  • the image generator 1370 may render the 3D image of the target object according to the attribute information generated by the attribute information generator 1350.
  • FIG. 14 is a block diagram of an apparatus 1300 for generating an image in which attribute information of a window area of a target object is reflected, according to an embodiment of the present invention.
  • the apparatus 1300 may include an ultrasound data acquirer 1310 acquiring ultrasound data of the target object, an ultrasound image acquirer 1320 acquiring an ultrasound image of the target object based on the ultrasound data, a window area setter 1360 setting the window area on the ultrasound image, an attribute information generator 1350 generating the attribute information of the target object from the ultrasound data, and an image generator 1370 generating the image of the window area based on the attribute information.
  • the window area setter 1360 may automatically set the window area to have a predetermined size to include a predetermined part of the target object.
  • the predetermined size of area may be previously stored in a storage (not shown).
  • a predetermined window area having a predetermined size may be automatically set to include an upper part or a bottom curved side of the stomach of a patient.
  • the apparatus 1300 may further include an external input receiver 1340.
  • the window area setter 1360 may set a predetermined area on the ultrasound image acquired by the ultrasound image acquirer 1320, based on an external input received by the external input receiver 1340.
  • Image attribute information according to an embodiment of the present invention may include at least one selected from a spatial frequency, an image intensity, an image histogram, a co-occurrence matrix, an LBP, and homogeneity of an ultrasound image of a target object.
  • Attribute information according to an embodiment of the present invention may include at least one selected from a texture of the target object and homogeneity of the target object.
  • the image generator 1370 may include an image renderer 1371 performing rendering to acquire the 3D image of the window area according to the attribute information generated by the attribute information generator 1350.
  • FIG. 15 is a block diagram of an apparatus 1300 for generating an image in which attribute information of a window area of a target object is reflected, according to another embodiment of the present invention.
  • an image generator 1370 may include an image renderer 1371 performing rendering to acquire a rendered 3D image of the window area and an attribute information adder 1372 adding the attribute information generated by the attribute information generator 1350 to the rendered 3D image.
  • the image generator 1370 may generate an attribute information image whereon the attribute information is displayed.
  • the rendered 3D image is acquired by performing 3D volume rendering on the window area, and the attribute information is added to the rendered 3D image, thereby acquiring the 3D image in which the attribute information is reflected.
  • the attribute information adder 1372 may overlap a first image generated by the image render 1371 with a second image (for example, the attribute information image) generated by the image generator 1370 to add the attribute information to the first image (namely, the rendered 3D image).
  • the attribute information adder 1372 may include a predetermined image processing filter (not shown) or the like. In other words, the attribute information adder 1372 may add attribute information obtained by filtering via the predetermined image processing filter or the like to the rendered 3D image to thereby acquire the 3D image in which the attribute information is reflected.
  • FIGS. 16A-16C and 17A-17C are views for describing a reflection light effect as attribute information of a target object, according to an embodiment of the present invention.
  • an image of FIG. 16C may be generated by synthesizing an image of FIG. 16A and an image of FIG. 16B .
  • a reflection light effect as shown in FIG. 16B may vary according to the properties of a target object.
  • the apparatus 1300 of FIG. 13 may acquire the 3D image of FIG. 16A and attribute information corresponding to the image of FIG. 16B .
  • the apparatus 1300 of FIG. 13 may generate the image of FIG. 16C by adding the image of FIG. 16B to the 3D image of FIG. 16A .
  • the apparatus 1300 of FIG. 13 may additionally render a reflection light effect that is the attribute information corresponding to the image of FIG. 16B , to the 3D image of FIG. 16A .
  • An image of FIG. 17C may be generated by synthesizing an image of FIG. 17A and an image of FIG. 17B .
  • a reflection light effect as shown in FIG. 17B may vary according to the properties of a target object.
  • the apparatus 1300 of FIG. 13 may acquire the 3D image of FIG. 17A and attribute information corresponding to the image of FIG. 17B .
  • the apparatus 1300 of FIG. 13 may generate the image of FIG. 17C by adding the image of FIG. 17B to the 3D image of FIG. 17A .
  • the apparatus 1300 of FIG. 13 may additionally render the reflection light effect that is the attribute information corresponding to the image of FIG. 17B , to the 3D image of FIG. 17A .
  • FIG. 18 is a schematic diagram for describing a method of obtaining the images of FIGS. 16A-16C and 17A-17C .
  • a reflection light effect may be generated via 3D rendering.
  • Specular rendering for generating a reflection light effect may be implemented by using ray tracing rendering.
  • Ray tracing rendering uses a general lighting model.
  • ray tracing rendering is a method of calculating a lighting effect due to reflection, refraction, absorption, and self-emission occurring when light strikes a surface of a target object, by tracing a path of light of all pixels on a screen when it is assumed that light is radiated from a camera.
  • ray tracing rendering a local lighting model is used to calculate a lighting effect due to reflection between a light source and a target object.
  • the local lighting model calculates lighting effects due to an ambient light effect, diffusive reflection, and reflection light effect.
  • a reflection light effect among them is a lighting effect when a target object is highlighted by, in particular, light regularly reflected from a surface of a target object. The intensity of such a highlight varies according to the positions of a camera.
  • the reflection light effect may be calculated by Equation 1.
  • C o C p ⁇ K s ⁇ O s ⁇ R ⁇ ⁇ S ⁇ n
  • the specular light color Cp may denote the color of light for providing a reflection light effect.
  • the specula reflection coefficient K s may denote the reflection degree of reflection light.
  • the reflection light vector R may represent both the direction of reflection light and the intensity thereof.
  • the view vector S may represent a unit vector for the direction of a view.
  • Equation 2 When the direction of the reflection light vector R and that of the view vector S are made the same in Equation 1, Equation 2 may be obtained.
  • a reflection light effect may also be calculated by using Equation 2 below. Supposing that the position and direction of a light source are not changed, the position and direction of the light source may be considered identical to those of a camera.
  • C o C p ⁇ K s ⁇ O s ⁇ Rz N
  • the specular light color Cp, the specula reflection coefficient K s , and the target object color O s in Equations 1 and 2 may be changed by using the attribute information such as a spatial frequency, an image intensity, an image histogram, a co-occurrence matrix, an LBP, or homogeneity of the ultrasound data described above with reference to FIGS. 3-8 .
  • the specular light color Cp, the specula reflection coefficient K s , and the target object color O s in Equations 1 and 2 may also be changed by using attribute information such as a mean, a dispersion, a standard distribution, skewness, or kurtosis of ultrasound data B, C, D, and E.
  • the specular light color Cp, the specula reflection coefficient K s , and the target object color O s in Equations 1 and 2 may be statistical values of a distance between voxels of a target object.
  • FIGS. 19A and 19B illustrate images in which different pieces of attribute information of a window area of a target object are respectively reflected according to specular light exponents, according to an embodiment of the present invention.
  • FIG. 19A is an ultrasound image of the target object when the specular light exponent N in Equation 2 is 10
  • FIG. 19B is an ultrasound image of the target object when the specular light exponent N in Equation 2 is 40.
  • the ultrasound image of FIG. 19A has a small specular light exponent, and thus a highlighted portion is wide on the ultrasound image of FIG. 19A .
  • FIGS. 20A and 20B illustrate images in which different pieces of attribute information of a window area of a target object have been respectively reflected according to specula reflection coefficients, according to an embodiment of the present invention.
  • FIG. 20A is an ultrasound image of the target object when the specula reflection coefficient K s in Equation 2 is 0.19
  • FIG. 20B is an ultrasound image of the target object when the specula reflection coefficient K s in Equation 2 is 0.5.
  • the ultrasound image of FIG. 20A has a small specula reflection coefficient, and thus a highlighted portion on the ultrasound image of FIG. 20A is relatively weak.
  • FIG. 21 illustrates a change in an image according to a specular light exponent and a specula reflection coefficient.
  • the specular light exponent increases, the size of a portion of the image from which light is reflected decreases.
  • FIGS. 22A and 22B illustrate ultrasound images for explaining a reflection light effect when two light sources are used, according to an embodiment of the present invention.
  • FIG. 22A is an ultrasound image obtained by using only a single white light source.
  • FIG. 22B is an ultrasound image obtained by using a white light source and a red light source. Compared with the ultrasound image of FIG. 22A , highlighted parts on the ultrasound image of FIG. 22B are redder.
  • the pixel color C o may be calculated by using Equation 3:
  • C o O s ⁇ 1 ⁇ C p ⁇ 1 ⁇ k s ⁇ 1 ⁇ Rz 1 n ⁇ 1 + ... + O sj ⁇ C pj ⁇ k sj ⁇ Rz j nj
  • the pixel color C o O s ⁇ C p ⁇ 1 ⁇ k s ⁇ 1 ⁇ Rz 1 n ⁇ 1 + ... + C pj ⁇ k sj ⁇ Rz j nj
  • Equation 3 may be applied according to various embodiments.
  • the specula reflection coefficients K s1 through K sj may have the same values.
  • FIG. 23 is a flowchart of a 3D reflection light effect rendering method according to an embodiment of the present invention.
  • the image generator 1370 of FIG. 13 may generate an image according to the 3D reflection light effect rendering method of FIG. 23 .
  • the image generator 1370 may obtain a representative voxel V that is to be displayed on a screen from among voxels on a path of a view vector projected from a point P on a screen.
  • the image generator 1370 may calculate a surface normal vector for the voxel V and calculate a reflection light vector via the surface normal vector.
  • the image generator 1370 may receive the specular light color Cp, the specula reflection coefficient K s , and the target object color O s from a user or may internally calculate the specular light color C p , the specula reflection coefficient K s , and the target object color O s .
  • the image generator 1370 may calculate the specular light color Cp, the specula reflection coefficient K s , and the target object color O s by using the attribute information such as a spatial frequency, an image intensity, an image histogram, a co-occurrence matrix, an LBP, or homogeneity of the ultrasound data described above with reference to FIGS. 3-8 .
  • the image generator 1370 may calculate a pixel color C o at the point P on the screen via Equation 1 by using the specular light color Cp, the specula reflection coefficient K s , the target object color O s , and the reflection light vector.
  • the image renderers 1371 of FIGS. 14 and 15 may generate images according to the 3D reflection light effect rendering method of FIG. 23 .
  • FIG. 24 is a view for illustrating a method of calculating a reflection light vector V reflect via a normal vector V normal .
  • the image generator 1370 may calculate the reflection light vector V reflect by using Equation 5.
  • the image generator 1370 may calculate the reflection light vector V reflect via the normal vector V normal and a light source vector V light by using Equation 5.
  • FIG. 25 is a flowchart of a 3D reflection light effect rendering method according to an embodiment of the present invention.
  • the image generator 1370 of FIG. 13 may generate an image according to the 3D reflection light effect rendering method of FIG. 25 .
  • the image generator 1370 extracts a depth map for a target object and a screen.
  • the image generator 1370 may calculate a surface normal vector and a reflection light vector for the depth map.
  • the image generator 1370 may receive the specular light color C p , the specula reflection coefficient K s , and the target object color O s from an external source or may internally calculate the specular light color C p , the specula reflection coefficient K s , and the target object color O s .
  • the image generator 1370 may calculate a pixel color C o at a single point P on the screen via Equation 2 by using the specular light color Cp, the specula reflection coefficient K s , the target object color O s , and the reflection light vector.
  • FIG. 26 is a flowchart of a 3D reflection light effect rendering method according to an embodiment of the present invention.
  • the image generator 1370 of FIG. 13 may generate an image according to the 3D reflection light effect rendering method of FIG. 26 .
  • the image generator 1370 extracts a depth map for a target object and a screen.
  • the image generator 1370 may calculate a surface normal vector for depth map and calculate a reflection light vector for plurality of light sources via surface normal vector.
  • the image generator 1370 may receive the specular light color Cp, the specula reflection coefficient K s , and the target object color O s from an external source or may internally calculate the specular light color Cp, the specula reflection coefficient K s , and the target object color O s .
  • the image generator 1370 may calculate a pixel color C o at a single point P on the screen via Equation 3 or 4 by using the specular light color Cp, the specula reflection coefficient K s , the target object color O s , and the reflection light vector.
  • Each of the 3D reflection light effect rendering methods of FIG. 24 , 25 , and 26 may be performed in the operation S500 of FIG. 10 .
  • Each of the 3D reflection light effect rendering methods of FIG. 24 , 25 , and 26 may be performed by the image generator 1370 of FIG. 13 , the image renderer 1371 of FIG. 14 , or the image renderer 1371 of FIG. 15 .
  • Each of the 3D reflection light effect rendering methods of FIG. 24 , 25 , and 26 may be performed together with grey scale rendering and diffused light rendering.
  • each of the 3D reflection light effect rendering methods of FIG. 24 , 25 , and 26 may be performed independently from grey scale rendering and diffused light rendering, and rendered images obtained by the 3D reflection light effect rendering method, the grey scale rendering, and the diffused light rendering may be synthesized into an image.
  • FIGS. 27 through 29 are views for describing a method of calculating surface information of a target object, according to an embodiment of the present invention.
  • each of the waves A through H has a base line.
  • the base line is a basis for a wave.
  • the base line may be calculated from a wave via smoothing and line fitting.
  • a wave may express a smoother surface of the target object in a direction from the wave H to the wave A, and also, a wave may express a rougher surface of the target object in a direction from the wave A to the wave H.
  • the surface information of the target object may be calculated from a 2D image (or a cross-section of a 3D ultrasound volume image) of the target object by using a wave and a base line.
  • various shapes of waves and base lines may be used instead of the waves and base lines of FIG. 27 .
  • a base line may be curved as illustrated in FIG. 28 .
  • the waves and base lines of FIG. 27 may be expanded into a 3D ultrasound volume to define a wave surface and a base surface.
  • a 2D image of FIG. 28 is expanded into a 3D image
  • a 3D image of FIG. 29 may be obtained.
  • a wave surface and a base surface of the 3D image may be calculated from the waves and base lines of the 2D image.
  • the embodiments of the present invention can be written as computer programs and can be implemented in general-use digital computers that execute the programs using a computer readable recording medium.
  • Examples of the computer readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.), optical recording media (e.g., CD-ROMs, or DVDs), etc.
  • magnetic storage media e.g., ROM, floppy disks, hard disks, etc.
  • optical recording media e.g., CD-ROMs, or DVDs
EP20140182742 2013-09-30 2014-08-29 Procédé et appareil de génération d'image tridimensionnelle d'objet cible Ceased EP2863363A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20130116899 2013-09-30
KR1020140085306A KR102377530B1 (ko) 2013-09-30 2014-07-08 대상체의 3차원 영상을 생성하기 위한 방법 및 장치

Publications (1)

Publication Number Publication Date
EP2863363A1 true EP2863363A1 (fr) 2015-04-22

Family

ID=51429097

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20140182742 Ceased EP2863363A1 (fr) 2013-09-30 2014-08-29 Procédé et appareil de génération d'image tridimensionnelle d'objet cible

Country Status (2)

Country Link
US (1) US9759814B2 (fr)
EP (1) EP2863363A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101930905B1 (ko) * 2017-02-28 2018-12-19 메디컬아이피 주식회사 의료영상의 영역 분리 방법 및 그 장치
JP6885896B2 (ja) * 2017-04-10 2021-06-16 富士フイルム株式会社 自動レイアウト装置および自動レイアウト方法並びに自動レイアウトプログラム
WO2019045144A1 (fr) 2017-08-31 2019-03-07 (주)레벨소프트 Appareil et procédé de traitement d'image médicale pour dispositif de navigation médicale
JP7078457B2 (ja) * 2018-05-29 2022-05-31 富士フイルムヘルスケア株式会社 血流画像処理装置及び方法
CN108876873B (zh) * 2018-06-22 2022-07-19 上海闻泰电子科技有限公司 图像生成方法、装置、设备和存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2208467A1 (fr) * 2009-01-20 2010-07-21 Kabushiki Kaisha Toshiba Appareil de diagnostic ultrasonore, appareil de traitement d'images ultrasonores, procédé de traitement d'images, programme d'affichage d'images, et produit de programme informatique
EP2253273A1 (fr) * 2009-05-18 2010-11-24 Medison Co., Ltd. Système de diagnostic à ultrasons et procédé d'affichage d'organes
US20130150719A1 (en) * 2011-12-08 2013-06-13 General Electric Company Ultrasound imaging system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2208467A1 (fr) * 2009-01-20 2010-07-21 Kabushiki Kaisha Toshiba Appareil de diagnostic ultrasonore, appareil de traitement d'images ultrasonores, procédé de traitement d'images, programme d'affichage d'images, et produit de programme informatique
EP2253273A1 (fr) * 2009-05-18 2010-11-24 Medison Co., Ltd. Système de diagnostic à ultrasons et procédé d'affichage d'organes
US20130150719A1 (en) * 2011-12-08 2013-06-13 General Electric Company Ultrasound imaging system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BINDER T ET AL: "Three-dimensional imaging of the heart using transesophageal echocardiography", PROCEEDINGS OF THE COMPUTERS IN CARDIOLOGY CONFERENCE. LONDON, SEPT. 5 - 8, 1993; [PROCEEDINGS OF THE COMPUTERS IN CARDIOLOGY CONFERENCE], LOS ALAMITOS, IEEE COMP. SOC. PRESS, US, vol. -, 5 September 1993 (1993-09-05), pages 21 - 24, XP010128894, ISBN: 978-0-8186-5470-1, DOI: 10.1109/CIC.1993.378514 *

Also Published As

Publication number Publication date
US9759814B2 (en) 2017-09-12
US20150093005A1 (en) 2015-04-02

Similar Documents

Publication Publication Date Title
US9659405B2 (en) Image processing method and apparatus
US20110125016A1 (en) Fetal rendering in medical diagnostic ultrasound
Smeets et al. Semi-automatic level set segmentation of liver tumors combining a spiral-scanning technique with supervised fuzzy pixel classification
US10628930B1 (en) Systems and methods for generating fused medical images from multi-parametric, magnetic resonance image data
CN106236133B (zh) 用于显示超声图像的方法和设备
EP3493161B1 (fr) Détermination de la fonction de transfert en imagerie médicale
US9759814B2 (en) Method and apparatus for generating three-dimensional (3D) image of target object
US20070165927A1 (en) Automated methods for pre-selection of voxels and implementation of pharmacokinetic and parametric analysis for dynamic contrast enhanced MRI and CT
CN106573150B (zh) 图像中血管结构的遮盖
US20150371420A1 (en) Systems and methods for extending a field of view of medical images
KR101728044B1 (ko) 의료 영상을 디스플레이 하기 위한 방법 및 장치
JP2016135252A (ja) 医用画像処理装置及び医用画像診断装置
US10342633B2 (en) Medical image data processing system and method
CN111836584A (zh) 超声造影成像方法、超声成像装置和存储介质
JP6564075B2 (ja) 医用画像を表示するための伝達関数の選択
KR20120102447A (ko) 진단장치 및 방법
Birkeland et al. The ultrasound visualization pipeline
KR102377530B1 (ko) 대상체의 3차원 영상을 생성하기 위한 방법 및 장치
US11259782B2 (en) Medical imaging data processing apparatus and method
WO2019040534A1 (fr) Dispositifs, systèmes et procédés de génération d'images 2d synthétiques
Looby et al. Unsupervised clustering method to convert high-resolution magnetic resonance volumes to three-dimensional acoustic models for full-wave ultrasound simulations
WO2012140396A1 (fr) Visualisation biomédicale
KR20160146487A (ko) 초음파 이미지 디스플레이 방법 및 이를 위한 장치
Krishnan et al. Algorithms, architecture, validation of an open source toolkit for segmenting CT lung lesions
US20230070102A1 (en) Volumetric lighting of 3d overlays on 2d images

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140829

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

R17P Request for examination filed (corrected)

Effective date: 20151021

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20170404

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20221019