EP1923839A1 - Ultraschall-Diagnosegerät und Verfahren zur Verarbeitung von Volumendaten - Google Patents

Ultraschall-Diagnosegerät und Verfahren zur Verarbeitung von Volumendaten Download PDF

Info

Publication number
EP1923839A1
EP1923839A1 EP07020529A EP07020529A EP1923839A1 EP 1923839 A1 EP1923839 A1 EP 1923839A1 EP 07020529 A EP07020529 A EP 07020529A EP 07020529 A EP07020529 A EP 07020529A EP 1923839 A1 EP1923839 A1 EP 1923839A1
Authority
EP
European Patent Office
Prior art keywords
cross sections
tracing
section
representative cross
manual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP07020529A
Other languages
English (en)
French (fr)
Other versions
EP1923839B1 (de
Inventor
Masaru Murashita
Masahi Nakamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Aloka Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2007104733A external-priority patent/JP4931676B2/ja
Application filed by Aloka Co Ltd filed Critical Aloka Co Ltd
Publication of EP1923839A1 publication Critical patent/EP1923839A1/de
Application granted granted Critical
Publication of EP1923839B1 publication Critical patent/EP1923839B1/de
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present invention relates to an ultrasound diagnostic apparatus and a volume data processing method, and more particularly to specification or measurement of an object tissue within a three-dimensional space.
  • a three-dimensional space (a three-dimensional data acquisition space) can be formed by scanning a scanning plane, which is formed by scanning an ultrasound beam, in the direction orthogonal to the scanning plane.
  • a Disk Summation method is conventionally known as a method for extracting an object tissue existing within a three-dimensional space and calculating the volume thereof. With this conventional method, each of a plurality of slice data items which are aligned across the object tissue is displayed as a tomographic image, and the outline of the object tissue is manually traced on each tomographic image, enabling the cross sectional area in each slice to be calculated. Then, the cross sectional area and the thickness (an interval between cross sections) for each slice is used to calculate an element volume of each portion (each disc) in the object tissue.
  • a total volume of the object tissue is obtained.
  • the volume of the object tissue cannot be obtained with a high precision unless the number of slice data items (i.e. the number of cross sections to be measured) is increased.
  • the burden of a user increases when manual tracing is required with respect to a great number of slice data items. While application of automatic tracing in place of manual tracing with respect to each cross section may be considered, if the outline includes an unclear portion ,automatic tracing cannot be performed or the reliability of tracing results is decreased.
  • JP2004-159997A describes processing for extracting a tissue (cardiac chamber or the like) existing within a three-dimensional space.
  • JP2002-224116A describes extraction of a tissue by using an outline model.
  • JP11-164834A describes technology of correcting a manual tracing result.
  • JP10-305033A , JP2002-330968A ; JP2000-107183A , and JP2000-296129 all describe technology of calculating a volume of a tissue.
  • the present invention advantageously provides an apparatus and a method which enables improving the precision of specification or measurement of an object tissue without imposing a significant burden on a user.
  • An ultrasound diagnostic apparatus includes a data acquisition section which performs transmission and reception of ultrasound with respect to a three-dimensional space including an object tissue to acquire volume data; a representative cross section processing section which, when a plurality of manual tracing lines are formed on a plurality of representative cross sections which are set with regard do the object tissue, applies first correction processing to each manual tracing line; a non-representative cross section processing section which forms, by interpolation processing based on a plurality of manual tracing lines to which the first correction processing has been applied, a plurality of interpolation tracing lines on a plurality of non-representative cross sections which are set with regard to the object tissue, and applies second correction processing to each interpolation tracing line; and a unit which extracts object tissue data from the volume data or calculates a volume of the object tissue, based on rows of tracing lines formed of a plurality of manual tracing lines to which the first correction processing has been applied and a plurality of interpolation tracing
  • a set of cross sections formed of a plurality of cross sections is set with respect to an object tissue (data corresponding to an object tissue) within volume data.
  • the set of cross sections is preferably set so as to cover a whole object tissue.
  • the set of cross sections may correspond to a set of scanning planes formed of a plurality of scanning planes or may be set independently from the set of scanning planes.
  • intervals between individual cross sections may be uniform or non-uniform.
  • a density of cross sections may be increased.
  • the set of cross sections is formed by including a plurality of representative cross sections and a plurality of non-representative cross sections.
  • a manual tracing line is formed on each representative cross section.
  • the first correction processing includes determination of necessity of correction and execution of correction.
  • each manual tracing line a determination at least as to whether or not correction is necessary is made.
  • each manual tracing line is automatically corrected as required. For example, automatic tracing can be applied with regard to a portion of an object tissue having a clear outline (boundary).
  • each interpolation tracing line is automatically formed by interpolation processing.
  • a second correction processing is applied to each interpolation tracing line.
  • the second correction processing similar to the first correction processing described above, includes determination of necessity of correction and execution of correction. Specifically, each interpolation tracing line is corrected based on the actual tissue outline as required, thereby obtaining an interpolation tracing line which has been corrected.
  • a surface shape of the object tissue can be imitated, and a three-dimensional image representing the surface of the object tissue can be formed or a volume of the object tissue can be calculated.
  • the representative cross section processing section includes a selection section which selects the plurality of representative cross sections from a set of cross sections which is set with respect to the object tissue, a unit which receives a tracing input from a user with respect to each representative cross section which is selected, and a first correction processing section which, for the first correction processing, determines whether or not correction can be applied for each point on each manual tracing line and corrects a position of a point which is determined to be correctable based on an actual tissue outline.
  • the representative cross sections are designated automatically or designated by a user.
  • the number of cross sections forming a set of cross sections or the number of representative cross sections in a set of cross sections is designated automatically.or designated by a user.
  • Tracing of a tissue outline by a user is normally performed on a tomographic image. While there are cases wherein fine unevenness cannot be sufficiently traced when drawing manual tracing lines, as long as the tissue outline is clear (or distinguishable), the fine unevenness can be traced with high precision by automatically correcting the manual tracing results (i.e. additional application of automatic tracing). Alternatively, when it is not necessary to perform faithful manual tracing with respect to such fine unevenness, the burden on a user can be reduced.
  • the first correction processing section determines whether or not correction can be performed by setting a cross line with regard to each point and performing edge detection on the cross line.
  • edge detection it is preferable that changes in the brightness value (a differential value) or the like are considered.
  • the non-representative cross section processing section includes a unit which forms an interpolation tracing line on each non-representative cross section by interpolation processing based on a plurality of manual tracing lines to which the first correction processing has been applied, and a second correction processing section which, for the second correction processing, determines whether or not correction can be performed for each point on each interpolation tracing line and corrects a position of a point which is determined to be correctable based on an actual tissue outline.
  • the interpolation tracing lines are automatically generated on the respective non-representative cross sections, and then each interpolation tracing line is corrected based on the actual tissue outline, so that the tracing results can be obtained with high precision.
  • the burden on the user can be lessened. Further, because the interpolation tracing lines are not only generated but also corrected based on the actual tissue outline, reliability of the interpolation processing results can be increased.
  • the second correction processing section determines whether or not correction can be performed by setting a cross line with regard to each point and performing edge detection on the cross line.
  • the ultrasound diagnostic apparatus includes a unit which receives designation of a base line extending through the object tissue, and a unit which sets a set of cross sections which are arranged in a direction of the base line and which are orthogonal to the base line, and the plurality of representative cross sections and the plurality of non-representative cross sections are determined from the set of cross sections.
  • a set of cross sections is preset prior to tracing, and some cross sections forming the set are designated as the representative cross sections, respectively, and other cross sections are designated as the non-representative cross sections, respectively.
  • a plurality of representative cross sections and a plurality of non-representative cross sections are set stepwise, so that a set of cross sections is consequently defined.
  • a method includes storing volume data acquired by performing transmission and reception of ultrasound with respect to a three-dimensional space including an object tissue; receiving a plurality of manual tracing operations with respect to a plurality of representative cross sections which are set with regard to the object tissue; performing correction processing by determining whether or not position correction can be performed, based on an actual tissue outline, with regard to each point on each manual tracing line formed on each of the representative cross sections and each point on each interpolation tracing line formed on each of non-representative cross sections which are set with regard to the object tissue and correcting a position of a point which is determined to be correctable; and extracting the object tissue or calculating a volume of the object tissue based on a row of a plurality of interpolation tracing lines and a plurality of manual tracing lines to which the correction processing has been applied.
  • An ultrasound diagnostic apparatus includes a data acquisition section which performs transmission and reception of ultrasound with respect to a three-dimensional space including an object tissue to acquire volume data; a representative cross section processing section which, when a plurality of manual tracing lines are formed on a plurality of representative cross sections which are set with regard to the object tissue, applies first correction processing to each manual tracing line; a similarity calculation section which calculates similarity for each pair of adjacent manual tracing lines in a plurality of manual tracing lines before or after the first correction processing; an additional representative cross section setting section which adds one or a plurality of additional representative cross sections in addition to the plurality of representative cross sections based on the similarity for each pair of adjacent manual tracing lines; an additional representative cross section processing section which applies the first correction processing to one or a plurality of manual tracing lines formed on the one or a plurality of additional representative cross sections; a non-representative cross section processing section which forms, by interpolation processing based on a plurality of manual
  • measurement of an object tissue can be performed by a combination of manual tracing, interpolation processing, and automatic tracing.
  • one or a plurality of additional representative cross sections are set as required between a pair of adjacent representative cross sections.
  • one or a plurality of manual tracing lines can be additionally generated between the pair of adjacent manual tracing lines.
  • the representative cross sections can be set with a high density in a portion of an object tissue where the shape changes significantly, for example, (i.e. because the manual tracing lines are formed with a high density,) the precision in measuring the object tissue can be increased.
  • the number of the representative cross sections (the original representative cross sections and the additional representative cross sections) can be minimized, so that an increase in the burden imposed by the manual tracing can be prevented.
  • the calculation of similarity may be performed with regard to a plurality of original manual tracing lines prior to the first correction processing or with regard to a plurality of original manual tracing lines after the first correction processing.
  • the first correction processing is applied to all the manual tracing lines.
  • the first correction processing for the first time which is applied to the plurality of original manual tracing lines and the first correction processing for the second time which is applied to the plurality of additional manual tracing lines are performed stepwise.
  • the similarity calculation section calculates the similarity by performing cross correlation operation between two manual tracing lines forming each pair of adjacent manual tracing lines.
  • the cross correlation operation information representing the degree of similarity of the outline shape between adjacent manual tracing lines can be obtained.
  • the additional representative cross section setting section adds the one or a plurality of additional representative cross sections between a pair of adjacent representative cross sections for which the similarity satisfies a predetermined addition condition.
  • the additional representative cross section setting section determines the number of additional representative cross sections to be added in accordance with the degree of the similarity.
  • a method includes storing volume data obtained by performing transmission and reception of ultrasound with respect to a three-dimensional space including an object tissue; receiving a plurality of manual tracing operations with respect to a plurality of representative cross sections which are set with regard to the object tissue; when one or a plurality of additional representative cross sections are set in addition to the plurality of representative cross sections, receiving one or a plurality of manual tracing operations with respect to the one or a plurality of additional representative cross sections; performing correction processing by determining whether or not position correction can be performed, based on an actual tissue outline, with regard to each point on each manual tracing line formed on each of the representative cross sections, on each manual tracing line formed on each of the additional representative cross sections, and on each interpolation tracing line formed on each of non-representative cross sections which are set with regard to the object tissue and correcting a position of a point which is determined to be correctable; and extracting the object tissue or calculating a volume of the object tissue based on a row of
  • Fig. 1 is a block diagram showing the overall structure of an ultrasound diagnosis apparatus according to an embodiment of the present invention.
  • This ultrasound diagnosis apparatus is intended for use in the medical field, and, in particular, has a function of extracting an object tissue within a living body and calculating the volume thereof.
  • the object tissue may include a placenta, a malignant tumor, a gallbladder, a thyroid gland, and so on.
  • processing of extracting a placenta as an object tissue will be described.
  • a 3D (three-dimensional) probe 10 is an ultrasound transmitter/receiver which is used in contact with a body surface or used in a state where it is inserted into cavities of humans.
  • the 3D probe 10 includes a 2D (two-dimensional) array transducer which is composed of a plurality of transducer elements arranged in the first and second directions.
  • An ultrasound beam is formed by the 2D array transducer and is scanned two-dimensionally, so that a three-dimensional echo data capturing space is formed as a three-dimensional space.
  • the three-dimensional space is formed as an assembly of a plurality of scanning planes each formed by scanning an ultrasound beam one dimensionally.
  • a mechanism for mechanically scanning a 1D array transducer may be provided, in place of the 2D array transducer, for forming a similar three-dimensional space.
  • a transmission section 12 functions as a transmitting beam former.
  • the transmission section 12 supplies a plurality of transmitting signals which are delayed to the 2D array transducer, and thus a transmitting beam is formed. Reflected waves from the living body are received by the 2D array transducer, which thereby outputs a plurality of receiving signals to a receiving section 14.
  • the receiving section 14 executes alignment and summation processing with respect to the plurality of receiving signals, thereby outputting a receiving signal (beam data) having been subjected to alignment and summation processing.
  • the receiving signal is then subjected to predetermined signal processing including detection, logarithmic transformation, and so on.
  • the beam data which is a receiving signal having been subjected to signal processing is stored in a 3D memory 16.
  • the 3D data memory 16 includes a three-dimensional memory space corresponding to a three-dimensional space which is a transmission/reception space within a living body.
  • coordinate transformation with regard to each data is executed.
  • the volume data is an assembly of a plurality of frame data items (a plurality of slice data items) corresponding to a plurality of scanning planes.
  • Each frame data is composed of a plurality of beam data items, and each beam data is formed of a plurality of echo data arranged in the depth directions.
  • the 3D data memory 16 and each of the elements which will be described after this 3D data memory 16 can be implemented as special-use hardware or as a software function.
  • a three-dimensional image forming section 18 executes image processing based on a volume rendering method, for example, based on the volume data stored in the 3D data memory 16, thereby forming a three-dimensional ultrasound image.
  • the image data thus obtained is transmitted to a display processing section 26.
  • An arbitrary tomographic image forming section 20 forms a tomographic image corresponding to an arbitrary cross section within a three-dimensional space which is set by a user.
  • data array corresponding to the arbitrary cross section is read from the 3D data memory 16, so that, based on the data array thus read, a B mode image is formed as an arbitrary tomographic image.
  • the resulting image data is then transmitted to the display processing section 26.
  • a tissue extraction section 22 which is a module that executes image processing specific to the present embodiment and which will be described in detail below, extracts an object tissue (object tissue data) existing within the three-dimensional space or the volume data space by using trace processing.
  • object tissue data object tissue data
  • trace processing manual tracing processing and interpolation processing are used in combination, and also automatic correction processing is used with respect to the respective processing results.
  • the object tissue data thus extracted is transmitted to the display processing section 26 and used for image display of the object tissue, and, in the present embodiment, is also transmitted to a volume calculation section 24.
  • the volume calculation section 24 is a module which obtains a volume of an object tissue by using the volume calculation method (Disk Summation Method) described above. More specifically, as a row of tracing lines formed of a plurality of closed loops are formed for the whole object tissue by the tissue extraction section 22, the volume calculation section 24 obtains the volume of the object tissue in an approximate manner based on these tracing lines. In this case, the distance between each closed loops, i.e. between each cross section, is also used. The data of the volume values thus calculated is transmitted to the display processing section 26.
  • the volume calculation method includes, in addition to the Disk Summation Method described above, Average Rotation Method, and so on.
  • Each of the modules i.e. the three-dimensional image forming section 18, the arbitrary tomographic image section 20, and the tissue extraction section 22, and so on, functions in accordance with an operation mode selected by a user, and data corresponding to the respective modes are input to the display processing section 26.
  • the display processing section 27 performs image combining processing, coloring processing, and so on, with respect to the input data, and outputs image data which is the processing result.
  • a three-dimensional ultrasound image, an arbitrary tomographic image, and so on, and also a three-dimensional image of the extracted tissue, and the volume thereof are displayed in the display section 28.
  • a control section 30 controls the operation of each section shown in Fig. 1, and particularly controls the tissue extraction processing and the volume calculation described above based on the parameters set by a user through an input section 32.
  • the control section 30 also controls data writing to the 3D data memory 16 and so on.
  • the input section 32 is formed of an operation panel including a keyboard, a trackball, and so on.
  • the control section 30 is formed of a CPU, an operation program, and so on.
  • the three-dimensional image processing, the arbitrary tomographic image forming processing, the tissue extraction processing, and the volume calculation may be executed by a single CPU.
  • FIG. 2 provides a flowchart for the entire process.
  • step S101 data is collected by using the 3D probe described above.
  • a plurality of scanning planes are formed.
  • Fig. 3 shows sectional data 40 corresponding to one scan plane.
  • an object for observation is a placenta, such as that shown in Fig. 3 as cross section 42A.
  • numeral 41 denotes a fetus
  • numeral 43 denotes a uterus
  • numeral 45 denotes amniotic fluid.
  • the placenta 42A is attached or joined to the inner wall of the uterus 43, and the image on the sectional data 40 at this joining portion, i.e. the boundary portion, shows little difference in brightness and is unclear.
  • volume data is assembled in the 3D data memory shown in Fig. 1.
  • the sectional data 40 as shown in Fig. 3 is sequentially stored in the 3D data memory 16, whereby a set of volume data is assembled in the memory 16, as shown in Fig. 4.
  • the volume data (or a three-dimensional space) 44 includes a placenta (placental data) 42, with other tissues not shown in Fig. 4.
  • step S103 while an arbitrary tomographic image is being displayed on the display section, the position of the cross section is approximately adjusted, thereby setting a base cross section 46 as shown in Fig. 5.
  • this base cross section is used for merely determining the basis for setting a row of reference cross sections (a set of cross sections), it is sufficient to set the base cross section 46 such that approximately the entire of the placenta 42 is covered, and strict positioning of the base cross section 46 is not required.
  • Fig. 6 shows, in an enlarged view, a tomographic image 46A corresponding to the base cross section 46.
  • the placenta 42B is attached to (the cross section of) the uterus 43.
  • step S104 points at both ends of the placenta 42B which is an object tissue are designated by a user on the tomographic image 46A representing the base cross section.
  • the points at both ends are designated with numerals 50 and 52.
  • a straight line connecting these points is a base line 54.
  • step S104 the number m of cross sections forming a row of reference cross sections which will be described below is also set.
  • a row of reference cross sections 56 are automatically generated with regard to the volume data 44 corresponding to a three-dimensional space, as shown in Fig. 8.
  • the row of reference cross sections 56 is formed as a plurality of cross sections orthogonal to the base line shown in Fig. 7, and is specifically formed of a plurality of cross sections arranged at equal or non-equal intervals from one end point 50 to the other end point 52. More specifically, the row of reference cross sections 56 includes a plurality of manual tracing reference cross sections 58 and a plurality of automatic tracing reference cross sections 60.
  • the manual tracing reference cross sections 58 are formed in a predetermined number n which is in the range of 5 to 10, for example.
  • the manual tracing reference cross sections may be additionally set in a number exceeding n.
  • the manual tracing reference cross section corresponds to a representative cross section.
  • the burden of the user can be significantly lessened.
  • the individual automatic tracing reference cross sections 60 on the other hand, the tracing lines are automatically generated by interpolation processing, as will be described below.
  • step S105 of Fig. 2 the reference cross sections in the number n which is automatically selected or selected by the user are displayed as a tomographic image on a screen.
  • Fig. 9 shows n reference cross sections 58.
  • Fig. 10 shows a plurality of tomographic images 58A corresponding to the n reference cross sections 58, which form an image list (an image collection) 64.
  • a plurality of tomographic images 58A may be displayed sequentially one by one or simultaneously displayed.
  • step S106 manual tracing is performed with respect to the individual tomographic images 58A (see Fig. 10). Specifically, the user, while observing an image, traces the outline, i.e. the boundary, of the placenta by using the input section. The result of tracing is shown in Fig. 11.
  • numeral 42B denotes a cross section of the placenta
  • numeral 42C denotes the outline of the placenta.
  • the outline includes an unclear portion in contact with the uterus and a relatively clear portion in contact with the amniotic fluid. Even an unclear portion can be traced to a certain degree due to human visual judgment and empirical understanding.
  • a manual tracing line denoted by numeral 66 is formed by tracing the outline 42C as a closed loop.
  • the manual tracing line 66 includes a portion 66B located on the uterus 43 side and a portion 66A located on the amniotic fluid 45 side. In the both portions 66A and 66B, precise or faithful tracing cannot be achieved with respect to fine unevenness of the uterus, and only the approximate shape of the uterus is imitated as a loop line.
  • step S107 shown in Fig. 2 automatic correction of the manual tracing line is performed as shown in Fig. 12.
  • This correction processing is performed for each manual tracing line, i.e. for each manual tracing reference cross section serving as a representative cross section.
  • edge detection processing is applied to each point on the manual tracing line with respect to its peripheral portion, and, with regard to a point for which an edge has been detected, processing of shifting that point to the position on the edge is performed. With regard to a point for which no edge is detected, the manual tracing result is maintained.
  • an automatic tracing line portion 68A obtained by faithful tracing along the outline 42C is formed with regard to the line portion on the amniotic fluid 45 side, as shown in Fig. 12.
  • the portion of the line on the uterus 43 side because automatic tracing cannot be appropriately performed due to poor difference in brightness, the original tracing result, which is the manual tracing line 68B, is retained.
  • the manual tracing result is preferentially adopted in consideration of errors which could result from automatic tracing due to lack of clarity of the image, although a fine unevenness shape cannot be faithfully traced by the manual tracing.
  • Fig. 13 shows a specific method of correcting a point.
  • a direction 74 which is orthogonal to the tracing line 66 is defined, and edge detection is executed along this direction 74 in both directions from the point 72.
  • the range of edge detection is indicated by numeral 76.
  • edge detection a differential value is calculated at each detection position, and a position is determined to be an edge when the differential value at that position exceeds a predetermined value.
  • updating processing for shifting the point 72 to a new position 78 is performed.
  • a tracing line portion which coincides with the outline can be generated.
  • the differential processing is applied as described above, various other processing can also be applied as long as an edge can be detected.
  • the direction of edge detection may be various directions other then the orthogonal direction. For example, when a center point of a tissue is known, a straight line connecting the noted point and the center point may be defined as a detection direction.
  • step S108 shown in Fig. 2 with regard to the plurality of automatic tracing reference cross sections shown in Fig. 8, automatic tracing processing is executed.
  • a plurality of composite tracing lines 68 which are a plurality of manual tracing lines to which the first correction processing has been applied, formed on the plurality of manual tracing reference cross sections 58, are used as a basis.
  • interpolation processing is performed, so that a tracing plane 70 obtained by coupling a plurality of closed loops into a plane shape is formed.
  • the interpolation processing be executed such that an interpolation tracing line is generated on each automatic tracing reference cross section 60.
  • Fig. 15 shows sectional data 60 on the automatic tracing reference cross section corresponding to a non-representative cross section.
  • an interpolation tracing line 74 which is a closed loop substantially enclosing the outline 42C of the placenta 42B which is an object tissue is automatically set.
  • This interpolation tracing line 74 is not a result of actual tracing, but is generated by interpolation processing based on two or more composite tracing lines provided on the manual tracing cross sections existing before and after the noted automatic tracing reference cross section. Accordingly, the interpolation tracing line 74 may completely or partially deviate from the actual outline 42C.
  • step S109 of Fig. 2 the processing for automatically correcting the tracing lines as described with reference to Figs. 12 and 13 is executed. Specifically, as shown in Fig. 16, a line portion in contact with the amniotic fluid is subjected to the automatic correction, whereas a line portion in contact with the uterus 43 is not subjected to the automatic tracing correction and the tracing result by means of the interpolation processing is maintained. Consequently, the interpolation tracing line which is partially corrected is formed.
  • the second correction processing as described above is executed with regard to each of the automatic tracing reference cross sections.
  • a tracing plane having been subjected to the automatic correction which encloses or imitates the whole placenta which is an object tissue can be defined.
  • the tracing plane 72 is specifically formed of a plurality of closed loops, i.e. a plurality of tracing lines which include the plurality of composite tracing lines (the plurality of manual tracing lines having been subjected to the first correction processing) and the plurality of interpolation tracing lines having been subjected to the second correction processing.
  • step S111 a three-dimensional image of the tissue which is extracted is displayed on the screen along with the volume data calculated concerning the tissue.
  • the manual tracing need not be performed with regard to the non-representative cross sections.
  • the automatic tracing is applied, and the automatic correction is further applied to the automatic tracing results, thereby processing an image such that the image with high reliability can be obtained.
  • the user's burden can also be reduced and highly precise calculation can be performed.
  • the base cross section, both end points, the number of representative cross sections, the number of cross sections forming a cross section set, and so on are set by a user
  • all or a portion of these items may be set automatically.
  • the values of these items may be dynamically varied depending on the situation.
  • a plurality of cross sections are basically arranged at an equal interval, it is also possible to reduce the interval between the cross sections, i.e. to arrange the cross sections close to each other in the portion of a tissue in which a change in the shape is great, and to increase the interval between the cross sections, i.e. to arrange the cross sections distant from each other in other portions.
  • FIG. 18 is a flowchart showing a process of extracting an object tissue. It should be noted that steps similar to those shown in Fig. 2 are designated by the same numerals in Fig. 18, and that these steps will not be described again.
  • the process shown in Fig. 18 further includes step S200 for generating one or a plurality of additional manual tracing reference cross sections (i.e. one or a plurality of additional manual tracing lines) between the steps S107 and S108, as required.
  • This step S200 aims at increasing the density of the manual tracing operations to thus enhance the tissue extraction precision with regard to a portion with a great change in the shape.
  • Fig. 19 conceptually shows the processing content.
  • an additional manual tracing reference cross section 102 is additionally set between the specified pairs of adjacent manual tracing lines (see the pairs 100-1 and 100-2 in Fig. 19).
  • a step corresponding to the addition processing step S200 can be provided between the steps S106 and S107.
  • automatic correction processing (S107) is applied with respect to a group of manual tracing lines including these additional manual tracing lines.
  • step S201 A series of steps starting from step S201 are performed for each pair of adjacent manual tracing lines, with the object pair of adjacent manual tracing lines being shifted one by one.
  • a pair of adjacent manual tracing lines is formed of two adjacent manual tracing lines (which are two composite tracing lines when automatic correction has been already applied). Assuming that the number of manual tracing reference cross sections prior to the addition processing is n, the pair of adjacent manual tracing lines in the number (n-1) is formed.
  • step S201 similarity is calculated with regard to the pair of adjacent manual tracing lines which is under consideration at that time (i.e. an object pair of adjacent manual tracing lines).
  • a cross-correlation operation for example, is executed. More specifically, information representing the similarity of the shapes of the two manual tracing lines is obtained. This information serves as a criterion when determining whether or not a manual tracing reference cross section should be added, i.e. whether or not a portion has a significant change in the shape.
  • Fig. 20 shows a specific example of similarity calculation.
  • sampling processing is applied to each of two manual tracing lines 102A and 102B which are adjacent to each other, to thereby set rows of points 104A and 104B, respectively.
  • a plurality of point pairs are defined between these rows of points 104A and 104B in accordance with a predetermined rule.
  • each point pair is formed of two points which are in a corresponding relationship (see numeral 106).
  • a variety of suitable rules can be selectively adopted in accordance with the situation.
  • a first point pair is specified for two points which are the most closely adjacent to each other, and then pairing is sequentially performed from the first point pair in the predetermined rotating direction, to finally specify the n-th point pair.
  • Fig. 21 shows a result of such processing. Specifically, as described above, two rows of points 104A and 104B are set on the two manual tracing lines 102A and 102B on the two adjacent manual tracing reference cross sections 58A and 58B, respectively. Then, a centroid is obtained with regard to each of the manual tracing lines 102A and 102B, and a distance between the centroid and each point is calculated. As indicated by numeral 108 in Fig. 21, when the distance to the centroid is represented for each sample point number t with respect to each row of points 104A, 104B, graphs (functions) f A (t) and f B (t) are obtained.
  • the similarity ⁇ as a cross-correlation coefficient can be obtained (similarity ⁇ : 0.0 to 1.0).
  • the method of obtaining the similarity ⁇ is not limited to the example described above. In any case, information indicating the similarity of shapes between two tracing lines is obtained.
  • ⁇ 0 N - 1 ⁇ f A t ⁇ f B t d t ⁇ 0 N - 1 ⁇ f A 2 t d t ⁇ ⁇ 0 N - 1 ⁇ f B 2 t d t
  • step S202 whether or not the similarity ⁇ described above is equal to or less than a predetermined value is determined. If the similarity ⁇ exceeds the predetermined value, the two shapes are determined to be similar, and the process proceeds to step S205 and addition of cross sections is not performed. If the similarity ⁇ is equal to or less than the predetermined value, on the other hand, the two shapes are determined to have no similarity, i.e. a change in the shape of the object tissue is determined to be great. Consequently, in step S203, one or a plurality of additional manual tracing reference cross sections are set between the two adjacent manual tracing reference cross sections.
  • the number of cross sections to be added may be one fixed number or may be varied depending on the situation.
  • each of the added manual tracing reference cross sections is displayed as an image, and the user performs manual tracing while observing the image. Then, each manual tracing line is automatically corrected.
  • an additional manual tracing operation may be performed after completion of the determination of necessity of addition with respect to all the pairs of adjacent manual tracing lines.
  • step S205 whether or not the above-described determination concerning the addition processing and the necessary addition processing is completed is determined with regard to all the pairs of adjacent manual tracing lines, and if there are any unprocessed pairs of adjacent manual tracing lines, the above processing is repeated with respect to these unprocessed pairs.
  • step S203 several methods of adding a manual tracing reference cross section may be applied, as will be described below with reference to Figs. 22 and 23. It should be noted that the correction of the process shown in Fig. 18 should be determined based upon the addition method to be employed.
  • Fig. 22 shows three types of addition methods.
  • a single predetermined value (threshold value) described above is used.
  • the similarity ⁇ equals to or is less than the predetermined value
  • only one manual tracing reference cross section is added.
  • Fig. 22, in the upper level further shows the condition and specific example of this method.
  • one reference cross section is added in each of a space between the cross sections B and C and a space between the cross sections C and D (see the hatched portions).
  • a plurality of different threshold values are used.
  • a plurality of threshold values are used as indicated as conditions in the table, and either 1, 2, or 3 is selected as the number of cross sections to be added for each range of the threshold values.
  • no reference cross sections are to be added if the similarity exceeds the maximum threshold value.
  • one reference cross section is added between B and C, and two reference cross sections are added between C and D.
  • the number of cross sections to be added is previously determined as k, and the additional cross sections in the number k are dynamically distributed. This method will be described with reference to Fig. 23. It is first assumed that an increment ⁇ has been obtained from ( ⁇ MAX - ⁇ MIN )/k.
  • ⁇ MAX indicates the maximum value of all the ⁇ values in its initial state
  • ⁇ MIM indicates the minimum value of all the ⁇ values in its initial state.
  • a cross section pair having ⁇ with the minimum value is specified among all the cross section pairs, and one additional cross section is inserted.
  • insertion of one additional cross section between C and D is determined.
  • the increment ⁇ described above is added to the minimum ⁇ (i.e. the original ⁇ between C and D) (at this point in time, the minimum value is raised).
  • ⁇ having the minimum value is specified again.
  • any appropriate method may be selected in accordance with a particular situation.
  • the method shown in the lower level of the table in Fig. 22 is advantageous in that a further number of cross sections can be added in the position where the cross sections should be preferentially added, within a limited number of cross sections to be added. According to the methods shown in the upper and middle levels, on the other hand, simple processing can be expected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Processing (AREA)
EP07020529.9A 2006-11-14 2007-10-19 Ultraschall-Diagnosegerät und Verfahren zur Verarbeitung von Volumendaten Not-in-force EP1923839B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006307663 2006-11-14
JP2007104733A JP4931676B2 (ja) 2006-11-14 2007-04-12 超音波診断装置及びボリュームデータ処理方法

Publications (2)

Publication Number Publication Date
EP1923839A1 true EP1923839A1 (de) 2008-05-21
EP1923839B1 EP1923839B1 (de) 2016-07-27

Family

ID=39111041

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07020529.9A Not-in-force EP1923839B1 (de) 2006-11-14 2007-10-19 Ultraschall-Diagnosegerät und Verfahren zur Verarbeitung von Volumendaten

Country Status (2)

Country Link
US (1) US8216144B2 (de)
EP (1) EP1923839B1 (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110028841A1 (en) * 2009-07-30 2011-02-03 Medison Co., Ltd. Setting a Sagittal View In an Ultrasound System
US20110028842A1 (en) * 2009-07-30 2011-02-03 Medison Co., Ltd. Providing A Plurality Of Slice Images In An Ultrasound System

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5632840B2 (ja) * 2008-08-12 2014-11-26 コーニンクレッカ フィリップス エヌ ヴェ 超音波イメージング・システムにおける体積のメッシュ作成及び計算の方法、装置及びコンピュータ・プログラム
JP5683868B2 (ja) * 2009-10-08 2015-03-11 株式会社東芝 超音波診断装置、超音波画像処理装置、超音波画像処理方法、及び超音波画像処理プログラム
KR101805619B1 (ko) * 2011-01-25 2017-12-07 삼성전자주식회사 3차원 의료 영상으로부터 최적의 2차원 의료 영상을 자동으로 생성하는 방법 및 장치
JP5325951B2 (ja) * 2011-08-17 2013-10-23 日立アロカメディカル株式会社 超音波データ処理装置
JP5857061B2 (ja) * 2011-10-03 2016-02-10 株式会社日立製作所 画像処理装置および画像処理方法
WO2013094205A1 (ja) * 2011-12-21 2013-06-27 パナソニック株式会社 超音波診断装置および輪郭抽出方法
JP6200249B2 (ja) * 2013-09-11 2017-09-20 キヤノン株式会社 情報処理装置、情報処理方法
KR20150082945A (ko) * 2014-01-08 2015-07-16 삼성메디슨 주식회사 초음파 진단 장치 및 그 동작방법
KR101665124B1 (ko) 2014-08-25 2016-10-12 삼성메디슨 주식회사 초음파 영상장치 및 그 제어방법
EP3454759B1 (de) * 2016-05-12 2021-08-18 Fujifilm Sonosite, Inc. Systeme und verfahren zur bestimmung der abmessungen von strukturen in medizinischen bildern
CN118356213A (zh) * 2018-11-09 2024-07-19 深圳迈瑞生物医疗电子股份有限公司 一种超声图像获取方法、系统和计算机存储介质
CN112426170A (zh) * 2020-11-19 2021-03-02 深圳开立生物医疗科技股份有限公司 一种胎盘厚度确定方法、装置、设备及存储介质
CN112932556A (zh) * 2021-04-02 2021-06-11 吉林大学第一医院 一种智能辅助羊膜穿刺系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001001864A1 (en) * 1999-07-02 2001-01-11 The Cleveland Clinic Foundation Intravascular ultrasonic analysis using active contour method and system
US20030198372A1 (en) * 1998-09-30 2003-10-23 Yoshito Touzawa System for accurately obtaining a contour and/or quantitative information from an echo image with reduced manual operation
EP1720039A2 (de) * 2005-04-26 2006-11-08 Biosense Webster, Inc. Darstellung eines zweidimensionalen fächerförmigen Ultraschallfeldes

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10305033A (ja) 1997-05-08 1998-11-17 Toshiba Iyou Syst Eng Kk 超音波画像処理装置及び臓器の容積算出用プログラムを記録した記録媒体
JP3283456B2 (ja) * 1997-12-08 2002-05-20 オリンパス光学工業株式会社 超音波画像診断装置及び超音波画像処理方法
US6545678B1 (en) * 1998-11-05 2003-04-08 Duke University Methods, systems, and computer program products for generating tissue surfaces from volumetric data thereof using boundary traces
JP2000296129A (ja) 1999-04-13 2000-10-24 Olympus Optical Co Ltd 超音波診断装置
JP4614548B2 (ja) 2001-01-31 2011-01-19 パナソニック株式会社 超音波診断装置
JP4149177B2 (ja) 2001-03-05 2008-09-10 松下電器産業株式会社 超音波診断装置及び画像処理装置
JP3944059B2 (ja) 2002-11-14 2007-07-11 アロカ株式会社 超音波診断装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030198372A1 (en) * 1998-09-30 2003-10-23 Yoshito Touzawa System for accurately obtaining a contour and/or quantitative information from an echo image with reduced manual operation
WO2001001864A1 (en) * 1999-07-02 2001-01-11 The Cleveland Clinic Foundation Intravascular ultrasonic analysis using active contour method and system
EP1720039A2 (de) * 2005-04-26 2006-11-08 Biosense Webster, Inc. Darstellung eines zweidimensionalen fächerförmigen Ultraschallfeldes

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BARONI MAURIZIO ET AL: "Contour definition and tracking in cardiac imaging through the integration of knowledge and image evidence.", ANNALS OF BIOMEDICAL ENGINEERING MAY 2004, vol. 32, no. 5, May 2004 (2004-05-01), pages 688 - 695, XP002471673, ISSN: 0090-6964 *
LANDRY ANTHONY ET AL: "Measurement of carotid plaque volume by 3-dimensional ultrasound.", STROKE; A JOURNAL OF CEREBRAL CIRCULATION APR 2004, vol. 35, no. 4, April 2004 (2004-04-01), pages 864 - 869, XP002471674, ISSN: 1524-4628 *
TAMURA S ET AL: "Plan-Based Boundary Extraction and 3-D Reconstruction for orthogonal 2-D Echocardiography", PATTERN RECOGNITION, ELSEVIER, GB, vol. 20, no. 2, 1987, pages 155 - 162, XP009018444, ISSN: 0031-3203 *
VARSHA SAMPATH: "Transrectal ultrasound image processing for brachytherapy applications", 25 September 2006, ROCHESTER, XP002471675 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110028841A1 (en) * 2009-07-30 2011-02-03 Medison Co., Ltd. Setting a Sagittal View In an Ultrasound System
US20110028842A1 (en) * 2009-07-30 2011-02-03 Medison Co., Ltd. Providing A Plurality Of Slice Images In An Ultrasound System
EP2281510A1 (de) 2009-07-30 2011-02-09 Medison Co., Ltd. Bereitstellung mehrere Slice-Bilder in einem Ultraschallsystem
EP2281509A1 (de) 2009-07-30 2011-02-09 Medison Co., Ltd. Einstellung einer Sagittalansicht in einem Ultraschallsystem
KR101117916B1 (ko) 2009-07-30 2012-02-24 삼성메디슨 주식회사 새지털 뷰를 검출하는 초음파 시스템 및 방법
US9216007B2 (en) 2009-07-30 2015-12-22 Samsung Medison Co., Ltd. Setting a sagittal view in an ultrasound system

Also Published As

Publication number Publication date
EP1923839B1 (de) 2016-07-27
US8216144B2 (en) 2012-07-10
US20080114244A1 (en) 2008-05-15

Similar Documents

Publication Publication Date Title
EP1923839B1 (de) Ultraschall-Diagnosegerät und Verfahren zur Verarbeitung von Volumendaten
US9901323B2 (en) Aberration correction using channel data in ultrasound imaging system
JP4931676B2 (ja) 超音波診断装置及びボリュームデータ処理方法
US7520857B2 (en) 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US7744534B2 (en) 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US7041059B2 (en) 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US7632230B2 (en) High resolution elastography using two step strain estimation
EP1973076B1 (de) Ultraschallsystem und Verfahren zum Erzeugen von Ultraschallbildern
US20040127796A1 (en) 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US20060079777A1 (en) Ultrasonic image boundary extracting method, ultrasonic image boundary extracting apparatus, and ultrasonic imaging apparatus
JP2006006932A (ja) 超音波造影用プロトコルを規定する方法及び装置
US9402600B2 (en) 3-dimensional elastic image generation method and ultrasonic diagnostic apparatus
CN102247171A (zh) 超声波诊断装置、超声波图像处理装置以及医用图像诊断装置
KR20080019186A (ko) 영상 처리 시스템 및 방법
CN102415902A (zh) 超声波诊断装置以及超声波图像处理装置
US8300909B2 (en) Ultrasonographic device and ultrasonographic method
US7223240B2 (en) Ultrasonic diagnostic apparatus
CN102549450A (zh) 超声无回声成像
KR101120726B1 (ko) 복수의 슬라이스 단면 영상을 제공하는 초음파 시스템 및 방법
US20120108962A1 (en) Providing a body mark in an ultrasound system
KR101652728B1 (ko) 초음파 영상 화질 개선 방법 및 이를 이용한 초음파 영상 장치
EP0934724A1 (de) Diagnostiches Ultraschallgerät
KR100885435B1 (ko) 영상 처리 시스템 및 방법
JP5460484B2 (ja) 超音波データ処理装置
JP5490649B2 (ja) 超音波データ処理装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

17P Request for examination filed

Effective date: 20081024

AKX Designation fees paid

Designated state(s): DE FR GB IT

17Q First examination report despatched

Effective date: 20110411

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: HITACHI ALOKA MEDICAL, LTD.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20160308

RIN1 Information on inventor provided before grant (corrected)

Inventor name: MURASHITA, MASARU

Inventor name: NAKAMURA, MASAHI

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB IT

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602007047148

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20161011

Year of fee payment: 10

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602007047148

Country of ref document: DE

Owner name: HITACHI LTD, JP

Free format text: FORMER OWNER: HITACHI ALOKA MEDICAL, LTD., MITAKA-SHI, TOKYO, JP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602007047148

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20161027

26N No opposition filed

Effective date: 20170502

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20170630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161102

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161027

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602007047148

Country of ref document: DE

Representative=s name: WUNDERLICH & HEIM PATENTANWAELTE PARTNERSCHAFT, DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602007047148

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180501