JP5726081B2 - Ultrasonic diagnostic apparatus and elasticity image classification program - Google Patents

Ultrasonic diagnostic apparatus and elasticity image classification program Download PDF

Info

Publication number
JP5726081B2
JP5726081B2 JP2011531908A JP2011531908A JP5726081B2 JP 5726081 B2 JP5726081 B2 JP 5726081B2 JP 2011531908 A JP2011531908 A JP 2011531908A JP 2011531908 A JP2011531908 A JP 2011531908A JP 5726081 B2 JP5726081 B2 JP 5726081B2
Authority
JP
Japan
Prior art keywords
frame data
elastic
image
plurality
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2011531908A
Other languages
Japanese (ja)
Other versions
JPWO2011034005A1 (en
Inventor
明子 外村
明子 外村
康治 脇
康治 脇
Original Assignee
株式会社日立メディコ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2009214115 priority Critical
Priority to JP2009214115 priority
Application filed by 株式会社日立メディコ filed Critical 株式会社日立メディコ
Priority to PCT/JP2010/065623 priority patent/WO2011034005A1/en
Priority to JP2011531908A priority patent/JP5726081B2/en
Publication of JPWO2011034005A1 publication Critical patent/JPWO2011034005A1/en
Application granted granted Critical
Publication of JP5726081B2 publication Critical patent/JP5726081B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0858Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties

Description

  The present invention relates to an ultrasonic diagnostic apparatus, an elasticity image classification method, and an elasticity image classification program, and in particular, elasticity generated based on elasticity information indicating the hardness or softness of a tissue on a tomographic plane of a subject. The present invention relates to a technique for classifying an image into one of a plurality of stages set according to the degree of progression of a disease in a tissue.

  The ultrasonic diagnostic device transmits ultrasonic waves to the inside of the subject using an ultrasonic probe having a plurality of ultrasonic transducers, receives a reflected echo signal corresponding to the structure of the living tissue from the inside of the subject, and reflects it. For example, a tomographic image such as a B-mode image is generated based on the echo signal and displayed for diagnosis.

  In recent years, as described in Patent Document 1, an ultrasonic reception signal (RF signal) is measured while pressing a subject with an ultrasonic probe by a manual or mechanical method, and the tissue on the tomographic plane is hardened. An elastic image representing the softness or softness is generated. In other words, the displacement generated in each part of the tissue due to compression based on a pair of RF signal frame data with different compression states on the tissue, and a frame of elastic information such as strain amount or elastic modulus based on the obtained displacement frame data Data is calculated, and an elastic image is generated and displayed based on elastic frame data.

  Elastic images are expected not only to diagnose mass lesions such as cancer, but also to diagnose diffuse diseases. That is, in the case of a diffuse disease, when local hard tissues such as nodules are scattered in the surrounding soft tissue, the elastic image reflects the mottled pattern of the hard tissue. For example, when the disease progresses from hepatitis to cirrhosis and fibrosis progresses, the nodule spreads into the liver parenchyma, and the mottled pattern of the sclerosing tissue in the elastic image becomes complicated. The examiner observes the elasticity image and evaluates the progress of the disease based on the state of the mottled pattern of the sclerotic tissue in the elasticity image. That is, the elastic image is classified into one of a plurality of stages set according to the degree of progression of the disease of the tissue (stage classification).

  However, when the examiner visually observes the elastic image and classifies the stage of the elastic image, the evaluation results vary among the examiners, so that it is required to objectively evaluate the progress of the disease.

  In this regard, as described in Patent Document 2, for example, the elasticity information of each measurement point of the elastic frame data is binarized to obtain a region of the hardened tissue, and the complexity of the shape of this region is expressed as Complexity = ( It is known to objectively evaluate the progress of disease based on the degree of complexity, using the formula of (perimeter of region) 2 / area of region.

JP-A-5-317313 JP 2008-212522 A

  However, it is considered that the technique of Patent Document 2 has room for improvement in terms of stabilizing the result of staging of elastic images with high accuracy.

  That is, in order to appropriately stage the elasticity image, it is necessary to use elasticity frame data generated in a state where appropriate compression is applied to the tissue on the tomographic plane of the subject. In this regard, the technique of Patent Document 2 selects the elastic frame data with the same pressure applied to the tissue of the tomographic plane from the plurality of elastic frame data acquired in time series, and for each elastic frame data It is for staging. Therefore, there is a risk that staging is performed based on elastic frame data whose compression state is not appropriate. In addition, since staging is performed based on a single elastic frame data, the classification result may vary due to variations in elastic frame data.

Therefore, the present invention stabilizes the result of staging of an elastic image with high accuracy , in particular, stabilizes the result of staging compared to performing staging based on a single elastic frame data. Let it be an issue.

The ultrasonic diagnostic apparatus of the present invention includes an ultrasonic probe, and phasing addition means for generating RF signal frame data of a tomographic plane of a subject based on a reflected echo signal measured by the ultrasonic probe. A displacement measuring means for generating displacement frame data by measuring the displacement of the tissue on the tomographic plane based on a pair of RF signal frame data having different compression states on the tissue on the tomographic plane of the subject; and the generated displacement frame Elastic information calculation means for generating elastic frame data by calculating elastic information representing the hardness or softness of the tissue on the tomographic plane based on the data, and an elastic image configuration for generating an elastic image based on the elastic frame data And the elasticity image is classified into one of a plurality of stages set according to the degree of disease progression of the tissue on the tomographic plane of the subject based on the means and the elasticity frame data An ultrasonic diagnostic apparatus having an image classifying unit, and an image display for displaying the classification result of the elasticity image and the elastic image, and on the basis of the electrocardiographic waveform of the subject, the mind of the subject Among a plurality of feature waveforms repeatedly appearing in the radio wave shape, a plurality of elastic frame data after a preset time from the appearance of the preset feature waveform is expressed as a compression state on the tissue on the tomographic plane of the subject. It is determined as appropriate plural elastic frame data for analysis, and has elastic frame selecting means for selecting the plural elastic frame data for analysis , and the image classification means is elastic based on the plural elastic frame data for analysis. Image staging is performed, and the image display further displays the determined result .

According to the present invention, it is possible to stabilize the result of staging of an elastic image with high accuracy , particularly to stabilize the result of staging compared to performing staging based on a single elastic frame data. Can do.

1 is a block diagram showing the overall configuration of an ultrasonic diagnostic apparatus according to a first embodiment The block diagram of the characteristic structure of the ultrasonic diagnostic apparatus of 1st Embodiment The figure explaining the selection method of the elastic frame data for analysis by a frame selection part Block diagram showing the configuration of the image classification unit Diagram showing the structure of a simple perceptron as an example of a neural network The block diagram which shows the characteristic structure of the ultrasound diagnosing device of 2nd Embodiment The figure explaining the selection method of the elastic frame data for analysis by a frame selection part

  Hereinafter, embodiments of an ultrasonic diagnostic apparatus, an elastic image classification method, and an elastic image classification program to which the present invention is applied will be described. In the following description, the same functional parts are denoted by the same reference numerals, and redundant description is omitted.

(First embodiment)
FIG. 1 is a block diagram showing the overall configuration of the ultrasonic diagnostic apparatus according to the first embodiment. This ultrasonic diagnostic apparatus generates a tomographic image of a tissue on a tomographic plane of a subject using ultrasonic waves, and generates an elastic image by obtaining elastic information indicating the hardness or softness of the tissue. .

  As shown in FIG. 1, the ultrasonic diagnostic apparatus 100 includes an ultrasonic probe 12 that is used in contact with a subject, and an ultrasonic wave that is transmitted to the subject via the ultrasonic probe 12 at time intervals. Transmitting unit 14 that repeatedly transmits, receiving unit 16 that receives a time-series reflected echo signal generated from the subject, ultrasonic transmission / reception control unit 17 that controls the transmitting unit 14 and the receiving unit 16, and the received reflected echo A phasing addition unit 18 for generating RF signal frame data in time series by phasing and adding, and performing various signal processing on the RF signal frame data phased and added by the phasing addition unit 18, for example, a tomographic image A tomographic image constructing unit 20 that generates a black and white tomographic image, and a black and white scan converter 22 that converts the output signal of the tomographic image constructing unit 20 to match the display of the image display 26 are provided.

  The ultrasound diagnostic apparatus 100 also includes an RF signal frame data selection unit 28 that selects a pair of RF signal frame data having different acquisition times for the RF signal frame data output from the phasing addition unit 18, and a pair of RF signals. Based on the frame data, a displacement measuring unit 30 that generates displacement frame data by measuring the displacement generated in the tissue of the tomographic plane of the subject (not shown), and based on the displacement frame data measured by the displacement measuring unit 30 An elasticity information calculation unit 32 that generates elasticity frame data by obtaining elasticity information (amount of strain or elastic modulus) representing the hardness or softness of the body tissue of the subject in a continuous compression process, and an elasticity information calculation unit 32 Based on the calculated elasticity information, an elasticity image construction unit 34 that constitutes an elasticity image, and a color scan converter 36 that converts the output signal of the elasticity image construction unit 34 to match the display of the image display 26 are provided. It has been.

  The ultrasonic diagnostic apparatus 100 includes a cine memory 48 in which the elastic frame data generated by the elastic information calculation unit 32 is stored, and an analysis image used for staging the elastic image from the elastic frame data stored in the cine memory 48. A frame selection unit 50 that selects elastic frame data, an evaluation data generation unit 52 that generates evaluation data used for staging the elastic image based on the analysis frame data, and a staging classification of the elastic image based on the evaluation data And an image classifying unit 54 to perform. The evaluation data generation unit 52 and the image classification unit 54 constitute image classification means. Details of the frame selection unit 50, the evaluation data generation unit 52, and the image classification unit 54 will be described later. In this specification, staging an elastic image means that the elastic image is classified into one of a plurality of stages set according to the degree of disease progression of the tissue on the tomographic plane of the subject. It shall refer to that.

  Further, for example, a control unit 56 composed of, for example, a CPU (Central Processing Unit) that controls each of the above-described components, and an instruction to control the ROI (Region Of Interest), the frame rate, etc. of the elastic image, for example, to the control unit 56 An interface unit 58 such as a mouse, a keyboard, a touch panel, or a trackball is provided.

  The ultrasonic probe 12 is formed by arranging a large number of transducers in a strip shape, and performs beam scanning mechanically or electronically. The ultrasonic probe 12 is in contact with the subject, transmits ultrasonic waves into the subject, and receives an echo signal from the subject. Although not shown, the ultrasonic probe 12 includes a transducer that is a source of ultrasonic waves and receives reflected echoes. Each vibrator generally has a function of converting an input pulse wave or continuous wave transmission signal into an ultrasonic wave and emitting it, and an electric wave reception signal by receiving an ultrasonic wave emitted from the inside of the subject. It is formed with the function of converting to and outputting.

  In general, the operation of compressing a subject in an elastic image using ultrasound is performed for the purpose of effectively giving a stress distribution in the body cavity of the diagnosis site of the subject while performing ultrasound transmission / reception with the ultrasound probe 12. Mount the compression plate so that it is aligned with the ultrasonic transmission / reception surface of the acoustic probe 12, and contact the compression surface composed of the ultrasonic transmission / reception surface of the ultrasonic probe 12 and the compression plate to the body surface of the subject. Then, a method is adopted in which the subject is compressed by manually moving the compression surface up and down. However, unlike the superficial regions such as the mammary gland, which are easy to approach from the body surface of the subject, the abdominal region such as the liver compresses the target tissue with the ultrasonic probe 12 to cause displacement and distortion. It can be difficult. Therefore, when targeting an abdominal region such as the liver, the amount of displacement or distortion caused by the pulsation of the heart or artery can be used.

  The transmission unit 14 generates a transmission pulse for generating an ultrasonic wave by driving the ultrasonic probe 12, and has a convergence point of the ultrasonic wave transmitted by the built-in transmission phasing / adding unit. The depth is set.

  The receiving unit 16 amplifies the reflected echo signal received by the ultrasonic probe 12 with a predetermined gain. A number of received signals corresponding to the number of amplified transducers are input to the phasing adder 18 as independent received signals. The phasing / adding unit 18 controls the phase of the received signal amplified by the receiving unit 16 and forms an ultrasonic beam at one or a plurality of convergence points. The ultrasonic transmission / reception control unit 17 controls the timing for transmitting and receiving ultrasonic waves.

  The tomographic image constructing unit 20 performs various signal processing such as gain correction, log correction, detection, contour emphasis, filter processing on the RF signal frame data from the phasing addition unit 18, and a tomographic image of the subject, for example, Construct a black and white tomographic image.

  The black-and-white scan converter 22 is for displaying the signal output from the tomographic image construction unit 20 on the image display 26, for example, for controlling the tomographic scanning means and the system for reading out at a television system cycle. For example, an A / D converter that converts a signal output from the tomographic image construction unit 20 into a digital signal, and a plurality of sheets that store tomographic image data digitized by the A / D converter in time series Frame memory and a controller for controlling these operations.

  The RF signal frame data selection unit 28 stores the RF signal frame data output one after another at the frame rate of the ultrasonic diagnostic apparatus from the phasing addition unit 18 in the frame memory provided in the RF signal frame data selection unit 28. (The currently reserved RF signal frame data is referred to as RF signal frame data N), and the past RF signal frame data N-1, N-2, N- 3 ... Select one RF signal frame data with a different compression state from NM (this is RF signal frame data X), and the displacement measurement unit 30 has a pair of RF signal frame data N and RF signal frame. It plays the role of outputting data X. Although the signal output from the phasing addition unit 18 is described as RF signal frame data, this may be, for example, a signal in the form of I and Q signals obtained by complex demodulation of the RF signal.

  The displacement measurement unit 30 performs one-dimensional or two-dimensional correlation processing based on the pair of RF signal frame data selected by the RF signal frame data selection unit 28, and the displacement or movement vector of each measurement point on the tomogram (Displacement direction and magnitude) is measured and displacement frame data is generated. Examples of the movement vector detection method include a block matching method and a gradient method. The block matching method divides the image into blocks consisting of N × N pixels, for example, searches the previous frame for the block closest to the target block in the current frame, and refers to these to predictive coding Is to do.

  The elasticity information calculation unit 32 calculates the strain amount and elastic modulus of each measurement point on the tomographic image from the displacement frame data output from the displacement measurement unit 30 to obtain numerical data (elastic frame data) of the strain amount or elastic modulus. It is generated and output to the elastic image construction unit 34. The calculation of the strain amount performed in the elasticity information calculation unit 32 is obtained by calculation by spatially differentiating the displacement. That is, assuming that the displacement measured by the displacement measuring unit 30 is ΔL, the strain amount (S) can be calculated by spatially differentiating ΔL, and thus is obtained using the equation S = ΔL / ΔX. For example, the Young's modulus Ym, which is one of the elastic moduli, is obtained by dividing the stress (pressure) at each calculation point by the strain amount at each calculation point, as shown in the following equation.

  Ymi, j = pressure (stress) i, j / (strain amount i, j) (i, j = 1, 2, 3,...) Where the indices of i and j represent the coordinates of the frame data. . The pressure applied to the body surface of the subject should be directly measured by a pressure sensor interposed between the body surface of the subject and the ultrasonic transmission / reception surface of the ultrasonic probe 12. Can do. When the displacement or strain amount is generated in the target tissue by the pulsation of the heart or artery, the strain amount is used as the elasticity information. The elasticity information calculation unit 32 performs various image processing, such as smoothing processing within the coordinate plane, contrast optimization processing, and smoothing processing in the time axis direction between frames, on the calculated elasticity frame data. The elastic frame data may be output as the strain amount.

  The elastic image construction unit 34 is configured to include a frame memory and an image processing unit, and secures the elastic frame data output in time series from the elastic information calculation unit 32 in the frame memory, and stores the secured frame data. The image processing unit performs image processing.

  The color scan converter 36 includes a gradation circuit and a hue conversion circuit, and provides hue information such as red, green, and blue to the elastic image frame data output from the elastic image configuration unit 34. Includes conversion processing. In addition, the color scan converter 36, like the black and white scan converter 22, brightens the brightness of the area in the elastic image data, and conversely the area where the distortion is measured is the elastic image. You may make it make the brightness | luminance of the said area | region in data dark.

  The gradation circuit in the color scan converter 36 converts the elastic gradation frame data by converting, for example, 256 levels according to the value of each element data of the elastic image frame data output from the elastic image construction unit 34. Generate. At this time, although the region to be gradation is in the region of interest (ROI), it can be arbitrarily changed by the examiner via the interface unit 58.

  The switching addition unit 24 is means for inputting the monochrome tomographic image data output from the monochrome scan converter 22 and the elastic image frame data output from the color scan converter 36, and adding or switching both images. Only monochrome tomographic image data or color elastic image data is output, or both image data are added and synthesized and output. Further, for example, as described in Japanese Patent Application Laid-Open No. 2004-135929 filed earlier by the applicant of the present application, a color tomographic image may be displayed semi-transparently on a monochrome tomographic image. good. At this time, the black and white tomographic image is not limited to a general B-mode image, and a tissue harmonic tomographic image obtained by imaging the harmonic component of the received signal may be used. In addition, a tissue plastic image may be displayed instead of the black and white tomographic image.

  The image display unit 26 includes a D / A converter that converts image data output from the monochrome scan converter 22 or the color scan converter 36 via the switching addition unit 24 into an analog signal, and an analog video signal from the D / A converter. It consists of a color television monitor that receives signals and displays them as images.

  By the way, the elasticity image of the ultrasonic diagnostic apparatus 100 is expected to be applied to the diagnosis of diffuse diseases. That is, in the case of a diffuse disease, when local hard tissues such as nodules are scattered in the surrounding soft tissue, the elastic image reflects the mottled pattern of the hard tissue. For example, when the disease progresses from hepatitis to cirrhosis and fibrosis progresses, the nodule spreads into the liver parenchyma, and the mottled pattern of the sclerosing tissue of the elastic image becomes complicated. Therefore, the examiner observes the elastic image generated by the ultrasonic diagnostic apparatus 100, and the staging of the elastic image is performed based on, for example, the state of the mottled pattern of the hardened tissue in the elastic image.

  However, when the examiner visually observes the elasticity image and classifies the elasticity image, the evaluation results vary between the examiners. There is a demand for accurate and stable results of period classification.

  Hereinafter, a characteristic configuration of the ultrasonic diagnostic apparatus 100 of the present embodiment in view of this point will be described. FIG. 2 is a block diagram of a characteristic configuration of the ultrasonic diagnostic apparatus according to the first embodiment. As shown in FIG. 2, the elastic information calculation unit 32 includes an elastic frame statistical processing unit 60 that calculates frame evaluation data that is a statistical value such as an average value or a median value of elasticity information at each measurement point of the elastic frame data. It is configured.

  The cine memory 48 includes an elastic frame data memory 62 and a frame evaluation data memory 64 that store the elastic frame data and the frame evaluation data generated by the elastic information calculation unit 32 in time series.

  The frame selection unit 50 uses the frame evaluation data stored in the frame evaluation data memory 64 in the cine memory 48 as analysis frame data, determines whether or not the compression state on the tissue on the tomographic plane of the subject is appropriate, A plurality of elastic frame data in an appropriate compressed state is selected as elastic frame data for analysis.

  FIG. 3 is a diagram for explaining a method of selecting the elastic frame data for analysis by the frame selection unit 50. The upper graph in FIG. 3 is a strain graph in which the horizontal axis represents time and the vertical axis represents the average value of strain amounts at a plurality of measurement points of the elastic frame data. As shown in FIG. 3, among a plurality of elastic frame data arranged in time series, the elastic frame data in which the average value of the strain has a negative peak is the elastic frame data A for analysis and the elastic frame data E for analysis. Selected.

  Note that elastic frame data in which the average value of the strain amount at a plurality of measurement points of the elastic frame data has a positive peak can be used as the elastic frame data for analysis. Further, elastic frame data in which the absolute value of the average value of strain amounts at a plurality of measurement points of the elastic frame data is larger than a preset threshold value can be used as the elastic frame data for analysis. In addition, the median strain amount at a plurality of measurement points of the elastic frame data can be used. Also, instead of the average value or median of the strain amount at multiple measurement points in the elastic frame data, the average value or median of the elastic modulus at multiple measurement points in the elastic frame data, or the average of displacement at multiple measurement points in the displacement frame data A value or a median can also be used. In short, it is possible to use data serving as an index for determining whether or not the compression state of the tomographic surface of the subject is appropriate.

  In order to determine whether or not the compression state of the tissue on the tomographic plane of the subject is appropriate, for example, two points are tracked via the interface unit 58 to the tissue to be staged on the tomographic image or the elasticity image. A point is set, and the distance between the two points is measured while changing the position of the tracking point according to the tissue displacement in which the two tracking points are set. When the rate of change in the distance between the two points is greater than a preset threshold, it is possible to determine that the compression state of the subject on the tomographic surface of the subject is appropriate.

  As shown in FIG. 3, the evaluation data generation unit 52 uses the analysis elastic frame data A-analysis elastic frame data E transferred from the frame selection unit 50 to evaluate the staging of the elastic image. Data A-evaluation data E are generated. More specifically, the evaluation data generation unit 52 includes a histogram calculation unit 70, a statistical processing unit 72, and a drawing area evaluation unit 74, as shown in FIG. Each of the evaluation data A and the evaluation data E includes at least one of the calculation results of the histogram calculation unit 70, the statistical processing unit 72, and the drawing area evaluation unit 74.

  The histogram calculation unit 70 creates a histogram by counting the number of distortions or the number of occurrences of elastic modulus at a plurality of measurement points of the elastic frame data for analysis output from the frame selection unit 50, and the distortion degree and kurtosis of this histogram. Is calculated as histogram data. The skewness and kurtosis of the histogram are obtained by the following formulas 1 and 2.

In Equations 1 and 2, n is the number of samples, x (with an overline) is an average, and σ is a standard deviation.

  The statistical processing unit 72 calculates, as statistical processing data, the strain amount or the average value and the standard deviation of the elastic modulus at a plurality of measurement points of the elastic frame data for analysis output from the frame selection unit 50, and calculates the strain amount or the elasticity. Quantify the distribution of rates. In addition to the amount of strain or average value of elastic modulus and standard deviation of a plurality of measurement points in the elastic frame data for analysis, a feature value using a co-occurrence matrix, which is a general method for statistically calculating textures, for example, Homogeneity, heterogeneity, contrast, angular second moment, and entropy can be used as statistical processing data.

The drawing region evaluation unit 74 is configured to store a plurality of pieces of elastic frame data according to a binarization threshold value Eth that is set in advance in the ultrasonic diagnostic apparatus 100 or can be arbitrarily set by the examiner via the interface unit 58 and the control unit 56. Detection frame data is created by binarizing the elasticity information of each measurement point. Then, for example, the elasticity information of the detection frame data is smaller than Eth, that is, the number, area, and complexity of the area of the hardened tissue are calculated as the drawing area evaluation data. The complexity can be obtained by the following equation. Complexity = (peripheral length) 2 / area By this, the extent and shape of the tissue region having the target strain or elastic modulus are quantified. The Eth can be arbitrarily set by the examiner, but it is also possible to calculate and use a threshold value that maximizes the complexity so that a binarized image in which the drawing area is emphasized can be obtained. Rather than binarizing with Eth, at least one of the number of regions formed by elasticity information belonging to a preset threshold range, the area of the region, and the complexity of the shape of the region is drawn region You may make it calculate as evaluation data.

  As shown in FIG. 3, the evaluation data generation unit 52 averages the calculation results of evaluation data A-evaluation data E or selects the median of the calculation results of evaluation data A-evaluation data E for classification. Evaluation data (classification calculation result) is generated and sent to the image classification unit 54. The image classification unit 54 performs staging of the elastic image based on the evaluation data for classification. For example, when the evaluation data A-evaluation data E is the degree of distortion of the histogram of elasticity information at a plurality of measurement points of the elastic frame data for analysis output from the frame selection unit 50, the evaluation data for classification includes five degrees of distortion. Average or median.

  In the present embodiment, the average or median of the calculation results of the evaluation data A-evaluation data E is used as the evaluation data for classification. However, the present invention is not limited to this, and the evaluation data A-evaluation data E is directly input to the image classification unit 54. The sending and image classification unit 54 may classify the elasticity image for each of the evaluation data A and the evaluation data E.

  Further, instead of using the average or median of the calculation results of the evaluation data A-evaluation data E as the evaluation data for classification, the frame selection unit 50 includes the elasticity information of the corresponding measurement points of the plurality of elastic frame data for analysis. The analysis elastic frame data is generated by averaging the selected values or by selecting the median elasticity information of the corresponding measurement points of the plurality of analysis elastic frame data, and the generated analysis elastic frame data is generated as the evaluation data. You may make it send to the part 52. In this case, the evaluation data generation unit 52 generates evaluation data using the sent elastic frame data for analysis.

  Next, the configuration of the image classification unit 54 will be described. FIG. 4 is a block diagram showing the configuration of the image classification unit. As shown in FIG. 4, the image classification unit 54 receives histogram data, statistical processing data, and rendering area evaluation data. The image classification unit 54 includes an evaluation data selection unit 80 that selects at least one evaluation data from histogram data, statistical processing data, and rendering area evaluation data, a memory 82, and a multivariate analysis unit 84. Has been. In addition to the histogram data, statistical processing data, and rendering area evaluation data, for example, the result of the blood test of the examiner can be input to the image classification unit 54 and added to the evaluation index.

  As a method of selecting evaluation data in the evaluation data selection unit 80, in addition to a method in which the examiner selects via the interface unit 58 and the control unit 56, when classifying the elastic images of each group stored in the memory 82, There is a method of selecting evaluation data having a high correlation. At least one or more evaluation data selected by the evaluation data selection unit 80 is input to the multivariate analysis unit 84, and the classification result is output to the image display 26 and displayed.

  The multivariate analysis unit 84 performs staging of elastic images using multivariate analysis, such as multiple regression analysis, discriminant analysis, principal component analysis, quantification method, factor analysis, cluster analysis, multidimensional scaling method, neural network, etc. Is what you do. Here, a classification method using a neural network perceptron will be described.

  FIG. 5 is a diagram showing the structure of a simple perceptron as an example of a neural network. As shown in FIG. 5, the simple perceptron includes an input layer and an output layer composed of a single unit (neuron). Evaluation data of the parameter selected by the evaluation data selection unit 80 is input to the input layer. For example, if four parameters are selected as parameters, the degree of distortion of the histogram, the standard deviation of the amount of distortion, the area of the region where the amount of distortion is less than the threshold in the binarized image, and the complexity of the shape of this region Are input to the input layer as x1, x2, x3, and x4.

  In the output layer, the sum u obtained by weighting the coupling load ωi to the input value xi is obtained by the following equation (3). Then, it is converted by a predetermined function f (u) and the value is output.

As the function f (u) used in the output layer, a threshold function or a linear function is used. For example, in the case of a threshold function, f (u) = 1 when u is larger than the threshold value h, f (u) = 0 when u is smaller, and the output value z is z = f (u). When the output is a linear function that linearly increases or decreases with respect to the input, the input value u becomes the output value as it is. A result of f (u) = 1 means that the elasticity image that created the evaluation data belongs to a specific stage (group) of the staging classification, and a result of f (u) = 0 indicates that the elasticity image is specific Means that it does not belong to any stage (group) of When a simple perceptron using a threshold function is used as the output layer conversion function, a plurality of simple perceptrons having different output layer threshold values are arranged in parallel and classified into a plurality of groups. When the input value u is an output value, threshold values are set in multiple stages according to the number of groups, and the output values are classified into a plurality of groups.

  For example, if the target organ for staging is the liver, there is no fibrosis (stage 0), portal vein fibrosis (stage 1), fibrous bridge formation (stage 2), fibrosis with lobular strain Classification into each stage of bridge formation (stage 3) and cirrhosis (stage 4) can be performed.

  The perceptron is characterized in that the output signal is compared with a teacher signal (correct answer), and when it is different, the coupling weight ωi and the threshold value h are changed, that is, learning is performed. Specifically, when the difference between the teacher signal z * and the output value z is δ, the coupling load ωi is corrected so that δ2 is minimized as shown in the following equation 4.

In Equation 4, ε is a learning coefficient. As the teacher signal, for example, a result (any one of the first to Nth groups) diagnosed by the pathological diagnosis for the evaluation target is used. Such perceptron learning can be performed using a plurality of elastic image data whose diagnosis has been determined in advance and its evaluation data. Moreover, it can be performed each time a correct answer (confirmed diagnosis result) is obtained for a new classification result, thereby improving the accuracy of the result. When the teacher signal is not input, the classification result is output using the latest connection weight ωi and the threshold value h.

  The classification result by the multivariate analysis unit 84 is sent to the image display 26 and displayed. As a display method, the classified stage names (for example, stage 0, stage 1, stage 2, stage 3, stage 4) are displayed, the stage name is the horizontal axis, and the vertical axis is the output value (u). Any method such as plotting on a graph can be employed.

  According to the ultrasonic diagnostic apparatus of the present embodiment, the frame selection unit 50 selects the elastic frame data with an appropriate compression state for the tissue on the tomographic plane of the subject as the elastic frame data for analysis. Elastic frame data that is not appropriate and not suitable for staging is not used for staging. Thereby, the precision of staging classification can be improved. Further, the evaluation data generation unit 52 and the image classification unit 54 classify the elastic image based on the plurality of analysis elastic frame data selected by the frame selection unit 50, thereby obtaining a single elastic frame data. Staging results can be stabilized compared to staging based.

  In addition, although the above-mentioned embodiment mainly demonstrated the ultrasonic diagnostic apparatus and the classification method of an elastic image, this invention is not limited to this. For example, an elastic image classification program that can be installed and executed in a computer such as an ultrasonic diagnostic apparatus or a PC can be used.

  The elastic image classification program is a method for analyzing the tissue of the tomographic plane of the subject generated based on a pair of RF signal frame data with different compression states obtained from the reflected echo signals measured in advance by the ultrasound probe 12. Based on at least one of the displacement frame data, elastic frame data representing the hardness or softness of the tissue of the tomographic plane generated based on the displacement frame data, and the tomographic plane of the subject based on the electrocardiographic waveform of the subject Determining whether or not the compression state of the tissue is appropriate, selecting a plurality of elastic frame data in the appropriate compression state as the elastic frame data for analysis, and converting the elastic frame data to the elastic frame data based on the plurality of analytical elastic frame data Based on the elasticity image based on the stage of the disease on the tomographic plane of the subject, it is classified into one of multiple preset stages A step that configured to include a step of displaying a result of classification together with the elastic image.

  By the way, when staging a plurality of elastic images, a more stable result can be obtained by selecting elastic images captured under the same conditions as much as possible. Therefore, for example, when acquiring an elastic image using heartbeat beats such as the liver, the distortion waveform shown in the upper part of FIG. It is possible to evaluate the coincidence rate of the distorted waveform with an autocorrelation function or the like and display the coincidence rate on the image display 26. That is, the examiner can recognize that the elastic image is stably acquired if the matching rate is high. Further, it may be determined whether or not the image acquisition is stably performed by comparing the coincidence rate with a preset threshold value, and the determination result is displayed on the image display 26 to be fed back to the examiner. .

  In addition, for example, when an elastic image is acquired using heartbeat beats such as the liver, depending on how the ultrasonic probe 12 is applied to the subject, the tissue on the ultrasonic tomographic plane has sufficient displacement or distortion. As a result, there is a possibility that elastic frame data suitable for staging of an elastic image cannot be acquired. Therefore, for example, the frame selection unit 50 can determine whether or not the compression state of the tomographic surface of the subject is appropriate, and display the determination result on the image display 26. According to this, if the method of applying the ultrasound probe 12 to the subject is not appropriate, it is determined that the compression state of the tissue on the tomographic surface of the subject is not appropriate, and this is displayed. . The examiner can recognize that the way of applying the ultrasonic probe 12 to the subject is not appropriate by looking at the display.

(Second embodiment)
Subsequently, a second embodiment of an ultrasonic diagnostic apparatus to which the present invention is applied will be described. In the second embodiment, an electrocardiogram waveform of a subject is measured using an electrocardiogram waveform acquisition device, and whether or not the compression state of the tissue on the tomographic plane of the subject is appropriate is determined based on the measured electrocardiogram waveform. The point of determination is different from the first embodiment. Therefore, the description of the same part as the first embodiment is omitted.

  FIG. 6 is a block diagram illustrating a characteristic configuration of the ultrasonic diagnostic apparatus according to the second embodiment. As shown in FIG. 6, in the second embodiment, instead of providing the elastic frame statistical processing unit 60 and the frame evaluation data memory 64 used in the first embodiment, the heart measured by the electrocardiogram waveform acquisition device is used. A radio wave type 90 (ECG) is input to the frame selection unit 50.

  FIG. 7 is a diagram for explaining a method of selecting elastic frame data for analysis by the frame selection unit 50. The upper graph in FIG. 7 is a graph of an electrocardiogram waveform in which the horizontal axis represents time, and the vertical axis represents the signal intensity measured by the electrocardiogram waveform acquisition apparatus. As shown in FIG. 7, the frame selection unit 50 generates a preset T wave among a plurality of characteristic waveforms (P wave, Q wave, R wave, S wave, T wave) appearing in the electrocardiogram waveform. The elastic frame data after a preset time is selected as analysis elastic frame data A-analysis elastic frame data E.

  In this embodiment, since it has been checked beforehand that the compression state of the tissue on the tomographic plane of the subject is appropriate in the time phase immediately after the T wave, the elastic frame data of the time phase immediately after the T wave is used as the elastic frame for analysis. Selected as data. Therefore, it is not limited to the T wave, among the plurality of feature waveforms appearing in the electrocardiogram waveform, in advance it is examined in advance whether the compression state for the tissue of the tomographic plane of the subject is appropriate in synchronization with the feature waveform, Elastic frame data synchronized with the characteristic waveform can be selected as elastic frame data with an appropriate compression state.

  Also in the ultrasonic diagnostic apparatus of the present embodiment, as in the first embodiment, the frame selection unit 50 selects elastic frame data with an appropriate compression state against the tissue on the tomographic plane of the subject as elastic frame data for analysis. Thus, elastic frame data that is not suitable for compression and not suitable for staging is not used for staging. Thereby, the precision of staging classification can be improved. Further, the evaluation data generation unit 52 and the image classification unit 54 classify the elastic image based on the plurality of analysis elastic frame data selected by the frame selection unit 50, thereby obtaining a single elastic frame data. Staging results can be stabilized compared to staging based.

  In each of the above-described embodiments, an example in which a plurality of stages according to the degree of progression of the disease classified by the image classification unit 54 has been set in advance has been described. However, a plurality of stages may be changed by changing the standard of the degree of disease progression or by feeding back and learning information on individual differences of subjects. As a result, it is possible to cope with a setting value that is changed later.

  12 Ultrasonic probe, 18 Phase adjusting and adding unit, 26 Image display, 28 RF signal frame data selecting unit, 30 Displacement measuring unit, 32 Elastic information calculating unit, 34 Elastic image composing unit, 50 Frame selecting unit, 52 Evaluation Data generation unit, 54 Image classification unit, 70 Histogram calculation unit, 72 Statistical processing unit, 74 Image area evaluation unit, 80 Evaluation data selection unit, 84 Multivariate analysis unit, 90 ECG waveform, 100 Ultrasound diagnostic device,

Claims (6)

  1. An ultrasound probe,
    Phasing addition means for generating RF signal frame data of the tomographic plane of the subject based on the reflected echo signal measured by the ultrasonic probe;
    A displacement measuring means for generating displacement frame data by measuring displacement of the tissue on the tomographic plane based on a pair of RF signal frame data having different compression states on the tissue on the tomographic plane of the subject;
    Elastic information calculation means for calculating elasticity information representing the hardness or softness of the tissue of the tomographic plane based on the generated displacement frame data; and
    Elastic image constructing means for generating an elastic image based on the elastic frame data;
    Based on the elastic frame data, the image classification means for classifying the elastic image into any of a plurality of stages set according to the degree of disease progression of the tissue on the tomographic plane of the subject;
    An ultrasonic diagnostic apparatus comprising the elastic image and an image display that displays a classification result of the elastic image,
    Based on the electrocardiogram waveform of the subject, among the plurality of feature waveforms that repeatedly appear in the electrocardiogram waveform of the subject, a plurality of elasticity after a preset time after the preset feature waveform appears The frame data is determined as a plurality of analysis elastic frame data in which the compression state of the tomographic plane of the subject is appropriate, and has elastic frame selection means for selecting the plurality of analysis elastic frame data,
    The image classification means performs staging of an elastic image based on the plurality of elastic frame data for analysis,
    The ultrasonic diagnostic apparatus, wherein the image display further displays the determined result.
  2.   2. The ultrasonic diagnostic apparatus according to claim 1, wherein the characteristic waveform is an electrocardiographic waveform excluding an R wave.
  3. The image classification means includes a histogram calculation unit that calculates at least one of the degree of distortion and kurtosis of the histogram of elasticity information at a plurality of measurement points of the elastic frame data for analysis as histogram data, and a plurality of measurements of the elastic frame data for analysis A statistical calculation unit that calculates at least one of the average value and the standard deviation of the elasticity information of the points as statistical processing data; and the elasticity information that belongs to a preset threshold range among the elasticity information of the plurality of measurement points of the elastic frame data for analysis A drawing region calculation unit that calculates at least one of the number of regions formed by the area, the area of the region, and the complexity of the shape of the region as drawing region evaluation data;
    An evaluation data generation unit including at least one of the histogram data, the statistical processing data, and the rendering area evaluation data;
    By averaging at least one calculation result of the histogram calculation unit, the statistical calculation unit, and the drawing area calculation unit for each of the plurality of elastic frame data for analysis, or by selecting a median of the calculation results 3. The ultrasonic diagnostic apparatus according to claim 1, wherein the elasticity image is classified based on the obtained classification calculation result.
  4.   The elastic frame selecting means selects an average value of elasticity information of corresponding measurement points of the analysis elastic frame data, or selects a median value of elasticity information of corresponding measurement points of the analysis elastic frame data. 4. The ultrasonic diagnostic apparatus according to claim 1, wherein the image classifying unit classifies the elastic images based on the selected average value or median value.
  5.   The elastic frame selection means includes an average value of displacements at a plurality of measurement points of the displacement frame data, a median value of displacements at the plurality of measurement points of the displacement frame data, an average value of elasticity information at the plurality of measurement points of the elastic frame data, And when at least one absolute value of median elasticity information of a plurality of measurement points of the elastic frame data is larger than a preset threshold value, the compression state is evaluated as appropriate. 5. The ultrasonic diagnostic apparatus according to any one of 4 to 4.
  6. It generates displacement frame data of the tissue slice plane of the object using the previously ultrasonic probe in measured pair of RF signal frame data having different resulting compressed state from the reflected echo signals which have been generated displacement Based on the frame data, elastic information representing the hardness or softness of the tissue on the tomographic plane is calculated to generate elastic frame data, and based on the electrocardiographic waveform of the subject, the electrocardiographic waveform of the subject Among a plurality of feature waveforms that repeatedly appear, a plurality of elastic frame data after a preset time after the appearance of a preset feature waveform is obtained. Determining as the elastic frame data for analysis, selecting the plurality of elastic frame data for analysis, and based on the plurality of elastic frame data for analysis, Classifying the elasticity image based on the elasticity frame data into any of a plurality of stages set according to the degree of disease progression of the tissue on the tomographic plane of the subject, and the classification and determination result as an elasticity image And an elastic image classification program comprising: a step of displaying together with the image.
JP2011531908A 2009-09-16 2010-09-10 Ultrasonic diagnostic apparatus and elasticity image classification program Active JP5726081B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2009214115 2009-09-16
JP2009214115 2009-09-16
PCT/JP2010/065623 WO2011034005A1 (en) 2009-09-16 2010-09-10 Ultrasonograph, elastic image classification method, and elastic image classification program
JP2011531908A JP5726081B2 (en) 2009-09-16 2010-09-10 Ultrasonic diagnostic apparatus and elasticity image classification program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011531908A JP5726081B2 (en) 2009-09-16 2010-09-10 Ultrasonic diagnostic apparatus and elasticity image classification program

Publications (2)

Publication Number Publication Date
JPWO2011034005A1 JPWO2011034005A1 (en) 2013-02-14
JP5726081B2 true JP5726081B2 (en) 2015-05-27

Family

ID=43758610

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011531908A Active JP5726081B2 (en) 2009-09-16 2010-09-10 Ultrasonic diagnostic apparatus and elasticity image classification program

Country Status (2)

Country Link
JP (1) JP5726081B2 (en)
WO (1) WO2011034005A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013049116A2 (en) * 2011-09-26 2013-04-04 Acuitas Medical Limited Method for detection of characteristics of organ fibrosis
JP6214974B2 (en) * 2012-09-10 2017-10-18 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus, image processing apparatus, and image processing method
JP6133038B2 (en) * 2012-10-23 2017-05-24 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Image processing method, apparatus, and program
CN104321019B (en) * 2013-01-18 2016-08-17 奥林巴斯株式会社 Ultrasound observation system
KR101512291B1 (en) 2013-05-06 2015-04-15 삼성메디슨 주식회사 Medical imaging apparatus and method of providing medical images
JP5639321B1 (en) 2013-06-26 2014-12-10 オリンパスメディカルシステムズ株式会社 Ultrasonic observation system and method of operating the ultrasonic observation system
JP2017006213A (en) * 2015-06-17 2017-01-12 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and control program

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004105615A1 (en) * 2003-05-30 2004-12-09 Hitachi Medical Corporation Ultrasonic probe and ultrasonic elasticity imaging device
JP2005118152A (en) * 2003-10-14 2005-05-12 Hitachi Medical Corp Ultrasonic diagnostic apparatus
JP2005152405A (en) * 2003-11-27 2005-06-16 Hitachi Medical Corp Ultrasonic diagnostic apparatus
JP2005334196A (en) * 2004-05-26 2005-12-08 Hitachi Medical Corp Ultrasonic diagnostic equipment
WO2006106852A1 (en) * 2005-03-30 2006-10-12 Hitachi Medical Corporation Ultrasonograph
JP2007282932A (en) * 2006-04-19 2007-11-01 Hitachi Medical Corp Method of generating elastic image and ultrasonograph
JP2007301086A (en) * 2006-05-10 2007-11-22 Hitachi Medical Corp Ultrasonic diagnostic equipment
WO2008010500A1 (en) * 2006-07-18 2008-01-24 Hitachi Medical Corporation Untrasonic diagnosis device
JP2008212522A (en) * 2007-03-07 2008-09-18 Hitachi Medical Corp Ultrasonic diagnostic apparatus
JP2009240464A (en) * 2008-03-31 2009-10-22 Hitachi Medical Corp Ultrasonic diagnostic apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004105615A1 (en) * 2003-05-30 2004-12-09 Hitachi Medical Corporation Ultrasonic probe and ultrasonic elasticity imaging device
JP2005118152A (en) * 2003-10-14 2005-05-12 Hitachi Medical Corp Ultrasonic diagnostic apparatus
JP2005152405A (en) * 2003-11-27 2005-06-16 Hitachi Medical Corp Ultrasonic diagnostic apparatus
JP2005334196A (en) * 2004-05-26 2005-12-08 Hitachi Medical Corp Ultrasonic diagnostic equipment
WO2006106852A1 (en) * 2005-03-30 2006-10-12 Hitachi Medical Corporation Ultrasonograph
JP2007282932A (en) * 2006-04-19 2007-11-01 Hitachi Medical Corp Method of generating elastic image and ultrasonograph
JP2007301086A (en) * 2006-05-10 2007-11-22 Hitachi Medical Corp Ultrasonic diagnostic equipment
WO2008010500A1 (en) * 2006-07-18 2008-01-24 Hitachi Medical Corporation Untrasonic diagnosis device
JP2008212522A (en) * 2007-03-07 2008-09-18 Hitachi Medical Corp Ultrasonic diagnostic apparatus
JP2009240464A (en) * 2008-03-31 2009-10-22 Hitachi Medical Corp Ultrasonic diagnostic apparatus

Also Published As

Publication number Publication date
WO2011034005A1 (en) 2011-03-24
JPWO2011034005A1 (en) 2013-02-14

Similar Documents

Publication Publication Date Title
US8480582B2 (en) Image processing apparatus and ultrasonic diagnosis apparatus
JP4202697B2 (en) Ultrasonic diagnostic apparatus, ultrasonic image display apparatus, and ultrasonic image display method
CN100450446C (en) Ultrasonographic device
US6508768B1 (en) Ultrasonic elasticity imaging
EP1614387B1 (en) Ultrasonic diagnostic apparatus, image processing apparatus and image processing method
JPWO2006073088A1 (en) Ultrasonic diagnostic apparatus, ultrasonic imaging program, and ultrasonic imaging method
US6558324B1 (en) System and method for strain image display
US20090124903A1 (en) Ultrasound Diagnostic Apparatus and Method of Displaying Ultrasound Image
US8206298B2 (en) Ultrasonographic elasticity imaging device
JP5230106B2 (en) Ultrasonic diagnostic apparatus, IMT measurement method, and IMT measurement program
JPWO2005120358A1 (en) Elastic image display method and ultrasonic diagnostic apparatus
JP2008301920A (en) Ultrasonic diagnostic apparatus, ultrasonic image processor, and ultrasonic image processing program
JP4733938B2 (en) Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus
US6592522B2 (en) Ultrasound display of displacement
JP5303147B2 (en) Ultrasonic diagnostic device for generating elastic images
JP4919972B2 (en) Elastic image display method and elastic image display device
JPWO2007046272A6 (en) Ultrasonic diagnostic device for generating elastic images
JP5203605B2 (en) Ultrasonic diagnostic equipment
US6749571B2 (en) Method and apparatus for cardiac elastography
CN100493461C (en) Ultrasonic diagnosis device
US20070112270A1 (en) Ultrasonic imaging apparatus
CN101175444B (en) Ultrasonographic device and ultrasonic elastic image acquisition method
US7245746B2 (en) Ultrasound color characteristic mapping
JP5264097B2 (en) Ultrasonic diagnostic equipment
US8353831B2 (en) Diagnostic ultrasound system and method of displaying elasticity image

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130829

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130829

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140609

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140728

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150113

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150203

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150317

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150331

R150 Certificate of patent or registration of utility model

Ref document number: 5726081

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350