CN106725564B - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN106725564B
CN106725564B CN201510833447.1A CN201510833447A CN106725564B CN 106725564 B CN106725564 B CN 106725564B CN 201510833447 A CN201510833447 A CN 201510833447A CN 106725564 B CN106725564 B CN 106725564B
Authority
CN
China
Prior art keywords
specific
image
difference
specific part
registration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510833447.1A
Other languages
Chinese (zh)
Other versions
CN106725564A (en
Inventor
唐喆
李广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Medical Systems Corp
Original Assignee
Toshiba Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Medical Systems Corp filed Critical Toshiba Medical Systems Corp
Priority to CN201510833447.1A priority Critical patent/CN106725564B/en
Priority to JP2016137987A priority patent/JP6915969B2/en
Publication of CN106725564A publication Critical patent/CN106725564A/en
Application granted granted Critical
Publication of CN106725564B publication Critical patent/CN106725564B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The present invention provides an image processing apparatus and an image processing method capable of automatically performing registration of a specific portion on an image with high accuracy, the image processing apparatus including: an image acquisition unit that acquires a first image and a second image that are acquired at timings different from each other in time series and that include a specific portion in a subject; a recognition unit configured to recognize the specific portion in each of the images acquired by the image acquisition unit, and recognize a plurality of first specific portion candidates on the first image and a plurality of second specific portion candidates on the second image; and a registration unit that determines a correspondence relationship between a first specific region candidate and a second specific region candidate representing the same specific region, based on a difference in feature amount between each of the plurality of first specific region candidates and each of the plurality of second specific region candidates.

Description

Image processing apparatus and image processing method
Technical Field
The present invention relates to an image processing apparatus and an image processing method for processing an image.
Background
Currently, medical devices such as X-ray CT (computed tomography) devices and ultrasonic scanning devices are widely used.
In practice, it is sometimes necessary to compare images acquired at different times, such as before and after treatment, with images acquired by these medical devices to assist physicians in interpreting the images. For example, after a specific region such as a certain blood vessel or a tumor site is treated, a morphological change of the specific region is often accompanied, and in this case, when image registration of the specific region between a plurality of images is performed at different times, it is difficult to perform registration of the specific region. By "registration" is meant matching the geographic coordinates of different images so that the same region between the images corresponds.
In particular, when the image acquisition environment changes greatly, the interval time is long, or when images acquired by different apparatuses are compared, as shown in fig. 10, even with the same subject, for example, the difference between the images before and after treatment is large, and particularly, on the image after treatment, it is difficult to find a part corresponding to a specific part on the image before treatment by the pixel comparison of the gradation, and there is a problem that the registration is difficult.
In order to register these images with large errors in the conventional method, an operator generally needs to manually input the location of a specific part or a common region of interest on a contrast image such as a post-treatment image, and perform registration based on the input.
A method of converting different coordinate systems into the same coordinate system using a connection device is disclosed in, for example, patent document 1(US8831708B 2). However, this method still cannot solve the problem that the positional deviation in the image is increased and the registration fails.
Patent document 2(US2015/0209015a1) discloses a method of performing registration of images acquired by a plurality of medical devices MR, CT, and the like by acquiring the images using preset reference feature points. However, this method also cannot avoid registration errors that occur when the image position is wrong.
In addition, a method of performing registration by selecting a point cloud on an ultrasonic image is disclosed in patent document 3(US8731264B 2). However, the method still needs to perform positioning with a preset point cloud, and it is not possible to determine whether the data is correct or not.
Therefore, the prior art has the problems that a reference region needs to be set in advance and accurate registration of images with large deformation (due to treatment or respiratory movement, etc.) or with small overlapping portions is difficult.
Disclosure of Invention
The present invention has been made in view of the above problems, and an object thereof is to provide an image processing apparatus and an image processing method capable of automatically performing registration of a specific portion on an image with high accuracy,
an aspect of the present invention is an image processing apparatus, including: an image acquisition unit that acquires a first image and a second image that are acquired at timings different from each other in time series and that include a specific portion in a subject; a recognition unit configured to recognize the specific portion in each of the images acquired by the image acquisition unit, and recognize a plurality of first specific portion candidates on the first image and a plurality of second specific portion candidates on the second image; and a registration unit that determines a correspondence relationship between a first specific region candidate and a second specific region candidate representing the same specific region, based on a difference in feature amount between each of the plurality of first specific region candidates and each of the plurality of second specific region candidates.
Another aspect of the present invention is an image processing method including: an image acquisition step of acquiring a first image and a second image which are acquired at timings different from each other in time series and which include a specific portion in a subject; a recognition step of recognizing the specific portion in each of the images acquired in the image acquisition step, and recognizing a plurality of first specific portion candidates on the first image and a plurality of second specific portion candidates on the second image; and a registration step of determining a correspondence relationship between a first specific region candidate and a second specific region candidate representing the same specific region, based on a difference in feature amount between each of the plurality of first specific region candidates and each of the plurality of second specific region candidates.
By comprehensively identifying the specific part candidates in each image and then screening the specific part candidates using the comprehensive index of the plurality of feature quantities, the specific part on the image can be automatically and accurately registered, and even when the error between the images is large, the reference point or the region of interest does not need to be specified, and the registration can be performed with high accuracy.
Due to the above features and effects, the present invention can also be used for comparing images from different image processing apparatuses.
Drawings
Fig. 1 is a block diagram showing a configuration of an image processing apparatus according to a first embodiment of the present invention.
Fig. 2 is a block diagram showing a configuration of a registration unit according to a first embodiment of the present invention.
Fig. 3 is a flowchart showing an image registration process according to the first embodiment of the present invention.
Fig. 4 is a flowchart showing an example of the difference calculation of a plurality of feature amounts according to the first embodiment of the present invention.
Fig. 5 is a schematic diagram showing a configuration in which the image processing apparatus is applied to a system in which a plurality of medical devices are linked.
Fig. 6A and 6B are explanatory views illustrating a specific example of determining a specific site to which the present invention is applied.
Fig. 7 is a block diagram showing a configuration of an image processing apparatus according to a second embodiment of the present invention.
Fig. 8 is a flowchart showing an image registration process according to the second embodiment of the present invention.
FIGS. 9A-9B are comparative examples of the present invention and the prior art.
Fig. 10 is a schematic diagram illustrating image variations at different times.
Detailed Description
The present invention relates to an image processing apparatus that processes an image, and the image processing apparatus can be realized by a device having a CPU (central processing unit) such as an independent computer connected to an image capture apparatus executing software having each function of the image processing apparatus, or can be realized by hardware as a circuit capable of executing each function of the image processing apparatus. The image processing apparatus of the present invention may be installed in advance in the above medical image acquisition apparatus as a part of a medical image acquisition apparatus such as a CT (computed Tomography) apparatus or an ultrasound apparatus.
Preferred embodiments of the present invention will be described below with reference to the accompanying drawings. In the different embodiments, the same reference numerals are used for the same components, and the overlapping description is appropriately omitted.
(first embodiment)
Fig. 1 is a block diagram showing a configuration of an image processing apparatus according to a first embodiment of the present invention. As shown in fig. 1, the image processing apparatus 100 includes an image acquisition unit 10, a recognition unit 20, and a registration unit 30.
The image acquisition unit 10 is configured to acquire a plurality of images including a specific portion in a subject acquired at timings different from each other in time series. For example, the image acquiring unit 10 acquires and stores a plurality of ultrasound medical images acquired by an ultrasound apparatus at different times from the ultrasound apparatus connected to the image processing apparatus 100. The image acquisition unit 10 may be a circuit or a software module that can realize the above functions.
The recognition unit 20 recognizes a specific portion in each image acquired by the image acquisition unit 10, and sets a plurality of recognized specific portions as specific portion candidates for each image. The identification section 20 may be a circuit or a software module capable of realizing the above functions.
The specific region is a region having similar characteristics on an image, and represents a specific region on a medical image of the subject, for example, a tumor site, a blood vessel site, a specific tissue set, or a site of a medical intervention element.
The specific portion can be identified by using an existing method for extracting a characteristic region of an image. The extraction can be performed, for example, using the method of MSER (Maxmally Stable extreme Regions). Specifically, pixels constituting the image are analyzed using any one of criteria such as a change amount of a gradation value, a maximum area of a region, a maximum change rate, and a minimum conversion amount of a stable region, which are criteria, and a region that meets a selected criterion is extracted as a specific portion.
In the present invention, since the operator is not required to input or confirm a specific portion, a plurality of specific portions can be generally recognized by comprehensively recognizing the image. These specific parts that meet the recognition criteria are stored as specific part candidates.
The registration unit 30 registers a plurality of images acquired at different times, and determines a correspondence relationship of a specific portion between the two images. For example, when two images before and after treatment are registered among the plurality of images, if the recognition unit 20 recognizes the specific site candidates R1 to Rn on the first image, recognizes the specific site candidates F1 to Fm on the second image, and n and m are natural numbers, the registration unit 30 sets R of one of the specific site candidates R1 to Rn and F of one of the specific site candidates F1 to Fm as one specific site pair, calculates a feature difference between R and F for each of the specific site pairs (R, F) thus combined, and compares the feature differences to determine a specific site pair representing the same specific site on the two images. The image processing apparatus 100 can align the images based on the specific part pair representing the same specific part obtained by the registration unit 30, and superimpose or display the images subsequently. The registration section 30 may be a circuit or a software module capable of realizing the above functions.
The selected feature amount can be set in advance, and is at least one of a variance value, an average luminance, an entropy, a euclidean distance, a volume ratio, an optimal vicard similarity coefficient, and the like of pixels constituting the specific portion, for example.
The specific method for determining the specific pair of specific sites will be described below. For example, the registration unit 30 calculates a difference in the feature amount between a specific part pair as an index for a certain feature amount, and compares the above-described indexes of the respective specific part pairs to determine a specific part pair having an index exceeding a predetermined threshold value or the uppermost specific part pair having the specific part pairs sorted by the indexes as a specific part pair representing the same specific part.
In addition, the invention also provides a method for forming a comprehensive index by using a plurality of characteristic quantities for evaluation. The registration unit 30 may calculate a composite index for each specific part pair using a difference between a plurality of feature amounts that can be acquired on the image, and may set the specific part pair having the smallest composite index as a specific part pair representing the same specific part.
Fig. 2 is a block diagram showing a configuration of the registration unit 30 that executes the multi-feature comprehensive evaluation method. As shown in fig. 2, the registration unit 30 includes a feature difference calculation unit 31 and an evaluation unit 32.
The feature amount difference calculation unit 31 calculates a difference between a plurality of feature amounts in each specific part pair, which is obtained by calculating a feature amount difference between R and F in each specific part pair, for each specific part pair.
The selected characteristic amount may be set in advance, and may be, for example, some of a variance value (SD), an average luminance (Mean intensity), an Entropy (Entropy), an Euclidean Distance (Euclidean Distance), a volume ratio (volume ratio), an optimized Jaccard similarity coefficient (Modified Jaccard similarity) of pixels constituting a specific portion, or the like. The calculation method of these characteristic quantities can be referred to the methods in the prior art, and for example, the following formulas are commonly used:
Figure GDA0002317051420000051
Figure GDA0002317051420000052
Figure GDA0002317051420000053
Euclidean Distance=baryCetnerR-transfomredBaryCetnerF
Figure GDA0002317051420000054
Figure GDA0002317051420000055
for the calculation of the individual feature amounts, a method in the related art can be referred to, and thus detailed explanation is omitted.
The feature amount difference calculation unit 31 calculates the feature amount of the specific region R and the feature amount of the specific region F in each specific region pair for each of the designated feature amounts, and then calculates the difference between the specific region R and the specific region F for the same feature amount. For example, when VR is a value of a certain characteristic amount of the specific portion R and VF is a value of the characteristic amount of the specific portion F, the difference between the specific portion R and the specific portion F is (VR-VF).
The evaluation unit 32 calculates a composite index for each of the specific portion pairs based on the difference between the plurality of feature values obtained by the feature value difference calculation unit 31, and sets the specific portion pair having the smallest composite index as a specific portion pair representing the same specific portion.
The evaluation unit 32 can obtain the comprehensive index S (R, F) by using the following formula 1.
Formula 1:
Figure GDA0002317051420000061
in the formula 1, N represents the number of types of a plurality of feature values, VR represents the value of a certain feature value of the specific portion R, VF represents the value of the same feature value of the specific portion F, wiThe weight given to the ith feature value is expressed.
Here, the comprehensive index uses a square root calculation method as an example, but the method of obtaining S (R, F) is not limited to this, and other calculation methods may be used.
The weight determination method may be a method in which the ratio of the sum of the differences of the plurality of feature amounts to the difference of each feature amount is used as a weight, and for example, when two feature amounts a and B are selected, the difference of the feature amount a is 90, and the difference of the feature amount B is 10, the weight (a) is 100/90, and the weight (B) is 100/10.
Note that, a normalized weight coefficient may be given to each weight, and for example, when four feature amounts A, B, C, D are selected, and the difference of feature amount a is 90, the difference of feature amount B is 10, the difference of feature amount C is 50, and the difference of feature amount D is 60, the following formula is used: w is aiThese four values are normalized to within the interval of (0,1) where the difference of the ith feature amount-the minimum difference)/(the maximum difference-the minimum difference). In the above example, the weight (a) is 1, the weight (B) is 0.00001, the weight (C) is 0.5, and the weight (D) is 0.625 are obtained.
The weight value may be set by other methods, for example, by directly receiving an input of the weight value from an operator or other devices through an interface.
In the first embodiment, the image acquisition unit 10 corresponds to an "image acquisition unit", the recognition unit 20 corresponds to an "recognition unit", and the registration unit 30 corresponds to a "registration unit". The flow of the image registration processing performed by the image processing apparatus 100 is described below.
Fig. 3 is a flowchart showing an image registration process according to the first embodiment of the present invention. As shown in fig. 3, when the registration process is started, first, the image acquiring unit 10 acquires a plurality of images including a specific portion in the subject, which are acquired at timings different from each other in time series, from a medical device such as an ultrasound apparatus (step S301). For example, the images shown in fig. 6A and 6B are taken. Fig. 6A and 6B are explanatory views illustrating a specific example of determining a specific site to which the present invention is applied.
Next, in step S302, the recognition unit 20 recognizes a specific portion in each image acquired by the image acquisition unit 10, and sets a plurality of recognized specific portions as specific portion candidates for each image. For example, the recognition unit 20 recognizes the specific portions R1, R2, R3 on the image of fig. 6A, and recognizes the specific portions F1, F2 on the image of fig. 6B.
Next, the process proceeds to step S303, and the registration unit 30 calculates a difference between the feature values of the specific portion pair between the images to check the specific portion pair representing the same specific portion.
As described above, a specific calculation method is, for example, registration of a specific portion using a plurality of feature quantities. Fig. 4 is a flowchart showing an example of the difference calculation of a plurality of feature amounts according to the first embodiment of the present invention. In step S401, the feature difference calculation unit 31 of the registration unit 30 calculates differences between a plurality of features in each specific part pair for each specific part pair.
Next, the process proceeds to step S402, and the evaluation unit 32 of the registration unit 30 calculates a composite index for each specific part pair based on the difference between the plurality of feature values obtained by the feature value difference calculation unit 31, and sets the specific part pair having the smallest composite index as a specific part pair representing the same specific part.
In the example shown in fig. 6, the table shown in table 1 is obtained by calculating the composite index S (R, F) for all combinations of the specific sites R1, R2, and R3 and the specific sites F1 and F2, respectively.
Table 1:
R1 R2 R3
F1 5% 4% 15%
F2 3% 5% 86%
the percentages in table 1 represent the overall index S (R, F) of the corresponding specific site pair (R, F). As can be seen from table 1, since the overall index of the specific site pair composed of F2 and R3 is 86% and is much higher than that of the other combinations, it was confirmed that F2 corresponds to R3, (R3, F2) represents the same specific site between the images of fig. 6A and 6B. At this point, the registration processing is finished, and the image processing apparatus can perform registration or superimposition display processing between the two images based on the specific part pair, so as to prompt a doctor to perform film reading.
Fig. 6 shows an example, but the correspondence relationship between the specific portions is not limited to this, and for example, the obtained integrated index may be compared with a predetermined threshold value, and a pair of specific portions exceeding the threshold value may be confirmed as representing the same specific portion.
By the above registration processing, even if the morphological position of the images acquired at different times is greatly different, a plurality of specific portions can be extracted from the image as specific portion candidates, and the images can be comprehensively evaluated using a plurality of feature amounts to automatically find the specific relationship of the specific portions, thereby performing the registration of the images with high accuracy.
(modification example)
In the first embodiment, registration is performed for medical images of the same category, but the present invention can also be used to register medical images of different categories from different medical devices.
Fig. 5 is a schematic diagram showing a configuration in which the image processing apparatus is applied to a system in which a plurality of medical devices are linked.
In the system shown in fig. 5, the apparatus to which the image processing apparatus 100 is attached is connected to the CT apparatus 200 and the ultrasound apparatus 300, respectively, so that the X-ray image can be acquired from the CT apparatus 200, the ultrasound image can be acquired from the ultrasound apparatus 300, and the image processing apparatus 100 registers a specific portion between the X-ray image and the ultrasound image.
Since the difference between medical images acquired by different types of medical devices is large, generally, it is easy to designate an incorrect region of interest as a specific part using a conventional method, thereby affecting the effect of registration.
By automatically identifying a plurality of specific parts by using the image processing device 100 of the present invention and registering the specific part pairs by using the comprehensive index of a plurality of characteristic quantities, the present invention is more suitable for registration between images with large errors, and can improve the registration accuracy.
(second embodiment)
The second embodiment is different from the first embodiment in that the second embodiment adds further fine registration to the registration processing and transforms the coordinate system of the image.
Hereinafter, differences between the second embodiment and the first embodiment will be mainly described, and redundant description will be omitted as appropriate.
Fig. 7 is a block diagram showing a configuration of an image processing apparatus 100A according to a second embodiment of the present invention. As shown in fig. 7, the image processing apparatus 100A includes an image acquisition unit 10, a recognition unit 20, a coordinate transformation unit 40, a registration unit 30, and a registration modification unit 50.
The image acquisition unit 10 is configured to acquire a plurality of images including a specific portion in a subject acquired at timings different from each other in time series. The image acquisition unit 10 may be a circuit or a software module that can realize the above functions.
The recognition unit 20 recognizes a specific portion in each image acquired by the image acquisition unit 10, and sets a plurality of recognized specific portions as specific portion candidates for each image. The identification section 20 may be a circuit or a software module capable of realizing the above functions.
The coordinate conversion unit 40 changes the coordinates of the images to unify the coordinate systems of different images. The coordinate transformation unit 40 may be a circuit or a software module capable of realizing the above functions.
For example, when a pair of images consisting of the reference image and the comparison image is aligned, the coordinate conversion unit 40 may perform coordinate conversion by moving or rotating the comparison image so that the coordinate system of the comparison image is matched with the coordinate system of the reference image. Of course, it is also possible to transform the two images requiring registration together into the same coordinate system.
The specific coordinate transformation method may use an existing method. For example, when registering an ultrasound image, a matrix transformation is performed using magnetic field coordinate information of the ultrasound image, and a displacement amount is given to a pixel matrix to perform a coordinate change.
The registration unit 30 registers the images after the coordinate system is unified by the coordinate transformation unit 40 based on the specific portion recognized by the recognition unit 20. The registration unit 30 determines a specific part pair representing the same specific part based on the difference between the plurality of feature values of each specific part pair composed of specific part candidates. The registration section 30 may be a circuit or a software module capable of realizing the above functions.
The specific configuration of the registration unit 30 and the method of calculating the feature amount may refer to the method of calculating the comprehensive index described in the first embodiment.
In the second embodiment, by calculating the comprehensive index using a plurality of characteristic parameters, it is possible to cope with a situation in which the magnetic field coordinate information used in the coordinate conversion unit 40 is not accurate. This is because the influence of a certain error parameter can be reduced by using a plurality of characteristic parameters in the calculation of the composite index.
The registration modification unit 50 re-registers the image using the gradation parameters based on the correspondence relationship of the specific part pair determined by the registration unit 30. The registration modifier 50 may be a circuit or a software module capable of implementing the above functions.
That is, the registration modification unit 50 uses the correspondence between two images of the same specific region determined by the registration unit 30 as the designation of a feature point or a feature region in the existing method, so as to perform further fine registration on the images by using the existing grayscale registration method.
The correspondence relationship of the specific portion based on which the registration modification unit 50 is based is more accurate than the correspondence relationship accepted by the conventional method such as manual input by an operator, so that the registration result of the image processing apparatus 100A can be further optimized by the registration method based on the gray scale.
The conventional gray scale registration method includes a normalized correlation coefficient NCC method, a nelder-mead downhill simplex method, and the like, and a detailed description thereof is omitted here.
The registration modifier 50 is a further optimization of the registration result of the image processing apparatus 110A, and therefore the registration modifier 50 may also be omitted.
In the second embodiment, the image acquisition unit 10 corresponds to an "image acquisition unit", the recognition unit 20 corresponds to an "recognition unit", the registration unit 30 corresponds to a "registration unit", the coordinate transformation unit 40 corresponds to a "coordinate transformation unit", and the registration modification unit 50 corresponds to a "registration modification unit". The flow of image registration processing performed by the image processing apparatus 100A is described below.
Fig. 8 is a flowchart showing an image registration process according to the second embodiment of the present invention.
As shown in fig. 8, when the registration process is started, first, the image acquiring unit 10 acquires a plurality of images including a specific portion in the subject, which are acquired at timings different from each other in time series, from a medical device such as an ultrasound apparatus (step S801).
Next, in step S802, the recognition unit 20 recognizes a specific portion in each image acquired by the image acquisition unit 10, and sets a plurality of recognized specific portions as specific portion candidates for each image.
Next, the process proceeds to step S803, and the coordinate transformation unit 40 transforms the coordinates of the comparative image, and shifts or rotates the pixels by applying a displacement amount to the pixels, thereby unifying the coordinate systems of the different images.
Thereafter, in step S804, the registration unit 30 calculates a difference in feature values of the specific portion pair between the images to confirm the specific portion pair representing the same specific portion. The specific determination process can be referred to fig. 4.
Finally, in step S805, the registration modification unit 50 performs, as a part for performing the fine registration, the registration of the gradation again on the image using the gradation parameter, with the correspondence relationship of the specific part pair determined by the registration unit 30 as input.
FIGS. 9A-9B are comparative examples of the present invention and the prior art. In which the left side of fig. 9A shows a reference image before treatment, and the center of the image is a lesion region. Also, the right side of fig. 9A shows a contrast image as after treatment.
As can be seen from fig. 9A, the reference image and the contrast image are greatly different from each other as a whole due to the therapeutic effect, the movement of the subject, and the like.
Fig. 9B shows the result of registration by a conventional gray-scale registration method by an operator inputting a region of interest. In the overlay display after the alignment, a large error occurs in associating the part in the black frame with the lesion area in the reference image.
On the other hand, fig. 9C shows the result of registration obtained by the image processing apparatus of the present invention. By the coordinate system transformation of the movement or rotation, the identification of the specific portion, and the comprehensive evaluation of the plurality of feature amounts, the portion surrounded by the black frame in fig. 9C can be associated with the lesion region in the reference image, and the accuracy of registration can be improved as compared with the conventional technique.
(modification example)
In the second embodiment, the coordinate system conversion by the coordinate conversion unit 40 is performed after the recognition unit 20 recognizes the specific portion, but the present invention is not limited to this, and the coordinate system conversion by the coordinate conversion unit 40 may be performed before the recognition unit 20 recognizes the specific portion.
The image processing apparatus of the present invention may be incorporated in a medical device as a circuit capable of realizing the functions described in the respective embodiments, or may be distributed as a program executable by a computer and stored in a storage medium such as a magnetic disk (floppy (registered trademark), hard disk, or the like), an optical disk (CD-ROM, DVD, or the like), a magneto-optical disk (MO), a semiconductor memory, or the like.
Further, MW (middleware) or the like, which runs on a computer based on instructions from a program installed in the computer from a storage medium, such as an OS (operating system), database management software, network software, or the like, may execute a part of the processing for realizing each of the above embodiments.
While the embodiments of the present invention have been described above, these embodiments are presented as examples and are not intended to limit the scope of the invention. These new embodiments may be implemented in various other ways, and various omissions, substitutions, and changes may be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are also included in the invention described in the claims and the equivalent scope thereof.

Claims (12)

1. An image processing apparatus, comprising:
an image acquisition unit that acquires a first image and a second image that are acquired at mutually different time sequences and that include a specific portion in a subject, the specific portion being a region having similar characteristics on the images;
a recognition unit configured to recognize the specific portion in each of the images acquired by the image acquisition unit, and recognize a plurality of first specific portion candidates on the first image and a plurality of second specific portion candidates on the second image; and
a registration unit that determines a correspondence relationship between a first specific region candidate and a second specific region candidate representing the same specific region, based on a difference in feature amount between each of the plurality of first specific region candidates and each of the plurality of second specific region candidates,
the above registration unit includes:
a feature amount difference calculation unit that calculates a difference between a plurality of feature amounts in each of the specific part pairs, for each of specific part pairs including the first specific part candidate and the second specific part candidate, the difference between the plurality of feature amounts being obtained by calculating a difference between the first specific part candidate and the second specific part candidate in the specific part pair for each of the plurality of feature amounts; and
an evaluation unit that calculates a composite index for each of the specific site pairs based on the difference between the plurality of feature values obtained by the feature value difference calculation unit, and that takes the specific site pair having the smallest composite index as a first specific site candidate and a second specific site candidate representing the same specific site,
the evaluation unit obtains the composite index by applying a weighting coefficient inversely proportional to the difference between the respective feature amounts to the difference between the plurality of feature amounts obtained by the feature amount difference calculation unit.
2. The image processing apparatus according to claim 1,
the evaluation unit obtains the composite index by normalizing the difference between the plurality of feature values obtained by the feature value difference calculation unit.
3. The image processing apparatus according to claim 1,
the characteristic quantity is one of a variance value, average brightness, entropy, Euclidean distance, volume ratio and optimized Jacard similarity coefficient.
4. The image processing apparatus according to claim 1,
further comprising a coordinate conversion means for unifying the coordinate systems of the first image and the second image acquired by the image acquisition means by the coordinate conversion of the images,
the registration means obtains a difference in feature amount between each of the plurality of first specific part candidates and each of the plurality of second specific part candidates based on the transformed coordinate system.
5. The image processing apparatus according to claim 1,
the registration correction unit is further provided with a registration correction unit which re-registers the correspondence between the first specific part candidate and the second specific part candidate representing the same specific part by using the gray scale parameter based on the correspondence between the first specific part candidate and the second specific part candidate representing the same specific part determined by the registration unit.
6. The image processing apparatus according to claim 1,
the image acquisition unit acquires an ultrasound image including the specific portion as a first image, and acquires an X-ray image including the specific portion as a second image at a timing different from that of the first image.
7. An image processing method, comprising:
an image acquisition step of acquiring a first image and a second image which are acquired at mutually different time sequences and which include a specific portion in a subject, the specific portion being a region having similar characteristics on the images;
a recognition step of recognizing the specific region as a non-tumor region in each of the images acquired in the image acquisition step, and recognizing a plurality of first specific region candidates on the first image and a plurality of second specific region candidates on the second image; and
a registration step of determining a correspondence relationship between a first specific region candidate and a second specific region candidate representing the same specific region based on a difference in feature amount between each of the plurality of first specific region candidates and each of the plurality of second specific region candidates,
the registration step includes:
a feature value difference calculation step of calculating a difference between a plurality of feature values in each of the specific part pairs, the difference being obtained by calculating a difference between the first specific part candidate and the second specific part candidate in each of the specific part pairs, for each of the specific part pairs including the first specific part candidate and the second specific part candidate; and
an evaluation step of calculating a composite index for each of the specific portion pairs based on a difference between the plurality of feature values obtained in the feature value difference calculation step, and setting the specific portion pair having the smallest composite index as a first specific portion candidate and a second specific portion candidate representing the same specific portion,
in the evaluation step, the composite index is obtained by assigning a weighting coefficient inversely proportional to the difference between the respective feature amounts to the difference between the plurality of feature amounts obtained in the feature amount difference calculation step.
8. The image processing method according to claim 7,
in the evaluation step, the integrated index is obtained by normalizing the difference between the plurality of feature values obtained in the feature value difference calculation step.
9. The image processing method according to claim 7,
the characteristic quantity is one of a variance value, average brightness, entropy, Euclidean distance, volume ratio and optimized Jacard similarity coefficient.
10. The image processing method according to claim 7,
further comprising a coordinate conversion step of unifying the coordinate systems of the first image and the second image obtained in the image acquisition step by coordinate conversion of the images,
in the registration step, a difference in feature amount between each of the plurality of first specific part candidates and each of the plurality of second specific part candidates is obtained based on the transformed coordinate system.
11. The image processing method according to claim 7,
the method further includes a registration modification step of re-registering the correspondence between the first specific part candidate and the second specific part candidate representing the same specific part by using the gray scale parameter based on the correspondence between the first specific part candidate and the second specific part candidate representing the same specific part determined in the registration step.
12. The image processing method according to claim 7,
the image acquisition step acquires an ultrasound image including the specific portion as a first image, and acquires an X-ray image including the specific portion as a second image at a timing different from that of the first image.
CN201510833447.1A 2015-11-25 2015-11-25 Image processing apparatus and image processing method Active CN106725564B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201510833447.1A CN106725564B (en) 2015-11-25 2015-11-25 Image processing apparatus and image processing method
JP2016137987A JP6915969B2 (en) 2015-11-25 2016-07-12 Medical image processing equipment, medical image processing method and ultrasonic diagnostic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510833447.1A CN106725564B (en) 2015-11-25 2015-11-25 Image processing apparatus and image processing method

Publications (2)

Publication Number Publication Date
CN106725564A CN106725564A (en) 2017-05-31
CN106725564B true CN106725564B (en) 2021-08-13

Family

ID=58820208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510833447.1A Active CN106725564B (en) 2015-11-25 2015-11-25 Image processing apparatus and image processing method

Country Status (2)

Country Link
JP (1) JP6915969B2 (en)
CN (1) CN106725564B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110772280B (en) * 2018-07-31 2023-05-23 佳能医疗系统株式会社 Ultrasonic diagnostic apparatus and method, and image processing apparatus and method
CN111045671A (en) * 2018-10-12 2020-04-21 中国移动通信集团有限公司 Image contrast method and device
CN111493931A (en) * 2019-08-01 2020-08-07 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and device and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101926676A (en) * 2009-06-25 2010-12-29 谢耀钦 Medical images-tracking method capable of automatically identifying characteristic points and device thereof
CN201719372U (en) * 2009-06-25 2011-01-26 谢耀钦 Medical image tracking device capable of automatically recognizing characteristic point
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002324238A (en) * 2001-04-26 2002-11-08 Fuji Photo Film Co Ltd Method and device for positioning image
CN1920882A (en) * 2005-08-24 2007-02-28 西门子共同研究公司 System and method for salient region feature based 3d multi modality registration of medical images
CN101390127A (en) * 2005-12-30 2009-03-18 卡尔斯特里姆保健公司 Cross-time inspection method for medical diagnosis
JP4518092B2 (en) * 2006-03-31 2010-08-04 ソニー株式会社 Object recognition device, object recognition method, object recognition program, feature amount registration device, feature amount registration method, and feature amount registration program
WO2008022210A2 (en) * 2006-08-15 2008-02-21 The Board Of Regents Of The University Of Texas System Methods, compositions and systems for analyzing imaging data
EP2095332B1 (en) * 2006-11-16 2010-08-11 Visiopharm A/s Feature-based registration of sectional images
CN101057790A (en) * 2007-06-22 2007-10-24 北京长江源科技有限公司 Treating tumor positioning method and device used for high strength focus ultrasonic knife
JP5835680B2 (en) * 2007-11-05 2015-12-24 株式会社東芝 Image alignment device
US8452126B2 (en) * 2011-06-17 2013-05-28 General Electric Company Method for automatic mismatch correction of image volumes
JP2013174823A (en) * 2012-02-27 2013-09-05 Olympus Corp Image processing device, microscope system and image processing method
US9311570B2 (en) * 2013-12-06 2016-04-12 Kabushiki Kaisha Toshiba Method of, and apparatus for, segmentation of structures in medical images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101926676A (en) * 2009-06-25 2010-12-29 谢耀钦 Medical images-tracking method capable of automatically identifying characteristic points and device thereof
CN201719372U (en) * 2009-06-25 2011-01-26 谢耀钦 Medical image tracking device capable of automatically recognizing characteristic point
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features

Also Published As

Publication number Publication date
JP2017097836A (en) 2017-06-01
JP6915969B2 (en) 2021-08-11
CN106725564A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
US11710233B2 (en) Three-dimensional medical image analysis method and system for identification of vertebral fractures
US10489678B2 (en) Image comparison tool tolerant to deformable image matching
US8150132B2 (en) Image analysis apparatus, image analysis method, and computer-readable recording medium storing image analysis program
CN109785303B (en) Rib marking method, device and equipment and training method of image segmentation model
US9928401B2 (en) System and method of biometric enrollment and verification
EP2977921B1 (en) Apparatus and method for automatically registering landmarks in three-dimensional medical image
Hosseini et al. Comparative performance evaluation of automated segmentation methods of hippocampus from magnetic resonance images of temporal lobe epilepsy patients
CN113826143A (en) Feature point detection
US8285013B2 (en) Method and apparatus for detecting abnormal patterns within diagnosis target image utilizing the past positions of abnormal patterns
CN109124662B (en) Rib center line detection device and method
US20150086091A1 (en) Method and apparatus for detecting anatomical elements
WO2016134125A1 (en) Image segmentation via multi-atlas fusion with context learning
CN106725564B (en) Image processing apparatus and image processing method
WO2020110774A1 (en) Image processing device, image processing method, and program
US9514384B2 (en) Image display apparatus, image display method and storage medium storing image display program
CN113808125A (en) Medical image processing method, focus type identification method and related product
WO2010035518A1 (en) Medical image processing apparatus and program
WO2018198500A1 (en) Collation device, collation method and collation program
Tan et al. Automatic localization of the left ventricular blood pool centroid in short axis cardiac cine MR images
CN113379770B (en) Construction method of nasopharyngeal carcinoma MR image segmentation network, image segmentation method and device
Xiong et al. Lung field segmentation using weighted sparse shape composition with robust initialization
CN111753723B (en) Fingerprint identification method and device based on density calibration
CN117576023A (en) Spliced image verification method and device and X-ray photographing system
CN115393246A (en) Image segmentation system and image segmentation method
US10699426B2 (en) Registration apparatus, registration method, and registration program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant