CN102814006A - Image contrast device, patient positioning device and image contrast method - Google Patents

Image contrast device, patient positioning device and image contrast method Download PDF

Info

Publication number
CN102814006A
CN102814006A CN2012100221452A CN201210022145A CN102814006A CN 102814006 A CN102814006 A CN 102814006A CN 2012100221452 A CN2012100221452 A CN 2012100221452A CN 201210022145 A CN201210022145 A CN 201210022145A CN 102814006 A CN102814006 A CN 102814006A
Authority
CN
China
Prior art keywords
image
dimension
zone
benchmark
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012100221452A
Other languages
Chinese (zh)
Other versions
CN102814006B (en
Inventor
平泽宏祐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN102814006A publication Critical patent/CN102814006A/en
Application granted granted Critical
Publication of CN102814006B publication Critical patent/CN102814006B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Radiation-Therapy Devices (AREA)

Abstract

The present invention relates to an image contrast device, a patient positioning device, and an image contrast method. The image contrast device comprises a contrast processing unit. The contrast processing unit is used for contrasting a three-dimensional reference image and a three-dimensional current image, and calculating positional correction so that the position and posture of the patient in the current image site are consistent with the position and posture of the patient in the reference image. The contrast processing unit comprises: a primary contrast unit, which conducts primary contrast according to the reference image; and a secondary contrast unit, which conducts secondary contrast based on a predetermined template region for a predetermined retrieval target region, wherein the predetermined template region is generated according to one of the reference image or the current image and based on the primary contrast result, and the predetermined retrieval target area is generated according to one of the reference image or the current image different from the predetermined template region and based on the result of the primary contrast.

Description

Image comparison device, patient positioning and image contrast method
Technical field
The present invention relates to a kind of image comparison device and patient positioning; With X ray, gamma-rays, particle ray isoradial patient's affected part is being shone in the radiotherapy apparatus that carries out treatment of cancer; This image comparison device utilizes CT view data etc.; And this patient positioning utilizes this image comparison device, the patient is positioned at the radiation exposure position that irradiates lonizing radiation.
Background technology
In recent years, be in the radiotherapy apparatus of purpose with the treatment of cancer, the treatment of cancer device (being known as particle-beam therapeutic apparatus especially) that has utilized particle ray such as proton or heavy ion is being carried out exploitation and construction.As everyone knows; Compare with existing radiation cures such as X ray, gamma-rays, utilize the particle-beam therapeutic of particle ray to shine the cancer affected part in the concentrated area, promptly; Can accurately shine particle ray corresponding to the shape of affected part, can treat not influencing under the Normocellular situation.
In particle-beam therapeutic, it is very important particle ray to be shone accurately affected parts such as cancer.Therefore, when carrying out particle-beam therapeutic, utilize fixture etc. to fix the patient and make and can not misplace with respect to the treatment table of therapeutic room (exposure cell).In order accurately affected parts such as cancer to be positioned in the radiation exposure scope, utilize laser designator etc. that the patient is carried out the fixing roughly setting that waits, then, utilize radioscopic image etc. that patient's affected part is accurately located.
In patent documentation 1; Bed (bed) positioner and localization method thereof have been proposed; In this positioner and localization method; Any image in the benchmark image of the radioscopy image present image captured with utilizing the X ray receptor is not specified the same position of identical a plurality of signs (monument), carry out two stage pattern match, generate the location that drives treatment table and use information.In 1 pattern match; 2 dimension present images are set second regions; This second regions and the first regions size are roughly the same; Wherein first regions comprises and waits center (isocenter) (beam irradiation center) to what 2 dimension benchmark images were provided with, and mobile successively second regions in the zone of 2 dimension present images deposits everybody of second regions; 2 dimension present images in the dimension benchmark image of 2 in first regions and second regions compare, and extract to have second regions of tieing up present images with 2 dimension benchmark images the most similar 2 of first regions.In 2 pattern match, the dimension of 2 in second regions that will in 1 pattern match, extract present image compares with 2 dimension benchmark images in said first regions, carries out pattern match so that two images unanimity.
The prior art document
Patent documentation
Patent documentation 1: No. 3748433 communique of Japan's patent (0007~0009 section, 0049 section, Fig. 8, Fig. 9)
Invent technical problem to be solved
Because the shape of affected part is 3 dimension three-dimensional shapes, during affected part position when therefore affected part being navigated to treatment plan, use 3 d image than using 2 dimension image energies to make positioning accuracy higher.Generally speaking, when making treatment plan data, (computerized tomograph: Computed Tomography) image is confirmed the affected part shape of 3 dimensions to use X ray CT.In recent years, have following requirement: will possess X ray CT device in the therapeutic room, the X ray CT image in the time of using when treatment by captured X ray CT present image of X ray CT device and treatment plan positions.Therefore in the radioscopy image, can not reflect affected part well, use skeleton to carry out location matches basically, and use the X ray CT image to position to be owing to carry out location matches between the affected part that can be reflected the X ray CT image as soft tissue.
Therefore, in existing 2 stage pattern match, consideration expands to benchmark image and present image the situation of 3 d image.3 dimension benchmark images and 3 dimension present images comprise a plurality of faultage images (sectioning image) of taking with X ray CT device.Therefore 3 dimension present images need compare 3 dimension benchmark images with intensive image information and 3 dimension present images with image information more sparse than 3 dimension benchmark images from being envisioned for the less situation of image sheet number by viewpoints such as X-radiations.In existing 2 stage pattern match; There is following problem: tie up between the present images and can compare although have the 2 dimension benchmark images and 2 of the image information of equal densities respectively; But to the different 3 dimension benchmark images of image information density and 3 dimension present images when comparing; Can not realize 2 stage pattern match only through the image dimension of prior art is merely brought up to 3 dimensions from 2 dimensions.Promptly; There is following problem: can not be identical with prior art; Merely the 3 dimension present images of 3 dimension benchmark images in first regions that sets in second regions carry out pattern match 1 time; Merely 3 dimension benchmark images in the dimension present image of 3 in second regions of being extracted and first regions are compared, with the pattern match that realizes making two 3 d images the most consistent.
Summary of the invention
The objective of the invention is to, when the patient to radiation cure positions,, also can realize high-precision 2 stage pattern match (contrast of 2 stages) even tie up under the situation that benchmark images lack than 3 at the faultage image number of 3 dimension present images.
The technical scheme that is used for the technical solution problem
Image comparison device involved in the present invention comprises: captured 3 dimension benchmark images captured 3 dimension present images when treating when 3 d image input part, this 3 d image input part read the treatment plan of radiation cure respectively; Control treatment portion, this control treatment portion contrasts 3 dimension benchmark images and 3 dimension present images, calculates the position correction so that the posture of the affected part in the posture of the affected part in the 3 dimension present images and the 3 dimension benchmark images is consistent.Control treatment portion has: 1 comparing part; This 1 comparing part is carried out pattern match according to 3 dimension benchmark images 1 time to 3 dimension present images; And 2 comparing part; This 2 comparing part template zone is according to the rules carried out pattern match 2 times to the searching object zone of regulation; Wherein the template zone of regulation generates according to 3 dimension benchmark images or 3 dimensions in the present images and based on 1 pattern match result, and the formation base in the searching object of regulation zone basis and the template zone of regulation different 3 dimension benchmark images or in the 3 dimension present images another and generate based on 1 pattern match result.
The invention effect
Image comparison device involved in the present invention carries out pattern match according to 3 dimension benchmark images 1 time to 3 dimension present images; Then; Based on 1 pattern match result, generate the template zone of regulation and the searching object zone of regulation, carry out the 2 dimension pattern match in searching object zone and template zone; Therefore even tie up under the situation that benchmark images lack than 3 at the faultage image number of 3 dimension present images, also can realize high-precision 2 stage pattern match.
Description of drawings
Fig. 1 is the figure of the structure of related image comparison device of expression embodiment of the present invention 1 and patient positioning.
Fig. 2 is the figure of the expression integral device structure relevant with patient positioning with image comparison device of the present invention.
Fig. 3 is related 3 dimension benchmark images and the regional figure of benchmark image template of expression embodiment of the present invention 1.
Fig. 4 is the figure of 3 related dimension present images of expression embodiment of the present invention 1.
Fig. 5 is the figure that embodiment 11 time related pattern matching method of the present invention is described.
Fig. 6 is the figure that the relation of and sectioning image regional to the benchmark image template in 1 pattern matching method of Fig. 5 describes.
To be expression extract the figure in zone by 1 time of related 1 time of embodiment of the present invention 1 sectioning image that pattern matching method extracted to Fig. 7.
Fig. 8 is the figure that embodiment 12 times related pattern matching methods of the present invention are described.
Fig. 9 is the figure that the relation of and sectioning image regional to the benchmark image template in 2 pattern matching methods of Fig. 8 describes.
Figure 10 is the figure that embodiment 21 time related pattern matching method of the present invention are described.
Figure 11 is the figure that the relation of and sectioning image regional to the benchmark image template in 1 pattern matching method of Figure 10 describes.
Figure 12 is the figure of 3 dimension benchmark images after the related posture conversion of expression embodiment of the present invention 2.
Figure 13 is the figure that embodiment 22 times related pattern matching methods of the present invention are described.
Figure 14 is the figure of the structure of related image comparison device of expression embodiment of the present invention 3 and patient positioning.
The specific embodiment
Embodiment 1
Fig. 1 is the figure of the structure of related image comparison device of expression embodiment of the present invention 1 and patient positioning, and Fig. 2 is the figure that representes the integral device structure relevant with patient positioning with image comparison device of the present invention.In Fig. 2; The 1st, be used to carry out the CT emulator chamber of the treatment plan that will before radiation cure, be carried out; In this CT emulator chamber, exist CT stand 2, CT image taking to use the top board 3 of bed; Make patient's 4 accumbency on top board 3, and take treatment plan with the CT view data so that it comprises affected part 5.On the other hand; The 6th, be used for carrying out the therapeutic room of radiation cure, in this therapeutic room, there are CT stand 7, rotation therapy platform 8, and top board 9 arranged on the top of rotation therapy platform 8; Make patient's 10 accumbency on top board 9, and take and locate with the CT view data so that it comprises the affected part 11 when treating.
Here; The location is meant: the patient 10 when calculating treatment according to treatment plan with the CT view data and the position of affected part 11; Calculate the position correction and make with treatment plan consistently, carry out location matches so that the affected part 11 during treatment arrives the beam irradiation center 12 of radiation cures.Through rotation treatment table 8 being carried out the position that drive controlling moves top board 9 with the state that carrying patient 10 on the top board 9, thus the realization location matches.Rotation therapy platform 8 can carry out the driving correction of the 6DOF of translation/rotation; And through with the top board 9 Rotate 180 degree of rotation therapy platform 8, thus can be from the CT camera site (representing with solid line Fig. 2) move to a certain treatment position (being represented by dotted lines among Fig. 2) of the irradiation bed 13 that carries out radiation exposure.In addition, although have the opposed locations relation of 180 degree at the camera site of CT shown in Fig. 2 and treatment position, yet configuration mode be not limited thereto, both position relations also can be into the position relation of other angle of 90 degree etc.
Treatment plan is transferred to position computer 14 with CT view data and location with the CT view data.Treatment plan becomes 3 dimension benchmark images with the CT view data, and the location becomes 3 dimension present images with the CT view data.Image comparison device 29 among the present invention and patient positioning 30 all be present in this position computer 14 in computer software relevant; And image comparison device 29 calculates above-mentioned position corrections (translational movement, rotation amount), and patient positioning 30 comprises image comparison device 29 but also has the function that calculates the parameter that each driving shaft of rotation treatment table 8 (according to circumstances simply being called treatment table 8) is controlled based on this position correction.Patient positioning 30 is through controlling treatment table 8 according to image comparison device 29 resulting matching results (results of comparison), thus to the object affected part channeling conduct of particle-beam therapeutic to be located at the beam irradiation center 12 of therapy equipment.
In the location in existing radiation cure; (digital reconstruction radiography: captured radioscopy image in image or the radioscopy image of meanwhile taking and the therapeutic room when treating Digitally Reconstructed Radiography) calculates position offset to the DRR that is generated with the CT view data according to treatment plan through contrast.In the radioscopy image, owing to can not reflect affected part well as soft tissue, thereby use the location matches of skeleton basically.The location of the use CT view data of in this embodiment, being explained has following characteristic: CT stand 7 is set in therapeutic room 6; And carry out location matches with the CT view data owing to utilize CT view data and treatment plan before being about to treat; Therefore affected part can be directly depicted, and the location matches of affected part can be carried out.
Then, the calculation procedure to the above-mentioned position correction of image comparison device in this embodiment 29 and patient positioning 30 describes.Fig. 1 representes the relation between each data processing division of composing images comparison device and patient positioning, and here, image comparison device 29 possesses: the 3 d image input part 21 that reads the CT view data; Control treatment portion 22; Results of comparison display part 23; And results of comparison efferent 24.The device that image comparison device 29 has been added treatment table control calculation of parameter portion 26 is a patient positioning 30.
As stated, 3 dimension benchmark images are to be used for treatment plan when carrying out treatment plan and the data of taking, and it is characterized in that, by the affected part information (affected part shape etc.) of manual work input expression as the affected part of particle-beam therapeutic object.3 dimension present images are to be used for the patient location when treating and the data of taking, and it is characterized in that, from suppressing by the viewpoint of X-radiation, the sheet number of faultage image (also being called sectioning image) is less.
Among the present invention, adopt the structure of carrying out 2 stage pattern match: 3 dimension present images are carried out pattern match 1 time according to 3 dimension benchmark images; Then, based on 1 pattern match result, generate the template zone of regulation and the searching object zone of regulation, pattern match is carried out 2 times in the template zone of using this regulation in the same way or oppositely.In 2 stage pattern match, the match parameter of the match parameter when carrying out 1 pattern match through making when carrying out 2 pattern match is inequality, thereby can realize high speed and high-precision processing.For example, there is following method: under low resolution, carrying out pattern match 1 time as object on a large scale, and use template zone or the searching object zone of being found, under high-resolution, carry out pattern match 2 times as object with the scope that filters out.
3 d image input part 21 is described.3 d image input part 21 read by X ray CT device group of pictures captured, that constitute by a plurality of faultage images, DICOM (the medical digital imaging with communicate by letter: Digital Imaging and Communications in Medicine) view data (sectioning image crowd) of form is to tie up volume datas as 3.Treatment plan is the dimension of 3 when carrying out a treatment plan volume data with the CT view data, i.e. 3 dimension benchmark images.The location is the dimension of 3 when treating volume datas with the CT view data, promptly 3 ties up present images.In addition, the CT view data is not limited to the DICOM form, also can be the data of other form.
Control treatment portion 22 pairs 3 dimensions benchmark image and 3 dimension present images contrast (pattern match), calculate the position correction so that the posture of the affected part in affected part posture in the 3 dimension present images and the said 3 dimension benchmark images is consistent.Results of comparison display part 23 show on the display picture of position computer 14 result after contrasting by control treatment portion 22 (following position correction or will move with this position correction after 3 dimension present images and 3 tie up benchmark images and coincide images displayed etc.).Correction when 24 outputs of results of comparison efferent utilize control treatment portion 22 pairs 3 dimensions benchmark image and 3 dimension present images to contrast, the position correction (translational movement, rotation amount) of utilizing control treatment portion 22 to calculate.Treatment table control calculation of parameter portion's 26 output valves with results of comparison efferent 24 (3 [Δ A, Δ B, Δ C], totally 6 degree of freedom are rotated in 3 of translations [Δ X, Δ Y, Δ Z]) convert each parameter controlled to treatment table 8 to, promptly calculate parameter.Treatment table 8 is based on the treatment table control parameter of utilizing treatment table control calculation of parameter portion 26 to be calculated, drives each driving device of treatment table 8.Thus, can calculate the position correction and make with treatment plan consistently, and can carry out location matches so that the affected part 11 when treating arrives the beam irradiation center 12 of radiation cures.
Control treatment portion 22 has: posture transformation component 25; 1 comparing part 16; 2 comparing part 17; Benchmark template zone generation portion 18.When carrying out 1 pattern match or 2 pattern match, posture transformation component 25 changes the posture of object data.1 time comparing part 16 is carried out pattern match according to 3 dimension benchmark images 1 time to 3 dimension present images.2 times comparing part 17 template zone is according to the rules carried out pattern match 2 times to the searching object zone of regulation; Wherein the template zone of regulation generates according to 3 dimension benchmark images or 3 dimensions in the present images and based on 1 pattern match result, and basis 3 dimension benchmark images different in the searching object of regulation zone with the formation base in the template zone of regulation or in the 3 dimension present images another and generate based on 1 pattern match result.
Utilize Fig. 3 to Fig. 9, specify control treatment portion 22.Fig. 3 is related 3 dimension benchmark images and the regional figure of benchmark image template of expression embodiment of the present invention 1.Fig. 4 is the figure of 3 related dimension present images of expression embodiment of the present invention 1.Fig. 5 is the figure that embodiment 11 time related pattern matching method of the present invention is described.Fig. 6 is the figure that the relation of and sectioning image regional to the benchmark image template in 1 pattern matching method of Fig. 5 describes.To be expression extract the figure in zone by 1 time of related 1 time of embodiment of the present invention 1 sectioning image that pattern matching method extracted to Fig. 7.Fig. 8 is the figure that embodiment 12 times related pattern matching methods of the present invention are described.Fig. 9 is the figure that the relation of and sectioning image regional to the benchmark image template in 2 pattern matching methods of Fig. 8 describes.
The benchmark template zone generation portion 18 of control treatment portion 22 uses the affected part shape of when carrying out treatment plan, being imported (affected part information), generates benchmark image template zone 33 from 3 dimension benchmark images 31.3 dimension benchmark images 31 are made up of a plurality of sectioning images 32.In Fig. 3, for convenience, show the example that is constituted by 5 sectioning images 32a, 32b, 32c, 32d, 32e.The affected part shape is as ROI (interesting areas: Region of Interest) 35, close profile input as what in each sectioning image, surround affected part.Can be with comprising the above-mentioned zone of closing profile for example as external tetragon 34, and the cuboid zone that will comprise each external tetragon 34 is as the template zone.With this template zone as benchmark image template zone 33.1 comparing part 16 of control treatment portion 22 is carried out 1 pattern match benchmark image template zone 33 is matched 3 dimension present images 36.The example that 3 dimension present images, 36 expressions shown in Figure 4 are made up of 3 sectioning image 37a, 37b, 37c.Present image zone 38 expressions shown in Figure 5 become the cuboid that comprises 3 sectioning image 37a, 37b, 37c.As shown in Figure 5, benchmark image template zone 33 (33a, 33b, 33c) are moved with the raster scanning shape, calculate correlation with 3 dimension present images 36.As correlation, normalized crosscorrelation value capable of using etc., all correlations that in images match (image contrast), utilized.
Benchmark image template zone 33a moves in sectioning image 37a with the raster scanning shape along scanning pattern 39a.Equally, benchmark image template zone 33b moves in sectioning image 37b with the raster scanning shape along scanning pattern 39b, and benchmark image template zone 33c moves in sectioning image 37c with the raster scanning shape along scanning pattern 39c.In order to make accompanying drawing simple, scanning pattern 39b, 39c are shown briefly.
When carrying out 1 pattern match, as shown in Figure 6, to constituting each sectioning image 53 in benchmark image template zone 33, carry out the image contrast with the sectioning image 37 that constitutes present image zone 38.Sectioning image 53 is the images that in the sectioning image 32 of 3 dimension benchmark images 31, are divided into by benchmark image template zone 33.Benchmark image template zone 33 is made up of 5 sectioning image 32a, 32b, 32c, 32d, 32e corresponding 5 sectioning image 53a, 53b, 53c, 53d, the 53e with 3 dimension benchmark images.Thereby, when carrying out 1 pattern match, utilize 5 sectioning image 53a, 53b, 53c, 53d, 53e in the benchmark image template zone 33 respectively, the sectioning image 37a of 3 dimension present images 36 is carried out the image contrast.To sectioning image 37b, the 37c of 3 dimension present images 36, carry out the image contrast equally.
1 comparing part 16 extracts 1 time from each sectioning images 37 of 3 dimension present images 36 and extracts zone 43, so that it comprises the highest zone of correlation in present image zone 38 and present image template zone 33.As shown in Figure 7, extract 1 time and extract regional 43a from the sectioning image 37a of 3 dimension present images 36.Equally, sectioning image 37b, the 37c from 3 dimension present images 36 extracts 1 time and extracts regional 43b, 43c.Generate 1 extraction present image zone 42, extract regional 43a, 43b, 43c so that it comprises 1 time as the searching object zone that is used for 2 pattern match.Like this, 1 comparing part 16 generates 1 time and extracts present image zone 42, and this 1 time extraction present image zone 42 is as the searching object zone that is used for 2 pattern match.
Here; Because under the state before the location; The posture (rotating 3) of 3 dimension benchmark images 31 and 3 dimension present images 36 is inconsistent, therefore in the simple raster scanning like Fig. 5, under the less situation of the section sheet number of 3 dimension present images 36; Although can not angular deflection also can detected high-precision coupling, extract 1 time of being used to carry out 2 pattern match and extract zone 43 and be out of question.Therefore, in 1 pattern match, calculate correlation and the skew of non-detection angles, in 2 pattern match thereafter, angular deflection also can the high coupling of detected precision.
2 pattern match are described.In 2 pattern match, utilize the posture transformation component 25 of control treatment portion 22 to generate the posture conversion template zone 40 after posture to the benchmark image template zone 33 that generates from 3 dimension benchmark images 31 carries out conversion.In 2 pattern match, like Fig. 8 and shown in Figure 9, when mating, the posture change amount (rotating 3) of appending benchmark image template zone 33 is with as parameter.Extract between the present image zone 42 the high-precision coupling that angular deflection also comprises 1 time of the 3 dimension present images 36 that 2 comparing part 17 posture conversion template after utilizing posture transformation component 25 to carry out the posture conversion regional 40 and slice map photo number are less.Through like this, can realize the high-precision 2 stage pattern match that angular deflection also comprises.Through will comprise the zone obtained by 1 pattern match in interior narrow range as object with exploration scope as 2 pattern match; Thereby can use and comprise with low resolution, wide region is carried out 1 pattern match as object and extracts zone 43 find out for 1 time and extract present image zone 42 at interior 1 time; Carry out pattern match 2 times with high-resolution, and can shorten the required time of pattern match.
The present image zone 42 of extracting for 1 time shown in Figure 8 is expressed as and comprises 3 cuboids that extract regional 43a, 43b, 43c for 1 time.Extract among the regional 43a 1 time of sectioning image 37a along scanning pattern 39a, with the raster scanning shape as the posture conversion template zone 40a that posture is carried out the benchmark image template zone after the conversion and to move.Equally; Extract among the regional 43b 1 time of sectioning image 37b along scanning pattern 39b, with the raster scanning shape as the posture conversion template zone 40b that posture is carried out the benchmark image template zone after the conversion and to move, extract among the regional 43c 1 time of sectioning image 37c along scanning pattern 39c, with the raster scanning shape as the posture conversion template zone 40c that posture is carried out the benchmark image template zone after the conversion and move.In order to make accompanying drawing simple, scanning pattern 39b, 39c are shown briefly.
When carrying out 2 pattern match, as shown in Figure 9, utilize comparing part 17 2 times, between 1 extraction zone 43 of the sectioning image 37 that the section 41 and the formation in posture conversion template zone 40 are extracted present image zone 42 for 1 time, carry out image and contrast.In addition, also can between sectioning image 55 and section 41, carry out the image contrast, this sectioning image 55 is in the sectioning image 37 of 3 dimension present images 36, to extract the image that present image zone 42 is divided into by 1 time.Generate the section 41 in posture conversion template zone 40 from a plurality of sectioning images 32 of 3 dimension benchmark images 31.For example, the data of section 41 are from constituting a plurality of sectioning images 32 interceptings of 3 dimension benchmark images 31.Usually, it is inequality that regional 43 packing density is extracted in the packing density of the section 41 in posture conversion template zone 40 and 3 dimension present images 36 1 time, but the correlation that calculates each pixel of section 41 gets final product.In addition, the section 41 in posture conversion template zone 40 also can comprise and carried out completion and make packing density and 3 dimension present images 36 1 time of section 41 extract regional 43 the identical data of packing density.
Here, 2 stage pattern matching methods of embodiment 1 are summarized.At first, the benchmark template of control treatment portion 22 zone generation portion 18 generates benchmark image template zone 33 (benchmark image template zone generates step) from 3 dimension benchmark images 31.1 time comparing part 16 is carried out 1 pattern match (1 pattern match step) according to benchmark image template zone 33 pairs 3 dimensions present image 36.1 pattern match is carried out the image contrast to constituting each sectioning image 53 in benchmark image template zone 33 with the sectioning image 37 that constitutes present image zone 38.1 comparing part 16 is in each benchmark image template zone 33 o'clock of scanning; Calculate the correlation (correlation value calculation step) between present image zone 38 and the benchmark image template zone 33; Through 1 pattern match, extract 1 time and extract zone 43 so that it comprises the highest zone (1 extraction extracted region step) of correlation between present image zone 38 and the benchmark image template zone 33.1 time comparing part 16 generates as 1 time regional extraction present image zone 42 of the searching object that is used for 2 pattern match, so that it comprises regional 43 (the searching object generation steps) of 1 extraction of each sectioning image 37 that constitutes present image zone 38.2 stage pattern matching methods of embodiment 1 comprise: benchmark image template zone generates step; 1 pattern match step; And 2 times following pattern match steps.1 time the pattern match step comprises: the correlation value calculation step; Extract the extracted region step 1 time; And searching object generates step.
Then, 2 of control treatment portion 22 comparing part 17 are carried out 1 time of posture conversion template zone 40 pairs 3 dimensions present images 36 after the conversion according to the posture by 25 pairs of benchmark image templates zones 33 of posture transformation component and are extracted present images zone 42 and carry out 2 pattern match (2 pattern match steps).A plurality of sections 41 (section generation step) in the posture conversion template zone 40 behind the posture that 2 pattern match generating transformations become to stipulate; To each section 41, extract for 1 time that constitutes the sectioning image 37 that extracts present image zone 42 for 1 time zone 43 or sectioning image 55, with this section 41 between carry out image and contrast.In each scanning position posture conversion template zone 40 o'clock, 2 comparing part 17 were calculated the correlation (correlation value calculation step) that extracts present image zone 42 and a plurality of sections 41 in posture conversion template zone 40 for 1 time.In addition; Posture transformation component 25 carries out conversion to become the posture different with posture before (posture shift step); 2 comparing part 17 generate a plurality of sections 41 (section generation step) in the posture conversion template zone 40 in this posture; In each scanning position posture conversion template zone 40 o'clock, calculates the correlation (correlation value calculation step) that extracts present image zone 42 and a plurality of sections 41 of posture conversion template regional 40 for 1 time.2 comparing part 17 of control treatment portion 22 will as in the correlation that calculates the posture of 3 of high correlation dimension benchmark images and 3 dimension present images concern that (posture information) is chosen to be optimum solution (optimum solution is selected step).Present image---these two kinds of 3 d images are the most consistent so that 3 dimension benchmark images are with 3 dimensions to realize pattern match thus.2 times the pattern match step comprises: section generates step; The correlation value calculation step; The posture shift step; And the selected step of optimum solution.
After pattern match finishes; Control treatment portion 22 is according to its correlation posture for the highest posture conversion template zone 40 in the correlation that calculates, and calculates the position correction (translational movement, rotation amount) (position correction calculation procedure) when 3 dimension benchmark images 31 and 3 are tieed up present images 36 and contrast.Results of comparison display part 23 shows the position correction in the display picture of computer 14, the 3 dimension present images and 3 after will moving with this position correction are tieed up benchmark images and coincided images displayed etc.Position correction (translational movement, rotation amount) (position correction output step) when 24 outputs of results of comparison efferent utilize control treatment portion 22 pairs 3 dimensions benchmark image 31 and 3 dimension present images 36 to contrast.26 output valves (3 of the translations [Δ X, Δ Y, Δ Z] of treatment table control calculation of parameter portion with results of comparison efferent 24; Rotate 3 [Δ A, Δ B, Δ C]; Totally 6 degree of freedom) convert each parameter of controlling to, promptly calculate parameter (treatment table control calculation of parameter step) treatment table 8.Treatment table 8 is based on the treatment table control parameter of utilizing treatment table control calculation of parameter portion 26 to be calculated, drives each driving device (treatment table actuation step) of treatment table 8.
Embodiment 1 related image comparison device 29 carries out from 1 pattern match of 3 dimension benchmark images, 31 to 3 dimension present images 36; Then; Based on 1 pattern match result; Generate posture conversion template zone 40 from 3 dimension benchmark images 31 as the template zone that is used for 2 pattern match of regulation; Generate as the searching object zone of the regulation that is used for 2 pattern match 1 time from 3 dimension present images 36 and extract present image zone 42 and extract zone 43,, also can realize the 2 stage pattern match that precision is high even therefore the faultage image number (sectioning image number) of 3 dimension present images 36 is tieed up under the situation that benchmark images 31 lack than 3 so that it comprises 1 time.
Because even the related image comparison device 29 of embodiment 1 is tieed up under the situation that benchmark images 31 lack than 3 at the faultage image number (sectioning image number) of 3 dimension present images 36; Also can realize the 2 stage pattern match that precision is high; Therefore the faultage image number in the time of reducing location matches from 3 dimension present images 36 of X ray CT device, the patient's radioactive exposure amount in the time of dipping coupling because of X ray CT device.
Embodiment 1 related image comparison device 29 generates 1 time based on carrying out the result of 1 pattern match from 31 pairs 3 dimensions of 3 dimension benchmark images present image 36 and extracts present image zone 42; Through present image zone 42 is extracted as searching object than narrow 1 time of present image regional 38 in its zone; Thereby utilize 1 the extraction present image zone 42 that comprises 1 extraction zone 43; Can carry out high-resolution 2 pattern match; Can shorten the pattern match required time, wherein extract for 1 time the zone 43 with low resolution, wide region carried out 1 pattern match as object find.
The related patient positioning 30 of embodiment 1 is based on the position correction of utilizing image comparison device 29 to calculate, can make that the posture when carrying out treatment plan is complementary.Owing to can make the posture when carrying out treatment plan be complementary, therefore location matches can be carried out so that the beam of the affected part 11 arrival radiation cures when treating shines center 12.
Embodiment 1 related patient positioning 30 posture transformation components 25 capable of using generate posture conversion template zone 40; This posture conversion template zone 40 is suitable for matching faultage image number (sectioning image number) than the 3 dimension present images 36 that 3 dimension benchmark images 31 lack from the benchmark image template zone 33 that is obtained by 3 dimension benchmark images 31, can realize the 2 high stage pattern match of precision that angular deflection also comprises.
Embodiment 1 related image comparison device 29 comprises: captured 3 dimension benchmark images 31 captured 3 dimension present images 36 when treating when 3 d image input part 21, this 3 d image input part 21 read the treatment plan of radiation cure respectively; And control treatment portion 22; This control treatment portion 22 pairs 3 dimensions benchmark image 31 and 3 dimension present images 36 contrast; Calculate the position correction so that the posture of the affected part in the posture of the affected part in the 3 dimension present images 36 and the 3 dimension benchmark images 31 is consistent; Control treatment portion 22 has: 1 comparing part 16, and this 1 comparing part 16 is carried out pattern match 1 time according to 31 pairs 3 dimensions of 3 dimension benchmark images present image 36; And 2 comparing part 17; These 2 comparing part 17 template zone (posture conversion template zone 40) is according to the rules carried out pattern match 2 times to the searching object zone 42 of regulation; Wherein the template zone of regulation generates according to 3 dimension benchmark images 31 or 3 dimensions in the present images 36 and based on 1 pattern match result; And searching object zone 42 bases of regulation 3 dimension benchmark images 31 different with the formation base in the template zone (posture domain transformation 40) of regulation or in the 3 dimension present images 36 another and generate based on 1 pattern match result; Therefore; Even tie up under the situation that benchmark images 31 lack than 3 at the faultage image number of 3 dimension present images 36, also can realize the 2 stage pattern match that precision is high.
Embodiment 1 related patient positioning 30 comprises: image comparison device 29; And treatment table control calculation of parameter portion 26; This treatment table control calculation of parameter portion 26 controls each of treatment table 8 based on the position correction of utilizing image comparison device 29 to calculate; And image comparison device 29 comprises: captured 3 dimension benchmark images 31 captured 3 dimension present images 36 when treating when 3 d image input part 21, this 3 d image input part 21 read the treatment plan of radiation cure respectively; And control treatment portion 22; This control treatment portion 22 pairs 3 dimensions benchmark image 31 and 3 dimension present images 36 contrast, and calculate the position correction so that the posture of the affected part in the posture of the affected part in the 3 dimension present images 36 and the 3 dimension benchmark images 31 is consistent.Control treatment portion 22 has: 1 comparing part 16, and this 1 comparing part 16 is carried out pattern match 1 time according to 31 pairs 3 dimensions of 3 dimension benchmark images present image 36; And 2 comparing part 17; These 2 comparing part 17 template zone (posture conversion template zone 40) is according to the rules carried out pattern match 2 times to the searching object zone 42 of regulation; Wherein the template zone of regulation generates according to 3 dimension benchmark images 31 or 3 dimensions in the present images 36 and based on 1 pattern match result; And searching object zone 42 bases of regulation 3 dimension benchmark images 31 different with the formation base in the template zone (posture domain transformation 40) of regulation or in the 3 dimension present images 36 another and generate based on 1 pattern match result; Therefore; Even tie up under the situation that benchmark images 31 lack than 3 at the faultage image number of 3 dimension present images 36, also can carry out the high location of precision.
Embodiment 1 relates to the image contrast method, this image contrast method during to the treatment plan of radiation cure captured 3 dimension benchmark images 31 3 captured dimension present images 36 when treating contrast; This image contrast method comprises: 1 pattern match step, and this 1 pattern match step is carried out pattern match 1 time according to 31 pairs 3 dimensions of 3 dimension benchmark images present image 36; And 2 pattern match steps; These 2 pattern match steps template zone (posture conversion template zone 40) is according to the rules carried out pattern match 2 times to the searching object zone 42 of regulation; Wherein the template zone of regulation generates according to 3 dimension benchmark images 31 or 3 dimensions in the present images 36 and based on 1 pattern match result; And searching object zone 42 bases of regulation 3 dimension benchmark images 31 different with the formation base in the template zone (posture domain transformation 40) of regulation or in the 3 dimension present images 36 another and generate based on 1 pattern match result; Therefore; Even tie up under the situation that benchmark images 31 lack than 3 at the faultage image number of 3 dimension present images 36, also can realize the 2 stage pattern match that precision is high.
Embodiment 2
In 2 stage pattern match of embodiment 2; Carry out from 1 pattern match of 3 dimension benchmark images, 31 to 3 dimension present images 36; Then; Result based on 1 pattern match; Generate present image template zone 44 from 3 dimension present images 36, will carry out posture conversion benchmark image-region 47 after the conversion to the posture of 3 dimension benchmark images 31, carry out pattern match 2 times according to the 44 pairs of posture conversion benchmark image-regions 47 in current template zone as searching object as the template zone that is used for 2 pattern match of regulation.2 times pattern match is and 1 reverse pattern match of pattern match.
Figure 10 is the figure that embodiment 21 time related pattern matching method of the present invention are described, and Figure 11 is the figure that the relation of and sectioning image regional to the benchmark image template in 1 pattern matching method of Figure 10 describes.In embodiment 2, through 1 pattern match, 1 time comparing part 16 is rotated 3 explorations that also comprise and is obtained the posture change amount.
Present image zone 38 expressions shown in Figure 10 become the cuboid that comprises 3 sectioning image 37a, 37b, 37c.The posture conversion template zone 40a, 40b, the 40c that become the benchmark image template zone of embodiment 2 are the zones that utilizes after posture transformation component 25 carries out the posture conversion.But the initial position posture is default conditions, and for example, the parameter of rotating 3 is 0.As the posture conversion template zone 40a that posture is carried out the benchmark image template zone after the conversion along scanning pattern 39a, in sectioning image 37a, move with the raster scanning shape.Equally; Along scanning pattern 39b, in sectioning image 37b, move, posture is carried out posture conversion template zone 40c after the conversion along scanning pattern 39c, move in sectioning image 37c with the raster scanning shape as the posture conversion template zone 37b that posture is carried out the benchmark image template zone 40b after the conversion with the raster scanning shape.In order to make accompanying drawing simple, scanning pattern 39b, 39c are shown briefly.
Posture is carried out conversion on one side, Yi Bian and carry out 3 and tie up the correlation computations in sectioning image 37a, 37b, 37c and the posture conversion template zone 40 of present images 36.For example, each variable quantity or rate of change with regulation of 3 of rotations are changed, carry out correlation computations, move to next scanning position, carry out correlation computations.Shown in figure 11,1 comparing part 16 is carried out image and is contrasted between the section 41 in posture conversion template zone 40 and the sectioning image 37 that constitutes present image zone 38.The section 41 in posture conversion template zone 40 is faces that posture conversion template zone 40 is cut off with the face that parallels with sectioning image 32 as 3 dimension benchmark images 31 of initial position posture, and is (section generation step) that a plurality of sectioning images 32 from 3 dimension benchmark images 31 generate.For example, can use method illustrated in the embodiment 1.That is, the data of section 41 can be from constituting a plurality of sectioning images 32 interceptings of 3 dimension benchmark images 31.In addition, the section 41 in posture conversion template zone 40 also can comprise and carried out completion and make the packing density and 3 of section 41 tie up the identical data of packing density of present images 36.
Then, 1 time comparing part 16 generates present image template zone 44, and this current image template zone 44 is used for coupling 2 times.1 comparing part 16 is for example according to 3 exploration results that all comprise of rotation in each image of each sectioning image 37a, 37b, 37c, obtain the highest posture conversion template zone 40 of correlation section 41, the posture conversion template of this moment zone 40 the posture change amount and with the extraction zone of these section 41 corresponding sectioning images 37.1 comparing part 16 generates present image template zone 44 so that it comprises extraction zone of 3 the highest dimension present images of correlation from the extraction zone of each sectioning image of obtaining.Present image template zone 44 is 2 dimension images.
Then; Shown in figure 12; Utilize the posture transformation component 25 of control treatment portion 22; 3 dimension benchmark images, 31 whole postures are changed to generate the regional above-mentioned posture change amount of being obtained of present image template at 44 o'clock, and generate 3 dimension posture conversion benchmark images 45 after the posture conversion, promptly generate posture conversion benchmark image-region 47.Figure 12 is the figure of 3 dimension benchmark images after the related posture conversion of expression embodiment of the present invention 2. Sectioning image 46a, 46b, 46c, 46d, 46e are respectively sectioning image 32a, 32b, 32c, 32d, 32e are carried out the sectioning image after the posture change with above-mentioned posture change amount.
Then; Shown in figure 13; 2 comparing part 17 are along scanning pattern 49, match the posture conversion benchmark image-region 47 as the dimension of 3 after posture conversion posture conversion benchmark image 45 with the raster scanning shape with present image template zone 44, thereby can only detect translational offsets at high speed.Figure 13 is the figure that embodiment 22 times related pattern matching methods of the present invention are described.Posture conversion benchmark image-region 47 expressions of carrying out after the posture conversion become the cuboid that comprises 5 sectioning image 46a, 46b, 46c, 46d, 46e.Contrast execution face 48 is image surfaces corresponding with following posture; This posture is and utilizes 1 pattern match corresponding to the highest posture of correlation between the posture of the sectioning image 37 of 3 dimension present images 36; Promptly be the face that becomes with the equal posture of following posture, the sectioning image 37 that 3 in this posture and the posture conversion benchmark image-region 47 tieed up present images 36 is corresponding.2 times comparing part 17 is carried out face 48 from the contrast that posture conversion benchmark image-region 47 generates regulation, from a plurality of sectioning images 46 generations (contrast is carried out and looked unfamiliar into step) of 3 dimension posture conversion benchmark images 45.For example, can use method illustrated in the embodiment 1.That is, the data of contrast execution face 48 can be from constituting a plurality of sectioning image interceptings of 3 dimension posture conversion benchmark images 45.In addition, contrast execution face 48 comprises and has carried out completion and make that the packing density of contrast execution face 48 is identical with the packing density of present image template regional 44.
2 stage pattern matching methods to embodiment 2 are summarized.At first, control treatment portion 22 utilizes posture transformation component 25, generates the posture conversion template zone 40 (posture conversion template zone generates steps) of carrying out behind the evolutions from 3 dimension benchmark images 31.1 comparing part 16 of control treatment portion 22 is carried out 1 pattern match (1 pattern match step) with posture conversion template zone 40 pairs 3 dimensions present image 36.The posture in posture conversion template zone 40 is changed when when executing location posture shift step (each); 1 pattern match generates the section 41 (section generation step) in posture conversion templates zone 40 to each sectioning image 37 that constitutes present image zone 38, between the section 41 in this posture conversion template zone 40 and the sectioning image 37 that constitutes present image zone 38, carries out image and contrasts.
When the posture in posture conversion template zone 40 was changed, 1 comparing part 16 calculated the correlation (correlation value calculation step) in present image zone 38 and posture conversion template zone 40.In addition; Each scanning position posture conversion template zone 40 o'clock; 1 comparing part 16 calculates the correlation in present image zone 38 and posture conversion template zone 40; Utilize pattern match 1 time, generate present image template zone 44, so that it comprises the extraction zone (present image template zone generates step) in the highest posture conversion template zone 40 of the correlation in present image zone 38 and posture conversion template zone 40.
Then; Control treatment portion 22 utilizes posture transformation component 25; 3 dimension benchmark images, 31 whole postures are changed to generate the regional above-mentioned posture change amount of being obtained of present image template at 44 o'clock; And generate 3 dimension posture conversion benchmark images 45 after the posture conversion, promptly generate posture conversion benchmark image-region 47 (posture conversion benchmark image-region generates step).2 times comparing part 17 is carried out 2 pattern match (2 pattern match steps) with 44 pairs of posture conversion benchmark image-regions in present image template zone 47.2 pattern match are carried out through contrast and are looked unfamiliar into step and generate contrast execution face 48, look unfamiliar into contrast that step generates and carry out face 48 and carry out the image contrast with present image template zone 44 being carried out by contrast.When carrying out the contrast of this image, do not make 44 rotations of present image template zone and carry out translation, calculate the correlation (correlation value calculation step) in contrast execution face 48 and present image template zone 44 simultaneously.
In 2 pattern match, 2 comparing part 17 of control treatment portion 22 with in the correlation that is calculated the posture relation (posture information) in 3 dimension posture conversion benchmark images 45 of high correlation and present image template zone 44 be chosen to be optimum solution (optimum solution is selected step).Thus, mate through 2 stages that to realize that pattern match makes 3 dimension benchmark images 31 and 3 tie up present images 36 these two kinds of 3 d images the most consistent.2 stage pattern matching methods of embodiment 2 comprise: posture conversion template zone generates step; 1 pattern match step; Posture conversion benchmark image-region generates step; And 2 pattern match steps.1 time the pattern match step comprises: section generates step; The correlation value calculation step; The posture shift step; And present image template zone generates step.2 times the pattern match step comprises: contrast is carried out and is looked unfamiliar into step; The correlation value calculation step; And the selected step of optimum solution.
After pattern match finished, control treatment portion 22 calculated position correction (translational movement, rotation amount) (position correction calculation procedure) when 3 dimension benchmark images 31 and 3 tieed up present images 36 contrast for the posture in the high correlation zone in the 3 the highest dimension posture conversion benchmark images 45 from its correlation the correlation that calculates.Results of comparison display part 23 in the display picture of computer 14, show the position correction or will move with this position correction after 3 the dimension present images overlap 3 the dimension benchmark images come images displayed etc.Position correction (translational movement, rotation amount) (position correction output step) when 24 outputs of results of comparison efferent utilize control treatment portion 22 pairs 3 dimensions benchmark image 31 and 3 dimension present images 36 to contrast.26 output valves (3 of the translations [Δ X, Δ Y, Δ Z] of treatment table control calculation of parameter portion with results of comparison efferent 24; Rotate 3 [Δ A, Δ B, Δ C]; Totally 6 degree of freedom) convert each parameter of controlling to, promptly calculate parameter (treatment table control calculation of parameter step) treatment table 8.Treatment table 8 is based on the treatment table control parameter of utilizing treatment table control calculation of parameter portion 26 to be calculated, drives each driving device (treatment table actuation step) of treatment table 8.
Embodiment 2 related image comparison devices 29 are tieed up present image 36 according to 40 pairs 3 in the posture conversion template zone of 3 dimension benchmark images 31 and are carried out as 1 pattern match of rotating 3 images contrasts that also comprise; Then; Based on 1 pattern match result; Generate present image template zone 44 from 3 dimension present images 36 as the template zone that is used for 2 pattern match; Therefore, even tie up under the situation that benchmark images 31 lack than 3, also can realize the 2 stage pattern match that precision is high at the faultage image number (sectioning image number) of 3 dimension present images 36.
Embodiment 2 related image comparison devices 29 are tieed up posture conversion benchmark images 45 through generating as 3 of the dimension of 3 after posture conversion benchmark image from 3 dimension benchmark images 31; Promptly through generating posture conversion benchmark image-region 47; Thereby can use the present image template zone 44 of 2 dimensions, and utilize the translation of moving without rotation to move posture conversion benchmark image-region 47 is realized direct pattern match.In 2 pattern match, owing to only calculate the correlation that each translation is moved, therefore compare with calculating the situation that each rotation moves the correlation that moves with translation, realized the high speed of 2 pattern match.
Embodiment 3
Embodiment 3 is with the different of embodiment 1 and 2, utilizes somatic data storehouse (spectrum model: atlas model) generate the benchmark image template zone 33 as the basis in posture conversion template zone 40 of the benchmark image template zone 33 that is used for 1 pattern match or the embodiment 2 of embodiment 1.Figure 14 is the figure of the structure of related image comparison device of expression embodiment of the present invention 3 and patient positioning.The difference of the related image comparison device 29 of embodiment 3 and embodiment 1 and 2 related image comparison devices 29 is that it has: somatic data storehouse input part 50; And average template zone generation portion 51.Embodiment 3 related patient positionings 30 have: image comparison device 29 and treatment table control calculation of parameter portion 26.
Somatic data storehouse input part 50 obtains somatic data storehouse (spectrum model) from storage devices such as data library devices.Average template zone generation portion 51 from the average template of the internal organs part intercepting zone 54 in patient 4,10 affected part 5,11 corresponding somatic data storehouses.The benchmark template of control treatment portion 22 zone generation portion 18 is through should average template zone 54 pattern match to 3 dimension benchmark image 31, thereby generates benchmark image template zone 33 (the regional generation of benchmark image template steps) automatically.
Utilize said reference image template zone 33, carry out 2 stage pattern match of embodiment 1 or 2 stage pattern match of embodiment 2.Through like this,, also can realize 2 stage pattern match even on 3 dimension benchmark images, do not prepare the information (affected part shape etc.) of expression affected part in advance.
In addition, also considered average template zone generation portion 51 from the situation in the average template zone of internal organs part intercepting 2 dimensions in patient 4,10 affected part 5,11 corresponding somatic data storehouses.Under the situation in the average template zone 54 of 2 dimensions, the average template zone of a plurality of 2 dimensions is compiled in the average template zone of a plurality of 2 dimensions of intercepting, outputs to control treatment portion 22.The benchmark template of control treatment portion 22 zone generation portion 18 matches 3 dimension benchmark images 31 through average template region region pattern that will a plurality of 2 dimensions, thereby generates benchmark image template regional 33 automatically.
Label declaration
16...1 inferior comparing part; 17...2 inferior comparing part; 18... benchmark template zone generation portion; 21...3 dimension image input part; 22... control treatment portion; 25... posture transformation component; 26... treatment table control calculation of parameter portion; 29... image comparison device; 30... patient positioning; 31...3 dimension benchmark image; 33... the benchmark image template is regional; 36...3 dimension present image; 40,40a, 40b, 40c... posture conversion template zone; 41... section; 42...1 inferior extraction present image zone; 44... the present image template is regional; 45...3 dimension posture conversion benchmark image; 48... contrast execution face; 50... somatic data storehouse input part; 51... average template zone generation portion.

Claims (16)

1. an image comparison device is characterized in that, comprising:
Captured 3 dimension benchmark images captured 3 dimension present images when treating when 3 d image input part, this 3 d image input part read in the treatment plan of radiation cure respectively; And
Control treatment portion; This control treatment portion contrasts said 3 dimension benchmark images and said 3 dimension present images; Calculate the position correction so that the posture of the affected part in the posture of the affected part in the said 3 dimension present images and the said 3 dimension benchmark images is consistent
Said control treatment portion has:
1 comparing part, this 1 comparing part is carried out pattern match from said 3 dimension benchmark images 1 time to said 3 dimension present images; And
2 comparing part; This 2 comparing part template zone is according to the rules carried out pattern match 2 times to the searching object zone of regulation; The template zone of wherein said regulation generates according to said 3 dimension benchmark images or said 3 dimensions in the present images and based on the result of said 1 pattern match, and basis said 3 dimension benchmark images different in the searching object of said regulation zone with the formation base in the template zone of said regulation or in the said 3 dimension present images another and generate based on the result of said 1 pattern match.
2. image comparison device as claimed in claim 1 is characterized in that,
Said control treatment portion comprises benchmark template zone generation portion, and this benchmark template zone generation portion generates the benchmark image template zone in 3 dimension zones based on the affected part information of being prepared in the said 3 dimension benchmark images from said 3 dimension benchmark images.
3. image comparison device as claimed in claim 1 is characterized in that, comprising:
Somatic data storehouse input part, this somatic data storehouse input part obtains the somatic data storehouse from data library device; And
Average template zone generation portion, this average template zone generation portion from said somatic data storehouse patient's corresponding internal organs of affected part partly generate average template zone,
Said control treatment portion has benchmark template zone generation portion; This benchmark template zone generation portion carries out pattern match according to said average template zone to said 3 dimension benchmark images; And, generate the benchmark image template zone in 3 dimension zones from said 3 dimension benchmark images based on the result of said pattern match.
4. like claim 2 or 3 described image comparison devices, it is characterized in that,
When said 1 pattern match, said 1 comparing part is carried out pattern match according to said benchmark image template zone to said 3 dimension present images.
5. image comparison device as claimed in claim 4 is characterized in that,
Said 1 comparing part is tieed up present images from said 3 and is generated as 1 extraction present image in said searching object zone regional so that the highest zone of correlation that it comprises and said benchmark image template is regional.
6. image comparison device as claimed in claim 5 is characterized in that,
Said control treatment portion comprises the posture transformation component, and this posture transformation component carries out conversion to the posture of 3 d image,
Said posture transformation component generates the posture conversion template zone behind the posture that posture with said benchmark image template zone is transformed into regulation,
When said 2 pattern match, said 2 comparing part are carried out pattern match according to extracting the present image zone as the said posture template zone in the template zone of said regulation to said 1 time.
7. image comparison device as claimed in claim 6 is characterized in that,
Said 2 comparing part generate the section in said posture conversion template zone, extract between present image zone and the said section at said 1 time and carry out pattern match.
8. like claim 2 or 3 described image comparison devices, it is characterized in that,
Said control treatment portion comprises the posture transformation component, and this posture transformation component carries out conversion to the posture of 3 d image,
Said posture transformation component generates the posture conversion template zone behind the posture that posture with said benchmark image template zone is for conversion into regulation,
When said 1 pattern match, said 1 comparing part is carried out pattern match according to said posture conversion template zone to said 3 dimension present images.
9. image comparison device as claimed in claim 8 is characterized in that,
Said 1 comparing part generates the section in said posture conversion template zone; Between said 3 dimension present images and said section, carry out pattern match, and carry out following operation: from a plurality of said sections, confirm the relevant section of height as the highest section of correlation; Posture change amount in the said posture conversion template zone is carried out computing; And extract regional with the corresponding extraction of the said relevant section of said height of 3 dimensions in the present images.
10. image comparison device as claimed in claim 9 is characterized in that,
Said 1 comparing part generates the present image template zone as the template zone of said regulation, so that its said extractions that extracts when being included in said 1 pattern match is regional,
Said posture transformation component generates 3 dimension posture conversion benchmark image-regions; This 3 dimension posture conversion benchmark image-region forms the corresponding said size of extracting the posture change amount in zone of posture conversion and said present image template zone of said 3 dimension benchmark images
When said 2 pattern match, said 2 comparing part are carried out pattern match according to said present image template zone to 3 dimension posture conversion benchmark image-regions as said searching object zone.
11. image comparison device as claimed in claim 10 is characterized in that,
Said 2 comparing part generate as the contrast of the section in said posture conversion template zone and carry out face, between said present image template zone and said contrast execution face, carry out pattern match.
12. a patient positioning is characterized in that, comprising:
Like claim 1 to 3, each described image comparison device of 5 to 7,9 to 11; And
Treatment table control calculation of parameter portion, this treatment table control calculation of parameter portion calculates each parameter controlled to treatment table based on the position correction of utilizing said image comparison device to be calculated.
13. an image contrast method, this image contrast method during to the treatment plan of radiation cure captured 3 dimension benchmark images 3 captured dimension present images when treating contrast, this image contrast method is characterised in that, comprises:
1 pattern match step, this 1 pattern match step is carried out pattern match according to said 3 dimension benchmark images 1 time to said 3 dimension present images; And
2 pattern match steps; 2 pattern match are carried out to the searching object zone of regulation in this 2 pattern match steps template zone according to the rules; The template zone of wherein said regulation generates according to said 3 dimension benchmark images or said 3 dimensions in the present images and based on the result of said 1 pattern match, and basis said 3 dimension benchmark images different in the searching object of said regulation zone with the formation base in the template zone of said regulation or in the said 3 dimension present images another and generate based on the result of said 1 pattern match.
14. image contrast method as claimed in claim 13 is characterized in that,
Comprise benchmark image template zone and generate step, this benchmark image template zone generation step is regional from the benchmark image template that said 3 dimension benchmark images generate 3 dimension zones,
Said 1 pattern match step is carried out 1 pattern match according to institute benchmark image template zone to said 3 dimension present images.
15. a patient positioning is characterized in that, comprising:
Image comparison device as claimed in claim 4; And
Treatment table control calculation of parameter portion, this treatment table control calculation of parameter portion calculates each parameter controlled to treatment table based on the position correction of utilizing said image comparison device to be calculated.
16. a patient positioning is characterized in that, comprising:
Image comparison device as claimed in claim 8; And
Treatment table control calculation of parameter portion, this treatment table control calculation of parameter portion calculates each parameter controlled to treatment table based on the position correction of utilizing said image comparison device to be calculated.
CN201210022145.2A 2011-06-10 2012-01-13 Image contrast device, patient positioning device and image contrast method Active CN102814006B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-130074 2011-06-10
JP2011130074A JP5693388B2 (en) 2011-06-10 2011-06-10 Image collation device, patient positioning device, and image collation method

Publications (2)

Publication Number Publication Date
CN102814006A true CN102814006A (en) 2012-12-12
CN102814006B CN102814006B (en) 2015-05-06

Family

ID=47298678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210022145.2A Active CN102814006B (en) 2011-06-10 2012-01-13 Image contrast device, patient positioning device and image contrast method

Country Status (3)

Country Link
JP (1) JP5693388B2 (en)
CN (1) CN102814006B (en)
TW (1) TWI425963B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104135609A (en) * 2014-06-27 2014-11-05 小米科技有限责任公司 A method and a device for assisting in photographing, and a terminal
CN105228527A (en) * 2013-03-15 2016-01-06 瓦里安医疗系统公司 Use the perspective evaluation of the tumor observability for IGRT of the template generated from planning CT and profile
CN109859213A (en) * 2019-01-28 2019-06-07 艾瑞迈迪科技石家庄有限公司 Bone critical point detection method and device in joint replacement surgery

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI573565B (en) * 2013-01-04 2017-03-11 shu-long Wang Cone - type beam tomography equipment and its positioning method
JP6192107B2 (en) * 2013-12-10 2017-09-06 Kddi株式会社 Video instruction method, system, terminal, and program capable of superimposing instruction image on photographing moving image
JP6338965B2 (en) * 2014-08-08 2018-06-06 キヤノンメディカルシステムズ株式会社 Medical apparatus and ultrasonic diagnostic apparatus
JP6452987B2 (en) * 2014-08-13 2019-01-16 キヤノンメディカルシステムズ株式会社 Radiation therapy system
US9878177B2 (en) * 2015-01-28 2018-01-30 Elekta Ab (Publ) Three dimensional localization and tracking for adaptive radiation therapy
JP6164662B2 (en) * 2015-11-18 2017-07-19 みずほ情報総研株式会社 Treatment support system, operation method of treatment support system, and treatment support program
JP2018042831A (en) 2016-09-15 2018-03-22 株式会社東芝 Medical image processor, care system and medical image processing program
JP6869086B2 (en) * 2017-04-20 2021-05-12 富士フイルム株式会社 Alignment device, alignment method and alignment program
JP7513980B2 (en) 2020-08-04 2024-07-10 東芝エネルギーシステムズ株式会社 Medical image processing device, treatment system, medical image processing method, and program
CN114073827B (en) * 2020-08-15 2023-08-04 中硼(厦门)医疗器械有限公司 Radiation irradiation system and control method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1235684A (en) * 1996-10-29 1999-11-17 匹兹堡大学高等教育联邦体系 Device for matching X-ray images with refrence images
US20040184583A1 (en) * 2003-03-05 2004-09-23 Yoshihiko Nagamine Patient positioning device and patient positioning method
CN101032650A (en) * 2006-03-10 2007-09-12 三菱重工业株式会社 Radiotherapy device control apparatus and radiation irradiation method
JP2009189461A (en) * 2008-02-13 2009-08-27 Mitsubishi Electric Corp Patient positioning apparatus and its method
CN101708126A (en) * 2008-09-19 2010-05-19 株式会社东芝 Image processing apparatus and x-ray computer tomography apparatus
WO2010133982A2 (en) * 2009-05-18 2010-11-25 Koninklijke Philips Electronics, N.V. Marker-free tracking registration and calibration for em-tracked endoscopic system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007014435A (en) * 2005-07-06 2007-01-25 Fujifilm Holdings Corp Image processing device, method and program
JP4425879B2 (en) * 2006-05-01 2010-03-03 株式会社日立製作所 Bed positioning apparatus, positioning method therefor, and particle beam therapy apparatus
JP5233374B2 (en) * 2008-04-04 2013-07-10 大日本印刷株式会社 Medical image processing system
TWI381828B (en) * 2009-09-01 2013-01-11 Univ Chang Gung Method of making artificial implants

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1235684A (en) * 1996-10-29 1999-11-17 匹兹堡大学高等教育联邦体系 Device for matching X-ray images with refrence images
US20040184583A1 (en) * 2003-03-05 2004-09-23 Yoshihiko Nagamine Patient positioning device and patient positioning method
CN101032650A (en) * 2006-03-10 2007-09-12 三菱重工业株式会社 Radiotherapy device control apparatus and radiation irradiation method
JP2009189461A (en) * 2008-02-13 2009-08-27 Mitsubishi Electric Corp Patient positioning apparatus and its method
CN101708126A (en) * 2008-09-19 2010-05-19 株式会社东芝 Image processing apparatus and x-ray computer tomography apparatus
WO2010133982A2 (en) * 2009-05-18 2010-11-25 Koninklijke Philips Electronics, N.V. Marker-free tracking registration and calibration for em-tracked endoscopic system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105228527A (en) * 2013-03-15 2016-01-06 瓦里安医疗系统公司 Use the perspective evaluation of the tumor observability for IGRT of the template generated from planning CT and profile
CN104135609A (en) * 2014-06-27 2014-11-05 小米科技有限责任公司 A method and a device for assisting in photographing, and a terminal
CN104135609B (en) * 2014-06-27 2018-02-23 小米科技有限责任公司 Auxiliary photo-taking method, apparatus and terminal
CN109859213A (en) * 2019-01-28 2019-06-07 艾瑞迈迪科技石家庄有限公司 Bone critical point detection method and device in joint replacement surgery

Also Published As

Publication number Publication date
JP5693388B2 (en) 2015-04-01
TWI425963B (en) 2014-02-11
TW201249496A (en) 2012-12-16
JP2012254243A (en) 2012-12-27
CN102814006B (en) 2015-05-06

Similar Documents

Publication Publication Date Title
CN102814006A (en) Image contrast device, patient positioning device and image contrast method
US20220219017A1 (en) Imaging based calibration systems, devices, and methods
US10300305B2 (en) Image guidance for radiation therapy
CA2339497C (en) Delivery modification system for radiation therapy
US9956427B2 (en) Moving-body tracking device for radiation therapy, irradiation region determining device for radiation therapy, and radiation therapy device
CN101553281B (en) Use the target following of direct target registration
CN101478918B (en) Parallel stereovision geometry in image-guided radiosurgery
JPH0332649A (en) Radiation therapy system
CN102596315A (en) Device and method for displaying a geometric figure on the surface of a body of a patient
Kim et al. 3D reconstruction of leg bones from X-ray images using CNN-based feature analysis
JP2012045163A (en) Device for controlling radiation therapy apparatus and method for controlling radiation therapy apparatus
CN111386555A (en) Image guidance method and device, medical equipment and computer readable storage medium
CN102049106A (en) Precise image positioning system and method of radiotherapy system of interfractionated radiotherapy
CN108852400B (en) Method and device for realizing position verification of treatment center
CN102414759A (en) Particle beam radiation device
JP6800462B2 (en) Patient positioning support device
Kallis et al. Introduction of a hybrid treatment delivery system used for quality assurance in multi-catheter interstitial brachytherapy
Kim et al. Quantifying the accuracy and precision of a novel real-time 6 degree-of-freedom kilovoltage intrafraction monitoring (KIM) target tracking system
JP2018042831A (en) Medical image processor, care system and medical image processing program
KR102619994B1 (en) Biomedical image processing devices, storage media, biomedical devices, and treatment systems
US20210052920A1 (en) Radiotherapy process and system
JP2011104183A (en) Radiotherapy apparatus control method and radiotherapy apparatus controller
Nguyen et al. An interdimensional correlation framework for real-time estimation of six degree of freedom target motion using a single x-ray imager during radiotherapy
Talbot et al. A method for patient set-up guidance in radiotherapy using augmented reality
WO2004030761A1 (en) Improvements in or relating to radiation treatment planning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190123

Address after: Tokyo, Japan

Patentee after: Hitachi Ltd.

Address before: Tokyo, Japan

Patentee before: Missubishi Electric Co., Ltd.