CN103826539B - Image processing system, radiographic apparatus and image processing method - Google Patents
Image processing system, radiographic apparatus and image processing method Download PDFInfo
- Publication number
- CN103826539B CN103826539B CN201380001786.2A CN201380001786A CN103826539B CN 103826539 B CN103826539 B CN 103826539B CN 201380001786 A CN201380001786 A CN 201380001786A CN 103826539 B CN103826539 B CN 103826539B
- Authority
- CN
- China
- Prior art keywords
- mentioned
- image data
- data
- dimensional
- position alignment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/02—Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computerised tomographs
- A61B6/032—Transmission computed tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/12—Devices for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4417—Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/46—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/46—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/481—Diagnostic techniques involving the use of contrast agents
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/50—Clinical applications
- A61B6/503—Clinical applications involving diagnosis of heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
- A61B6/5241—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5247—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0883—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4416—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5246—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4429—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
- A61B6/4435—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
- A61B6/4441—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
- G06T2207/10121—Fluoroscopy
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
- G06T2207/10136—3D ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
Abstract
The image processing system of embodiment possesses the 1st position alignment portion, output unit, the 2nd position alignment portion and display part.1st position alignment portion carries out the position alignment between the 1st three-dimensional medical image data and the 2nd three-dimensional medical image data.Output unit above-mentioned 1st three-dimensional medical image data and the above-mentioned 2nd three-dimensional medical image data will be addition of position alignment information after data or generated data after the 1st three-dimensional medical image data and the 2nd three-dimensional medical image data position are directed at and synthesize export as output data.2nd position alignment portion receives above-mentioned output data, carries out the position alignment between the above-mentioned 2nd three-dimensional medical image data and 1 or multiple X-ray image data.Display part shows the above-mentioned 1st three-dimensional medical image data position alignment in the view data of X-ray image data according to position alignment result.
Description
Technical field
Embodiments of the present invention relate to image processing system, radiographic apparatus and image
Processing method.
Background technology
In the past, as one of heart failure therapy method it is known that have cardiac resynchronization therapy (CRT:
Cardiac Resynchronization Therapy).CRT is by electricity thorn in heart
The propagation swashed there occurs the position (hereinafter referred to as " delay position ") of delay, indwelling heart
The electrode (pacing lead) of pacemaker, thus improve the asynchronous of cardiac motion, make the heart
Dirty pumps out the functional rehabilitation therapy to normal condition.In a crt, doctor is with reference to logical
Cross the radioscopic image that radiographic apparatus perspective photography obtains, at range delay position
Nearest venous indwelling electrode.
Postpone position in the past such as according to EP(Electrophysiology) information diagnose,
Mapped by EP in recent years and diagnose.The most in recent years it is known that postpone position and may be able to pass through
The analysis of the Noninvasive employing diagnostic ultrasound equipment diagnoses.I.e., in recent years, logical
Cross method that heart wall motion is analyzed by ultrasonic cardiography quantitatively just practical,
In this analysis method, it is possible to show at the endomyocardial of ultrasonography or endomyocardial with outer
Between film, the index (such as, strain etc.) of the heart wall motion of local is with the tone corresponding with value
The analysis image mapped.Heart is the group that cardiac muscle moves due to mechanical vibration based on electricity irritation
Knit, therefore, postpone position and can show which as the nonsynchronous position of heart wall motion in analyzing image
(asynchronous position).But, CRT treatment is carried out under radioscopy, only by inciting somebody to action
Above-mentioned analysis image is passed on as the priori information giving doctor when planned treatment, real
On border, do not realize under the radioscopy carrying out CRT treatment, doctor being illustrated and staying
Put the position of above-mentioned pacing lead.On the other hand, although achieve to radioscopy
Under image on the overlapping technology showing other images, but owing to being difficult to differentiate the interior adventitia of heart wall
Face, therefore, it is difficult to carry out radioscopic image and the position alignment analyzed between image, be i.e. difficult to into
Position alignment between row radioscopic image and ultrasonography.
Prior art literature
Patent documentation
Patent documentation 1: Japanese Unexamined Patent Publication 2009-039429 publication
Summary of the invention
The problem to be solved in the present invention is, it is provided that one can differentiate logical under radioscopy
Cross the image processing system at the position that ultrasonic diagnosis is determined, radiographic apparatus and figure
As processing method.
The image processing system of embodiment possesses: the 1st position alignment portion, output unit, the 2nd
Position alignment portion and display part.1st position alignment portion carries out the regulation tissue forceps to subject
Position between the 1st three-dimensional medical image data and the 2nd three-dimensional medical image data that shadow obtains
Alignment.Output unit will be three-dimensional medical to the above-mentioned 1st three-dimensional medical image data and the above-mentioned 2nd
View data addition of the data of position alignment information or to the 1st three-dimensional medical image data
The generated data carrying out position alignment with the 2nd three-dimensional medical image data and synthesize is as output
Data export.2nd position alignment portion receives above-mentioned output data, carries out the above-mentioned 2nd three-dimensional
Medical image data and by the above-mentioned regulation tissue of above-mentioned subject from 1 or multiple photography side
Between 1 corresponding with photography direction or the multiple X-ray image data that obtain to photography
Position alignment.Display part is according to above-mentioned 1st position alignment portion and above-mentioned 2nd position alignment portion
Position alignment result, show above-mentioned 1st three-dimensional medical image data and above-mentioned regulation group
View data after the X-ray image data position alignment knitted.At image according to above-mentioned composition
Reason system, it is possible to differentiate the position determined by ultrasonic diagnosis under radioscopy.
Accompanying drawing explanation
Fig. 1 is the figure of the structure example representing the image processing system involved by the 1st embodiment.
Fig. 2 is the frame of the structure example representing the diagnostic ultrasound equipment involved by the 1st embodiment
Figure.
Fig. 3 is the figure (1) for the analysis portion involved by the 1st embodiment is described.
Fig. 4 is the figure (2) for the analysis portion involved by the 1st embodiment is described.
Fig. 5 is the figure (3) for the analysis portion involved by the 1st embodiment is described.
Fig. 6 is the figure (4) for the analysis portion involved by the 1st embodiment is described.
Fig. 7 is the figure for the position alignment portion involved by the 1st embodiment is described.
Fig. 8 is the frame of the structure example representing the radiographic apparatus involved by the 1st embodiment
Figure.
Fig. 9 is to represent to perform the image that the image processing system involved by the 1st embodiment is carried out
The figure in the process portion of processing method.
Figure 10 is the place carried out for the diagnostic ultrasound equipment illustrated involved by the 1st embodiment
The figure (1) of one example of reason.
Figure 11 is the place carried out for the diagnostic ultrasound equipment illustrated involved by the 1st embodiment
The figure (2) of one example of reason.
Figure 12 is the place carried out for the radiographic apparatus illustrated involved by the 1st embodiment
The figure (1) of one example of reason.
Figure 13 is the place carried out for the radiographic apparatus illustrated involved by the 1st embodiment
The figure (2) of one example of reason.
Figure 14 is to represent the figure of an example of the view data of display in the 1st embodiment.
Figure 15 is the place carried out for the diagnostic ultrasound equipment illustrated involved by the 1st embodiment
The flow chart of one example of reason.
Figure 16 is the place carried out for the radiographic apparatus illustrated involved by the 1st embodiment
The flow chart of one example of reason.
Figure 17 is the figure for the 2nd embodiment is described.
Figure 18 is the place carried out for the radiographic apparatus illustrated involved by the 2nd embodiment
The flow chart of one example of reason.
Detailed description of the invention
Hereinafter, referring to the drawings, the embodiment of image processing system is described in detail.
(the 1st embodiment)
First, the structure example for the image processing system involved by the 1st embodiment is said
Bright.Fig. 1 is the figure of the structure example representing the image processing system involved by the 1st embodiment.
As it is shown in figure 1, the image processing system 1 involved by the 1st embodiment has ultrasound wave
The diagnostic equipment 100, radiographic apparatus 200, X ray CT (Computed Tomography)
Device 300, image archive apparatus 400 and image processing apparatus 500.Each exemplified by Fig. 1
Device is in such as by being arranged at LAN(Local Area Network in the hospital in hospital)
600, it is possible to the state being directly or indirectly in communication with each other.Such as, when medical imaging is examined
Disconnected system introducing PACS(Picture Archiving and Communication System)
Time, each device is according to DICOM(Digital Imaging and Communications in
Medicine) standard, mutually sends and receives medical imaging etc..
Each device exemplified by Fig. 1 can receive the data of dicom standard by transmission, thus
Read by this device or show the data received from other devices.It addition, it is if able to logical
Cross this device to process the data received from other devices, then present embodiment can also
Send and receive the data deferring to arbitrary standards.
Diagnostic ultrasound equipment 100 carries out the super of the two-dimensional scan of ultrasound wave by operator's adjustment
The position of sonic probe, generates the ultrasound image data of arbitrary section.It addition, ultrasound wave
The diagnostic equipment 100, by using machinery 4D to pop one's head in or 2D array probe, carries out ultrasound wave
3-D scanning, generates three-dimensional ultrasonic view data.It addition, radiographic apparatus 200 leads to
Cross and enter when securing the position of C-arm of supporting X-ray tube and X-ray detector
Row photography, generates two-dimensional x-ray images data.It addition, for involved by the 1st embodiment
And diagnostic ultrasound equipment 100 and radiographic apparatus 200, describe in detail afterwards.
It addition, X ray CT device 300 has saturating to the X-ray tube of X-ray irradiation and detection
The X-ray detector of the X-ray having crossed subject is supported on opposed position and can rotate
Swivel mount.X ray CT device 300 is by while from X-ray tube X-ray irradiation while making rotation
Pivoted frame rotates, thus covers collection in all directions and receive the X-ray passing through, absorbing, decay
Data, rebuild X ray CT view data according to the data collected.X ray CT figure
As data become the layer image in the X-ray tube surfaces of revolution (axial face) with X-ray detector.
Here, in X-ray detector, as the x-ray detection device arranged in channel direction
The body direction of principal axis that detecting element arranges along subject arranges multiple row.Such as, there is arrangement 16 row inspection
Survey the X ray CT device 300 of X-ray detector of element row according to swivel mount rotation 1 time
The data for projection collected, rebuilds along subject body axial multiple (such as 16) X
Ray CT view data.
It addition, X ray CT device 300 is by making swivel mount rotate while making mounting tested
The helical scanning that the top board of body moves, such as, can will cover 500 X-ray of heart entirety
CT view data is rebuild as 3 D X-ray CT view data.Or, such as, have
In the X ray CT device 300 of the X-ray detector arranging 320 row detecting element row, it is possible to
Only by the conventional sweep carrying out making swivel mount to rotate 1 time, just rebuild that to cover heart overall
3 D X-ray CT view data.It addition, X ray CT device 300 can be by entering continuously
Row helical scanning or conventional sweep, enter along time series 3 D X-ray CT view data
Row photography.
In the 1st embodiment, use 3 D X-ray CT view data, carry out by ultrasound wave
The ultrasound image data that the diagnostic equipment 100 photography obtains is taken the photograph with by radiographic apparatus 200
Position alignment between the X-ray image data that shadow obtains.To this, illustrating that the 1st implements
Diagnostic ultrasound equipment 100 involved by mode and the overall knot of radiographic apparatus 200
After structure, it is described in detail.
Image archive apparatus 400 is the data base of keeping medical image data.Specifically, figure
As storage appts 400 will be from diagnostic ultrasound equipment 100, radiographic apparatus 200 or X
Ray CT apparatus 300 sends the medical image data come and is stored in the storage part of this device, goes forward side by side
Row keeping.The medical image data that image archive apparatus 400 is taken care of such as with patient ID, inspection
Look into the incidental informations such as ID, device ID, serial ID to take care of accordingly.
Doctor that image processing apparatus 500 e.g. works in hospital or laboratory technician are medical
The work station reading to use in shadow of image or PC(Personal Computer) etc..At image
The operator of reason device 500 can by carry out employing patient ID, check ID, device ID,
The retrieval of serial ID etc., obtains required medical image data from image archive apparatus 400.
Or, image processing apparatus 500 can also be from diagnostic ultrasound equipment 100, radiodiagnosis dress
Put 200 or X ray CT device 300 directly receive view data.It addition, image procossing dress
Put 500 in addition to showing medical imaging to read shadow, additionally it is possible to medical image data is entered
The various image procossing of row.
Hereinafter, for diagnostic ultrasound equipment 100 and radiographic apparatus 200 phase cooperation
The situation that ground performs with the image processing method involved by present embodiment illustrates.Wherein,
The various places that diagnostic ultrasound equipment 100 described later and radiographic apparatus 200 are carried out
Part or all of reason can also be by X ray CT device 300 or image processing apparatus 500
Perform.
It addition, image processing system 1 is not limited to be applicable to be imported with the situation of PACS.Example
As, image processing system 1 can be applicable to be imported with the addition of medical image data equally
The situation of the electronic medical record system that electronic health record is managed.Now, image archive apparatus 400
It is the data base of keeping electronic health record.It addition, such as image processing system 1 is equally applicable to lead
Enter to have HIS(Hospital Information System), RIS(Radiology
Information System) situation.
Then, for the structure example of the diagnostic ultrasound equipment 100 shown in Fig. 1, Fig. 2 is used
Illustrate.Fig. 2 is the structure representing the diagnostic ultrasound equipment involved by the 1st embodiment
The block diagram of example.Ultrasonic diagnosis as shown in Figure 1 as example, involved by the 1st embodiment
Device 100 has: ultrasound probe 110, display 120, input unit 130, electrocardiogram are swept
Retouch instrument 140, apparatus main body 150, position sensor 160 and emitter 161.
Ultrasound probe 110 carries out the transmission of ultrasound wave and receives.Such as, ultrasound probe 110
Having multiple piezoelectric vibrator, these multiple piezoelectric vibrators are according to from apparatus main body 150 institute described later
The driving signal of transmission and reception unit 151 supply having is to produce ultrasound wave.It addition, ultrasound wave
Probe 110 reception from the echo of subject P and is converted into the signal of telecommunication.It addition, ultrasound wave
Probe 110 has the matching layer being arranged at piezoelectric vibrator and prevents ultrasound wave from piezoelectric vibrator backward
The back lining materials etc. that side propagates.It addition, ultrasound probe 110 is freely torn open with apparatus main body 150
Link with unloading.
If subject P being sent ultrasound wave from ultrasound probe 110, then sent is ultrasonic
Ripple is reflected successively by the discontinuity surface of the acoustic impedance in the in-vivo tissue of subject P, by ultrasound wave
Multiple piezoelectric vibrators that probe 110 is had are received as reflection wave signal.The echo received
The amplitude of signal depends on the difference of the acoustic impedance in the discontinuity surface of reflectance ultrasound ripple.It addition, institute
Reflection during the surface reflection of the blood flow that moved of ultrasonic pulse sent or heart wall etc.
Ripple signal due to Doppler effect, and can depend on moving body relative to ultrasound wave sending direction
Velocity component, and accept frequency displacement.
Here, the ultrasound probe 110 involved by the 1st embodiment is to pass through ultrasound wave
Two-dimensionally subject P is scanned, can dimensionally subject P be scanned simultaneously
Ultrasound probe.Specifically, the ultrasound probe 110 involved by the 1st embodiment is logical
Cross the multiple piezoelectric vibrators being configured to string, two-dimensionally subject P is scanned, leads to simultaneously
Cross and make multiple piezoelectric vibrator swing with predetermined angular (pendulum angle), come dimensionally to subject
The mechanical 4D probe that P is scanned.Or, the ultrasound probe involved by the 1st embodiment
110 is can be by being configured to rectangular by multiple piezoelectric vibrators, thus dimensionally to subject P
Carry out the 2D array probe of ultrasonic scanning.It addition, 2D array probe can also be by assembling
And send ultrasound wave, thus two-dimensionally subject P is scanned.
Input unit 130 has mouse, keyboard, button, panel-switch, touch instruction screen, foot
Step on switch, trace ball, action bars etc., accept the operator from diagnostic ultrasound equipment 100
Various setting requirements, and apparatus main body 150 is transmitted the various setting requirements that accepted.
Display 120 display uses input unit for the operator of diagnostic ultrasound equipment 100
130 inputs are various sets the GUI(Graphical User Interface required), or
The ultrasound image data etc. that display generates in apparatus main body 150.
As the signal of organism of subject P, ecg scanning instrument 140 obtains subject P's
Electrocardiogram (ECG:Electrocardiogram).Ecg scanning instrument 140 is by acquired
Electrocardiogram sends to apparatus main body 150.
Position sensor 160 and emitter 161 are the positions for obtaining ultrasound probe 110
The device of confidence breath.Such as, position sensor 160 is mounted to the magnetic of ultrasound probe 110
Sensor.It addition, such as, emitter 161 is configured in optional position, is to be with this device
The device in formation magnetic field, centrally directed outside.
Position sensor 160 detects the three-dimensional magnetic field formed by emitter 161.Further, position
Put the sensor 160 information according to the magnetic field detected, calculate with emitter 161 as initial point
The position (coordinate and angle) of this device in space, and by the position that calculates to device
Main body 150 sends.Here, position sensor 160 three-dimensional coordinate that this device is positioned at
And angle is as the three dimensional local information of ultrasound probe 110, sends to apparatus main body 150.
It addition, present embodiment can also be applicable to by employ position sensor 160 and
System beyond the position detecting system of emitter 161, obtains the position of ultrasound probe 110
The situation of information.Such as, present embodiment can also use gyro sensor or acceleration to pass
Sensors etc., obtain the positional information of ultrasound probe 110.
Apparatus main body 150 is that the reflection wave signal received according to ultrasound probe 110 generates
The device of ultrasound image data.Apparatus main body 150 shown in Fig. 1 is can be according to ultrasound wave
The two-dimentional reflected waveform data that probe 110 receives is to generate the dress of two-dimensional ultrasonic view data
Put.It addition, the apparatus main body 150 shown in Fig. 1 is can to receive according to ultrasound probe 110
To three-dimensional reflection wave datum generate the device of three-dimensional ultrasonic view data.
Apparatus main body 150 is as it is shown in figure 1, have transmission and reception unit 151, B-mode process portion
152, at doppler processing portion 153, image production part 154, image storage 155, image
Reason portion 156, control portion 157, storage inside portion 158 and interface portion 159.
Transmission and reception unit 151 has pulse generator, sends delay portion, pulse generator etc.,
Signal is driven to ultrasound probe 110 supply.Pulse generator is with the rate frequency of regulation, instead
Reproduce raw for forming the rate pulse sending ultrasound wave.It addition, send delay portion, pulse is produced
Each rate pulse produced by raw device gives to be assembled the ultrasound wave produced by ultrasound probe 110
Fasciculation, and determine the time delay sending each piezoelectric vibrator needed for directivity.It addition,
Pulse generator, with timing based on rate pulse, applies to drive signal to ultrasound probe 110
(driving pulse).That is, when sending delay portion by making the delay given for each rate pulse
Between change, at random adjust the sending direction of ultrasound wave sent from piezoelectric vibrator face.
It addition, transmission and reception unit 151 is for the instruction according to control portion 157 described later, perform
The scanning sequence of regulation, has the merit that can change transmission frequency moment, send driving voltage etc.
Energy.Especially, the change of driving voltage is sent by the linear amplification that can switch its value moment
The transtation mission circuit of type or TURP change the mechanism of multiple power subsystem and realize.
It addition, transmission and reception unit 151 has preamplifier, A/D(Analog/Digital)
Transducer, reception delay portion and adder etc., anti-to received by ultrasound probe 110
Ejected wave signal carries out various process and generates reflected waveform data.Reflection wave signal is existed by preamplifier
Each passage (channel) amplifies.Reflection wave signal after amplifying is carried out by A/D converter
A/D changes.Receive delay portion and give the time delay determined needed for reception directivity.Adder
Addition process is carried out to generate echo number to by the reflection wave signal after receiving the process of delay portion
According to.By the addition process of adder, emphasize from the reception directivity pair with reflection wave signal
The reflecting component in the direction answered, forms ultrasound wave send out according to reception directivity and transmission directivity
Send the comprehensive wave beam of reception.
When subject P is carried out two-dimensional scan, transmission and reception unit 151 is from ultrasound probe 110
Send two-dimensional ultrasound wave beam.Further, transmission and reception unit 151 receives according to ultrasound probe 110
To two-dimentional reflection wave signal generate two dimension reflected waveform data.It addition, when subject P is entered
During row 3-D scanning, transmission and reception unit 151 sends three-D ultrasonic wave beam from ultrasound probe 110.
Further, transmission and reception unit 151 is according to the three-dimensional reflection ripple letter received by ultrasound probe 110
Number generate three-dimensional reflection wave datum.
It addition, the form from the output signal of transmission and reception unit 151 can select to be known as
RF(Radio Frequency) at the signal comprising phase information of signal or envelope detection
The various forms such as the situation of the amplitude information after reason.
B pattern process portion 152 receives reflected waveform data from transmission and reception unit 151, carries out logarithm
Amplification, envelope detection process etc., generate the data (B that signal intensity is showed by the light and shade of brightness
Pattern data).
Doppler processing portion 153 is according to the reflected waveform data pair received from transmission and reception unit 151
Velocity information carries out frequency analysis, extracts blood flow based on Doppler effect, tissue or contrast agent
Echo component, generate extract for multiple spot speed, variance, the number of idempotent mobile unit information
According to (doppler data).
It addition, the B pattern process portion 152 involved by the 1st embodiment and doppler processing
Portion 153 can for two dimension reflected waveform data and three-dimensional reflection wave datum both sides at
Reason.That is, B pattern process portion 152 generates two dimension B pattern number according to two dimension reflected waveform data
According to, generate three-dimensional B pattern data according to three-dimensional reflection wave datum.It addition, doppler processing
Portion 153 generates two-dimensional Doppler data, according to three-dimensional reflection ripple according to two dimension reflected waveform data
Data generate three-dimensional Doppler data.
Image production part 154 is according to by B pattern process portion 152 and doppler processing portion 153
The data generated generate ultrasound image data.That is, image production part 154 is schemed according to by B
The two-dimentional B pattern data that case process portion 152 generates generate and are represented reflection wave strength by brightness
Two dimension B pattern view data.It addition, image production part 154 is according to by doppler processing portion 153
The two-dimensional Doppler data generated generate the two-dimensional Doppler picture number representing mobile unit information
According to.Two-dimensional Doppler view data is velocity image, variance image, power image or is combined with
These image.
Here, image production part 154 is typically by the scanning-line signal of ultrasonic scanning
Row conversion (scan conversion) become the scanning-line signal row of the representative video formats such as TV, raw
Become the ultrasound image data of display.Specifically, image production part 154 is according to ultrasound wave
The scan mode of the ultrasound wave of probe 110 carries out Coordinate Conversion, thus generates the super of display
Sound wave view data.It addition, image production part 154 is in addition to scan conversion, as various
Image procossing, such as, use the multiple picture frames after scan conversion, carries out regenerating brightness and puts down
The image procossing (smoothing techniques) of average image or use differential filter in image
Image procossing (edge enhancement process) etc..It addition, image production part 154 is to ultrasonography
Data, synthesize the Word message of various parameter, scale, body mark etc..
That is, B pattern data and doppler data is scan conversion ultrasonography before treatment
Data, the data that image production part 154 is generated are the super of the display after scan conversion processes
Sound wave view data.It addition, B pattern data and doppler data are also known as initial data
(Raw Data).
It addition, image production part 154 is by the three-dimensional B generated by B pattern process portion 152
Pattern data carries out Coordinate Conversion, generates three-dimensional B pattern view data.It addition, image is raw
One-tenth portion 154 carries out coordinate by the three-dimensional Doppler data being generated doppler processing portion 153
Conversion, generates three-dimensional Doppler view data.That is, image production part 154 will " three-dimensional B
Pictorial image data or three-dimensional Doppler view data " conduct " three-dimensional ultrasonic view data "
Generate.
It addition, image production part 154 is in order to generate for by three-dimensional ultrasonic view data (body
Data) it is shown in the various two-dimensional image datas of display 120, volume data is carried out at drafting
Reason.The drawing modification carried out as image production part 154, exist the profile Reconstruction method that carries out (MPR:
Multi Planar Reconstruction) according to volume data generate MPR view data place
Reason.It addition, the drawing modification carried out as image production part 154, exist and volume data is carried out
The process of " Curved MPR " or volume data is carried out " Maximum Intensity
Projection " process.It addition, the drawing modification carried out as image production part 154,
There is the volume drawing (VR:Volume of the two-dimensional image data generating reflection three-dimensional information
Rendering) process.
Image storage 155 is the picture number of the display that storage image production part 154 is generated
According to memorizer.It addition, image storage 155 can also store B pattern process portion 152 or
The data that doppler processing portion 153 is generated.The B pattern number that image storage 155 is stored
According to or doppler data such as diagnosis after can be recalled by operator, via image production part
154 ultrasound image data becoming display.It addition, image production part 154 is by ultrasound wave
View data and time of ultrasonic scanning of carrying out to generate this ultrasound image data with
The electrocardiogram foundation sent from ecg scanning instrument 140 is stored in image storage 155 accordingly.
Analysis portion 156a described later or control portion 157 can be by referring to being stored in image storage 155
Data, thus obtain carry out to generate ultrasound image data ultrasonic scanning time
Heart phase.
Storage inside portion 158 storage is used for carrying out ultrasound wave and sends reception, image procossing and show
Show the control program of process, diagnostic message (such as, the suggestion etc. of patient ID, doctor), examine
The disconnected various data such as agreement or various body marks.It addition, storage inside portion 158 is as required,
It is additionally operable to the keeping etc. of the view data that image storage 155 is stored.It addition, storage inside
The data that portion 158 is stored can be via interface portion 159 described later, to outside device transmission.
It addition, the data that external device (ED) is stored can also be via interface portion 159 described later, internally
Storage part 158 transmits.It addition, the external device (ED) e.g. radiographic apparatus shown in Fig. 1
200, X ray CT device 300, image archive apparatus 400 or image processing apparatus 500 etc..
Image processing part 156 is in order to carry out computer-aided diagnosis (Computer-Aided
Diagnosis:CAD) apparatus main body 150 it is arranged at.Image processing part 156 obtains preservation
In the ultrasound image data of image storage 155, carry out image analysis processing.Further, figure
As analysis result is stored in image storage 155 or storage inside portion 158 by process portion 156.
Image processing part 156 is as it is shown in figure 1, have analysis portion 156a and position alignment portion 156b.
Analysis portion 156a to generate by subject P is carried out three-dimensional ultrasonic scanning along the time
The three-dimensional ultrasonic image data set of sequence is analyzed, and generates and the local specified in tissue
The three dimensional analysis view data that motion is relevant.
Here, regulation tissue is heart, analysis portion 156a generates the motion in each region with heart wall
Relevant information.Further, analysis portion 156a generate ultrasound image data endomyocardial,
Or the analysis view data after heart wall motion information is mapped between heart muscle inner membrance and adventitia.1st
Analysis portion 156a involved by embodiment uses three-dimensional ultrasonic image data set, generates three-dimensional
The time series data of heart wall motion information.
Hereinafter, the analyzing and processing carried out for analysis portion 156a involved by the 1st embodiment,
Fig. 3~Fig. 6 is used to illustrate.Fig. 3~Fig. 6 is for illustrating involved by the 1st embodiment
The figure of analysis portion.
First, use can the ultrasound probe 110 of 3-D scanning, operator such as passes through the heart
Point path period more than 1 heart beating, system is felt concerned about on the left side of the heart of subject P and carries out three
Dimension scanning.Thus, image production part 154 generates the time of the period more than 1 heart beating
Multiple three-dimensional ultrasonic view data of sequence, are stored in image storage 155.Image stores
Multiple three-dimensional ultrasonic view data that device 155 is preserved are by by including at least left ventricle
The three-dimensional ultrasonic image that heart period more than 1 heart beating carries out ultrasonic scanning and generates
Data set.It addition, above-mentioned three-dimensional ultrasonic image data set is three-dimensional B pattern view data
Group.
Further, as analysis portion 156a example as shown in Figure 3, it is taken along more than 1 heart beating
Seasonal effect in time series multiple three-dimensional ultrasonic view data.In each three-dimensional ultrasonic view data,
Include the left ventricle of subject P.
Further, analysis portion 156a, according to three-dimensional ultrasonic image data set, calculates in left ventricle
The time series data of heart wall motion information.Specifically, analysis portion 156a uses by comprising
The result that the process of the pattern match between view data is followed the trail of above-mentioned tracking point and obtained, is carried out
The calculating of heart wall motion information processes.More specifically, analysis portion 156a uses for by three-dimensional
The three-dimensional dynamic images data that echocardiography obtains carry out three-dimensional speckle tracking (3D
Speckle Tracking, hereinafter referred to as " 3DT ") result, calculate heart wall motion information.
Speckle tracking method be process with pattern match together with, such as by and with optical flow method or various time
Space interpolation processing, thus the method that presumption is movable accurately.It addition, in speckle tracking method,
Also comprise and do not carry out pattern match process, and the method for presumption activity.
Such as, input unit 130 accepts the 1st of three-dimensional ultrasonic image data set at operator
The display requirement of frame (the 1st volume).The control portion 157 transferring display requirement deposits from image
Reservoir 155 reads the three-dimensional ultrasonic view data of the 1st frame, is shown in display 120.Example
As, the section that control portion 157 makes image production part 154 generate in multiple directions cuts off the 1st frame
Three-dimensional ultrasonic view data and multiple MPR view data of obtaining, and be shown in display
120.Such as, display 120 as shown in Figure 4, shows multiple MPR view data.
In shown in Fig. 4 a example, display 120 is at the MPR in region A display A face
View data.It addition, in shown in Fig. 4 a example, display 120 shows at region B
Show the MPR view data in B face.It addition, in shown in Fig. 4 a example, display 120
MPR view data in the C face of the near C3 grade of region C3 display distance apex.It addition,
In shown in Fig. 4 a example, display 120 is near at region C3 display distance heart base portion
The MPR view data in the C face of C7 grade.It addition, in shown in Fig. 4 a example, aobvious
Show that device 120 shows the C face being positioned at apex with the C5 grade of the centre of heart base portion at region C5
MPR view data.In shown in Fig. 4 a example, in the viewing area in left side,
From upper beginning successively configuring area C3, region C5, region C7, at region C3 and region C5
Right side configuring area A, at the right side configuring area B of region A.
It addition, in shown in Fig. 4 a example, display 120 is in the district of lower right side
Territory, the volume rendered images of the three-dimensional ultrasonic view data of display the 1st frame and electrocardiogram.
Further, operator, with reference to the multiple MPR view data shown by display 120, sets
Multiple tracking points carrying out 3DT.Example is given one example, operator in each MPR view data,
Traced for left ventricular endocardium or the position of heart muscle adventitia, it is intended that inner membrance profile and epicardium contours.Point
Analysis portion 156a constitutes three-dimensional inner membrance profile according to specified inner membrance profile and epicardium contours
And three-dimensional epicardium contours.Further, as analysis portion 156a example as shown in Figure 5, will constitute
The each point of the three-dimensional inner membrance profile of the 1st frame is set as tracking point.Although it addition, do not illustrate,
But each point constituting the three-dimensional epicardium contours of the 1st frame is set as tracking point by analysis portion 156a.And
And, analysis portion 156a, respectively to the multiple tracking points set with the 1st frame, sets template data.
Template data is made up of the multiple voxels centered by tracking point.
Further, analysis portion 156a by exploring speckle pattern with template data 2 interframe
Consistent region, thus follow the trail of which position template data moves at next frame.Thus, divide
Analysis portion 156a is as it is shown in figure 5, who of n-th frame each tracking point of tracking the 1st frame move to
Put.It addition, the 1st can also be detected by analysis portion 156a for setting the grid of tracking point
Endocardial surface or the epicardial surface of the left ventricle that frame is comprised set.
Analysis portion 156a is by overall for left ventricle (such as, the endocardium of left ventricle and left ventricle
Visceral pericardium) as object, carry out the 3DT for three-dimensional ultrasonic image data set.Further,
Analysis portion 156a is according to the result of the 3DT for three-dimensional ultrasonic image data set, in each tracking
Point, generates the time series data of heart wall motion information.Such as, analysis portion 156a is according to intracardiac
Film and the result of epicardial 3DT, calculate strain (Strain) as heart wall motion information.
Analysis portion 156a calculates the strain (LS) in major axis (Longitudinal) direction, circumference
(Circumferential) strain (CS) in direction or answering of wall thickness (Radial) direction
Become (RS).
Or, such as, analysis portion 156a, according to the result of the 3DT of inner membrance, calculates left room intracardiac
The area change rate (Area Change ratio:AC) of face is as heart wall motion information.Or
Person, such as analysis portion 156a can also come according to endocardium or the result of epicardial 3DT
Calculate displacement (Displacement).When using displacement as heart wall motion information, analyze
Portion 156a can calculate the displacement (LD) of long axis direction or the displacement (RD) in wall thickness direction.Or
Person, analysis portion 156a can also calculate relative to the tracking at benchmark phase (such as, R ripple)
Displacement (the Absolute of the position of point, phase beyond reference phase tracking point
Displacement:AD).It addition, analysis portion 156a is in order to capture the asynchronous of cardiomotility
Property, it is also possible to calculate and Strain value become the analysis that certain above time maps tie
Really or to Strain value reach the analysis result that the time of maximum maps.
Here, analysis portion 156a can generate the time sequence of heart wall motion information to each tracking point
Column data, it is also possible to the time series data to the Area generation heart wall motion information of each local.
Such as, analysis portion 156a uses ultrasoundcardiogram association of the U.S. or American Heart Association to be recommended
16 or 17 cut zone divided, calculate the heart wall motion information of local.Such as, as U.S.
The division that ultrasoundcardiogram association of state etc. is recommended, it is possible to enumerate in antetheca every (ant-sept.),
Antetheca (ant.), sidewall (lat.), rear wall (post.), lower wall (inf.), in every
(sept.) etc..
Further, such as analysis portion 156a is as shown in Figure 6, generates the heart that will be obtained by each tracking point
The value of wall movable information is converted into colour, and be mapped to the iso-surface patch image of three-dimensional inner membrance profile
Three dimensional analysis view data.Operator can pass through moving view point position, thus passes through display
120 observe the three dimensional analysis view data exemplified by Fig. 6 from all directions.Or, such as,
Analysis portion 156a generates and the value of the heart wall motion information obtained by each tracking point is converted into colour,
And it is mapped to the three dimensional analysis view data of the Polar-map of 16 divisions.
Return to Fig. 2, position alignment portion 156b and carry out ultrasound image data and other kinds of
Position alignment between three-dimensional medical image data processes.The so-called medical figure of other kinds of three-dimensional
The 3 D X-ray CT view data e.g. received from X ray CT device 300 as data.
Or, so-called other kinds of three-dimensional medical image data refers to from the magnetic resonance that Fig. 1 is not shown
The three-dimensional MRI that imaging (MRI:Magnetic Resonance Imaging) device receives
View data.Diagnostic ultrasound equipment 100 involved by 1st embodiment can pass through position
Sensor 160 and the process of position alignment portion 156b, make image production part 154 generate with for
The section of the two-dimensional ultrasonic scanning generating two-dimensional ultrasonic view data and carry out is the most same
The medical image data of section, and it is shown in display 120.
Such as, operator is at the ultrasoundcardiogram using ultrasound probe 110 to carry out subject P
Before inspection, the 3 D X-ray CT image that carrying out photographs to the heart of subject P obtains
The transmission requirement of data.It addition, operator via input unit 130 MPR is processed transversal
The position in face is adjusted, so that being penetrated by the two-dimentional X of the check point depicting subject P
Line CT view data is shown in display 120.
Further, by the control of position alignment portion 156b, image production part 154 generates by operating
Cross section (hereinafter referred to as " initial profile ") after person's regulation has cut off 3 D X-ray CT
The two-dimensional x-ray CT view data of view data, display 120 shows image production part 154
The two-dimensional x-ray CT view data generated.Operator operate ultrasound probe 110 carry out with
The ultrasonic scanning of the X ray CT view data same profile shown by display 120.Further,
When the two-dimensional x-ray CT view data being judged as shown by display 120 and two-dimensional ultrasonic figure
As data be substantially same profile time, such as operator specifies correspondence in the view data of both sides
3 points.Or, operator such as specify in the view data of both sides correspondence 1 with
On point and axle (line).Further, operator presses the confirming button of input unit 130.Position
Aligned portions 156b by press moment of confirming button from position sensor 160 obtain ultrasonic
The three dimensional local information of ripple probe 110 is set as initial position message.Or, position alignment portion
156b, by establishing point or the line of corresponding relation, carries out the seat of two-dimensional ultrasonic view data
The position alignment of the coordinate system of mark system and 3 D X-ray CT view data.Fig. 7 is for illustrating
The figure in the position alignment portion involved by the 1st embodiment.
Afterwards, position alignment portion 156b is constituted from by position sensor 160 and emitter 161
Position detecting system, obtain two-dimensional ultrasonic view data B shown in Fig. 7 generate time
The three dimensional local information of ultrasound probe 110.Further, position alignment portion 156b is by obtaining institute
The three dimensional local information obtained and the mobile message of initial position message according to acquired movement
The position of information change initial profile, resets the cross section of MPR.Further, pass through
The control of position alignment portion 156b, image production part 154 is by position alignment portion 156b again
The cross section set, generates two dimension according to 3 D X-ray CT view data A shown in Fig. 7
X ray CT view data C.Further, by the control of position alignment portion 156b, display
120 as it is shown in fig. 7, show two-dimensional x-ray CT view data C and two-dimensional ultrasonic figure side by side
As data B.It addition, in above-mentioned, carry out position alignment for using position sensor 160
Situation be illustrated.But, three-dimensional ultrasonic view data and 3 D X-ray CT image
The position alignment of data (or, three-dimensional MRI image data) have collected three-dimensional ultrasonic figure
As after data, if setting more than 3 common characteristic points in the view data of both sides,
Even if then non-use position sensor 160 can be carried out too.Such as, if showing both sides
MPR view data, set common characteristic point independently, when set more than 3 when
Make image synchronization, then can be carried out by interfaces such as mouses and use position sensor 160 phase
Show while Tong.
Shown function by this, operator such as can observe ultrasonography simultaneously and cut open simultaneously
The X ray CT image that face is roughly the same with this ultrasonography.It addition, can be entered by use
The ultrasound probe 110 of row 3-D scanning carries out two-dimensional scan, obtains initial position message in advance
With establish point or the positional information of line of corresponding relation, thus position alignment portion 156b can be
3 D X-ray CT view data is identified big with the 3D region dimensionally carrying out ultrasonic scanning
Cause identical 3D region.It addition, position alignment portion 156b can carry out constituting three-dimensional ultrasonic
The position pair of each voxel of each voxel of view data and composition 3 D X-ray CT view data
Accurate.
That is, position alignment portion 156b can carry out three-dimensional ultrasonic view data and 3 D X-ray
The position alignment of CT view data or three-dimensional ultrasonic view data and three-dimensional MRI image data
Position alignment.It addition, position alignment portion 156b can use three-dimensional ultrasonic view data with
The position alignment information of 3 D X-ray CT view data, carries out three dimensional analysis view data and three
The position alignment of dimension X ray CT view data, similarly, it is possible to use three-dimensional ultrasonic image
The position alignment information of data and three-dimensional MRI image data, carry out three dimensional analysis view data with
The position alignment of three-dimensional MRI image data.It addition, when 3 D X-ray CT view data or three
Dimension MRI image data be radiography photography obtain three-dimensional image data time, it is possible to carry out from
The three-dimensional Contrast Area numeric field data being partitioned in three-dimensional image data and three dimensional analysis view data
Position alignment.
Returning to Fig. 2, the process that control portion 157 controls diagnostic ultrasound equipment 100 is overall.
Specifically, control portion 157 is according to the various settings inputted via input unit 130 by operator
The various control programs required or read in from storage inside portion 158 and various data, control to send out
Send acceptance division 151, B pattern process portion 152, doppler processing portion 153, image production part 154
And the process of analysis portion 156a.It addition, control portion 157 is controlled, so that display
The display ultrasonogram that 120 display image memory 155 or storage inside portion 158 are stored
As data.It addition, control portion 157 is controlled, so that display 120 shows analysis portion
The result of 156a.
It addition, control portion 157 by result of analysis portion 156a etc. via interface portion described later
159 export to external device (ED).The external device (ED) e.g. radiographic apparatus 200 shown in Fig. 1,
X ray CT device 300, image archive apparatus 400 or image processing apparatus 500 etc..1st
Control portion 157 involved by embodiment is as the while of for carrying out the output process exporting data
Control the process portion of the data form of output data, there is output unit 157a shown in Fig. 1.Separately
Outward, the process carried out for output unit 157a, describe in detail afterwards.
Interface portion 159 is for LAN600, radiographic apparatus in input unit 130, hospital
200, X ray CT device 300, image archive apparatus 400 and image processing apparatus 500
Interface.Such as, input unit 130 is accepted various set informations from operator and
Various instructions, by interface portion 159, transmit to control portion 157.It addition, such as output unit 157a
The output data exported pass through interface portion 159, via LAN600 in hospital to radiodiagnosis
Device 200 sends.It addition, such as X ray CT device 300 or image archive apparatus 400
The data such as the three-dimensional medical image data sent, via interface portion 159, are stored in storage inside
Portion 158.
Then, for the structure example of the radiographic apparatus 200 shown in Fig. 1, Fig. 8 is used
Illustrate.Fig. 8 is the structure representing the radiographic apparatus involved by the 1st embodiment
The block diagram of example.Radiodiagnosis as shown in Figure 8 as example, involved by the 1st embodiment
Device 200 possesses X-ray high voltage device 211, X-ray tube 212, x-ray diAN_SNhragm device
213, top board 214, C-arm 215, X-ray detector 216.It addition, the 1st embodiment
Involved radiographic apparatus 200 possesses C-arm rotation/travel mechanism 217, top board moves
Motivation structure 218, C-arm/top panel mechanism control portion 219, diaphragm control portion 220, system control
Portion 221, input unit 222 and display part 223.It addition, the X involved by the 1st embodiment
Ray diagnostic device 200 possess image data generating section 224, image data storing section 225 with
And image processing part 226.
X-ray high voltage device 211, according to the control of systems control division 221, produces high voltage,
And supply produced high voltage to X-ray tube 212.X-ray tube 212 uses from X-ray
The high voltage of high voltage device 211 supply, produces X-ray.
X-ray diAN_SNhragm device 213 according to the control in diaphragm control portion 220 by X-ray tube 212
Produced X-ray is converged to optionally irradiate the Region Of Interest of subject P.Such as, X
Ray aperture device 213 has 4 blades of diaphragm that can slide.X-ray diAN_SNhragm device 213
By making these blades of diaphragm slide according to the control in diaphragm control portion 220, thus assemble X and penetrate
X-ray produced by spool 212 is irradiated to subject P.Top board 214 is mounting subject P
Bed, be configured on not shown examination platform.
X-ray detector 216 detection has passed through the X-ray of subject P.Such as, X-ray inspection
Survey device 216 and there is the detecting element of rectangular arrangement.Each detecting element will transmit through subject P
X-ray be converted into the signal of telecommunication and accumulate, by the signal of telecommunication accumulated to image data generating section
224 send.
C-arm 215 keeps X-ray tube 212, x-ray diAN_SNhragm device 213 and X-ray inspection
Survey device 216.X-ray tube 212 and x-ray diAN_SNhragm device 213 and X-ray detector 216
It is configured to by C-arm 215 opposed across subject P.
C-arm rotation/travel mechanism 217 is for making C-arm 215 rotate and the machine of movement
Structure, top board travel mechanism 218 is the mechanism for making top board 214 move.C-arm/top board machine
Structure control portion 219, by the control according to systems control division 221, controls C-arm and rotates/move
Mechanism 217 and top board travel mechanism 218, thus to the rotation of C-arm 215 or movement,
Top board 214 move into Row sum-equal matrix.Diaphragm control portion 220 is by according to systems control division 221
Control, the aperture of the blade of diaphragm that x-ray diAN_SNhragm device 213 is had is adjusted,
Thus control the range of exposures to the X-ray that subject P irradiates.
Image data generating section 224 uses to be changed out from X-ray by X-ray detector 216
The signal of telecommunication generate X-ray image data, and the X-ray image data generated is stored in
Image data storing section 225.Such as, image data generating section 224 is to by X-ray detector
216 signals of telecommunication received, carry out current/voltage-converted, A(Analog)/D(Digital)
Conversion or parallel/series conversion, generate X-ray image data.
Image data storing section 225 stores the view data generated by image data generating section 224.
The view data that image data storing section 225 is stored by image processing part 226 carries out various figure
As processing.Describe in detail for after the image procossing that image processing part 226 is carried out.
Input unit 222 is at the operators such as the doctor operating radiographic apparatus 200 or technician
Accept various instruction.Such as, input unit 222 have mouse, keyboard, button, trace ball,
Action bars etc..Input unit 222 by the instruction that receives at operator to systems control division 221
Transmit.
Display part 223 display is for accepting the GUI(Graphical User of the instruction of operator
Or the display view data etc. that stored of image data storing section 225 Interface).Such as,
Display part 223 has display.It addition, display part 223 can also have multiple display.
Systems control division 221 controls the action of radiographic apparatus 200 entirety.Such as, it is
System control portion 221 is by indicating control X-ray according to the operator come from input unit 222 transmission
High voltage device 211, and adjust the voltage to X-ray tube 212 supply, thus control quilt
Amount of x-ray or the ON/OFF(of x-ray bombardment that corpse or other object for laboratory examination and chemical testing P irradiates open or close).
It addition, such as, systems control division 221 controls C-arm/top panel mechanism according to the instruction of operator
Control portion 219, to the rotation of C-arm 215 or movement, top board 214 move into Row sum-equal matrix.
It addition, such as, systems control division 221 controls diaphragm control portion by the instruction according to operator
220, the aperture of the blade of diaphragm that x-ray diAN_SNhragm device 213 is had is adjusted, thus
Control the range of exposures to the X-ray that subject P irradiates.
It addition, systems control division 221 is according to the instruction of operator, control based on view data raw
The view data generation in one-tenth portion 224 processes or image procossing etc. based on image processing part 226.
It addition, systems control division 221 is controlled, so that the display display of display part 223 is used
In the view data etc. that the GUI or image data storing section 225 that accept operator's instruction are stored.
Here, systems control division 221 receives from diagnostic ultrasound equipment 100 to use
Output data carry out various process, as it is shown on figure 3, have obtaining section 221a.Obtaining section 221a
It is by the process portion of position alignment process described later etc..That is, if image processing system 1 will
Above-mentioned position alignment portion 156b as the 1st position alignment portion, then has as the 2nd position pair
Obtaining section 221a in quasi-portion.It addition, describe in detail after the process carried out for obtaining section 221a.
Interface portion 227 is for LAN600, radiographic apparatus 200, X-ray in hospital
CT device 300, image archive apparatus 400 and the interface of image processing apparatus 500.Such as,
It is defeated that interface portion 227 involved by present embodiment receives that diagnostic ultrasound equipment 100 exported
Go out data, and the obtaining section output data received being had to systems control division 221
221a transmits.
Above, the overall structure for the image processing system 1 involved by the 1st embodiment is entered
Row explanation.In the structure shown here, in the image processing system 1 involved by the 1st embodiment,
By using the ultrasound investigation of diagnostic ultrasound equipment 100, determine the position needing treatment.
Specifically, at cardiac resynchronization therapy method (CRT:Cardiac Resynchronization
Therapy), in, the asynchronous position of the electrode of indwelling cardiac pacemaker is according to analysis portion 156a
The three dimensional analysis view data generated determines.Here, in a crt, doctor's reference
The radioscopic image obtained by radiographic apparatus 200 perspective photography, is stayed electrode
Put at the vein nearest apart from asynchronous position.But, under radioscopy, it is difficult to differentiate
The inside and outside face of heart wall, accordingly, it is difficult to carry out X-ray image data and analyze view data
Position alignment, is i.e. difficult to the position alignment of X-ray image data and ultrasound image data.
Therefore, in the 1st embodiment, pass through ultrasound wave to differentiate under radioscopy
The position that diagnosis determines, each portion shown in Fig. 9 carries out following process.Fig. 9 is to represent execution
The process portion of the image processing method that the image processing system involved by the 1st embodiment is carried out
Figure.
In the 1st embodiment, the 1st first had as diagnostic ultrasound equipment 100
The position alignment portion 156b in position alignment portion carries out what the photography of the regulation tissue to subject P obtained
1st three-dimensional medical image data and the position alignment of the 2nd three-dimensional medical image data.Here,
The three-dimensional medical image data in above-mentioned the 1st is three can be analyzed the motion of regulation tissue
Dimension medical image data.Specifically, the 1st three-dimensional medical image data is three-dimensional ultrasonic figure
As data.It addition, the 2nd above-mentioned three-dimensional medical image data is according to X-ray image data
The three-dimensional medical image data that particular organization's mappedization that can differentiate obtains.That is, as
The position alignment portion 156b in 1 position alignment portion carries out photographing the regulation tissue of subject P
The three-dimensional ultrasonic view data obtained with the regulation tissue of subject P carried out photography obtain
, can according to X-ray image data differentiate particular organization's mappedization obtain and three
The position alignment of the 2nd three-dimensional medical image data that dimension ultrasound image data is different.Here,
So-called regulation tissue refers to heart.It addition, specifically, the three-dimensional medical imaging in above-mentioned the 2nd
Data are 3 D X-ray CT view data or three-dimensional MRI image data.Above-mentioned the 2nd
Three-dimensional medical image data is three-dimensional image data as an example, is to coronary artery
The three-dimensional that (coronary artery) or Coronary vein (coronary vein) radiography obtain
X ray CT view data or the three-dimensional MRI image that coronary artery or Coronary venography are obtained
Data.Above-mentioned particular organization is the tissue being capable of identify that according to X-ray image data.Specifically
For, above-mentioned particular organization is to photograph according to the heart as regulation tissue is carried out radiography
To the tissue that is capable of identify that of x-ray imaging view data.Such as, above-mentioned particular organization is
Coronary artery or Coronary vein.It addition, the 2nd three-dimensional that particular organization's mappedization obtains is medical
View data in addition to three-dimensional image data, such as, can also be to be photographed by non-radiography
The three-dimensional MRI image data that blood flow labelling is obtained.Hereinafter, for the 2nd three-dimensional medical imaging
Data are that the situation of three-dimensional image data illustrates.
Here, owing to dye shadow degree coronarius is higher than the dye shadow degree of Coronary vein, therefore, three
Dimension image data is preferably used the 3 D X-ray CT picture number obtaining coronarography
According to or three-dimensional MRI image data that coronarography is obtained.Hereinafter, for will be as spy
The 3 D X-ray CT view data that the coronarography of fixed tissue obtains is three-dimensional as the 2nd
The situation of the three-dimensional image data of medical image data illustrates.
Further, output unit 157a that diagnostic ultrasound equipment 100 is had will be to the 1st three-dimensional doctor
With the three-dimensional medical image data (three of view data (three-dimensional ultrasonic view data) and the 2nd
Dimension image data) addition of the data of position alignment information and export as output data.
Or, output unit 157a will be to the 1st three-dimensional medical image data (three-dimensional ultrasonic view data)
Carry out position alignment with the 2nd three-dimensional medical image data (three-dimensional image data) and synthesize
Generated data as output data export.
Further, the acquirement as the 2nd position alignment portion that radiographic apparatus 200 is had
Portion 221a receives output data, carries out the 2nd three-dimensional medical image data and the rule to subject P
The multiple radioscopic images corresponding with photography direction that fixed tissue obtains from the photography of multiple photography directions
The position alignment of data.Or, obtaining section 221a as the 2nd position alignment portion carries out the 2nd
Three-dimensional medical image data obtains with photographing the regulation tissue of subject P from 1 photography direction
The position alignment of 1 corresponding with photography direction X-ray image data.Further, X-ray
The display part 223 that the diagnostic equipment 200 is had is according to the position pair as the 1st position alignment portion
The position alignment result of quasi-portion 156b and obtaining section 221a as the 2nd position alignment portion
Position alignment result, show the 1st three-dimensional medical image data position alignment in regulation tissue
The view data of X-ray image data.
Specifically, obtaining section 221a is according to the 2nd three-dimensional medical image data and multiple X-ray
The position alignment result of view data, obtains the 3-D photography space Zhong Te of X-ray image data
The three dimensional local information of fixed tissue.Or, obtaining section 221a is according to the 2nd three-dimensional medical imaging number
According to the position alignment result with 1 X-ray image data, obtain the three of X-ray image data
The three dimensional local information of particular organization in dimension photographing space.
More specifically, by the control of obtaining section 221a, when by the 2nd three-dimensional medical imaging number
According to when being configured at the 3-D photography space of radiographic apparatus 200, display part 223 shows spy
Fixed tissue is projected to the projection image after multiple X-ray image data.Further, obtaining section 221a
According to reference to display part 223 operator in above-mentioned multiple X-ray image data with spy
Surely organize corresponding position that the position of projection image is set up corresponding operation, obtain three-dimensional position
Information.Or, by the control of obtaining section 221a, when the 2nd three-dimensional medical image data is joined
When being placed in the 3-D photography space of radiographic apparatus 200, display part 223 shows specific group
Knit the projection image after being projected to 1 X-ray image data.Further, obtaining section 221a according to
With reference to the operator of display part 223 in above-mentioned 1 X-ray image data with particular organization
Corresponding operation is set up in the position of projection image by corresponding position, obtains three dimensional local information.
That is, the position alignment that obtaining section 221a is carried out processes by by " two-dimensional x-ray images
The two-dimentional particular organization that data are depicted " and " the 2nd three-dimensional medical image data is described
It is special that the three-dimensional particular organization gone out projects the two dimension obtained to the photography direction of this X-ray image data
Fixed tissue " carry out 3 correspondences established above.Accordingly, as the 2nd position alignment portion
Obtaining section 221a can use 1 X-ray image data obtained from 1 photography direction photography,
Carry out position alignment process.
Further, display part 223 shows the three dimensional local information according to particular organization and the 1st three
Dimension medical image data and the relative position relation of the 2nd three-dimensional medical image data, by the 1st three
Tie up medical image data or generate by the 1st three-dimensional medical image data is analyzed
Analyze the image data location view data in alignment with the X-ray image data of regulation tissue.
Here, obtaining section 221a as the 2nd position alignment portion carries out " the 1 of position alignment process
Individual or multiple X-ray image data " be " particular organization is carried out radiography photograph obtain 1
Individual or multiple x-ray imaging view data ".Or, as taking of the 2nd position alignment portion
Portion 221a to carry out " 1 or multiple X-ray image data " of position alignment process be " right
The particular organization being inserted into utensil carries out photograph 1 obtained or multiple radioscopic image number
According to ".Above-mentioned utensil e.g. inserts the seal wire (guide of coronary artery or Coronary vein
Wire).Owing to seal wire is X-ray impermeability, therefore when inserting seal wire
In the X-ray image data that photography obtains, contrast agent need not be injected and the most clearly depict and be preced with
Shape tremulous pulse or region corresponding to Coronary vein.
Hereinafter, for obtaining section 221a as " multiple X-ray image data " use " many
Multiple x-ray imaging view data that the photography of individual photography direction radiography obtains ", carry out position pair
Accurate situation about processing illustrates.Wherein, the content of following description can also be applicable to " multiple
X-ray image data " it is " the multiple X obtained in the photography of multiple photography directions when inserting seal wire
Radiographic image data " situation.It addition, the content of following description can also be suitable as " 1
Individual X-ray image data " " 1 X obtained 1 photography direction radiography photography penetrates in use
Line image data " situation or " obtain in the photography of 1 photography direction when inserting seal wire
1 X-ray image data " situation.
Such as, obtaining section 221a receives output data, carry out three-dimensional image data with will be by
Multiple x-ray imaging view data that the heart of corpse or other object for laboratory examination and chemical testing P obtains from multiple directions photography are respective
Position alignment, the particular organization in the 3-D photography space of acquirement x-ray imaging view data
Three dimensional local information.
Further, such as, display part 223 is according to the three dimensional local information of particular organization and three-dimensional
Ultrasound image data and the relative position relation of three-dimensional image data, show above-mentioned point
Analysis view data (three dimensional analysis view data) position alignment is in the radioscopic image of regulation tissue
View data after data.
Hereinafter, an example of the process carried out for each portion shown in Fig. 9 illustrates.Figure
10 and Figure 11 is for illustrating that the diagnostic ultrasound equipment involved by the 1st embodiment is carried out
The figure of an example of process.
As it has been described above, position alignment portion 156b can use by position sensor 160 and launch
The position detecting system that device 161 is constituted, carries out the three-dimensional being partitioned into from three-dimensional image data
Contrast Area numeric field data and the position alignment of three dimensional analysis view data.In the present embodiment, make
Being an example, position alignment portion 156b is three generated according to three-dimensional ultrasonic image data set
Dimension is analyzed in image data set, by ED three dimensional analysis view data (with reference to Fig. 6)
Object as position alignment.
It addition, position alignment portion 156b is for 3 D X-ray CT view data (reference Figure 10
Left figure), use for the threshold process of voxel value or region diastole method, will be extracted crown
The three-dimensional Contrast Area numeric field data (with reference to the right figure of Figure 10) of tremulous pulse is as the object of position alignment.
Here, when as when obtaining along seasonal effect in time series 3 D X-ray CT image data set, position
Put aligned portions 156b and will extract coronary artery from ED three dimensional analysis view data
And the data obtained that is three-dimensional Contrast Area numeric field data are as the object of position alignment.It addition, this
Embodiment such as can also use X ray CT device 300 or image processing apparatus 500 to carry out
Dividing processing and the three-dimensional Contrast Area numeric field data that obtains.
Here, in the 1st embodiment, position alignment portion 156b can also non-use position inspection
Examining system, and carry out position alignment process.Such as, position alignment portion 156b adjusts three dimensions
On position and the angle of 3 axles to regulation phase three-dimensional ultrasonic view data from
The projection image that obtains of multiple viewpoint direction projections each other with the 3 D X-ray CT to regulation phase
The projection image that view data obtains from the projection of multiple viewpoint direction overlaps each other.Thus, position pair
Quasi-portion 156b carries out three-dimensional ultrasonic view data and the 3 D X-ray CT image of same phase
The position alignment of data.By this process, position alignment portion 156b can carry out same phase
Three dimensional analysis data and the position alignment of three-dimensional Contrast Area numeric field data.
Further, output unit 157a will be to " the analysis result as three-dimensional ultrasonic view data
Analyze view data " and " the 2nd three-dimensional medical image data " addition of, and " position alignment is believed
Breath " data as output data export.The three-dimensional medical image data in above-mentioned the 2nd also may be used
Being to extract " particular organization of mappedization " from the 2nd three-dimensional medical image data and obtain
" the three-dimensional map region image data " arrived.Specifically, output unit 157a will be with for the moment
The three dimensional analysis data of phase and three-dimensional Contrast Area numeric field data and position alignment information are as output data
Send to obtaining section 221a.Obtaining section 221a can use position alignment information, such as Figure 11
Shown in, when by three dimensional analysis data and three-dimensional Contrast Area numeric field data position alignment, join
It is placed in three dimensions.
Or, medical image data three-dimensional to analysis view data and the 2nd is entered by output unit 157a
The generated data that line position is directed at and synthesizes exports as output data.Above-mentioned the 2nd is three-dimensional
Medical image data can also be to extract " mappedization from the 2nd three-dimensional medical image data
Particular organization " and the three-dimensional map region image data that obtains.Specifically, output unit
157a will carry out position pair to three dimensional analysis data and the three-dimensional Contrast Area numeric field data of same phase
Generated data that is accurate and that synthesize exports as output data.This generated data becomes shown in Figure 11
As data.It addition, when using generated data as output data, output unit 157a
It is configured to that there is " the three-dimensional ultrasonic image allowed hand over as the 1st three-dimensional medical image data
Data (three dimensional analysis data) " with " the three-dimensional radiography as the 2nd three-dimensional medical image data
View data (three-dimensional Contrast Area numeric field data) " display and spy that is non-display and that can separate
Determine the data of information.
That is, the three-dimensional image data (three-dimensional Contrast Area numeric field data) of generated data is used for
The process of obtaining section 221a, three-dimensional ultrasonic view data (three dimensional analysis data) eventually through
Display part 223 shows.Show and non-display therefore, it is intended that these 2 data allow hand over,
And can separate.Such as, output unit 157a uses brightness value as customizing messages, three-dimensional is divided
Analysis view data is as the data being made up of the brightness value of 511 gray scales in 512 gray scales, by three-dimensional
Contrast Area numeric field data as the data being made up of the brightness value of 1 gray scale in 512 gray scales to generate conjunction
Become data.
It addition, can also be in the present embodiment, use three-dimensional radiography figure as output data
As data, the image processing part 226 that radiographic apparatus 200 is had is from three-dimensional radiography figure
As extracting data three-dimensional Contrast Area numeric field data.
Further, obtaining section 221a of radiographic apparatus 200 receives output data.Further,
Obtaining section 221a uses output data to perform x-ray imaging view data and ultrasound image data
Position alignment.Figure 12 and Figure 13 is for illustrating that the X involved by the 1st embodiment penetrates
The figure of one example of the process that ray diagnosis apparatus is carried out.
First, by the control of obtaining section 221a, radiographic apparatus 200 is by subject P
Heart from multiple directions radiography photograph, generate multiple x-ray imaging view data.Such as,
By the control of obtaining section 221a, X-ray tube 212 as shown in figure 12, from the 1st direction to
Subject P X-ray irradiation, X-ray detector 216 has passed through tested at the 1st angle detecting
The X-ray of body P.Thus, image data generating section 224 generates the X-ray in the 1st direction and makes
Shadow view data.It addition, such as, by the control of obtaining section 221a, X-ray tube 212 is such as
Shown in Figure 12, from the 2nd direction to subject P X-ray irradiation, X-ray detector 216 exists
2nd angle detecting has passed through the X-ray of subject P.Thus, image data generating section 224
Generate the x-ray imaging view data in the 2nd direction.
Further, obtaining section 221a uses the x-ray imaging view data and the 2nd in the 1st direction
The x-ray imaging view data in direction and output data, obtain the three-dimensional position letter of particular organization
Breath.Further, since particular organization is coronary artery, therefore, the x-ray imaging in the 1st direction
The x-ray imaging view data in view data and the 2nd direction becomes the X-ray of tremulous pulse phase and makes
Shadow view data.Wherein, when particular organization is Coronary vein, the X ray picture in the 1st direction
As the X-ray image data of data and the 2nd direction becomes the x-ray imaging image of vein phase
Data.
Obtaining section 221a as shown in figure 13, by being extracted by three-dimensional Contrast Area numeric field data
The coronary artery that the x-ray imaging view data in coronary artery and 2 directions is depicted respectively is built
Vertical correspondence, obtains the three dimensional local information coronarius of three-dimensional Contrast Area numeric field data.This is right
The correspondence establishment being to carry out according to traveling path coronarius should be set up, become more than 3
Correspondence establishment.
First, three-dimensional Contrast Area numeric field data is configured at radiographic apparatus by obtaining section 221a
The 3-D photography space of 100.Here, the position of configuration three-dimensional Contrast Area numeric field data is such as by grasping
Author sets.Or, allocation position position the most set in advance.Further, obtaining section
221a makes image processing part 226 generate (crown dynamic for the particular organization of three-dimensional Contrast Area numeric field data
Arteries and veins) respectively to multiple X-ray image data project after projection image.Such as, obtaining section is passed through
The control of 221a, image processing part 226 generates and will be configured at the three-dimensional radiography in 3-D photography space
Area data is respectively to the 1st direction and the projection image of the 2nd direction projection.
Further, the display part 223 control by obtaining section 221a, display is respectively to multiple X
Projection image after the numeric field data of ray contrast view data projection three-dimensional Contrast Area.Further, obtaining section
221a according to reference to the operator of display part 223 respectively in multiple x-ray imaging view data
In the position corresponding with particular organization, corresponding operation is set up in the position of projection image, obtain
Three dimensional local information.Such as, operator moves operation (correspondence establishment operation), so that
Must projection image coronarius be overlapped according to each x-ray imaging view data identification is crown
Tremulous pulse.
It addition, operator moves operation, so that projection image and X-ray image data institute
The coronary artery depicted is the most overlapping.Obtaining section 221a is according to the amount of movement of projection image and shifting
Dynamic direction, carry out being configured at 3-D photography space three-dimensional Contrast Area numeric field data move in parallel or
In rotary moving, and using carried out these process after three-dimensional Contrast Area numeric field data position as
Three dimensional local information obtains.According to three dimensional local information and three-dimensional Contrast Area numeric field data and three
Dimension analyzes the relative position relationship of view data, and three dimensional analysis view data passes through obtaining section
221a, is reconfigured in 3-D photography space.It addition, also sometimes through mobile operation, to throwing
Image is amplified, reduces or deforms.Now, three-dimensional Contrast Area numeric field data is at 3-D photography
Space is amplified, reduces or deforms.Now, three dimensional analysis view data is by taking
Portion 221a, on the basis of being reconfigured in 3-D photography space, be amplified, reduce or
Person deforms.
Further, image processing part 226 will be reconfigured in 3-D photography according to three dimensional local information
The three dimensional analysis view data in space or be reconfigured in three-dimensional according to three dimensional local information and take the photograph
Shadow space also " amplify, reduce or deform " after three dimensional analysis view data to doctor
Photograph the x-ray imaging view data of heart of the subject P obtained in real time in desired direction
Projection.That is, three after image processing part 226 generates position alignment in 3-D photography space
The projection image of dimension analysis view data is overlapped in the picture number of the x-ray imaging view data of heart
According to.It addition, so-called direction desired by doctor refer to be suitable for indwelling electrode for X-ray
Image data carries out the direction photographed.It addition, the direction desired by doctor can be in operation
In at random change, image processing part 226 by three dimensional analysis view data to the side after change
To the x-ray imaging view data projection of the heart of the subject P obtained that photographs in real time.
Figure 14 is to represent the figure of an example of the view data of display in the 1st embodiment.
Doctor can be while with reference to view data X exemplified by Figure 14, in three dimensional analysis view data
Projection image confirms asynchronous position, the intravenous of the position nearest at asynchronous position,
Indwelling electrode.It addition, the projection image of three dimensional analysis view data is overlapping image, therefore,
Can switch show and non-display according to the requirement of operator.Here, present embodiment is also
The projection objects being overlapped in the three dimensional analysis view data of X-ray image data can be only used as
Asynchronous position.It addition, the projection image of the three dimensional analysis view data of overlap can be changed to appoint
The opacity of meaning.It addition, in 3-D photography space the three dimensional analysis picture number of position alignment
According to projection image be not limited to x-ray imaging view data by overlapping X-ray image data.
In 3-D photography space, the projection image of the three dimensional analysis view data of position alignment is by overlapping X
Radiographic image data can also be when being not injected into contrast agent, desired by doctor
Direction carries out the X-ray image data obtained of photographing.
It addition, the 1st embodiment can also be using the three-dimensional as the 1st three-dimensional medical image data
Ultrasound image data is contained in output data and exports.Now, it is overlapped in desired by doctor
The view data of X-ray image data that obtains of direction photography become based on three-dimensional ultrasonic figure
View data as data.View data based on this three-dimensional ultrasonic view data is e.g. wrapped
The ultrasound image data of multiple brchypinacoids of the brchypinacoid containing asynchronous position.
Then, Figure 15 and Figure 16 is used, at the image involved by the 1st embodiment
The flow process of the process of reason system 1 illustrates.Figure 15 is for illustrating involved by the 1st embodiment
And the flow chart of an example of process that carries out of diagnostic ultrasound equipment, Figure 16 is for saying
The stream of one example of the process that the radiographic apparatus involved by bright 1st embodiment is carried out
Cheng Tu.It addition, Figure 15 represents one detecting system of use, finishing two-dimensional ultrasonic image
One of the process that data are carried out after being directed at the initial position of 3 D X-ray CT view data
Example.
Diagnostic ultrasound equipment as shown in figure 15 as example, involved by the 1st embodiment
The 100 three-dimensional ultrasonic image data set (step S101) collecting heart.Further, analysis portion
156a generates three dimensional analysis image data set (step S102).Further, position alignment portion 156b
Carry out at the three dimensional analysis view data of same phase and the position alignment of three-dimensional Contrast Area numeric field data
Reason (step S103).
Further, output unit 157a such as will be to three dimensional analysis view data and three-dimensional radiography number of regions
The generated data synthesized according to carrying out position alignment is as output data, output output data (step
Rapid S104), end processes.
Further, the radiodiagnosis as shown in figure 16 as example, involved by the 1st embodiment
Obtaining section 221a that device 200 is had determines whether to receive from diagnostic ultrasound equipment 100
Output data (step S201).Here, when do not receive output data time (step S201
Negative), obtaining section 221a is standby to receiving output data.
On the other hand, (step S201 is certainly), obtaining section 221a when receiving output data
Control each portion of radiographic apparatus 200, generate the x-ray imaging picture number of multiple directions
According to (step S202).Specifically, radiographic apparatus 200 from multiple directions to tremulous pulse
The heart of the subject P of phase is photographed.
Further, by the control of obtaining section 221a, multiple X-ray are made by display part 223 respectively
Shadow view data, three-dimensional Contrast Area numeric field data (step S203) of projection overlap display.Further,
Obtaining section 221a determines whether to receive from operator the crown of x-ray imaging view data to be moved
Arteries and veins sets up corresponding correspondence establishment operation (step S204) with projection image.Here, ought not connect
When being operated by correspondence establishment (step S204 negative), obtaining section 221a is standby to accepting correspondence
Set up operation.
On the other hand, (step S204 is certainly), obtaining section when receiving correspondence establishment operation
221a operates according to correspondence establishment, obtains the three-dimensional position coronarius in 3-D photography space
Information (step S205).Further, by the control of obtaining section 221a, display part 223 shows
Show three dimensional analysis image data location in alignment with the view data after x-ray imaging view data
(step S206), and terminate to process.
As it has been described above, in the 1st embodiment, by making three-dimensional ultrasonic view data and two
The position alignment of X-ray image data of dimension alternate in 3 D X-ray CT view data (or,
Three-dimensional MRI image data) carry out.That is, in the 1st embodiment, by using position
The position detecting system of sensor 160, it is possible in 3 D X-ray CT view data differentiate with
The region that the scanning area of three-dimensional ultrasonic view data is corresponding, it addition, can be according to these 2
The organizational information that region is depicted respectively, carries out the position alignment between volume data according to voxel grade.
Thus, in the 1st embodiment, it is possible to easily carry out based on ultrasound image data
The position alignment of three dimensional analysis view data and three-dimensional Contrast Area numeric field data.Further, since hat
Shape tremulous pulse has distinctive form, therefore, it is possible to easily carry out the three-dimensional Contrast Area of tremulous pulse phase
The position alignment of numeric field data and the x-ray imaging view data of tremulous pulse phase.That is, implement the 1st
In mode, it is possible to carry out ultrasound image data (three dimensional analysis view data) and make with X-ray
The position alignment of shadow view data.Thus, in the 1st embodiment, it is possible to saturating in X-ray
The position determined by ultrasonic diagnosis depending on lower differentiation.It addition, in the 1st embodiment, doctor
Teacher can be while with reference to can the throwing of three dimensional analysis view data of overlapping display by position alignment
Image, at asynchronous near sites indwelling electrode.
(the 2nd embodiment)
In the 1st embodiment, obtain three dimensional local information for the operation according to operator
Situation is illustrated.In the 2nd embodiment, for not carrying out the operation of operator, and
Automatically obtain the situation of three dimensional local information, use Figure 17 to illustrate.Figure 17 be for
The figure of the 2nd embodiment is described.
Image processing system 1 involved by 2nd embodiment is real with use Fig. 1 explanation the 1st
Execute the image processing system 1 involved by mode to constitute in the same manner.Wherein, as the 2nd embodiment party
Obtaining section 221a in the 2nd position alignment portion involved by formula is when according to photographing at multiple photography directions
When the multiple X-ray image data obtained carry out position alignment process, pass through 3 d image data
Between pattern match carry out the 2nd three-dimensional medical image data and rebuild multiple X-ray image data
The position alignment of the three-dimensional X-ray image data obtained processes.Such as, the 2nd embodiment institute
Obtaining section 221a related to carries out three-dimensional radiography figure by the pattern match between 3 d image data
The 3 D X-ray contrastographic picture number obtained with the multiple x-ray imaging view data of reconstruction as data
According to position alignment process.As pattern match process, it is possible to list use be mutually associated or
The process of auto-correlation, mutual information amount, standardization mutual information amount, correlation ratio etc..
The data comprised when output data are the three-dimensionals extracted from three-dimensional image data
During the numeric field data of Contrast Area, the object of pattern match becomes three-dimensional Contrast Area numeric field data.It addition, example
As, by the control of obtaining section 221a, image processing part 226 will be by photographing in multiple directions
The x-ray imaging view data obtained, to 3-D photography space back projection, is rebuild three-dimensional X and is penetrated
Line image data.Such as, in the 2nd embodiment, 3 D X-ray contrastographic picture number
According to according in 2 directions, 3 directions or 50 directions photograph the x-ray imaging obtained
View data is rebuild.
Here, in order to alleviate the load that pattern match processes, for example, it is desirable to carry out following place
Reason.That is, as obtaining section 221a in the 2nd position alignment portion at the 2nd three-dimensional medical image data
Three-dimensional Region Of Interest and the three-dimensional set in three-dimensional X-ray image data of middle setting are concerned about
Position alignment process is carried out between region.Such as, obtaining section 221a is in three-dimensional image data
The three-dimensional Region Of Interest set in (or, three-dimensional Contrast Area numeric field data) is penetrated with at three-dimensional X
Position alignment process is carried out between the three-dimensional Region Of Interest set in line image data.
Such as, operator as shown in figure 17, sets three-dimensional ROI to three-dimensional Contrast Area numeric field data
(region Of Interest).Thus, such as image processing part 226 extracts as three-dimensional
" volume data E " of the three-dimensional Contrast Area numeric field data of ROI.It addition, operator is as shown in figure 17,
By respectively 2 x-ray imaging view data being set two dimension ROI, set three-dimensional ROI.
Thus, image processing part 226 rebuilds the 3 D X-ray image data as three-dimensional ROI
" volume data F ".It addition, these three-dimensional ROI can also be according to brightness value by obtaining section 221a
Automatically set.
Further, obtaining section 221a by volume data E and volume data F are carried out pattern match, from
And carry out position alignment process, obtain the three-dimensional position of particular organization's (such as, coronary artery)
Information.Above-mentioned process such as can also use multiple when inserting seal wire
Multiple X-ray image data that direction photography obtains carry out rebuilding the three-dimensional X-ray image obtained
Data are carried out.It addition, its later processing and the process phase of explanation in the 1st embodiment
With, therefore omit the description.
Then, Figure 18 is used, for the image processing system 1 involved by the 2nd embodiment
The flow process processed illustrates.Figure 18 is for the X-ray involved by the 2nd embodiment is described
The flow chart of one example of the process that the diagnostic equipment is carried out.It addition, involved by the 1st embodiment
And diagnostic ultrasound equipment 100 carry out process with in the 1st embodiment explanation process
Identical, therefore omit the description.
Radiographic apparatus as shown in figure 18 as example, involved by the 2nd embodiment
200 obtaining sections 221a being had determine whether to receive output from diagnostic ultrasound equipment 100
Data (step S301).Here, when being not received by exporting data, (step S301 is no
Fixed), obtaining section 221a is standby to receiving output data.
On the other hand, (step S301 is certainly), obtaining section 221a when receiving output data
Control each portion of radiographic apparatus 200, generate the x-ray imaging picture number of multiple directions
According to (step S302).Specifically, radiographic apparatus 200 from multiple directions to tremulous pulse
The heart of the subject P of phase is photographed.
Further, obtaining section 221a accepts the setting (step S303) of three-dimensional ROI.Further,
Obtaining section 221a extracts the three-dimensional Contrast Area numeric field data of three-dimensional ROI, according to multiple x-ray imagings
View data rebuilds the 3 D X-ray image data (step S304) of three-dimensional ROI.And
And, the three-dimensional X of the obtaining section 221a three-dimensional Contrast Area numeric field data at three-dimensional ROI and three-dimensional ROI
Pattern match (step S305) is carried out between ray contrast view data.
Further, the three-dimensional position coronarius letter during obtaining section 221a obtains 3-D photography space
Breath (step S306).Further, by the control of obtaining section 221a, display part 223 shows
By three dimensional analysis image data location in alignment with the view data after x-ray imaging view data
(step S307), end processes.
As it has been described above, in the 2nd embodiment, it is possible to automatically carry out the three-dimensional of particular organization
Positional information.Thus, in the 2nd embodiment, it is possible to more easily carry out ultrasonography
Data (three dimensional analysis view data) and the position alignment of x-ray imaging view data.
It addition, the process in each portion of explanation also may be used in the 1st and the 2nd above-mentioned embodiment
To be performed by X ray CT device 300 or image processing apparatus 500.Such as, image is analyzed
At the position alignment of the generation process of data, ultrasound image data and X ray CT view data
Reason, the output of output data process, the acquirement of the three dimensional local information of particular organization processes one
Some or all of can also be performed by X ray CT device 300 or image processing apparatus 500.
The overlay chart picture of the analysis view data after position alignment and X-ray image data can also be by X
Ray CT apparatus 300 or image processing apparatus 500 generate.That is, the above-mentioned the 1st and
In 2 embodiments, the dispersion/comprehensive concrete mode in each process portion of explanation is not limited to figure
Show, it is possible to according to various loads or behaviour in service etc., the functional or physics with arbitrary unit
Property ground dispersion/comprehensively its all or part constitute.
It addition, in the 1st and the 2nd above-mentioned embodiment, for the 1st three-dimensional medical figure
As data are three-dimensional ultrasonic view data, the 2nd three-dimensional medical image data is by particular organization
The 3 D X-ray CT view data of reflectionization or the situation of three-dimensional MRI image data are entered
Row explanation.But, if the 1st three-dimensional medical image data is can be to the motion of regulation tissue
The three-dimensional medical image data being analyzed, the 2nd three-dimensional medical image data is to particular organization
The three-dimensional medical image data that reflectionization obtains, then can be useful in above-mentioned the 1st and the 2nd
The content of explanation in embodiment.Such as, in the 1st and the 2nd embodiment in explanation
It is to carry out cardiac muscle contaminating the three of the phase of shadow that appearance can also be applicable to the 1st three-dimensional medical image data
Dimension MRI image data, the 2nd three-dimensional medical image data is to coronary artery or Coronary vein
The situation of the 3 D X-ray CT view data of the phase of dye shadow.Or, such as the 1st and
In 2nd embodiment, the content of explanation can also be applicable to the 1st three-dimensional medical image data is right
Cardiac muscle carries out contaminating the 3 D X-ray CT view data of the phase of shadow, the 2nd three-dimensional medical imaging number
According to being the situation that Coronary vein carries out contaminating the 3 D X-ray CT view data of the phase of shadow.
It addition, the image processing method energy illustrated in the 1st and the 2nd above-mentioned embodiment
Enough by the computer such as personal computer or the work station preprepared image processing program of execution
Realize.This image processing program can be issued via networks such as the Internets.It addition, this figure
As processing routine is recorded in the computer such as hard disk, floppy disk (FD), CD-ROM, MO, DVD can
In the record medium read, by being read out execution by computer from record medium.
Above, as described, according to the 1st and the 2nd embodiment, it is possible at X
The position determined by ultrasonic diagnosis is differentiated under actinoscopy X.
Although the description of several embodiments of the invention, but these embodiments are as an example
And point out, it is not intended to limit the scope of the present invention.These embodiments can each with other
Kind of mode is implemented, within a range not departing from the gist of the invention, it is possible to carry out various omission,
Displacement, change.These embodiments or its deformation be contained in the scope of invention or purport one
Sample, is contained in the invention of claims record and the scope of equalization thereof.
Claims (9)
1. an image processing system, wherein, possesses:
1st position alignment portion, it carries out obtained of photographing the regulation tissue of subject
Position alignment between 1 three-dimensional medical image data and the 2nd three-dimensional medical image data;
Output unit, it will be to the above-mentioned 1st three-dimensional medical image data and the above-mentioned 2nd three-dimensional doctor
Addition of the data of position alignment information or to the 1st three-dimensional medical imaging number by view data
Generated data conduct after synthesizing according to carrying out position alignment with the 2nd three-dimensional medical image data
Output data export;
2nd position alignment portion, it receives above-mentioned output data, carries out the above-mentioned 2nd three-dimensional medical
View data and the above-mentioned regulation tissue of above-mentioned subject is entered from 1 or multiple photography direction
Between 1 corresponding with photography direction or multiple X-ray image data that row is photographed and obtained
Position alignment;And
Display part, it is according to above-mentioned 1st position alignment portion and above-mentioned 2nd position alignment portion
Position alignment result, shows the above-mentioned 1st three-dimensional medical image data position alignment in above-mentioned
The X-ray image data of regulation tissue and the view data that obtains,
Above-mentioned 1st three-dimensional medical image data is that the motion can organized above-mentioned regulation is carried out point
The three-dimensional medical image data of analysis, the above-mentioned 2nd three-dimensional medical image data is can to penetrate according to X
Three-dimensional medical image data after particular organization's mappedization of line image discriminating data,
Above-mentioned 2nd position alignment portion carry out position alignment process above-mentioned 1 or multiple X penetrates
Line image data are above-mentioned regulation tissue to carry out radiography photography obtain 1 or multiple X penetrate
Line image data or the above-mentioned particular organization being inserted into utensil is photographed and
1 or the multiple X-ray image data obtained,
Above-mentioned 2nd position alignment portion according to above-mentioned 2nd three-dimensional medical image data with above-mentioned 1
Or the position alignment result between multiple X-ray image data, obtains radioscopic image number
According to 3-D photography space in the three dimensional local information of above-mentioned particular organization,
Above-mentioned display part is three-dimensional according to the three dimensional local information of above-mentioned particular organization and the above-mentioned 1st
Relative position relation between medical image data and the above-mentioned 2nd three-dimensional medical image data, comes
Show by the above-mentioned 1st three-dimensional medical image data or by the above-mentioned 1st three-dimensional medical imaging
The analysis image data location that data are analyzed and generate is aligned the X organized in above-mentioned regulation
Radiographic image data and the view data that obtains.
Image processing system the most according to claim 1, wherein,
When the above-mentioned 2nd three-dimensional medical image data is configured at above-mentioned 3-D photography space, on
State display part to show above-mentioned particular organization projects to above-mentioned 1 or multiple radioscopic image number
According to and the projection image that obtains,
Above-mentioned 2nd position alignment portion utilizes above-mentioned 1 according to the operator with reference to above-mentioned display part
Or multiple X-ray image data are by the position corresponding with above-mentioned particular organization and above-mentioned projection
The operation of corresponding relation is set up in the position of picture, obtains above-mentioned three dimensional local information.
Image processing system the most according to claim 1, wherein,
When carrying out position alignment process according to multiple X-ray image data, above-mentioned 2nd position
Aligned portions, by the pattern match between 3 d image data, carries out above-mentioned 2nd three-dimensional medical figure
As data and above-mentioned multiple X-ray image data are rebuild the three-dimensional X-ray image obtained
Position alignment between data processes.
Image processing system the most according to claim 3, wherein,
The three-dimensional that above-mentioned 2nd position alignment portion sets in the above-mentioned 2nd three-dimensional medical image data
Enter between Region Of Interest and the three-dimensional Region Of Interest set in above-mentioned three-dimensional X-ray image data
Line position registration process.
Image processing system the most according to claim 1, wherein,
When using above-mentioned generated data as above-mentioned output data, it is configured to that there is above-mentioned output unit
Allow hand over the above-mentioned 1st three-dimensional medical image data and the above-mentioned 2nd three-dimensional medical image data
Display and non-display and while the data of the customizing messages that can separate.
Image processing system the most according to claim 1, wherein,
Above-mentioned output unit is using dividing the analysis result as the above-mentioned 1st three-dimensional medical image data
Analysis view data and the above-mentioned 2nd three-dimensional medical image data addition of the number of position alignment information
According to or medical image data three-dimensional to this analysis view data and the 2nd carry out position alignment
And the generated data synthesized exports as above-mentioned output data.
Image processing system the most according to claim 1, wherein,
Above-mentioned 1st three-dimensional medical image data is three-dimensional ultrasonic view data, the above-mentioned 2nd three
Dimension medical image data is 3 D X-ray CT view data or three-dimensional MRI image data.
8. a radiographic apparatus, wherein, possesses:
2nd position alignment portion, obtains as according to the position alignment result in the 1st position alignment portion
Output data, above-mentioned 2nd position alignment portion receives and obtains the regulation tissue photography of subject
The 1st three-dimensional medical image data addition of position alignment with the 2nd three-dimensional medical image data and believe
The data of breath or by three-dimensional to the 1st three-dimensional medical image data and the 2nd medical image data
The generated data carrying out position alignment and synthesize, carry out above-mentioned 2nd three-dimensional medical image data with
The above-mentioned regulation tissue of above-mentioned subject is photographed from 1 or multiple photography direction obtain with
Position alignment between 1 or multiple X-ray image data that photography direction is corresponding;With
Display part, it is according to above-mentioned 1st position alignment portion and the position of above-mentioned 2nd position alignment
Put alignment result, show the above-mentioned 1st three-dimensional medical image data position alignment in above-mentioned rule
The X-ray image data of fixed tissue and the view data that obtains,
Above-mentioned 1st three-dimensional medical image data is that the motion can organized above-mentioned regulation is carried out point
The three-dimensional medical image data of analysis, the above-mentioned 2nd three-dimensional medical image data is can to penetrate according to X
Three-dimensional medical image data after particular organization's mappedization of line image discriminating data,
Above-mentioned 2nd position alignment portion carry out position alignment process above-mentioned 1 or multiple X penetrates
Line image data are above-mentioned regulation tissue to carry out radiography photography obtain 1 or multiple X penetrate
Line image data or the above-mentioned particular organization being inserted into utensil is photographed and
1 or the multiple X-ray image data obtained,
Above-mentioned 2nd position alignment portion according to above-mentioned 2nd three-dimensional medical image data with above-mentioned 1
Or the position alignment result between multiple X-ray image data, obtains radioscopic image number
According to 3-D photography space in the three dimensional local information of above-mentioned particular organization,
Above-mentioned display part is three-dimensional according to the three dimensional local information of above-mentioned particular organization and the above-mentioned 1st
Relative position relation between medical image data and the above-mentioned 2nd three-dimensional medical image data, comes
Show by the above-mentioned 1st three-dimensional medical image data or by the above-mentioned 1st three-dimensional medical imaging
The analysis image data location that data are analyzed and generate is aligned the X organized in above-mentioned regulation
Radiographic image data and the view data that obtains.
9. an image processing method, wherein, including:
1st position alignment portion carry out the regulation tissue of subject is photographed and obtain the 1st
Position alignment between three-dimensional medical image data and the 2nd three-dimensional medical image data, wherein,
Above-mentioned 1st three-dimensional medical image data is can be analyzed the motion that above-mentioned regulation is organized
Three-dimensional medical image data, the above-mentioned 2nd three-dimensional medical image data is can be according to X ray picture
As the three-dimensional medical image data after particular organization's mappedization of discriminating data,
Output unit will be to the above-mentioned 1st three-dimensional medical image data and above-mentioned 2nd three-dimensional medical figure
As data addition of the data after position alignment information or to the 1st three-dimensional medical image data
The generated data carrying out position alignment with the 2nd three-dimensional medical image data and synthesize is as output
Data export,
2nd position alignment portion receives above-mentioned output data, carries out the above-mentioned 2nd three-dimensional medical imaging
Data and the above-mentioned regulation tissue of above-mentioned subject is taken the photograph from 1 or multiple photography direction
Position between 1 corresponding with photography direction or multiple X-ray image data that shadow obtains
Alignment, according to the above-mentioned 2nd three-dimensional medical image data and above-mentioned 1 or multiple X ray picture
As the position alignment result between data, obtain the 3-D photography space of X-ray image data
In the three dimensional local information of above-mentioned particular organization,
Display part is medical according to three dimensional local information and above-mentioned 1st three-dimensional of above-mentioned particular organization
Relative position relation between view data and the above-mentioned 2nd three-dimensional medical image data, shows
By the above-mentioned 1st three-dimensional medical image data or by the above-mentioned 1st three-dimensional medical image data
The image data location of analyzing being analyzed and generate is aligned the X-ray organized in above-mentioned regulation
View data and the view data that obtains, thus according to above-mentioned 1st position alignment portion and above-mentioned
The position alignment result in the 2nd position alignment portion, shows the above-mentioned 1st three-dimensional medical imaging number
The X-ray image data organized in above-mentioned regulation according to position alignment and the view data obtained,
Above-mentioned 2nd position alignment portion carry out position alignment process above-mentioned 1 or multiple X penetrates
Line image data are above-mentioned regulation tissue to carry out radiography photography obtain 1 or multiple X penetrate
Line image data or the above-mentioned particular organization being inserted into utensil is photographed and
1 or the multiple X-ray image data obtained.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012207468 | 2012-09-20 | ||
JP2012-207468 | 2012-09-20 | ||
JP2013196006A JP6202963B2 (en) | 2012-09-20 | 2013-09-20 | Image processing system, X-ray diagnostic apparatus, and image processing method |
PCT/JP2013/075584 WO2014046267A1 (en) | 2012-09-20 | 2013-09-20 | Image processing system, x-ray diagnostic device, and image processing method |
JP2013-196006 | 2013-09-20 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103826539A CN103826539A (en) | 2014-05-28 |
CN103826539B true CN103826539B (en) | 2016-11-02 |
Family
ID=50341566
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380001786.2A Expired - Fee Related CN103826539B (en) | 2012-09-20 | 2013-09-20 | Image processing system, radiographic apparatus and image processing method |
Country Status (4)
Country | Link |
---|---|
US (1) | US9747689B2 (en) |
JP (1) | JP6202963B2 (en) |
CN (1) | CN103826539B (en) |
WO (1) | WO2014046267A1 (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106030657B (en) * | 2014-02-19 | 2019-06-28 | 皇家飞利浦有限公司 | Motion Adaptive visualization in medicine 4D imaging |
JP6437286B2 (en) * | 2014-11-26 | 2018-12-12 | 株式会社東芝 | Image processing apparatus, image processing program, image processing method, and treatment system |
US20160354049A1 (en) * | 2015-06-04 | 2016-12-08 | Biosense Webster (Israel) Ltd. | Registration of coronary sinus catheter image |
US10002423B2 (en) * | 2015-09-04 | 2018-06-19 | Canon Kabushiki Kaisha | Medical image processing apparatus, medical image processing method, and medical image processing system |
JP6615603B2 (en) * | 2015-12-24 | 2019-12-04 | キヤノンメディカルシステムズ株式会社 | Medical image diagnostic apparatus and medical image diagnostic program |
USD843382S1 (en) * | 2016-09-21 | 2019-03-19 | Analytics For Life | Display with graphical user interface |
USD858532S1 (en) * | 2016-09-21 | 2019-09-03 | Analytics For Life Inc. | Display screen with transitional graphical user interface |
JP6849420B2 (en) * | 2016-12-12 | 2021-03-24 | キヤノンメディカルシステムズ株式会社 | Ultrasound diagnostic equipment and medical image processing equipment |
WO2019075544A1 (en) * | 2017-10-19 | 2019-04-25 | Ventripoint Diagnostics Ltd | Positioning device and method |
TWI640931B (en) * | 2017-11-23 | 2018-11-11 | 財團法人資訊工業策進會 | Image object tracking method and apparatus |
JP7271126B2 (en) * | 2018-10-15 | 2023-05-11 | キヤノンメディカルシステムズ株式会社 | Ultrasound diagnostic equipment and medical image processing equipment |
KR102272741B1 (en) * | 2019-07-11 | 2021-07-02 | 가톨릭대학교 산학협력단 | Medical Imaging Method and System for Simultaneous Implementation of 3D Subtraction MR Arteriography, 3D Subtraction MR Venography and Color-coded 4D MR Angiography by the Post-processing of 4D MR Angiography |
CN110464460B (en) * | 2019-07-16 | 2020-11-17 | 江苏霆升科技有限公司 | Method and system for cardiac intervention operation |
CN111938699B (en) * | 2020-08-21 | 2022-04-01 | 电子科技大学 | System and method for guiding use of ultrasonic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1862596A (en) * | 2005-04-19 | 2006-11-15 | 西门子共同研究公司 | System and method for fused PET-CT visualization for heart unfolding |
CN101410060A (en) * | 2006-04-03 | 2009-04-15 | 皇家飞利浦电子股份有限公司 | Determining tissue surrounding an object being inserted into a patient |
CN101903909A (en) * | 2007-12-18 | 2010-12-01 | 皇家飞利浦电子股份有限公司 | System for multimodality fusion of imaging data based on statistical models of anatomy |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7813535B2 (en) | 2005-04-19 | 2010-10-12 | Siemens Medical Solutions Usa, Inc. | System and method for fused PET-CT visualization for heart unfolding |
JP4818846B2 (en) * | 2006-08-16 | 2011-11-16 | 富士フイルム株式会社 | Medical image processing apparatus and medical image processing program |
JP5060117B2 (en) * | 2006-12-18 | 2012-10-31 | 株式会社東芝 | 3D image processing apparatus, 3D image processing method, storage medium, and program |
JP5238201B2 (en) | 2007-08-10 | 2013-07-17 | 株式会社東芝 | Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing program |
BR112012024494A2 (en) * | 2010-03-31 | 2017-12-05 | Koninl Philips Electronics Nv | device for positioning an automatic absorption means for x-ray imaging, medical imaging system, method for positioning an automatic absorption medium for x-ray imaging, computer program element for control a computer-readable device and media |
US8805043B1 (en) * | 2010-04-02 | 2014-08-12 | Jasjit S. Suri | System and method for creating and using intelligent databases for assisting in intima-media thickness (IMT) |
JP5232286B2 (en) * | 2011-10-24 | 2013-07-10 | 株式会社東芝 | 3D image processing apparatus, 3D image processing method, storage medium, and program |
-
2013
- 2013-09-20 CN CN201380001786.2A patent/CN103826539B/en not_active Expired - Fee Related
- 2013-09-20 JP JP2013196006A patent/JP6202963B2/en not_active Expired - Fee Related
- 2013-09-20 WO PCT/JP2013/075584 patent/WO2014046267A1/en active Application Filing
-
2015
- 2015-03-19 US US14/663,272 patent/US9747689B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1862596A (en) * | 2005-04-19 | 2006-11-15 | 西门子共同研究公司 | System and method for fused PET-CT visualization for heart unfolding |
CN101410060A (en) * | 2006-04-03 | 2009-04-15 | 皇家飞利浦电子股份有限公司 | Determining tissue surrounding an object being inserted into a patient |
CN101903909A (en) * | 2007-12-18 | 2010-12-01 | 皇家飞利浦电子股份有限公司 | System for multimodality fusion of imaging data based on statistical models of anatomy |
Also Published As
Publication number | Publication date |
---|---|
JP6202963B2 (en) | 2017-09-27 |
US20150193932A1 (en) | 2015-07-09 |
JP2014076331A (en) | 2014-05-01 |
CN103826539A (en) | 2014-05-28 |
WO2014046267A1 (en) | 2014-03-27 |
US9747689B2 (en) | 2017-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103826539B (en) | Image processing system, radiographic apparatus and image processing method | |
US9717474B2 (en) | Image processing apparatus, ultrasound diagnosis apparatus, and image processing method | |
CN103857344B (en) | Image processing system, radiographic apparatus and image processing method | |
CN103889337B (en) | Diagnostic ultrasound equipment and ultrasonic diagnosis apparatus control method | |
JP5164309B2 (en) | Catheter device | |
JP5868067B2 (en) | Medical image diagnostic apparatus, image processing apparatus and method | |
CN103429164B (en) | Ultrasonic diagnostic device, image processing device, and image processing method | |
JP4763883B2 (en) | Ultrasonic diagnostic equipment | |
CN104661596B (en) | Image processing apparatus, radiographic apparatus and display methods | |
US20180008232A1 (en) | Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus | |
CN104994792B (en) | Ultrasonic diagnostic device and medical image processing device | |
EP1909649A1 (en) | Physiology workstation with real-time fluoroscopy or ultrasound imaging | |
US9888905B2 (en) | Medical diagnosis apparatus, image processing apparatus, and method for image processing | |
JP6956483B2 (en) | Ultrasonic diagnostic equipment and scanning support program | |
CN104602611B (en) | Diagnostic ultrasound equipment, medical image-processing apparatus and image processing method | |
JP2010094181A (en) | Ultrasonic diagnostic apparatus and data processing program of the same | |
JP5134897B2 (en) | Breast examination system | |
JP6863774B2 (en) | Ultrasound diagnostic equipment, image processing equipment and image processing programs | |
CN104619255B (en) | Radiographic apparatus and arm control method | |
US20090076386A1 (en) | Method and system for acquiring volume of interest based on positional information | |
JP4764521B2 (en) | Ultrasonic diagnostic equipment | |
US20230380812A1 (en) | Medical imaging method, apparatus, and system | |
JP5921610B2 (en) | Ultrasonic diagnostic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C41 | Transfer of patent application or patent right or utility model | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20160712 Address after: Japan Tochigi Tian Yuan City Applicant after: Toshiba Medical System Co., Ltd. Address before: Tokyo, Japan, Japan Applicant before: Toshiba Corp Applicant before: Toshiba Medical System Co., Ltd. |
|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20161102 Termination date: 20180920 |