WO2010113479A1 - 画像処理装置および方法並びにプログラム - Google Patents
画像処理装置および方法並びにプログラム Download PDFInfo
- Publication number
- WO2010113479A1 WO2010113479A1 PCT/JP2010/002306 JP2010002306W WO2010113479A1 WO 2010113479 A1 WO2010113479 A1 WO 2010113479A1 JP 2010002306 W JP2010002306 W JP 2010002306W WO 2010113479 A1 WO2010113479 A1 WO 2010113479A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- menu
- image
- dimensional image
- information
- analysis
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000012545 processing Methods 0.000 title claims description 45
- 238000003384 imaging method Methods 0.000 claims description 24
- 210000001015 abdomen Anatomy 0.000 claims description 10
- 210000004197 pelvis Anatomy 0.000 claims description 5
- 239000002131 composite material Substances 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 description 155
- 230000004217 heart function Effects 0.000 description 20
- 210000004072 lung Anatomy 0.000 description 19
- 210000004351 coronary vessel Anatomy 0.000 description 17
- 238000010586 diagram Methods 0.000 description 17
- 230000002308 calcification Effects 0.000 description 16
- 238000000605 extraction Methods 0.000 description 7
- 210000002216 heart Anatomy 0.000 description 6
- 210000004185 liver Anatomy 0.000 description 6
- 210000002429 large intestine Anatomy 0.000 description 5
- 230000000241 respiratory effect Effects 0.000 description 5
- 230000002526 effect on cardiovascular system Effects 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 4
- 238000002600 positron emission tomography Methods 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000003902 lesion Effects 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- 238000003325 tomography Methods 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000003187 abdominal effect Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 239000002872 contrast media Substances 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000001079 digestive effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 235000013372 meat Nutrition 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/465—Displaying means of special interest adapted to display user selection data, e.g. graphical user interface, icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/56—Details of data transmission or power supply, e.g. use of slip rings
- A61B6/563—Details of data transmission or power supply, e.g. use of slip rings involving image data transmission via a network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/008—Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
Definitions
- the present invention relates to an image processing apparatus and method for processing a three-dimensional image consisting of a plurality of tomographic images acquired by tomographic imaging of an object, and a program for causing a computer to execute the image processing method.
- X-ray CT Computed Modalities using various technologies such as Tomography equipment, ultrasound (US) diagnostic equipment, MRI (Magnetic Resonance Imaging) equipment, PET (Positron Emission Tomography) equipment, and SPET (Single-Photon Emission Tomography) equipment ing.
- US ultrasound
- MRI Magnetic Resonance Imaging
- PET PET
- SPET Single-Photon Emission Tomography
- the structure of interest is recognized, and the tomographic image is analyzed from the tomographic image including the structure of interest using a method such as maximum value projection (MIP method) and minimum value projection (MinIP method).
- MIP method maximum value projection
- MinIP method minimum value projection
- analysis applications exist for performing such analysis.
- a wide variety of color templates, image processing applications and display applications (hereinafter referred to as analysis applications etc.) for changing shading patterns when VR display of 3D images is performed.
- analysis applications etc. a wide variety of color templates, image processing applications and display applications for changing shading patterns when VR display of 3D images is performed.
- the number of menus for selecting an analysis application etc. increases, and as a result, it is very time consuming for the user such as an image interpretation doctor and an engineer to select a menu. .
- an index value representing an anatomic feature is calculated from the medical image, and it is determined whether the photographed subject is an adult, an infant, an infant or a child based on the calculated index value.
- a method of selecting an optimal imaging menu item to be selected from imaging menu items according to a sample and displaying only the selected imaging menu see Patent Document 1. According to the method described in Patent Document 1, when displaying a medical image, an unnecessary imaging menu for the photographed subject is not displayed, so that the burden on the user when selecting the imaging menu can be reduced. it can.
- Patent 2007-185429 gazette
- the present invention has been made in view of the above circumstances, and a menu is selected when displaying a menu used when processing a three-dimensional image, particularly when a tomographic image is acquired over a plurality of regions.
- the purpose is to reduce the burden on the user.
- An image processing apparatus is an image acquisition unit that acquires a three-dimensional image composed of a plurality of tomographic images acquired by performing tomographic imaging on a subject.
- a part information acquiring unit that acquires information on a part recognition result of the subject included in the three-dimensional image;
- Menu specifying means for specifying a menu according to a part from a plurality of menus used when displaying the three-dimensional image based on the information of the part recognition result;
- display control means for displaying the specified menu on a display means.
- the “site” is used to identify the body of the human body when the subject is a human body, and specific examples include the head, neck, chest, abdomen, pelvis, legs, and the above-mentioned respective parts. Composite sites that include both adjacent two sites, such as the head and neck and thoraco-abdominal.
- the part information acquiring unit may be configured to acquire information on the part recognition result by recognizing the part of the subject included in the three-dimensional image.
- the plurality of tomographic images are axial images, and the plurality of tomographic images are a head, a neck, a chest, an abdomen, a pelvis, a leg, and a portion thereof.
- at least one composite site consisting of at least two adjacent sites of
- the menu may be a menu for selecting an application for analyzing the three-dimensional image.
- the menu specification means may specify the menu in accordance with user information of a user who uses the three-dimensional image.
- the menu specification means may specify the menu in accordance with the type of modality that has acquired the three-dimensional image.
- the display control means may change the display mode of the specified menu in accordance with the use frequency of each of the plurality of menus.
- the display control unit is a unit capable of selectively displaying a specific part included in the three-dimensional image.
- the menu specifying means may be a means for specifying the menu in accordance with the part of the three-dimensional image being displayed.
- a computer acquires a three-dimensional image composed of a plurality of tomographic images acquired by performing tomographic imaging on a subject, Acquiring information on a result of site recognition of the subject included in the three-dimensional image; Based on the information on the part recognition result, a menu corresponding to the part is specified from a plurality of menus used when displaying the three-dimensional image, The specified menu is displayed on display means.
- the image processing method according to the present invention may be provided as a program for causing a computer to execute the method.
- a plurality of units are used when displaying a three-dimensional image based on the result of site recognition of the object included in the three-dimensional image consisting of a plurality of tomographic images acquired by tomographic imaging the object
- the menu according to the region is specified from the menu of, and the specified menu is displayed. Therefore, when the display of the three-dimensional image is instructed, only the menu corresponding to the part included in the three-dimensional image can be displayed. Therefore, it is not necessary for the user such as the image interpretation doctor and the technician to select unnecessary menus for the parts that are not included in the three-dimensional image, thereby reducing the burden on the user when selecting the menu.
- the specific site included in the three-dimensional image can be selectively displayed, and the menu is specified according to the site of the displayed three-dimensional image, thereby further reducing the burden on the user when selecting the menu. be able to.
- a diagram showing a schematic configuration of a medical information system to which an image processing apparatus according to an embodiment of the present invention is applied A diagram showing an example of a table used in the first embodiment Flow chart showing processing performed in the first embodiment Diagram showing the displayed examination list
- a diagram showing a state where a menu of an analysis application is listed for each part (part 1)
- a diagram showing an example of a table used in the second embodiment The figure which shows the state which displayed the menu of the analysis application as a list in 2nd Embodiment.
- a diagram showing an example of a table used in the third embodiment The figure which shows the state which displayed the menu of the analysis application as a list in 2nd Embodiment.
- the figure which shows the state which displayed the menu of analysis application as a list in 2nd Embodiment (the 1) The figure which shows the state which carried out pop-up display of the menu of analysis application in 4th Embodiment.
- the figure which shows the state which displayed the menu of the analysis application as a list in 4th Embodiment (the 2) The figure which shows the state which displayed the menu of the analysis application as a list in 4th Embodiment (the 3)
- a diagram showing an example of a table used in the fifth embodiment Flow chart showing processing performed in the fifth embodiment
- the figure which shows the display screen in 5th Embodiment (the 1) The figure which shows the display screen in 5th Embodiment (the 2)
- FIG. 1 is a view showing a schematic configuration of a medical information system to which an image processing apparatus according to a first embodiment of the present invention is applied.
- the medical information system according to the first embodiment includes a medical image photographing apparatus (modality) 1, an image interpretation workstation (WS) 2, an image server 3, an image database 4, and a site recognition terminal 5. It is connected and configured to be able to communicate with each other via the network 10.
- Each device in the present embodiment is controlled by a program installed from a recording medium such as a CD-ROM. Also, the program may be installed after being downloaded from a server connected via a network such as the Internet.
- a device that generates image data of an image representing a region to be examined by imaging the region to be examined of a subject in modality 1, and appends and outputs incidental information such as examination information and patient information to the image data Is included.
- the incidental information is in a format in accordance with a standardized standard such as the DICOM standard and a maker's unique standard such as its modality.
- Specific examples of the modality 1 include a CT apparatus, an MRI apparatus, a PET apparatus, an SPET apparatus, an ultrasonic imaging apparatus, and the like.
- the image data is image data of a three-dimensional image representing a region to be inspected of the subject acquired by the CT apparatus, and the axial slice image (tomographic image with a predetermined slice interval and slice thickness) Is configured as a set of image data, but is not limited thereto.
- the axial slice image tomographic image with a predetermined slice interval and slice thickness
- FIG. 1 a plurality of types of modalities 1 may be connected.
- the image interpretation workstation 2 is a device used by users such as image interpretation doctors and technologists to create image interpretation and interpretation reports, and includes a processing device, one or two high-definition displays, and a keyboard and mouse or touch panel. And other input devices.
- the image interpretation workstation 2 requests image browsing to the image server 3, performs various image processing on the image received from the image server 3, various analysis processing including automatic detection and highlighting of a structure and a lesion or the like in the image, image It supports display, creation of diagnostic reading reports, requests to register diagnostic reading reports to a diagnostic reading report server (not shown) and browsing requests, and displays diagnostic reading reports received from a diagnostic reading report server.
- an analysis application for performing various analysis processing instructed by the user is installed in the image interpretation workstation 2.
- the analysis application is prepared according to the part included in the three-dimensional image.
- the image interpretation workstation 2 stores a table in which sites included in the three-dimensional image are associated with types of analysis applications used to analyze the sites.
- FIG. 2 is a diagram showing an example of a table used in the first embodiment. As shown in FIG. 2, in the table T1, parts and types of analysis applications are associated.
- brain blood vessel extraction in the head and neck lung analysis in the chest, coronary artery analysis, cardiac function analysis and calcification score, lung analysis and liver analysis in thoraco-abdominal, liver analysis in the abdomen and
- the large intestine analysis corresponds to the analysis application of the large intestine analysis in the pelvic region.
- the image interpretation workstation 2 When the user interprets an image, the image interpretation workstation 2 first acquires incidental information and information of a part recognition result to be described later for each examination according to an instruction from the user. Then, with reference to the table T1 based on the acquired information of the part recognition result, the analysis application corresponding to the part included in the three-dimensional image is specified and displayed on the display. The display of the identified analysis application will be described later.
- the part may be included. Further, in the case where a part in the body axis direction is included in a predetermined distance (number of sheets ⁇ slice interval) or more, the part may be included. For example, when the neck is 10 cm or more, it may be determined that the three-dimensional image includes the neck. In this case, if the neck includes only 5 cm, the analysis application of the neck will not be identified.
- the image server 3 is a general-purpose relatively high-performance computer on which a software program for providing the function of a database management system (DBMS) is installed. Further, the image server 3 is provided with a large capacity storage in which the image database 4 is configured. This storage may be a large-capacity hard disk drive connected to the image server 3 by a data bus, or connected to a NAS (Network Attached Storage) and a SAN (Storage Area Network) connected to the network 10. It may be a disk device.
- the image server 3 also has a communication interface that communicates with the modality 1 and the image interpretation workstation 2 and the like via the network 10.
- the image server 3 When receiving the image registration request from the modality 1, the image server 3 arranges the image into a database format and registers the image in the image database 4. Further, the information of the part recognition result obtained by the part recognition terminal 5 described later is registered in the image database 4.
- the incidental information includes, for example, an image ID for identifying an individual image, a patient ID for identifying a subject, an examination ID for identifying an examination, a unique ID (UID) assigned to each image, and the like Examination date when the image was generated, examination time, type of modality used in examination to acquire the image, patient name, patient information such as age, gender etc., examination site (shooting site), shooting information (shooting protocol (shooting protocol) , Imaging sequence, imaging method, imaging conditions, use of contrast agent, etc., and information such as a series number or acquisition number when a plurality of images are acquired in one examination.
- the image server 3 receives the browse request from the image interpretation workstation 2 via the network 10, the image server 3 searches the image registered in the image database 4 and the image extracted by the search is the image interpretation workstation of the request source. Send to 2
- the image interpretation workstation 2 transmits a browsing request to the image server 3 and acquires an image necessary for image interpretation. Then, analysis processing such as automatic detection processing of a lesion is performed on the image according to a request from the user.
- the part recognition terminal 5 recognizes the part of the subject included in the three-dimensional image prior to or after registration in the image database 4 of the three-dimensional image composed of a plurality of tomographic images acquired by the modality 1 Site recognition processing.
- the site recognition terminal 5 also has a communication interface that communicates with the modality 1 and the image interpretation workstation 2 and the like via the network 10. Further, the part recognition terminal 5 starts the recognition processing by receiving from the image server 3 a notification that the registration of the three-dimensional image in the image database 4 is completed.
- the recognition processing may be started by an instruction from the image interpretation workstation 2.
- the part recognition process performed by the part recognition terminal 5 will be described.
- the method described in Japanese Patent Application Laid-Open No. 2008-259682 can be used as the part recognition process.
- the method described in Japanese Patent Laid-Open No. 2008-259682 normalizes a plurality of input tomographic images, calculates a large number of feature amounts from the normalized tomographic images, and calculates for each normalized tomographic image.
- the feature amount is input to the discriminator obtained by the AdaBoost method to calculate the score for each part indicating the likeness of the part, and the calculated part score is used as an input to use the dynamic programming method to obtain the body part of the human body.
- a method by color template matching see, for example, JP-A-2002-253539
- a method using a unique image of each part see, for example, JP-A-2003-10166
- the part recognition terminal 5 transmits information on the part recognition result to the image server 3.
- the image server 3 registers the received information of the part recognition result in the image database 4 and transmits the information of the part recognition result together with the three-dimensional image from the image server 3 to the image reading workstation 2 according to the instruction from the image reading workstation 2 .
- FIG. 3 is a flowchart showing the process performed in the first embodiment.
- the information of the region recognition result is registered in the image database 4, and after the user such as the image interpretation doctor instructs the image interpretation workstation 2 to interpret the image, the menu of the identified analysis application is specified. The process until it displays will be described.
- the image server 3 reads out the registered three-dimensional image from the image database 4 together with the accompanying information and the information of the part recognition result, and the image interpretation workstation 2 Send to
- the image interpretation workstation 2 receives the three-dimensional image, the incidental information, and the information on the part recognition result (information reception, step ST1). Then, based on the received information on the part recognition result, the analysis application according to the part included in the three-dimensional image is specified with reference to the above-mentioned table T1 (step ST2), and the inspection list is It displays on a display (step ST3).
- FIG. 4 is a diagram showing the displayed examination list.
- the examination list includes, for each acquired three-dimensional image, information on a patient name, a patient ID, and examination date and time based on the associated incidental information.
- the image interpretation workstation 2 starts monitoring whether or not the user has selected an image from the examination list by right clicking (step ST4), and when step ST4 is affirmed, the analysis application specified for the selected image Is popped up (step ST5), and the process is ended.
- FIG. 5 is a view showing a state where the analysis application menu is popped up.
- the analysis application specified is lung analysis, coronary artery analysis, cardiac function analysis and calcification score, so lung analysis, coronary artery analysis , Pop-up menu of cardiac function analysis and calcification score analysis application.
- the user can cause the image interpretation workstation 2 to execute an analysis application corresponding to the selected menu by selecting the desired menu.
- FIG. 6 is a diagram showing a list of analysis application menus in the first embodiment.
- the analysis applications specified are lung analysis, coronary artery analysis, cardiac function analysis, calcification score, liver analysis and large intestine analysis Therefore, a menu of analysis applications for lung analysis, coronary artery analysis, cardiac function analysis, calcification score, liver analysis and large intestine analysis is listed on the selection screen.
- the user can cause the image interpretation workstation 2 to execute an analysis application corresponding to the selected menu by selecting the desired menu.
- the analysis applications may be displayed in a list for each part. For example, if the region recognition result for the selected image is the chest and abdomen, as shown in FIG. 7, it is possible to select the chest and abdomen using the tab, and only the analysis application menu corresponding to the selected region is displayed. You may display it. That is, when a chest tab is selected as shown in FIG. 7, a menu of lung analysis, coronary artery analysis, cardiac function analysis and calcification score is displayed, and an abdominal tab is selected as shown in FIG. In this case, a menu of liver analysis and large intestine analysis may be displayed. In the three-dimensional image, it is preferable to display the longest part in the body axis direction first.
- the analysis application corresponding to the region is specified from a plurality of analysis applications used when displaying the three-dimensional image. Therefore, when the display of the image is instructed, the user can select only the menu corresponding to the part included in the three-dimensional image by displaying only the menu of the specified analysis application. Therefore, a user such as an image reading doctor does not have to select an unnecessary menu for a part not included in the three-dimensional image as a selection target, so that the burden on the user when selecting the menu can be reduced.
- part recognition process is performed in the part recognition terminal 5
- the part recognition process may be performed in the image interpretation workstation 2.
- the medical information system to which the image processing apparatus according to the second embodiment and the subsequent embodiments is applied has the same configuration as the medical information system to which the image processing apparatus according to the first embodiment is applied. Since only the processing is different, the detailed description of the configuration will be omitted hereinafter.
- the user of the medical information system is an image interpretation doctor and a technician, and the analysis application to be used for each clinical department to which the user belongs is substantially determined.
- the analysis application is specified based on the user information.
- user information is not only information that specifies a clinical department such as cardiovascular department, respiratory department, digestive department, and brain surgery, but also a user ID that specifies an image reading doctor or technician or the ID of an image reading workstation 2, etc. Any information that can identify the user can be used.
- FIG. 9 is a diagram showing an example of a table used in the second embodiment.
- the chest and the type of analysis application are associated with each clinical department.
- the cardiovascular and respiratory departments are associated with the chest, coronary artery analysis, cardiac function analysis and calcification score for cardiovascular department, and analysis application for lung analysis for respiratory department. It is done.
- the type of analysis application to be used may be edited according to user information.
- FIG. 10 is a diagram showing a list of menus of analysis applications in the second embodiment.
- the analysis application specified is lung analysis, coronary artery analysis, cardiac function analysis and calcification score, so the user information is In the case of the cardiovascular department, as shown on the screen 20 of FIG. 10, menus of coronary artery analysis, cardiac function analysis and calcification score analysis application are listed and displayed on the selection screen.
- the user information is in the respiratory department, as shown on the screen 21 of FIG. 10, a menu of an analysis application of lung analysis is listed and displayed on the selection screen.
- the user can cause the image interpretation workstation 2 to execute an analysis application corresponding to the selected menu by selecting the desired menu.
- the analysis application is specified based on user information as well as the site recognition result. Therefore, when the display of the image is instructed, the menu of the analysis application which is frequently used by the user is displayed, and as a result, the burden on the user when selecting the menu can be further reduced.
- modality 1 includes various devices such as a CT apparatus, an MRI apparatus, a PET apparatus, an SPET apparatus, and an ultrasonic imaging apparatus, and the analysis application that can be executed differs depending on the type of modality 1 .
- the analysis application is specified according to the type of the modalities 1 included in the incidental information as the first embodiment. It is different from
- the image interpretation workstation 2 includes a part included in the three-dimensional image and the type of analysis application used for analysis of the part, of the modality 1 having acquired the three-dimensional image.
- a table associated with each type is stored.
- FIG. 11 is a diagram showing an example of a table used in the third embodiment.
- the chest and the type of analysis application are associated with each type of modality 1.
- a CT device and an MR device are associated with the chest, and the CT device includes coronary artery analysis, cardiac function analysis and calcification score, and the MR device includes analysis applications of cardiac function analysis and delayed contrast analysis. Each is associated.
- FIG. 12 is a diagram showing a list of menus of analysis applications in the third embodiment.
- the region recognition result for the image selected by the user in the examination list shown in FIG. 4 is the chest, and the modality 1 from which the three-dimensional image is acquired is the CT device
- the identified analysis Since the application is lung analysis, coronary artery analysis, cardiac function analysis and calcification score, as shown on the screen 22 of FIG. 12
- menus of coronary artery analysis, cardiac function analysis and calcification score analysis application are listed on the selection screen Is displayed.
- the modality 1 from which the three-dimensional image has been acquired is the MR device
- the menus of cardiac function analysis and delayed contrast analysis analysis applications are listed and displayed on the selection screen.
- the user can cause the image interpretation workstation 2 to execute an analysis application corresponding to the selected menu by selecting the desired menu.
- the analysis application is specified based on the type of modality 1 that has acquired a three-dimensional image in addition to the result of site recognition. For this reason, when display of an image is instructed, an analysis application that is highly likely to be displayed on the displayed three-dimensional image is displayed, and as a result, the burden on the user when selecting a menu is further reduced. be able to.
- the image interpretation workstation 2 when the image interpretation workstation 2 records a log of the number of times of use of the analysis application, and the user instructs image interpretation at the image interpretation workstation 2, the image interpretation workstation 2 becomes three-dimensional
- the display mode of the menu is changed so that the analysis application with high frequency of use can be easily selected. It differs from the first embodiment.
- FIG. 13 is a diagram showing a list of menus of analysis applications in the fourth embodiment.
- coronary artery analysis, cardiac function analysis and calcification score analysis application are specified based on information of site recognition result, and the frequency of use of analysis application is coronary artery analysis, cardiac function analysis and calcification It shall be in order of score.
- menus of coronary artery analysis, cardiac function analysis and calcification score analysis application are listed from the top in this order.
- the analysis application menu is popped up, menus of coronary artery analysis, cardiac function analysis and calcification score analysis application are popped up in this order from the top as shown in FIG.
- the size may be increased and displayed as the menu of the analysis application which is frequently used.
- the coronary artery analysis icon with the highest frequency of use may be selected. Note that, in FIG. 16, the selected state is shown by giving a hatching to the menu. As a result, the user can execute the most frequently used analysis application only by pressing the execution key of the input device such as the keyboard in the image interpretation workstation 2.
- the display mode of the analysis application menu is changed so that the analysis application that is frequently used among the specified analysis applications can be easily selected.
- the burden of menu selection can be further reduced.
- the three-dimensional image acquired by the CT apparatus and the three-dimensional image acquired by the SPET apparatus are transmitted to the image interpretation workstation 2 as three-dimensional images of the same patient.
- the region recognition result is a chest
- a three-dimensional image acquired by a CT apparatus is selected as an image to be displayed
- the three-dimensional image is not intended for lung diagnosis but for heart function diagnosis. It will be done. Therefore, in such a case, the display mode of the analysis application menu may be changed so that the analysis application of cardiac function analysis that is highly likely to be used can be easily selected.
- a color template corresponding to the result of site recognition is specified from a plurality of color templates for changing the shading pattern of the three-dimensional image according to the result of site recognition, and the specified color is specified. It differs from the first embodiment in that a menu for selecting the specified color template is listed so that only the template is selected.
- the image interpretation workstation 2 stores a table in which sites included in the three-dimensional image are associated with types of color templates used when the sites are three-dimensionally displayed.
- FIG. 17 is a diagram showing an example of a table used in the fifth embodiment. As shown in FIG. 17, in the table T4, the site and the type of color template are associated. Specifically, color templates P1 to P5 for the head, color templates P6 to P10 for the neck, color templates P11 to P20 for the chest, color templates P21 to P25 for the chest and abdomen, and color templates P26 to Color templates 31 to P35 are associated with P30 and the pelvis, respectively.
- the image interpretation workstation 2 when the user interprets an image, the image interpretation workstation 2 first acquires incidental information and information on a part recognition result for each examination according to an instruction from the user. Then, referring to the table T4 based on the information of the part recognition result, a color template corresponding to the part included in the three-dimensional image is specified and displayed on the display.
- FIG. 18 is a flowchart showing the process performed in the fifth embodiment.
- the information on the part recognition result is registered in the image database 4, and after the user such as the image reading doctor instructs the image reading workstation 2 to read the image, the specified color template is displayed. The process until it is done will be described.
- the image server 3 reads out the registered three-dimensional image from the image database 4 together with the accompanying information and the information of the part recognition result, and the image interpretation workstation 2 Send to
- the image interpretation workstation 2 receives the three-dimensional image, the incidental information, and the information on the part recognition result (information reception, step ST11). Then, based on the received information of the part recognition result, the color template corresponding to the part included in the three-dimensional image is specified with reference to the above-mentioned table T4 (step ST12). Next, a menu of the specified color template is displayed together with the three-dimensional image (step ST13), and the process is ended.
- FIG. 19 is a view showing a display screen in the fifth embodiment.
- the VR image 31 and the color template display area 32 are displayed on the screen 30.
- a menu of a plurality of specified color templates is displayed.
- the VR image 31 is an image of a chest
- the specified part is the chest
- the color templates P11 to P20 are displayed in the color template display area 32 by referring to the table T4.
- the menu of the color template being displayed is shown by rectangular icons in FIG. 19, each menu is displayed so that the user can recognize the identified part and the shaded state of the part. It is done. The user can give a desired shade to the VR image 31 by selecting a desired color template on the screen 30.
- the display is performed when the display of the image is instructed.
- the color template used only in the three-dimensional image is displayed, as a result, it is possible to further reduce the burden on the user when selecting the color template.
- all usable color templates are displayed, and the density of color templates other than the specified color template is lowered to facilitate selection of only the specified color template. You may do so.
- a log of the number of times of use of the color template used is recorded in the image interpretation workstation 2 so that it is easy to select a color template with a high frequency of use when displaying a three-dimensional image.
- the display mode of the color template menu may be changed. For example, only the menu of color templates whose usage frequency is equal to or higher than a predetermined threshold is displayed, the menu of color templates is rearranged and displayed in order of usage frequency, or the color template is used more frequently. You may enlarge it.
- the menu to be displayed according to the site in the image interpretation workstation 2 is not limited to the types of analysis application and color template, and an image processing application and a display application etc. installed in the image interpretation workstation 2 It is also possible to use other menus to run. For example, when the region of the three-dimensional image to be displayed is a chest, image processing applications such as lung field extraction, bronchial extraction and heart extraction are often executed. Therefore, as shown on the screen 40 of FIG. 20, for example, when displaying the axial image 41 of the chest, only the menu of the image processing application of lung field extraction, bronchial extraction and heart extraction is displayed in the menu display area 42 You may do so.
- the menu is specified according to the result of the part recognition.
- the part to be focused in the displayed image is recognized, and the menu is specified according to the result of the part recognition.
- the operation of the input device of the image interpretation workstation 2 opens the incision of the chest and makes it easy to focus on the lung field and heart existing from the surface of the skin to the back.
- the display mode of the image can be changed. Specifically, when the input device is a touch panel, an image is displayed as if the cut is opened by touching the touch panel and performing an operation to spread the finger, and the lung field and heart located behind the skin are displayed on the heart
- the display mode can be changed to be able to pay attention.
- a site to be focused on may be recognized in the image interpretation workstation 2
- a menu may be specified according to the result of the site recognition
- the specified menu may be displayed.
- the display mode of the menu of these display applications may be changed according to the result of site recognition. That is, since the display application to be used differs depending on the part, the display mode of the menu may be changed so as to easily select the menu of the display application which is highly likely to be used in the part to be displayed.
- the display mode of the menu may be changed so as to easily select the menu of the display application which is highly likely to be used in the part to be displayed.
- the MinIP display is often used to observe the lung field, when the chest and thoraco-abdominal images are displayed, the menu for performing the MinIP display is selected according to the result of site recognition. It may be displayed to be easy to do.
- MIP display when bone removal processing is performed on the chest image, there is a high possibility of MIP display, so it is easy to select the menu of the display application in consideration of the result of the image processing applied to the image.
- the display mode of the display menu may be changed.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Pulmonology (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Medical Treatment And Welfare Office Work (AREA)
Abstract
Description
Tomography)装置、超音波(US)診断装置、MRI(Magnetic Resonance Imaging)装置、PET(Positron Emission Tomography)装置、およびSPET(Single-Photon Emission Tomography)装置等の様々な技術を用いたモダリティが利用されている。このようなモダリティの高速化およびマルチスライス対応といった高性能化に伴い、1つの撮影シリーズにおいて被検者の複数の部位の撮影を行い、数百から数千の高精細な断層画像を取得することが可能になってきている。しかしながら、すべての断層画像を1枚ずつ観察するには時間がかかり、また、断層画像のみから観察対象となる構造物(体表、骨、心臓、肺野および肝臓等の臓器および組織)の3次元形状を理解するには医師の熟練を要する。
前記3次元画像に含まれる前記被検体の部位認識結果の情報を取得する部位情報取得手段と、
前記部位認識結果の情報に基づいて、前記3次元画像を表示する際に使用する複数のメニューから、部位に応じたメニューを特定するメニュー特定手段と、
前記特定されたメニューを表示手段に表示する表示制御手段とを備えたことを特徴とするものである。
前記メニュー特定手段を、表示されている前記3次元画像の部位にも応じて、前記メニューを特定する手段としてもよい。
前記3次元画像に含まれる前記被検体の部位認識結果の情報を取得し、
前記部位認識結果の情報に基づいて、前記3次元画像を表示する際に使用する複数のメニューから、部位に応じたメニューを特定し、
前記特定されたメニューを表示手段に表示することを特徴とするものである。
Claims (10)
- 被検体を断層撮影することにより取得した複数の断層画像からなる3次元画像を取得する画像取得手段と、
前記3次元画像に含まれる前記被検体の部位認識結果の情報を取得する部位情報取得手段と、
前記部位認識結果の情報に基づいて、前記3次元画像を表示する際に使用する複数のメニューから、部位に応じたメニューを特定するメニュー特定手段と、
前記特定されたメニューを表示手段に表示する表示制御手段とを備えたことを特徴とする請求項1記載の画像処理装置。 - 前記部位情報取得手段は、前記3次元画像に含まれる前記被検体の部位を認識することにより、前記部位認識結果の情報を取得する手段であることを特徴とする請求項1記載の画像処理装置。
- 前記複数の断層画像が軸位断画像であり、該複数の断層画像が頭部、頚部、胸部、腹部、骨盤部、脚部およびこれらの部位のうちの隣接する少なくとも2つの部位からなる複合部位の少なくとも1つを含むことを特徴とする請求項1または2記載の画像処理装置。
- 前記メニューは、前記3次元画像を解析するアプリケーションを選択するためのメニューであることを特徴する請求項1から3のいずれか1項記載の画像処理装置。
- 前記メニュー特定手段は、前記3次元画像を使用するユーザのユーザ情報にも応じて、前記メニューを特定する手段であることを特徴とする請求項1から4のいずれか1項記載の画像処理装置。
- 前記メニュー特定手段は、前記3次元画像を取得したモダリティの種類にも応じて、前記メニューを特定する手段であることを特徴とする請求項1から5のいずれか1項記載の画像処理装置。
- 前記表示制御手段は、前記特定されたメニューのうち、使用頻度が高いメニューほど選択されやすいように、前記特定されたメニューの表示態様を変更する手段であることを特徴とする請求項1から6のいずれか1項記載の画像処理装置。
- 前記表示制御手段は、前記3次元画像に含まれる特定の部位を選択的に表示可能な手段であり、
前記メニュー特定手段は、表示されている前記3次元画像の部位にも応じて、前記メニューを特定する手段であることを特徴とする請求項1から7のいずれか1項記載の画像処理装置。 - コンピュータが、被検体を断層撮影することにより取得した複数の断層画像からなる3次元画像を取得し、
前記3次元画像に含まれる前記被検体の部位認識結果の情報を取得し、
前記部位認識結果の情報に基づいて、前記3次元画像を表示する際に使用する複数のメニューから、部位に応じたメニューを特定し、
前記特定されたメニューを表示手段に表示することを特徴とするコンピュータによる画像処理方法。 - コンピュータに、被検体を断層撮影することにより取得した複数の断層画像からなる3次元画像を取得する手順と、
前記3次元画像に含まれる前記被検体の部位認識結果の情報を取得する手順と、
前記部位認識結果の情報に基づいて、前記3次元画像を表示する際に使用する複数のメニューから、部位に応じたメニューを特定する手順と、
前記特定されたメニューを表示手段に表示する手順とを実行させることを特徴とする画像処理プログラム。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010522886A JP4596437B2 (ja) | 2009-03-31 | 2010-03-30 | 画像処理装置および方法並びにプログラム |
CA2733519A CA2733519A1 (en) | 2009-03-31 | 2010-03-30 | Image processing device and method, as well as program |
BRPI1004218A BRPI1004218A2 (pt) | 2009-03-31 | 2010-03-30 | dispositivo e método de processamento de imagem, e programa |
AU2010231365A AU2010231365A1 (en) | 2009-03-31 | 2010-03-30 | Image processing apparatus and method and program |
CN2010800027521A CN102164543B (zh) | 2009-03-31 | 2010-03-30 | 图像处理装置和方法、以及程序 |
US12/737,702 US9144407B2 (en) | 2009-03-31 | 2010-03-30 | Image processing device and method, and program |
BR122013007310A BR122013007310A2 (pt) | 2009-03-31 | 2010-03-30 | dispositivo, método e programa de processamento de imagem |
EP10758259A EP2316341B1 (en) | 2009-03-31 | 2010-03-30 | Image processing apparatus and method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009083931 | 2009-03-31 | ||
JP2009-083931 | 2009-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010113479A1 true WO2010113479A1 (ja) | 2010-10-07 |
Family
ID=42827789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/002306 WO2010113479A1 (ja) | 2009-03-31 | 2010-03-30 | 画像処理装置および方法並びにプログラム |
Country Status (8)
Country | Link |
---|---|
US (1) | US9144407B2 (ja) |
EP (2) | EP2316341B1 (ja) |
JP (3) | JP4596437B2 (ja) |
CN (1) | CN102164543B (ja) |
AU (1) | AU2010231365A1 (ja) |
BR (2) | BR122013007310A2 (ja) |
CA (1) | CA2733519A1 (ja) |
WO (1) | WO2010113479A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2011037063A1 (ja) * | 2009-09-24 | 2013-02-21 | 株式会社日立メディコ | 投影像生成方法および磁気共鳴イメージング装置 |
CN103140160A (zh) * | 2011-03-30 | 2013-06-05 | 奥林巴斯医疗株式会社 | 图像管理装置、方法、程序以及胶囊型内窥镜系统 |
JP2013121448A (ja) * | 2011-12-12 | 2013-06-20 | Nemoto Kyorindo:Kk | 医用画像処理システム |
JP2013121449A (ja) * | 2011-12-12 | 2013-06-20 | Nemoto Kyorindo:Kk | 医療用画像処理システム |
JP2013121450A (ja) * | 2011-12-12 | 2013-06-20 | Nemoto Kyorindo:Kk | 医用画像処理ネットワークシステム |
JP2015114691A (ja) * | 2013-12-09 | 2015-06-22 | 株式会社東芝 | 医療情報システム及び医療情報提供方法 |
JP2017205217A (ja) * | 2016-05-17 | 2017-11-24 | 東芝メディカルシステムズ株式会社 | 医用画像診断装置、医用画像処理装置および画像表示プログラム |
JP2018502631A (ja) * | 2014-12-16 | 2018-02-01 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 自動放射線読み取りセッション検出 |
JP2018057695A (ja) * | 2016-10-07 | 2018-04-12 | キヤノン株式会社 | 画像表示システム、画像表示方法、及びプログラム |
WO2019138773A1 (ja) * | 2018-01-10 | 2019-07-18 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、医療画像処理方法及びプログラム |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2316341B1 (en) * | 2009-03-31 | 2013-03-06 | FUJIFILM Corporation | Image processing apparatus and method and program |
US8904517B2 (en) | 2011-06-28 | 2014-12-02 | International Business Machines Corporation | System and method for contexually interpreting image sequences |
JP5972609B2 (ja) * | 2012-03-05 | 2016-08-17 | 東芝メディカルシステムズ株式会社 | 管理オブジェクト生成装置および画像表示システム |
WO2013143087A1 (zh) * | 2012-03-28 | 2013-10-03 | 宇龙计算机通信科技(深圳)有限公司 | 操作对象的纠错方法及通信终端 |
JP6143425B2 (ja) * | 2012-06-11 | 2017-06-07 | 東芝メディカルシステムズ株式会社 | X線診断装置 |
EP2934324B1 (en) * | 2012-12-21 | 2020-05-06 | Volcano Corporation | Display control for a multi-sensor medical device |
JP2015085182A (ja) * | 2013-09-26 | 2015-05-07 | 株式会社東芝 | 医用画像診断装置、医用画像表示装置、および医用画像表示方法 |
JP6266310B2 (ja) * | 2013-11-08 | 2018-01-24 | 東芝メディカルシステムズ株式会社 | 医用情報処理装置 |
JP6289064B2 (ja) * | 2013-12-09 | 2018-03-07 | キヤノンメディカルシステムズ株式会社 | 医療情報システム及び推奨アプリケーションの検索方法 |
EP3073401A1 (en) * | 2015-03-27 | 2016-09-28 | Fujifilm Corporation | Failed image management apparatus, operation method of failed image management apparatus, and failed image management system |
JP6396597B2 (ja) * | 2015-09-09 | 2018-09-26 | 富士フイルム株式会社 | マッピング画像表示制御装置および方法並びにプログラム |
JP6378715B2 (ja) * | 2016-04-21 | 2018-08-22 | ゼネラル・エレクトリック・カンパニイ | 血管検出装置、磁気共鳴イメージング装置、およびプログラム |
CN107404577B (zh) * | 2017-07-20 | 2019-05-17 | 维沃移动通信有限公司 | 一种图像处理方法、移动终端及计算机可读存储介质 |
WO2020066132A1 (ja) | 2018-09-27 | 2020-04-02 | 富士フイルム株式会社 | 医用画像診断支援装置、方法及びプログラム |
WO2020158100A1 (ja) * | 2019-01-30 | 2020-08-06 | 富士フイルム株式会社 | 医用画像解析装置、方法およびプログラム |
CN113677274A (zh) * | 2019-04-11 | 2021-11-19 | 富士胶片株式会社 | 放射线摄影系统及其工作方法以及放射线摄影系统用控制台 |
JP7272149B2 (ja) * | 2019-07-08 | 2023-05-12 | コニカミノルタ株式会社 | 選択支援システム及びプログラム |
JP7236974B2 (ja) * | 2019-10-03 | 2023-03-10 | 富士フイルム株式会社 | 診療支援装置、診療支援方法、及び診療支援プログラム |
EP4377885A1 (en) * | 2021-07-28 | 2024-06-05 | Artrya Limited | A coronary artery disease analysis system |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002253539A (ja) | 2001-03-05 | 2002-09-10 | Nippon Telegr & Teleph Corp <Ntt> | 医用画像識別システム,医用画像識別処理方法,医用画像識別用プログラムおよびその記録媒体 |
JP2003010166A (ja) | 2001-04-25 | 2003-01-14 | Fuji Photo Film Co Ltd | 画像処理方法および装置並びにプログラム |
JP2003284705A (ja) * | 2002-03-27 | 2003-10-07 | Konica Corp | 医用画像処理装置、画像処理パラメータの設定方法、プログラム、記憶媒体 |
JP2005034473A (ja) * | 2003-07-17 | 2005-02-10 | Hitachi Medical Corp | 異常陰影検出装置 |
JP2006061278A (ja) * | 2004-08-25 | 2006-03-09 | Konica Minolta Medical & Graphic Inc | 医用画像表示装置 |
JP2007185429A (ja) | 2006-01-16 | 2007-07-26 | Fujifilm Corp | 画像再生装置およびそのプログラム |
JP2008253681A (ja) * | 2007-04-09 | 2008-10-23 | Toshiba Corp | 医用支援システム、及び医用支援プログラム |
JP2008259682A (ja) | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 部位認識結果修正装置、方法、およびプログラム |
JP2008259710A (ja) * | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 画像処理方法および装置ならびにプログラム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0443957A (ja) * | 1990-06-11 | 1992-02-13 | Hitachi Ltd | 超音波撮像方式 |
JP4260938B2 (ja) * | 1998-10-23 | 2009-04-30 | 株式会社東芝 | 3次元超音波診断装置 |
US6608650B1 (en) * | 1998-12-01 | 2003-08-19 | Flashpoint Technology, Inc. | Interactive assistant process for aiding a user in camera setup and operation |
EP1504721B1 (en) * | 2002-09-27 | 2010-01-06 | Olympus Corporation | Ultrasonograph |
US20060008143A1 (en) * | 2002-10-16 | 2006-01-12 | Roel Truyen | Hierachical image segmentation |
US20040100505A1 (en) * | 2002-11-21 | 2004-05-27 | Cazier Robert Paul | System for and method of prioritizing menu information |
US20060064321A1 (en) * | 2004-08-25 | 2006-03-23 | Konica Minolta Medical & Graphic, Inc. | Medical image management system |
JP4709600B2 (ja) * | 2005-07-15 | 2011-06-22 | 株式会社東芝 | X線診断装置、撮影角度最適化支援装置及びプログラム |
CN2868194Y (zh) * | 2005-11-02 | 2007-02-14 | 华南理工大学 | X光图像的乳腺病症特征自动识别装置 |
CN1965752A (zh) * | 2005-11-15 | 2007-05-23 | 乐金电子(沈阳)有限公司 | 具有检测功能的图像显示装置及其控制方法 |
JP4855141B2 (ja) * | 2006-05-19 | 2012-01-18 | 富士フイルム株式会社 | 医用画像部位認識装置、及び、医用画像部位認識プログラム |
WO2008001928A1 (fr) * | 2006-06-30 | 2008-01-03 | Fujifilm Corporation | Appareil d'affichage d'image médicale et programme d'affichage d'image médicale |
CN200984178Y (zh) * | 2006-08-31 | 2007-12-05 | 深圳市国基科技有限公司 | 数字乳腺多功能影像系统 |
US8081811B2 (en) * | 2007-04-12 | 2011-12-20 | Fujifilm Corporation | Method, apparatus, and program for judging image recognition results, and computer readable medium having the program stored therein |
EP2316341B1 (en) * | 2009-03-31 | 2013-03-06 | FUJIFILM Corporation | Image processing apparatus and method and program |
-
2010
- 2010-03-30 EP EP10758259A patent/EP2316341B1/en active Active
- 2010-03-30 EP EP13151299.8A patent/EP2583626B1/en active Active
- 2010-03-30 BR BR122013007310A patent/BR122013007310A2/pt not_active IP Right Cessation
- 2010-03-30 JP JP2010522886A patent/JP4596437B2/ja active Active
- 2010-03-30 CN CN2010800027521A patent/CN102164543B/zh active Active
- 2010-03-30 AU AU2010231365A patent/AU2010231365A1/en not_active Abandoned
- 2010-03-30 CA CA2733519A patent/CA2733519A1/en not_active Abandoned
- 2010-03-30 US US12/737,702 patent/US9144407B2/en active Active
- 2010-03-30 WO PCT/JP2010/002306 patent/WO2010113479A1/ja active Application Filing
- 2010-03-30 BR BRPI1004218A patent/BRPI1004218A2/pt not_active IP Right Cessation
- 2010-09-14 JP JP2010205388A patent/JP4634539B2/ja active Active
- 2010-09-14 JP JP2010205389A patent/JP4694651B2/ja active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002253539A (ja) | 2001-03-05 | 2002-09-10 | Nippon Telegr & Teleph Corp <Ntt> | 医用画像識別システム,医用画像識別処理方法,医用画像識別用プログラムおよびその記録媒体 |
JP2003010166A (ja) | 2001-04-25 | 2003-01-14 | Fuji Photo Film Co Ltd | 画像処理方法および装置並びにプログラム |
JP2003284705A (ja) * | 2002-03-27 | 2003-10-07 | Konica Corp | 医用画像処理装置、画像処理パラメータの設定方法、プログラム、記憶媒体 |
JP2005034473A (ja) * | 2003-07-17 | 2005-02-10 | Hitachi Medical Corp | 異常陰影検出装置 |
JP2006061278A (ja) * | 2004-08-25 | 2006-03-09 | Konica Minolta Medical & Graphic Inc | 医用画像表示装置 |
JP2007185429A (ja) | 2006-01-16 | 2007-07-26 | Fujifilm Corp | 画像再生装置およびそのプログラム |
JP2008253681A (ja) * | 2007-04-09 | 2008-10-23 | Toshiba Corp | 医用支援システム、及び医用支援プログラム |
JP2008259682A (ja) | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 部位認識結果修正装置、方法、およびプログラム |
JP2008259710A (ja) * | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 画像処理方法および装置ならびにプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2316341A4 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2011037063A1 (ja) * | 2009-09-24 | 2013-02-21 | 株式会社日立メディコ | 投影像生成方法および磁気共鳴イメージング装置 |
JP5738193B2 (ja) * | 2009-09-24 | 2015-06-17 | 株式会社日立メディコ | 投影像生成方法および磁気共鳴イメージング装置 |
CN103140160A (zh) * | 2011-03-30 | 2013-06-05 | 奥林巴斯医疗株式会社 | 图像管理装置、方法、程序以及胶囊型内窥镜系统 |
US8918740B2 (en) | 2011-03-30 | 2014-12-23 | Olympus Medical Systems Corp. | Image management apparatus, method, and computer-readable recording medium and capsule endoscope system |
CN103140160B (zh) * | 2011-03-30 | 2015-06-17 | 奥林巴斯医疗株式会社 | 图像管理装置、图像管理装置的工作方法以及胶囊型内窥镜系统 |
JP2013121448A (ja) * | 2011-12-12 | 2013-06-20 | Nemoto Kyorindo:Kk | 医用画像処理システム |
JP2013121449A (ja) * | 2011-12-12 | 2013-06-20 | Nemoto Kyorindo:Kk | 医療用画像処理システム |
JP2013121450A (ja) * | 2011-12-12 | 2013-06-20 | Nemoto Kyorindo:Kk | 医用画像処理ネットワークシステム |
JP2015114691A (ja) * | 2013-12-09 | 2015-06-22 | 株式会社東芝 | 医療情報システム及び医療情報提供方法 |
JP2018502631A (ja) * | 2014-12-16 | 2018-02-01 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 自動放射線読み取りセッション検出 |
JP2017205217A (ja) * | 2016-05-17 | 2017-11-24 | 東芝メディカルシステムズ株式会社 | 医用画像診断装置、医用画像処理装置および画像表示プログラム |
JP2018057695A (ja) * | 2016-10-07 | 2018-04-12 | キヤノン株式会社 | 画像表示システム、画像表示方法、及びプログラム |
US10997762B2 (en) | 2016-10-07 | 2021-05-04 | Canon Kabushiki Kaisha | Image display system, image display method, and program |
WO2019138773A1 (ja) * | 2018-01-10 | 2019-07-18 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、医療画像処理方法及びプログラム |
JPWO2019138773A1 (ja) * | 2018-01-10 | 2020-12-10 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、医療画像処理方法及びプログラム |
US11526986B2 (en) | 2018-01-10 | 2022-12-13 | Fujifilm Corporation | Medical image processing device, endoscope system, medical image processing method, and program |
JP2023010809A (ja) * | 2018-01-10 | 2023-01-20 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、医療画像処理装置の作動方法及びプログラム、記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
EP2316341A1 (en) | 2011-05-04 |
EP2316341A4 (en) | 2011-09-14 |
EP2583626B1 (en) | 2015-03-18 |
JP2011025056A (ja) | 2011-02-10 |
CN102164543B (zh) | 2013-03-20 |
JP4596437B2 (ja) | 2010-12-08 |
JP4634539B2 (ja) | 2011-02-16 |
CA2733519A1 (en) | 2010-10-07 |
JP2011011069A (ja) | 2011-01-20 |
EP2583626A1 (en) | 2013-04-24 |
US20110131528A1 (en) | 2011-06-02 |
JPWO2010113479A1 (ja) | 2012-10-04 |
CN102164543A (zh) | 2011-08-24 |
BR122013007310A2 (pt) | 2016-03-22 |
JP4694651B2 (ja) | 2011-06-08 |
US9144407B2 (en) | 2015-09-29 |
BRPI1004218A2 (pt) | 2016-02-23 |
AU2010231365A1 (en) | 2010-10-07 |
EP2316341B1 (en) | 2013-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4634539B2 (ja) | 画像処理装置および方法並びにプログラム | |
US8837794B2 (en) | Medical image display apparatus, medical image display method, and medical image display program | |
JP6422486B2 (ja) | 高度医用画像処理ウィザード | |
JP4818846B2 (ja) | 医用画像処理装置及び医用画像処理プログラム | |
US8744149B2 (en) | Medical image processing apparatus and method and computer-readable recording medium for image data from multiple viewpoints | |
US20190051215A1 (en) | Training and testing system for advanced image processing | |
JP4786246B2 (ja) | 画像処理装置及び画像処理システム | |
US20200105070A1 (en) | Overlay and Manipulation of Medical Images in a Virtual Environment | |
US20120299818A1 (en) | Medical information display apparatus, operation method of the same and medical information display program | |
JP2019153249A (ja) | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム | |
JP6734111B2 (ja) | 所見情報作成装置及びシステム | |
JP5784082B2 (ja) | 診断支援装置及び診断支援方法 | |
US12062447B2 (en) | Medical image diagnosis support device, method, and program | |
JP2011120827A (ja) | 診断支援システム、診断支援プログラムおよび診断支援方法 | |
JP2010244224A (ja) | 画像表示装置および方法並びにプログラム | |
JP2014013590A (ja) | 診断支援装置及び診断支援方法 | |
Kownacki et al. | Advanced server-based diagnostic imaging and post-processing workflow at European Centre of Health Otwock, Poland |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080002752.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010522886 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10758259 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2733519 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12737702 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010231365 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010758259 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1067/CHENP/2011 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 2010231365 Country of ref document: AU Date of ref document: 20100330 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: PI1004218 Country of ref document: BR Kind code of ref document: A2 Effective date: 20110217 |