WO2017072916A1 - Dispositif de traitement d'image, procédé de traitement d'image et programme informatique - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image et programme informatique Download PDF

Info

Publication number
WO2017072916A1
WO2017072916A1 PCT/JP2015/080576 JP2015080576W WO2017072916A1 WO 2017072916 A1 WO2017072916 A1 WO 2017072916A1 JP 2015080576 W JP2015080576 W JP 2015080576W WO 2017072916 A1 WO2017072916 A1 WO 2017072916A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
dimensional
image portion
dimensional image
specifying
Prior art date
Application number
PCT/JP2015/080576
Other languages
English (en)
Japanese (ja)
Inventor
宏美 武居
達也 織茂
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2015/080576 priority Critical patent/WO2017072916A1/fr
Publication of WO2017072916A1 publication Critical patent/WO2017072916A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging

Definitions

  • the present invention relates to a technical field of an image processing apparatus and an image processing method for performing image processing on a two-dimensional image and a three-dimensional image obtained by imaging the same object, and a computer program.
  • Patent Document 1 There is known a surgery support apparatus that superimposes and displays an observation image of a human body photographed by a video camera or the like during surgery on a three-dimensional image of the human body photographed by an MRI apparatus or the like before surgery.
  • tumor image portion an image portion showing a tumor (hereinafter referred to as “tumor image portion”) in the spectral image.
  • the inventors of the present application display the image part corresponding to the tumor in a three-dimensional image of the human body based on the identified tumor image part in a manner that can be distinguished from other image parts.
  • the spectral image is a two-dimensional image (that is, a planar image)
  • the tumor image portion is also a two-dimensional image. Therefore, from the spectral image, only the outer surface of the tumor (that is, the portion of the tumor that can be observed from the outside) is identified. That is, it is difficult to specify how the tumor is three-dimensionally distributed from the spectral image. Therefore, by simply using the surgery support apparatus described in Patent Document 1 and displaying the tumor image portion superimposed on the three-dimensional image, only a three-dimensional image representing the outer surface of the tumor is displayed.
  • the above-described technical problem is not limited to a case where a tumor image is superimposed and displayed on a three-dimensional image of a human body, but can also occur in a case where an arbitrary two-dimensional image is superimposed and displayed on an arbitrary three-dimensional image.
  • the present invention has been made in view of, for example, the above-described problems, and a two-dimensional image obtained by photographing the same object and a three-dimensional image are used to convert the object included in the two-dimensional image into a three-dimensional image. It is an object of the present invention to provide an image processing apparatus, an image processing method, and a computer program that can specify how the image is three-dimensionally distributed in an image.
  • An image processing apparatus for solving the above problem represents a two-dimensional image and a three-dimensional image generated by photographing the same object, and represents the object among the two-dimensional image.
  • An image processing method for solving the above problem represents a two-dimensional image and a three-dimensional image generated by capturing the same target object, and represents the target object among the two-dimensional images.
  • a computer program for solving the above problem causes a computer to execute the above-described image processing method.
  • the image processing apparatus includes a acquiring unit that acquires a two-dimensional image and a three-dimensional image generated by photographing the same object, and a first image representing the object among the two-dimensional images.
  • First specifying means for specifying a portion; and second specifying means for specifying a third image portion in the three-dimensional image having features of the second image portion in the three-dimensional image corresponding to the first image portion; Is provided.
  • the first specifying unit specifies the first image portion representing the object in the two-dimensional image. Since the first image part is at least a part of the two-dimensional image, the first image part represents at least a part of the object planarly (in other words, two-dimensionally). That is, the first image portion represents at least a part of the outer surface of the object.
  • the second specifying means has a third image portion having the same feature as the feature of the second image portion in the three-dimensional image corresponding to the first image portion in the three-dimensional image based on the first image portion. Is identified. Since the first image part represents at least a part of the outer surface of the object, the second image part corresponding to the first image part is also visible from the outer surface of the object (that is, the object is visible from the outside). Part). Furthermore, since the second image portion is at least a part of the three-dimensional image, the second image portion represents at least a part of the outer surface of the object in a three-dimensional manner (in other words, three-dimensionally).
  • the third image portion also represents at least a part of the object, like the second image portion. It is. Furthermore, the third image portion has the same characteristics as the features of the second image portion, instead of being directly identified from the first image portion, which only represents at least part of the outer surface of the object. The image portion is specified by searching from the three-dimensional image. For this reason, it can be estimated that the third image portion represents at least the inside of the object (that is, the portion that cannot be visually recognized from the outside of at least a part of the object). Furthermore, since the third image portion is at least a part of the three-dimensional image, the third image portion three-dimensionally represents the inside of the object.
  • the image processing apparatus specifies not only the second image part that stereoscopically represents at least a part of the outer surface of the object, but also the third image part that stereoscopically represents the inside of the object. can do. For this reason, the image processing apparatus of this embodiment can specify how the objects included in the two-dimensional image are three-dimensionally distributed in the three-dimensional image. As a result, any display device supports the user's recognition of the object by displaying the second and third image parts corresponding to the object in the three-dimensional image in a manner distinguishable from other image parts. can do.
  • the third image portion represents the inside of the object.
  • the third image portion may include a second image portion representing the outer surface of the object.
  • an image portion obtained by combining an image portion representing the inside of the object and an image portion representing the outer surface of the object may be used as the third image portion.
  • the second image portion represents at least a part of the outer surface of the object in the three-dimensional image
  • the third image portion is a portion of the three-dimensional image. Of these, at least the inside of the object is represented.
  • the second specifying means has at least the inside of the object having the same feature as the feature of the second image portion that represents at least a part of the outer surface of the object specified by the second specifying means.
  • Three image portions can be specified.
  • the image processing apparatus further includes a third specifying unit that specifies the second image portion, and the two-dimensional image is generated by shooting a shooting target including the target object,
  • the third specifying means is (i) present on a path of light reaching each pixel of an image sensor included in an imaging device that generates the two-dimensional image by photographing the photographing target among the three-dimensional images.
  • the fourth image portion is associated with the two-dimensional image
  • the second image portion is specified based on a result of the association between the fourth image portion and the two-dimensional image.
  • the third specifying means can preferably specify the second image portion.
  • the third specifying means does not specify the position in the two-dimensional image of the marker that may be placed on the shooting target in order to specify the position of the shooting target or the object,
  • the second image portion can be specified.
  • the display device is controlled to display another three-dimensional image representing the second image portion and the third image portion in a manner distinguishable from other image portions. And a control means.
  • the display device can display the second and third image portions corresponding to the object in the three-dimensional image in a manner distinguishable from the other image portions.
  • the object includes at least one of a predetermined part of the subject, an affected area of the subject, and a tumor part of the subject.
  • the image processing apparatus has the same characteristics as the characteristics of the second image portion representing the outer surface of the predetermined part, affected part, or tumor part of the subject, and the inside of the predetermined part, affected part, or tumor part of the subject.
  • a third image portion representing can be specified.
  • the three-dimensional image is an MRI (Magnetic Resonance Imaging) image that three-dimensionally shows the outer surface and the inside of the subject, and the two-dimensional image is the subject FIG.
  • MRI Magnetic Resonance Imaging
  • the image processing apparatus includes not only the second image portion representing the outer surface of the subject including the outer surface of the object based on the MRI image and the spectroscopic image, but also the inside of the subject including the inside of the object. It is also possible to specify the third image portion representing
  • the image processing method includes an acquisition step of acquiring a two-dimensional image and a three-dimensional image generated by imaging the same object, and a first image representing the object among the two-dimensional images.
  • the image processing method of the present embodiment may adopt various aspects.
  • the computer program of this embodiment causes a computer to execute the image processing method of this embodiment described above.
  • the computer program of this embodiment may adopt various aspects.
  • the computer program may be recorded on a recording medium.
  • the image processing apparatus includes an acquisition unit, a first specifying unit, and a second specifying unit.
  • the image processing method of the present embodiment includes an acquisition process, a first specifying process, and a second specifying process.
  • the computer program of this embodiment causes a computer to execute the image processing method of this embodiment described above. Therefore, it is specified how the objects included in the two-dimensional image are three-dimensionally distributed in the three-dimensional image using the two-dimensional image and the three-dimensional image obtained by photographing the same object.
  • the image processing apparatus, the image processing method, and the computer program include an MRI image obtained by imaging a patient with an MRI (Magnetic Resonance Imaging) apparatus (that is, an external surface and an internal surface of the patient).
  • MRI Magnetic Resonance Imaging
  • a three-dimensional image that represents the structure of the patient three-dimensionally) and a tumor image portion obtained by analyzing the spectral image captured by the hyperspectral camera that is, a two-dimensional image that planarly represents the outer surface of the patient's tumor
  • the tumor represented by the tumor image portion is three-dimensionally distributed in the MRI image.
  • the image processing apparatus, the image processing method, and the computer program use a two-dimensional image and a three-dimensional image obtained by photographing the same object, and the object included in the two-dimensional image is a three-dimensional image.
  • the present invention may be applied to any device that specifies how the three-dimensional distribution is distributed within the network.
  • FIG. 1 is a block diagram showing the configuration of the surgery support system 1 of the present embodiment.
  • the surgery support system 1 includes an MRI apparatus 11, a hyperspectral camera 12, a pointer 13, a position measuring device 14, an image processing device 15, and a display device 16.
  • the MRI apparatus 11 generates an MRI image that is a three-dimensional image that three-dimensionally represents the external surface and internal structure of the patient by imaging the external surface and internal structure of the patient.
  • An MRI image is an aggregate of patient tomographic images (in other words, two-dimensional slice images).
  • the MRI image may be any image as long as the outer surface and the internal structure of the patient can be three-dimensionally represented.
  • the MRI image may be a three-dimensional model image or three-dimensional volume data instead of an aggregate of tomographic images.
  • the MRI image generated by the MRI apparatus 11 is input to the image processing apparatus 15.
  • the patient is an operation target.
  • a part of the patient's body becomes the surgical site.
  • the surgical site is a part of the patient's head.
  • the operation is performed on a tumor in the patient's head.
  • a tumor is present at the surgical site.
  • the patient and the surgical site are specific examples of “imaging target”, and the tumor is a specific example of “target”.
  • the surgery support system 1 of the present embodiment is used when performing an arbitrary operation (for example, an operation performed on an arbitrary part of a patient) different from an operation performed on a tumor of the patient's head. May be.
  • the patient may be a human or any living body other than a human (for example, an animal).
  • At least three markers m1 are installed around the surgical site.
  • the marker m1 is a marker (for example, Fiducial Marker) that can be imaged by the MRI apparatus 11.
  • the hyperspectral camera 12 shoots a patient to generate a spectral image of the patient (that is, a collection of images for each light spectrum (wavelength)).
  • the spectral image generated by the hyperspectral camera 12 is input to the image processing device 15.
  • the hyperspectral camera 12 is provided with at least three markers m2 (three markers m2 in the example shown in FIG. 1).
  • the marker m2 is used for the position measurement device 14 to measure the position of the hyperspectral camera 12 (particularly, the three-dimensional position and orientation in the real space coordinate system that defines the three dimensions in the real space where the patient actually exists). Used.
  • the pointer 13 is a surgical tool that points to the marker m1 installed around the surgical site. The surgeon operates the pointer 13 to point the marker m1 at the tip of the pointer 13.
  • the pointer 13 is provided with at least three markers m3 (three markers m3 in the example shown in FIG. 1).
  • the marker m3 is used for the position measuring device 14 to measure the position of the pointer 13 (particularly, the three-dimensional position and orientation in the real space coordinate system).
  • the position measuring device 14 measures the positions of the marker m2 and the marker m3 (particularly, a three-dimensional position in the real space coordinate system).
  • the position measuring device 14 includes a stereo camera (two-lens camera) and an LED (Light Emitting Diode).
  • the LED emits measurement light (for example, infrared rays) toward each of the marker m2 and the marker m3.
  • the stereo camera detects the measurement light reflected by each of the marker m2 and the marker m3.
  • the detection result of the measurement light by the position measurement device 14 (that is, the measurement result of the measurement light) is input to the image processing device 15.
  • the image processing device 15 specifies the position of the hyperspectral camera 12 and the position of the pointer 13 (specifically, the position of the marker m1 as will be described in detail later) based on the measurement result of the position measuring device 14.
  • the image processing device 15 performs image processing on the MRI image and the spectral image.
  • the image processing apparatus 15 includes a CPU (Central Processing Unit) 151 and a memory 152.
  • the memory 152 stores a computer program for causing the image processing apparatus 15 to perform image processing.
  • a logical processing block for performing image processing is formed inside the CPU 151.
  • the computer program may not be recorded in the memory 152.
  • the CPU 151 may execute a computer program downloaded via a network.
  • the image processing apparatus 15 includes an MRI image acquisition unit 1511 that is a specific example of “acquisition unit”, a marker, and a logical processing block formed in the CPU 151.
  • the MRI image acquisition unit 1511 acquires the MRI image generated by the MRI apparatus 11.
  • the marker position specifying unit 1512 analyzes the MRI image acquired by the MRI image acquisition unit 1511 to thereby detect the position of the marker m1 included in the MRI image (particularly in the MRI coordinate system that defines the three-dimensional position in the MRI image). 3D position).
  • the spectral image acquisition unit 1513 acquires the spectral image generated by the hyperspectral camera 12.
  • the tumor identification unit 1514 identifies the position of the tumor (particularly, the two-dimensional position in the spectral coordinate system that defines the two-dimensional position in the spectral image) by analyzing the spectral image acquired by the spectral image acquisition unit 1513. . That is, the tumor specifying unit 1514 specifies a tumor image portion that is an image portion representing a tumor in the spectral image.
  • the tumor image portion is a specific example of “first image portion”.
  • the marker position specifying unit 1515 specifies the position of the marker m2 based on the measurement result of the position measuring device 14. In addition, the marker position specifying unit 1515 specifies the position of the marker m1 (particularly, the three-dimensional position in the real space coordinate system) based on the position of the marker m3 placed on the pointer 13 pointing to the marker m1.
  • the association processing unit 1516 mainly performs three association processes described below.
  • the association processing unit 1516 performs, as the first association processing, the position of the marker m1 in the MRI coordinate system specified by the marker position specifying unit 1512 and the real space coordinates specified by the marker position specifying unit 1515. A process of associating the position of the marker m1 with the system is performed.
  • the association processing unit 1516 performs a process of associating the MRI image and the spectral image (particularly, the tumor image portion) as the second association process. More specifically, the association processing unit 1516 identifies an image portion corresponding to the tumor image portion (that is, an image portion corresponding to the outer surface of the tumor) in the MRI image based on the tumor image portion. Note that a specific example of the process of specifying the image portion corresponding to the tumor image portion in the MRI image will be described in detail later, and thus the description thereof is omitted here.
  • the tumor image portion is also a two-dimensional image. Therefore, the tumor image portion is an image portion that two-dimensionally represents the outer surface of the tumor (in other words, the outer shell) as a plane.
  • the outer surface of a tumor means the part visible from the outside among tumors. Therefore, the image portion corresponding to the tumor image portion in the MRI image is also an image portion representing the outer surface of the tumor.
  • the image portion corresponding to the tumor image portion in the MRI image is an image portion that three-dimensionally (in other words, three-dimensionally) with the outer surface of the tumor as a curved surface. .
  • an image portion corresponding to the tumor image portion in the MRI image is referred to as an “outer surface image portion”.
  • the outer surface image portion is a specific example of “second image portion”.
  • the outer surface image portion represents at least a part of the outer surface of the patient's tumor (typically, the outer surface of the tumor portion included in the imaging region of the hyperspectral camera 12 of the patient's tumor).
  • the association processing unit 1516 identifies an image portion corresponding to the inside of the tumor (that is, an image portion representing a structure inside the tumor) based on the outer surface image portion.
  • the inside of a tumor means the part which cannot be visually recognized from the outside among tumors. Note that a specific example of the process of specifying the image portion corresponding to the inside of the tumor in the MRI image will be described later in detail, and the description thereof is omitted here.
  • the image part corresponding to the inside of the tumor in the MRI image is an image part that represents the inside of the tumor three-dimensionally (in other words, three-dimensionally) as a three-dimensional structure.
  • an image portion corresponding to the inside of the tumor in the MRI image is referred to as an “internal image portion”.
  • the internal image portion is a specific example of “third image portion”.
  • the three-dimensional model generation unit 1517 determines whether the tumor is in the other part based on the processing result by the association processing unit 1516 (that is, the identification result of the outer surface image portion and the inner image portion) and the MRI image acquired by the MRI image acquisition unit 1511. A three-dimensional model of a patient expressed in a manner distinguishable from the above is generated.
  • the 3D viewer processing unit 1518 generates an observation image that is observed when the surgeon observes the 3D model generated by the 3D model generation unit 1517 from a desired viewpoint.
  • the display device 16 displays the observation image generated by the three-dimensional viewer processing unit 1518. As a result, the surgeon can recognize how the tumor is three-dimensionally distributed inside the patient.
  • the display device 16 may display other arbitrary information (for example, information related to surgery support).
  • FIG. 3 is a flowchart showing an operation flow of the surgery support system 1.
  • the MRI image acquisition unit 1511 acquires an MRI image (step S11). Specifically, the marker m1 is first installed around the patient's surgical site. Thereafter, the patient lies on the bed provided in the MRI apparatus 11. Thereafter, the MRI apparatus 11 sequentially takes tomographic images (see FIG. 4) of the patient. That is, the MRI apparatus 11 generates an MRI image. As a result, the MRI image acquisition unit 1511 acquires an MRI image.
  • the patient is transferred from the bed provided in the MRI apparatus 11 to the operating table.
  • the patient is transported so that the position where the marker m1 is installed does not change (that is, the position where the marker m1 is installed is fixed).
  • the bed provided in the MRI apparatus 11 may be used as an operating table. The position where the marker m1 is installed does not change (is fixed) while the series of operations shown in FIG. 3 is performed.
  • the marker m1 may be directly driven into the bone of the patient's head before the MRI image is acquired so that the position where the marker m1 is installed does not change.
  • the marker m1 may be driven into the skull that has opened the patient lying on the bed, turned over the scalp, and appeared.
  • the marker position specifying unit 1512 specifies the position of the marker m1 included in the MRI image by analyzing the MRI image acquired in step S11 (step S12). That is, the marker position specifying unit 1512 specifies the three-dimensional position of the marker m1 in the MRI coordinate system. For example, the marker position specifying unit 1512 specifies an image portion corresponding to the marker m1 from the MRI image using a matching process or the like. Thereafter, the marker position specifying unit 1512 specifies the position of the image portion corresponding to the marker m1.
  • the marker position specifying unit 1515 specifies the position of the marker m1 based on the measurement result of the position measuring device 14 (step S13). That is, the marker position specifying unit 1515 specifies the three-dimensional position of the marker m1 in the real space coordinate system. Specifically, the surgeon operates the pointer 13 to point the marker m1 at the tip of the pointer 13. Under the situation where the marker m1 is pointed by the tip of the pointer 13, the position measurement device 14 detects the measurement light reflected by the marker m3. The marker position specifying unit 1515 can specify the position of the marker m3 based on the measurement result of the position measuring device 14.
  • the position of the pointer 13 (however, the “position” in this case includes the direction of the pointer 13) is also specified.
  • the position of the pointer 13 substantially corresponds to the position of the marker m1.
  • the shape of the pointer 13 (the positional relationship between the positions and the tips of at least three markers m3) is known. Therefore, the marker position specifying unit 1515 can specify the position of the marker m1 in the real space coordinate system by specifying the position of the marker m3 (that is, specifying the position of the pointer 13). The above operation is performed for all the markers m1.
  • the association processing unit 1516 associates the position of the marker m1 in the MRI coordinate system identified in step S12 with the position of the marker m1 in the real space coordinate system identified in step S13 (step S14).
  • the association processing unit 1516 calculates a transformation matrix for converting the position in the MRI coordinate system to a position in the real space coordinate system or the position in the real space coordinate system to a position in the MRI coordinate system. be able to. That is, the association processing unit 1516 can specify a position in the real space coordinate system corresponding to an arbitrary position in the MRI coordinate system.
  • the association processing unit 1516 can specify a position in the MRI coordinate system corresponding to an arbitrary position in the real space coordinate system.
  • the spectral image acquisition unit 1513 acquires a spectral image (step S21). Specifically, the hyperspectral camera 12 images the patient with the patient positioned on the operating table. That is, the hyperspectral camera 12 generates a spectral image. As a result, the spectral image acquisition unit 1513 acquires a spectral image.
  • the tumor identification unit 1514 identifies the position of the tumor by analyzing the spectral image acquired in step S21 (step S22). That is, as shown in FIG. 5, the tumor specifying unit 1514 specifies the two-dimensional position of the tumor in the spectral coordinate system that defines the two-dimensional position in the spectral image. That is, the tumor specifying unit 1514 specifies a tumor image portion that is an image portion representing a tumor in the spectral image (step S22).
  • movement may be used as an operation
  • a known operation may be used as an operation for specifying the position of the tumor from the spectral image.
  • the image processing apparatus 15 associates the tumor image portion specified from the spectral image with the MRI image, thereby corresponding to the outer surface image portion corresponding to the outer surface of the tumor in the MRI image and the inside of the tumor in the MRI image.
  • An operation for specifying the internal image portion is performed (step S31 to step S34).
  • the marker position specifying unit 1515 specifies the three-dimensional position of the hyperspectral camera 12 in the real space coordinate system based on the measurement result of the position measuring device 14 (step S31). Specifically, the position measuring device 14 detects the measurement light reflected by the marker m2 installed in the hyperspectral camera 12. The marker position specifying unit 1515 can specify the position of the marker m ⁇ b> 2 based on the measurement result of the position measuring device 14.
  • the position of the marker m2 is specified
  • the position of the hyperspectral camera 12 (however, the position in this case includes the orientation of the hyperspectral camera 12) is specified.
  • the position of the hyperspectral camera 12 specified in step S31 is specified by Equation 1.
  • the matrix of 3 rows ⁇ 3 rows composed of rc11, rc12, rc13, rc21, rc22, rc23, rc31, rc32, and rc33 in Equation 1 is the rotation amount of the hyperspectral camera 12 in the real space coordinate system (ie, yaw).
  • the rotation amount in the direction (in other words, the tilt amount, the same applies hereinafter), the rotation amount in the roll direction, and the rotation amount in the pitch direction) are shown.
  • Tcx in Equation 1 represents the translation amount of the hyperspectral camera 12 along the X axis from the origin of the real space coordinate system.
  • Tcy in Equation 1 indicates the translation amount of the hyperspectral camera 12 along the Y axis from the origin of the real space coordinate system.
  • Tcz in Formula 1 indicates the translation amount of the hyperspectral camera 12 along the Z axis from the origin of the real space coordinate system.
  • the association processing unit 1516 identifies an outer surface image portion corresponding to the tumor image portion in the MRI image (step S32). That is, the association processing unit 151 displays an outer surface image portion that three-dimensionally shows the outer surface of the tumor as a curved surface (or uneven surface) in the MRI image that is a three-dimensional image based on the tumor image portion that is a two-dimensional image. Identify.
  • the association processing unit 1516 arbitrarily selects an outer surface image portion corresponding to the tumor image portion of the MRI image (that is, three-dimensional image) based on the tumor image portion (that is, two-dimensional image). May be used.
  • the association processing unit 1516 corresponds to the two-dimensional image of the three-dimensional image based on the two-dimensional image as the operation of specifying the outer surface image portion corresponding to the tumor image portion of the MRI image based on the tumor image portion.
  • a known operation for specifying an image portion to be performed may be used.
  • the association processing unit 1516 identifies an internal parameter indicating the state of the optical system (for example, a lens group) of the hyperspectral camera 12.
  • the focal length fcx of the optical system with respect to the X axis constituting the spectral coordinate system that is, the focal length fcx corresponding to the photographing magnification of the subject along the X axis
  • the Y axis constituting the spectral coordinate system Is the focal length fcy of the optical system (that is, the focal length fcy corresponding to the photographing magnification of the subject along the Y axis).
  • the internal parameter is a deviation amount of the center of the imaging device of the hyperspectral camera 12 (that is, the center of the spectral image) with respect to the center of the optical axis of the optical system.
  • This deviation amount includes a deviation amount ccx along the X axis constituting the spectral coordinate system and a deviation amount ccy along the Y axis constituting the spectral coordinate system.
  • the association processing unit 1516 can specify an internal parameter using the calibration result of the hyperspectral camera 12 based on the checker pattern imaging result.
  • the association processing unit 1516 further specifies an external parameter indicating the installation state of the hyperspectral camera 12 in the real space coordinate system.
  • the external parameter is the position of the hyperspectral camera 12 in the real space coordinate system shown in Equation 1.
  • the association processing unit 1516 can specify the light path to each pixel constituting the imaging device of the hyperspectral camera 12 based on the internal parameter and the external parameter in the real space coordinate system. it can. Furthermore, since the position in the real space coordinate system can be converted to the position in the MRI coordinate system, the association processing unit 1516 can specify the light path to each pixel in the MRI coordinate system. As a result, the association processing unit 1516 can specify a portion located on the light path reaching each pixel in the outer surface of the patient represented by the MRI image. For example, in the MRI image, the association processing unit 1516 firstly reaches the position of the outer surface of the patient in the MRI coordinate system (X1 (X1 ( 3), Y1 (3), Z1 (3)) can be specified.
  • the portion of the outer surface of the patient that is located on the light path leading to each pixel is the portion of the outer surface of the patient that is captured by the hyperspectral camera 12.
  • the portion of the outer surface of the patient that is located on the light path to each pixel is a portion that is also imaged by the MRI apparatus 11.
  • the image portion corresponding to the outer surface of the patient located on the light path reaching each pixel in the MRI image corresponds to the spectroscopic image. That is, the image portion located on the light path in the MRI image corresponds to the spectral image.
  • the association processing unit 1516 includes an image portion located on the light path in the MRI image (that is, an image portion where the position (X1 (3), Y1 (3), Z1 (3)) is specified). Is associated with the spectral image. Specifically, the positions (X1 (3), Y1 (3), Z1 (3)) in the MRI coordinate system are the positions (X1 (3) ′, Y1 (3) ′, Z1 in the real space coordinate system. As described above, (3) ′) can be converted. Furthermore, each pixel that is a light emission destination has a one-to-one correspondence with specific coordinates in a spectral coordinate system that specifies a two-dimensional position in the spectral image.
  • the association processing unit 1516 determines the position (X1 (3) ′, Y1 (3) ′, Z1 (3) ′) in the real space coordinate system and the spectral coordinate system based on the internal parameter and the external parameter.
  • the positions (X1 (2), Y1 (2)) can be associated with each other. That is, the association processing unit 1516 determines the position (X1 (3), Y1 (3), Z1 (3)) in the MRI coordinate system and the position (X1 (2), Y1 (2)) in the spectral coordinate system. Can be associated. In this embodiment, the association processing unit 1516 uses Equation 2 to calculate the position (X1 (3), Y1 (3), Z1 (3)) in the MRI coordinate system and the position (X1 ( 2) and Y1 (2)). Note that s1 represents a predetermined coefficient or a predetermined conversion matrix.
  • the association processing unit 1516 can associate the MRI image (particularly, the image portion of the MRI image corresponding to the outer surface of the patient located on the light path reaching each pixel) with the spectral image. That is, the association processing unit 1516 can specify the image portion of the spectral image corresponding to a certain image portion constituting the MRI image. Similarly, the association processing unit 1516 can specify the image portion of the MRI image corresponding to a certain image portion constituting the spectral image. Then, since the position of the tumor image portion in the spectral image has already been specified, the association processing unit 1516 corresponds to the tumor image portion in the MRI image when the MRI image and the spectral image are associated with each other. The outer surface image portion to be identified can be specified.
  • the association processing unit 1516 can associate the MRI image and the spectral image without specifying the position of the marker m1 included in the spectral image in the spectral coordinate system.
  • the reason for not specifying the position of the marker m1 included in the spectral image in the spectral coordinate system is that the operation performed by the association processing unit 1516 in order to associate the MRI image with the spectral image is performed by the patient in the MRI image. This is because it is assumed that the outer surface image portion corresponding to the outer surface corresponds to a spectral image generated by photographing the outer surface of the patient.
  • the operation performed by the association processing unit 1516 for associating the MRI image and the spectroscopic image is two-dimensionally representing the outer surface of the patient as a plane, and the same outer surface of the same patient having the same MRI image as a curved surface. This is because it is assumed to be expressed three-dimensionally.
  • the association processing unit 1516 uses the fact that each of the spectral image (further, the tumor image portion) and the outer surface image portion represents the same outer surface of the same patient, so that the outer surface of the patient is a two-dimensional plane. Based on the tumor image portion expressed in (3), an outer surface image portion that three-dimensionally represents the same outer surface of the same patient as a curved surface is specified.
  • the association processing unit 1516 identifies the feature of the outer surface image portion identified in step S32 (step S33).
  • the “feature” specified in step S33 may be any feature as long as the feature is unique to the image or related to the image. Examples of such features include brightness, hue, saturation, brightness, and the like.
  • step S33 Since the feature specified in step S33 is a feature of the outer surface image portion showing the outer surface of the tumor, it is also estimated that it is a feature of the image portion corresponding to the tumor in the MRI image. Therefore, the association processing unit 1516 identifies an image portion having the same feature as the feature identified in step S33 in the MRI image as an internal image portion representing the inside of the tumor (step S34). As a result, as shown in FIG. 7, on the MRI image, not only the outer surface image portion representing the outer surface of the tumor but also the inner image portion representing the inside of the tumor is specified.
  • the three-dimensional model generation unit 1517 generates a three-dimensional model of the patient that represents the tumor in a manner that can be distinguished from other parts (step S41). That is, the three-dimensional model generation unit 1517 generates a three-dimensional model of the patient that represents the outer surface image portion and the inner image portion in a manner that can be distinguished from other image portions (step S41). For example, the three-dimensional model generation unit 1517 may generate the three-dimensional model by modifying the MRI image so that the MRI image represents the outer surface image portion and the inner image portion in a manner that can be distinguished from other image portions. Good.
  • the three-dimensional model generation unit 1517 may generate a three-dimensional model by newly generating an MRI image so as to represent the outer surface image portion and the inner image portion in a manner distinguishable from other image portions.
  • a display method that highlights and displays the tumor can be given.
  • the three-dimensional viewer processing unit 1518 generates an observation image that is observed when the surgeon observes the three-dimensional model generated in step S41 from a desired viewpoint (step S42).
  • the three-dimensional viewer processing unit 1518 outputs the generated observation image to the display device 16.
  • the display device 16 displays the observation image generated in step S42 (step S43).
  • the surgeon can recognize how the tumor is three-dimensionally distributed inside the patient by visually recognizing the observation image.
  • the MRI image acquisition unit 1511 determines whether or not it is time to acquire the MRI image again (step S51). For example, when a predetermined time has elapsed since the last MRI image was acquired, the MRI image acquisition unit 1511 may determine that the timing for acquiring the MRI image again has arrived. For example, when the user requests acquisition of the MRI image again, the MRI image acquisition unit 1511 may determine that the timing for acquiring the MRI image again has arrived.
  • step S51 when it is determined that the timing for acquiring the MRI image again has come (step S51: Yes), the MRI image acquisition unit 1511 acquires the MRI image again (step S1). Thereafter, the operations after step S2 are performed again.
  • step S51 when it is determined that the timing for acquiring the MRI image again has not arrived (step S51: No), the 3D model generation unit 1517 generates the 3 generated in step S42.
  • An additional model corresponding to an additional image is added to the dimension model (step S52).
  • an additional image an image of a surgical instrument (for example, a knife) handled by the surgeon is raised.
  • the image processing device 15 performs the operation described below.
  • a marker is placed on the surgical instrument.
  • the position measurement device 14 detects measurement light reflected by a marker installed on the surgical instrument.
  • the marker position detection unit 1515 specifies the position of the surgical instrument in the real space coordinate system based on the measurement result of the position measurement device 14.
  • the association processing unit 1516 converts the position of the surgical tool in the real space coordinate system into the position of the surgical tool in the MRI coordinate system.
  • the position of the surgical instrument in the MRI coordinate system corresponds to the position where the surgical instrument is to be placed in the three-dimensional model.
  • the three-dimensional model generation unit 1517 gives an additional model corresponding to the surgical tool to the three-dimensional model of the patient, and the positional relationship between the three-dimensional model of the patient and the additional model corresponding to the surgical tool is actually It can be added so as to match the positional relationship between the patient and the actual surgical instrument.
  • step S53 determines whether or not the operation is completed. As a result of the determination in step S53, when it is determined that the operation has not been completed (step S53: No), the operations after step S51 are repeated. On the other hand, as a result of the determination in step S53, when it is determined that the operation is completed (step S53: Yes), the operation support system 1 ends the operation illustrated in FIG.
  • the image processing apparatus 15 can specify not only the outer surface image portion that three-dimensionally represents the outer surface of the tumor but also the inner image portion that represents the inside of the tumor three-dimensionally. . That is, the image processing apparatus 15 according to the present embodiment can specify how the tumor identified from the spectral image that is a two-dimensional image is three-dimensionally distributed in the MRI image that is a three-dimensional image. . As a result, the display device 16 can display a three-dimensional model of the patient that represents the tumor in a manner that can be distinguished from other parts. As a result, the surgeon can appropriately recognize the tumor.
  • the surgery support system 1 may include an arbitrary device that can generate an image representing a three-dimensional representation of the outer surface and internal structure of the patient by imaging the patient in addition to or instead of the MRI apparatus 11. Good.
  • an arbitrary apparatus there is a CT (Computed Tomography) apparatus and a PET (Positron Emission Tomography) apparatus.
  • the image processing apparatus 15 specifies an outer surface image portion representing the outer surface of the tumor and an inner image portion representing the inside of the tumor, and then at least one of the spectral image (or the outer surface image portion and the inner image portion). ) May be subjected to a non-rigid registration process.
  • the image processing apparatus 15 can generate a three-dimensional model that represents the tumor whose shape has changed in a manner that can be distinguished from other parts.
  • the inner image portion may include the outer surface image portion as a part thereof. This is because the outer shell of the inner image portion substantially corresponds to the outer surface image portion. For this reason, the internal image portion may represent the outer surface of the tumor in addition to the inside of the tumor.
  • the present invention can be appropriately changed without departing from the gist or concept of the invention that can be read from the claims and the entire specification, and an image processing apparatus, an image processing method, and a computer that involve such a change
  • the program is also included in the technical idea of the present invention.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

L'invention concerne, à l'aide d'une image bidimensionnelle et d'une image tridimensionnelle obtenues par la capture d'images d'un même objet, et une distribution stérique dans l'image tridimensionnelle d'un objet comprise dans l'image bidimensionnelle. Un dispositif de traitement d'image (15) comporte un moyen d'acquisition pour acquérir une image bidimensionnelle et une image tridimensionnelle générées par capture d'images du même objet, un premier moyen de spécification (1514) pour spécifier une première partie image représentant l'objet dans l'image bidimensionnelle, et un second moyen de spécification (1516) pour spécifier une troisième composante d'image dans l'image tridimensionnelle ayant une caractéristique de la seconde partie image dans l'image tridimensionnelle correspondant à la première partie image.
PCT/JP2015/080576 2015-10-29 2015-10-29 Dispositif de traitement d'image, procédé de traitement d'image et programme informatique WO2017072916A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/080576 WO2017072916A1 (fr) 2015-10-29 2015-10-29 Dispositif de traitement d'image, procédé de traitement d'image et programme informatique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/080576 WO2017072916A1 (fr) 2015-10-29 2015-10-29 Dispositif de traitement d'image, procédé de traitement d'image et programme informatique

Publications (1)

Publication Number Publication Date
WO2017072916A1 true WO2017072916A1 (fr) 2017-05-04

Family

ID=58631370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/080576 WO2017072916A1 (fr) 2015-10-29 2015-10-29 Dispositif de traitement d'image, procédé de traitement d'image et programme informatique

Country Status (1)

Country Link
WO (1) WO2017072916A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008093443A (ja) * 2006-10-05 2008-04-24 Siemens Ag インターベンショナルな処置の表示方法
JP2008520312A (ja) * 2004-11-23 2008-06-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ インターベンション手順の間の画像表示用の画像処理システム及び方法
JP2009039280A (ja) * 2007-08-08 2009-02-26 Arata Satori 内視鏡システム及び内視鏡システムを用いた被写体の検出方法
JP2010274044A (ja) * 2009-06-01 2010-12-09 Olympus Corp 手術支援装置、手術支援方法及び手術支援プログラム
JP2014226430A (ja) * 2013-05-24 2014-12-08 富士フイルム株式会社 画像表示装置および方法、並びにプログラム
JP5781667B1 (ja) * 2014-05-28 2015-09-24 株式会社モリタ製作所 根管治療用装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008520312A (ja) * 2004-11-23 2008-06-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ インターベンション手順の間の画像表示用の画像処理システム及び方法
JP2008093443A (ja) * 2006-10-05 2008-04-24 Siemens Ag インターベンショナルな処置の表示方法
JP2009039280A (ja) * 2007-08-08 2009-02-26 Arata Satori 内視鏡システム及び内視鏡システムを用いた被写体の検出方法
JP2010274044A (ja) * 2009-06-01 2010-12-09 Olympus Corp 手術支援装置、手術支援方法及び手術支援プログラム
JP2014226430A (ja) * 2013-05-24 2014-12-08 富士フイルム株式会社 画像表示装置および方法、並びにプログラム
JP5781667B1 (ja) * 2014-05-28 2015-09-24 株式会社モリタ製作所 根管治療用装置

Similar Documents

Publication Publication Date Title
US11759261B2 (en) Augmented reality pre-registration
JP6463038B2 (ja) 画像位置合せ装置、方法およびプログラム
US9983065B2 (en) Method and apparatus for analyzing images
JP6714006B2 (ja) 医療イメージングモダリティで使用するための患者の生体計測パラメータ及び生理学的パラメータを自動測定するためのカメラシステム
US9076246B2 (en) System and method of overlaying images of different modalities
CN109998678A (zh) 在医学规程期间使用增强现实辅助导航
CN108140242A (zh) 视频摄像机与医学成像的配准
JP2016179121A (ja) 内視鏡検査支援装置、方法およびプログラム
US11928834B2 (en) Systems and methods for generating three-dimensional measurements using endoscopic video data
CN110136191A (zh) 用于体内对象的大小估计的系统和方法
JP2013153883A (ja) 画像処理装置、撮影システム及び画像処理方法
JP2017164007A (ja) 医療用画像処理装置、医療用画像処理方法、プログラム
JP2016536089A (ja) 外科的介入計画を算出する方法
JP2010281811A (ja) 温度画像の三次元化装置
Richey et al. Soft tissue monitoring of the surgical field: detection and tracking of breast surface deformations
US10631948B2 (en) Image alignment device, method, and program
WO2017072916A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme informatique
JP2017080159A (ja) 画像処理装置及び画像処理方法、並びにコンピュータプログラム
JP2018061844A (ja) 情報処理装置、情報処理方法、及びプログラム
JP2017023834A (ja) 画像処理装置、撮影システム及び画像処理方法
US11989915B2 (en) Intra-operative determination of a focal length of a camera for medical applications
US20240225776A1 (en) Augmented reality headset and probe for medical imaging
JP2024518392A (ja) 医療イメージングのための拡張現実ヘッドセット及びプローブ
CN114727860A (zh) 物理医疗元件放置系统
IL202923A (en) Method and device for image analysis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15907280

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15907280

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP