WO2013027773A1 - Dispositif de traitement d'images, procédé de traitement d'images et support d'enregistrement - Google Patents

Dispositif de traitement d'images, procédé de traitement d'images et support d'enregistrement Download PDF

Info

Publication number
WO2013027773A1
WO2013027773A1 PCT/JP2012/071213 JP2012071213W WO2013027773A1 WO 2013027773 A1 WO2013027773 A1 WO 2013027773A1 JP 2012071213 W JP2012071213 W JP 2012071213W WO 2013027773 A1 WO2013027773 A1 WO 2013027773A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
dimensional image
image
depth information
depth
Prior art date
Application number
PCT/JP2012/071213
Other languages
English (en)
Japanese (ja)
Inventor
恵 中尾
小太郎 湊
Original Assignee
国立大学法人奈良先端科学技術大学院大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人奈良先端科学技術大学院大学 filed Critical 国立大学法人奈良先端科学技術大学院大学
Publication of WO2013027773A1 publication Critical patent/WO2013027773A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering

Definitions

  • the present invention relates to an image processing apparatus that superimposes a two-dimensional image and a three-dimensional image.
  • Japanese Patent No. 4337987 (first page, FIG. 1 etc.)
  • the image processing apparatus is image data of one or more three-dimensional objects, position information that is information indicating a position, color information that is information about a color, and a depth from a reference point.
  • a three-dimensional image storage unit capable of storing a three-dimensional image composed of a plurality of voxel information that is point information having depth information indicating an image of one or more three-dimensional objects, Depth that can store a two-dimensional image with depth information that is an image that has been aligned with the original image and that is composed of a plurality of pixel information that is point information having position information, color information, and depth information Depth at which the first image processing is performed on the two-dimensional image storage unit with depth information and the two-dimensional image with depth information using the depth information, and a two-dimensional image with post-processing depth information is acquired.
  • 2D image processing unit with information and 2D with depth information after processing
  • An extended image generating unit that generates an extended image obtained by superimposing a three-dimensional image or a three-dimensional image after processing the image and the second image processing on the three-dimensional image; and an output unit that outputs the extended image Is an image processing apparatus.
  • the image processing apparatus of the second invention is an image obtained by photographing one or more three-dimensional objects with respect to the first invention, and is an image aligned with the three-dimensional image.
  • a two-dimensional image storage unit capable of storing a two-dimensional image composed of a plurality of pieces of pixel information that is point information having information and color information, and in a plurality of voxel information possessed by a three-dimensional image, the same on the plane
  • the depth information of the voxel information which is color information that is not transparent from the information of a plurality of points having position information and the depth information is closest to the reference position.
  • the two-dimensional image with depth information in the attached two-dimensional image storage unit is an image processing device that is a two-dimensional image with depth information acquired by the two-dimensional image acquisition unit with depth information.
  • a two-dimensional image with depth information which is a two-dimensional image having depth information, can be automatically acquired.
  • the image processing apparatus is the first or second aspect of the invention, wherein the two-dimensional image processing unit with depth information is for a plurality of pixel information included in the two-dimensional image with depth information.
  • the first image processing for setting at least two or more different color information and / or different opacity information according to the difference in the depth information of the pixel information is performed, and a two-dimensional image with post-processing depth information is obtained.
  • an appropriate image can be obtained by controlling the color and opacity of the two-dimensional image to be superimposed using depth information.
  • the image processing apparatus is different from the first to third aspects in that a plurality of voxel information included in a three-dimensional image is changed depending on a difference in depth information included in the voxel information.
  • the image processing apparatus further includes a three-dimensional image processing unit that performs second image processing for setting at least two or more different color information and / or different opacity information, and acquires a processed three-dimensional image.
  • an appropriate image can be obtained by controlling the color and opacity of the three-dimensional image to be superimposed using depth information.
  • the image processing apparatus of the fifth invention accepts an instruction for one or more points of the output two-dimensional image, the two-dimensional image with depth information, or the extended image with respect to the fourth invention.
  • a three-dimensional image processing unit further comprising a reception unit and an instruction voxel information determination unit that determines two or more consecutive voxel information using color information included in voxel information corresponding to one or more points corresponding to the instruction Performs second image processing for setting different color information and / or different opacity information to two or more voxel information determined by the instruction voxel information determination unit or voxel information other than the two or more voxel information.
  • An image processing apparatus that performs and obtains a processed three-dimensional image.
  • a necessary image can be obtained by effectively using a two-dimensional image and a three-dimensional image of the same object.
  • Block diagram of an image processing apparatus Flow chart for explaining the operation of the image processing apparatus Flowchart explaining operation of same depth information acquisition processing A flowchart for explaining the operation of the extended image generation processing Flowchart explaining operation of 2D image processing with same depth information A flowchart for explaining the operation of the 3D image processing A flowchart for explaining the operation of the 3D image processing The flowchart explaining the operation
  • indication voxel information determination process Flowchart explaining operation of same voxel information acquisition process Figure showing the same two-dimensional image Figure showing the same 3D image Figure showing the same extended image Figure showing the same extended image Diagram explaining the same continuous area instruction
  • Embodiment 1 an image processing apparatus that performs image processing of a two-dimensional image using depth information and superimposes the two-dimensional image that has been processed and a three-dimensional image or the like will be described.
  • FIG. 1 is a block diagram of the image processing apparatus 1 in the present embodiment.
  • the image processing apparatus 1 includes a two-dimensional image storage unit 101, a three-dimensional image storage unit 102, a two-dimensional image storage unit 103 with depth information, a reception unit 104, a depth information acquisition unit 105, and a two-dimensional image acquisition with depth information.
  • the 2D image storage unit 101 can store a 2D image composed of a plurality of pieces of pixel information.
  • the pixel information is information on a point having position information that is information indicating a position and color information that is information about a color.
  • the position information here is position information on a plane (for example, (x, y)).
  • the pixel information may include position information, color information, and opacity information (also referred to as ⁇ value, blending parameter) indicating opacity.
  • the opacity information may be regarded as transparency information indicating transparency.
  • the two-dimensional image is an image obtained by photographing one or more three-dimensional objects, and is an image that is aligned with a three-dimensional image described later.
  • a two-dimensional image and a three-dimensional image are aligned means that a projection generated by rendering a two-dimensional image captured by a camera and a three-dimensional image captured by a CT or the like.
  • the image is aligned.
  • the two-dimensional image is an image obtained by photographing one or more three-dimensional objects.
  • the projected image is generated by rendering a three-dimensional image of the object.
  • the projected image aligned with the two-dimensional image is rendered with, for example, camera parameters such as the camera position, angle, and viewing angle set to be the same as the camera state when photographing a three-dimensional object. Is generated.
  • the 2D image storage unit 101 is preferably a non-volatile recording medium, but can also be realized by a volatile recording medium.
  • the process of storing the two-dimensional image in the two-dimensional image storage unit 101 does not matter.
  • a 2D image may be stored in the 2D image storage unit 101 via a recording medium, and a 2D image transmitted via a communication line or the like is stored in the 2D image storage unit 101.
  • a two-dimensional image input via an input device may be stored in the two-dimensional image storage unit 101.
  • the 3D image storage unit 102 can store a 3D image composed of a plurality of voxel information.
  • the voxel information is point information having position information that is information indicating a position, color information that is information about a color, and depth information that indicates a depth from a reference point.
  • the three-dimensional image is image data of one or more three-dimensional objects.
  • the three-dimensional image is, for example, 3D voxel information that is a volume texture of one or more three-dimensional objects.
  • the 3D voxel information is, for example, information on a point constituted by (x, y, z, col).
  • (X, y, z) of (x, y, z, col) is coordinate information.
  • “Col” is color information.
  • the 3D voxel information may be, for example, information on a point constituted by (x, y, z, col, ⁇ value).
  • the 3D voxel information is preferably information on clogged points with no interval between points, but may be information on discrete points.
  • the 3D voxel information is a set of two-dimensional images acquired by a medical device such as CT, MRI, or PET.
  • the 3D voxel information is, for example, a set of two-dimensional images obtained by photographing the human brain or the inside of the body by CT or MRI.
  • “z” in (x, y, z) may be depth information.
  • the position information is (x, y)
  • the three-dimensional image is slice information that is information configured based on two-dimensional image data obtained by cutting out 3D voxel information that is a volume texture of one or more three-dimensional objects in a plurality of planes, It may be a slice information group having a plurality of slice information composed of information on a plurality of points having position information, color information, and depth information.
  • the slice information group is a plurality of pieces of slice information cut out from the 3D voxel information perpendicular to the line of sight (line-of-sight vector) and at constant intervals.
  • the line of sight is, for example, a line perpendicular to the display that outputs the slice information group (the line of sight of the person who views the display).
  • the slice information is a set of information of points constituting the plane, and there is no interval between the points and is packed.
  • the 3D image storage unit 102 is preferably a non-volatile recording medium, but can also be realized by a volatile recording medium. The process of storing the 3D image in the 3D image storage unit 102 does not matter.
  • the two-dimensional image storage unit 103 with depth information can store a two-dimensional image with depth information.
  • the two-dimensional image with depth information is an image composed of a plurality of pieces of pixel information that is point information having position information, color information, and depth information.
  • the two-dimensional image with depth information is an image obtained by photographing one or more three-dimensional objects, and is an image that is aligned with the three-dimensional image.
  • the aligned image is an image in which the camera parameters such as the camera position, angle, and viewing angle are the same as the camera parameters of the three-dimensional image.
  • the two-dimensional image storage unit 103 with depth information is preferably a non-volatile recording medium, but can also be realized by a volatile recording medium.
  • the process in which the two-dimensional image with depth information is stored in the two-dimensional image storage unit 103 with depth information does not matter.
  • the accepting unit 104 accepts an instruction.
  • the instruction is, for example, a two-dimensional image acquisition instruction with depth information, an extended image acquisition instruction, a continuous area instruction, or the like.
  • the two-dimensional image acquisition instruction with depth information is an instruction to acquire a two-dimensional image with depth information obtained by adding depth information to pixel information of the two-dimensional image.
  • the extended image acquisition instruction is an instruction to acquire an extended image.
  • the continuous area instruction is an instruction for one or more points in the output two-dimensional image, depth information-added two-dimensional image or extended image, and includes an object (continuous area) including the one or more points. ) In a different mode from other objects (for example, color information and / or opacity information is different).
  • reception means reception of information input from an input device such as a mouse, a touch panel, or a keyboard, reception of information transmitted via a wired or wireless communication line, recording on an optical disk, magnetic disk, semiconductor memory, or the like. It is a concept including reception of information read from a medium.
  • the instruction input means may be anything such as a mouse or a menu screen.
  • the accepting unit 104 can be realized by a device driver for input means such as a numeric keypad and a keyboard, control software for a menu screen, and the like.
  • the depth information acquisition unit 105 includes color information that is not transparent from information on a plurality of points having the same position information on a plane in a plurality of voxel information included in the three-dimensional image, and the depth information is a reference.
  • Depth information possessed by voxel information which is information indicating the forefront from the position, is acquired for each position information on the plane.
  • the plane is usually a plane perpendicular to the camera position and the line-of-sight vector.
  • the plurality of points having the same position information on the plane are, for example, a plurality of points having the same (x, y) and different depth information or z.
  • the reference position is usually the position of a camera or line of sight that acquires a three-dimensional image.
  • the depth information-added two-dimensional image acquisition unit 106 adds depth information to a plurality of pieces of pixel information included in the two-dimensional image, and acquires a two-dimensional image with depth information. More specifically, the two-dimensional image acquisition unit 106 with depth information includes a plurality of pieces of depth information acquired by the depth information acquisition unit 105 for each piece of positional information on the plane. Each pixel information is added to each pixel information having the same position information on the plane, and a two-dimensional image with depth information including two or more pieces of pixel information having position information, color information, and depth information is acquired. . That is, the two-dimensional image with depth information is an image in which depth information is added to a plurality of pieces of pixel information included in the two-dimensional image.
  • the two-dimensional image storage unit 107 with depth information stores the two-dimensional image with depth information acquired by the two-dimensional image acquisition unit 106 with depth information in the two-dimensional image storage unit 103 with depth information.
  • the two-dimensional image processing unit with depth information 108 performs first image processing on the two-dimensional image with depth information, and acquires a two-dimensional image with post-processing depth information.
  • the two-dimensional image processing unit with depth information 108 normally performs the first image processing on the two-dimensional image with depth information using the depth information, and the processed two-dimensional image with depth information is obtained. get.
  • the two-dimensional image processing unit with depth information 108 for example, with respect to the two-dimensional image with depth information, at least two or more different color information or / and different information depending on the difference in the depth information of the pixel information.
  • First image processing for setting transparency information is performed, and a two-dimensional image with post-processing depth information is acquired.
  • the setting of at least two or more different color information may be a process of setting other color information for a part of pixel information when one color information is originally set.
  • the setting of at least two or more different opacity information may be a process of setting other opacity information for a part of pixel information when one opacity information is originally set.
  • the first image processing only needs to have two or more different color information and / or different opacity information as a result of the processing.
  • the above-mentioned “different color information or / and different opacity information” means any of different color information, different opacity information, different color information, and different opacity information.
  • the first image processing emphasizes the contour according to the difference in the depth information of the pixel information, changes the color information of the partial area by the color map, or RGBA value (color information and Any image processing such as changing (opacity information) may be used.
  • the instruction voxel information determination unit 109 determines two or more consecutive voxel information using the color information included in the voxel information corresponding to one or more points corresponding to the instruction.
  • the two or more consecutive voxel information items are two or more voxel information items having color information that is the same as or different from a threshold value within the color information of adjacent voxel information.
  • the 3D image processing unit 110 performs the second image processing on the 3D image and acquires the processed 3D image.
  • the three-dimensional image processing unit 110 for example, with respect to a three-dimensional image, at least two or more different color information and / or different opacity information according to the difference in depth information included in the voxel information included in the three-dimensional image Second image processing is performed to set a three-dimensional image after processing.
  • the three-dimensional image processing unit 110 sets different color information or different opacity information to two or more voxel information determined by the instruction voxel information determination unit 109 or voxel information other than the two or more voxel information.
  • the second image processing is performed to obtain a post-processing three-dimensional image.
  • the extended image generation unit 111 generates an extended image obtained by superimposing the 2D image with post-processing depth information and the 3D image. Further, the extended image generation unit 111 generates an extended image obtained by superimposing the processed 2D image with depth information and the processed 3D image obtained by performing the second image processing on the 3D image. Also good. That is, the extended image generation unit 111 arbitrarily selects a region (two-dimensional image with post-processing depth information) selected and image-processed in the two-dimensional image and a projected image (part or whole) of the three-dimensional image. Blend to create an extended image.
  • the output unit 112 outputs the extended image generated by the extended image generation unit 111.
  • the output unit 112 outputs a 2D image in the 2D image storage unit 101, a 3D image in the 3D image storage unit 102, a 2D image with depth information in the 2D image storage unit 103 with depth information, and the like. You may do it.
  • output refers to display on a display, projection using a projector, printing on a printer, sound output, transmission to an external device, storage in a recording medium, output to another processing device or other program, etc. It is a concept that includes delivery of processing results.
  • the output unit 112 may or may not include an output device such as a display or a speaker.
  • the output unit 112 can be realized by output device driver software, or output device driver software and an output device.
  • the processing unit 110 and the extended image generation unit 111 can usually be realized by an MPU, a memory, or the like.
  • the processing procedure of the depth information acquisition unit 105 and the like is usually realized by software, and the software is recorded on a recording medium such as a ROM. However, it may be realized by hardware (dedicated circuit).
  • Step S201 The receiving unit 104 determines whether a two-dimensional image acquisition instruction with depth information has been received. If a two-dimensional image acquisition instruction with depth information is accepted, the process goes to step S202. If a two-dimensional image acquisition instruction with depth information is not accepted, the process goes to step S208.
  • Step S202 The depth information acquisition unit 105 substitutes 1 for the counter i.
  • Step S203 The depth information acquisition unit 105 determines whether or not the i-th point (pixel information) is present in the two-dimensional image of the two-dimensional image storage unit 101. If the i-th point exists, go to step S204, and if not, go to step S207.
  • Step S204 The depth information acquisition unit 105 acquires depth information of the i-th pixel information in the two-dimensional image. Such processing is called depth information acquisition processing. Depth information acquisition processing will be described with reference to the flowchart of FIG.
  • Step S205 The depth information-added two-dimensional image acquisition unit 106 sets the depth information acquired in step S204 as the i-th pixel information in the two-dimensional image.
  • the two-dimensional image acquisition unit 106 with depth information it is not necessary for the two-dimensional image acquisition unit 106 with depth information to write the depth information into the two-dimensional image of the two-dimensional image storage unit 101, and the two-dimensional image of the two-dimensional image storage unit 101 is used. It is preferred to write depth information into the copied image.
  • Step S206 The depth information acquisition unit 105 increments the counter i by one. The process returns to step S203.
  • Step S207 The two-dimensional image storage unit 107 with depth information stores the acquired two-dimensional image with depth information in the two-dimensional image storage unit 103 with depth information. The process returns to step S201.
  • Step S208 The reception unit 104 determines whether an extended image acquisition instruction has been received. If an extended image acquisition instruction is accepted, the process goes to step S209. If an extended image acquisition instruction is not accepted, the process goes to step S211.
  • Step S209 The extended image generation unit 111 and the like perform extended image generation processing.
  • the extended image generation process will be described with reference to the flowchart of FIG.
  • Step S210 The output unit 112 outputs the extended image generated in Step S209. The process returns to step S201.
  • Step S211 The receiving unit 104 determines whether or not a continuous area instruction has been received. If a continuous area instruction is accepted, the process proceeds to step S212. If a continuous area instruction is not accepted, the process returns to step S201.
  • Step S212 The instruction voxel information determination unit 109 performs instruction voxel information determination processing.
  • the instruction voxel information determination process will be described with reference to the flowchart of FIG. Go to step S209.
  • a two-dimensional image with depth information is generated by the processing from step S202 to step S207.
  • the processing is ended by powering off or interruption for aborting the processing.
  • step S204 the depth information acquisition processing in step S204 will be described using the flowchart of FIG.
  • Step S301 The depth information acquisition unit 105 assigns 1 to the counter j.
  • the depth information acquisition unit 105 includes the j-th point (voxel information) in the same position (same position on the plane) as the i-th point in the three-dimensional image of the three-dimensional image storage unit 102. It is determined whether or not exists. If the jth point exists, the process goes to step S303, and if the jth point does not exist, the process goes to step S307. Here, the depth information acquisition unit 105 selects the j-th point from the front.
  • the point in the foreground means “a point where the depth information is shallow”, “a point where the value of z is small (the smaller the value of z, the closer it is)” or “a point where the value of z is large (value of z It ’s okay if you ’re in front.
  • Step S303 The depth information acquisition unit 105 acquires the color information of the j-th point.
  • Step S304 The depth information acquisition unit 105 determines whether or not the color information acquired in Step S303 is information indicating “transparent”. If it is “transparent”, the process goes to step S306, and if it is not “transparent”, the process goes to step S305.
  • Step S305 The depth information acquisition unit 105 acquires depth information included in the j-th voxel information. Return to upper process.
  • Step S306 The depth information acquisition unit 105 increments the counter j by 1. The process returns to step S302.
  • Step S307 The depth information acquisition unit 105 substitutes NULL for the depth information. Return to upper process. Note that this step may be omitted.
  • step S209 the extended image generation processing in step S209 will be described using the flowchart in FIG.
  • Step S401 The two-dimensional image processing unit with depth information 108 performs processing on the two-dimensional image with depth information. Such two-dimensional image processing with depth information will be described with reference to the flowchart of FIG.
  • Step S402 The three-dimensional image processing unit 110 determines whether or not processing on the three-dimensional image is necessary. Note that the 3D image processing unit 110 determines whether or not processing for a 3D image is necessary, for example, based on whether or not a user inputs a 3D image processing instruction. In addition, the 3D image processing unit 110 determines whether or not processing for a 3D image is necessary, for example, based on whether or not a condition corresponding to the processing of the 3D image is included in an instruction input from the user. You may do it. If processing for a three-dimensional image is necessary, the process goes to step S403, and if processing for a three-dimensional image is not required, the process goes to step S405.
  • Step S403 The three-dimensional image processing unit 110 performs three-dimensional image processing. Three-dimensional image processing will be described with reference to the flowcharts of FIGS.
  • Step S404 The extended image generation unit 111 generates an extended image by superimposing the processed two-dimensional image with depth information and the processed three-dimensional image. Return to upper process.
  • Step S405 The extended image generation unit 111 reads a 3D image from the 3D image storage unit 102.
  • Step S406 The extended image generation unit 111 superimposes the two-dimensional image with post-processing depth information and the three-dimensional image read out in step S405 to generate an extended image. Return to upper process.
  • step S401 the two-dimensional image processing with depth information in step S401 will be described with reference to the flowchart of FIG.
  • the two-dimensional image processing unit with depth information 108 acquires one or more conditions included in the instruction.
  • This condition is information indicating image processing for a two-dimensional image with depth information.
  • the conditions include, for example, a range of depth information for setting color information or / and opacity information, and color information or / and opacity information to be set.
  • Step S502 The two-dimensional image processing unit with depth information 108 substitutes 1 for the counter i.
  • Step S503 The two-dimensional image processing unit with depth information 108 determines whether or not the i-th condition exists in the conditions acquired in Step S501. If the i-th condition exists, the process goes to step S504, and if the i-th condition does not exist, the process returns to the upper process.
  • Step S504 The two-dimensional image processing unit with depth information 108 acquires a range of depth information included in the i-th condition and color information or / and opacity information to be set.
  • Step S505 The two-dimensional image processing unit with depth information 108 substitutes 1 for the counter j.
  • Step S506 The two-dimensional image processing unit with depth information 108 determines whether or not the j-th pixel information included in the two-dimensional image with depth information in the two-dimensional image storage unit 103 with depth information exists. To do. If the j-th pixel information exists, the process goes to step S507, and if not, the process goes to step S511.
  • Step S507 The two-dimensional image processing unit with depth information 108 acquires depth information included in the jth pixel information.
  • Step S508 The two-dimensional image processing unit with depth information 108 determines whether or not the depth information acquired in Step S507 is within the range of the depth information included in the i-th condition. If it is within the range, the process goes to step S509, and if not, the process goes to step S510.
  • Step S509 The two-dimensional image processing unit with depth information 108 sets color information and / or opacity information included in the i-th condition in the j-th pixel information.
  • Step S510 The two-dimensional image processing unit with depth information 108 increments the counter j by 1. The process returns to step S506.
  • Step S511 The two-dimensional image processing unit with depth information 108 increments the counter i by one. The process returns to step S503.
  • a first example of the three-dimensional image processing in step S403 will be described using the flowchart of FIG. In the flowchart of FIG. 6, the description of the same steps as those in the flowchart of FIG. 5 is omitted.
  • a first example of the three-dimensional image processing is an example in a case where a range of depth information for setting predetermined color information and / or opacity information is fixed.
  • Step S601 The three-dimensional image processing unit 110 determines whether or not the j-th voxel information is present in the three-dimensional image of the three-dimensional image storage unit 102. If the j-th voxel information exists, the process goes to step S602, and if not, the process goes to step S511.
  • Step S602 The three-dimensional image processing unit 110 acquires depth information included in the j-th voxel information.
  • Step S603 The three-dimensional image processing unit 110 sets color information and / or opacity information included in the i-th condition in the j-th voxel information in the three-dimensional image.
  • a second example of the three-dimensional image processing in step S403 will be described using the flowchart in FIG.
  • a second example of the three-dimensional image processing is an example in which a voxel information group for setting predetermined color information or / and opacity information is determined.
  • Step S701 The three-dimensional image processing unit 110 acquires set color information and / or opacity information.
  • Step S702 The three-dimensional image processing unit 110 substitutes 1 for the counter i.
  • Step S703 The three-dimensional image processing unit 110 determines whether or not the i-th voxel information for which color information or / and opacity information is to be set exists. If i-th voxel information exists, it will go to step S704, and if it does not exist, it will return to a high-order process.
  • Step S704 The three-dimensional image processing unit 110 sets the color information and / or opacity information acquired in step S701 to the i-th voxel information to be set.
  • Step S705 The three-dimensional image processing unit 110 increments the counter i by one. The process returns to step S703.
  • step S212 the instruction voxel information determination process in step S212 will be described with reference to the flowchart of FIG.
  • Step S801 The instruction voxel information determination unit 109 acquires the coordinate value (usually (x, y)) of the specified point.
  • Step S802 The instruction voxel information determination unit 109 converts all the voxel information having the same plane coordinate value (for example, (x, y)) as the coordinate value acquired in Step S801 into the 3D image storage unit 102. Obtain from the original image.
  • Step S803 The instruction voxel information determination unit 109 acquires the voxel information which is the frontmost, not transparent, among the voxel information acquired in step S802. This voxel information is referred to as reference voxel information.
  • the foremost voxel information is, for example, voxel information having the smallest z coordinate value.
  • the instruction voxel information determination unit 109 determines the reference voxel information acquired in step S803 as instruction voxel information.
  • the determining process is, for example, a process of writing reference voxel information in a buffer that stores instruction voxel information.
  • Step S805 The instruction voxel information determination unit 109 performs a process of acquiring continuous voxel information with reference to the reference voxel information. Such processing is called continuous voxel information acquisition processing.
  • the continuous voxel information acquisition process will be described with reference to the flowchart of FIG. Return to upper process.
  • step S805 the continuous voxel information acquisition process in step S805 will be described using the flowchart of FIG. In the flowchart of FIG. 9, the description of the same steps as those in the flowchart of FIG. 8 is omitted.
  • Step S901 The instruction voxel information determination unit 109 acquires color information of the reference voxel information.
  • Step S902 The instruction voxel information determination unit 109 acquires all the voxel information adjacent to the reference voxel information from the 3D image in the 3D image storage unit 102.
  • Step S903 The instruction voxel information determination unit 109 assigns 1 to the counter i.
  • Step S904 The instruction voxel information determination unit 109 determines whether or not the i-th voxel information exists in the voxel information acquired in step S902. If the i-th pixel information exists, the process proceeds to step S905, and if it does not exist, the process returns to the upper process.
  • Step S905 The instruction voxel information determination unit 109 acquires color information included in the i-th voxel information.
  • Step S906 The instruction voxel information determination unit 109 compares the color information of the reference voxel information acquired in step S901 with the color information of the i-th voxel information acquired in step S905, and the difference between the two color information is determined. It is determined whether it is within the threshold. If it is within the threshold value, go to step S907, and if it is not within the threshold value, go to step S909. Here, when the difference between the two color information is within a threshold, it is assumed that the two points are points within the same object.
  • Step S907 The instruction voxel information determination unit 109 determines the i-th voxel information as instruction voxel information.
  • Step S908 The instruction voxel information determination unit 109 sets the i-th voxel information as reference voxel information. Go to step S805.
  • Step S909 The instruction voxel information determination unit 109 increments the counter i by one. The process returns to step S904.
  • the three-dimensional image I vol acquired by using CT or MRI for the three objects is stored in the three-dimensional image storage unit 102 (see FIG. 11).
  • the reception unit 104 receives a two-dimensional image acquisition instruction with depth information.
  • the depth information acquisition unit 105 performs the following process on each pixel information in the two-dimensional image I cam of the two-dimensional image storage unit 101. That is, the depth information acquisition unit 105 uses, for each pixel information of I cam , among the plurality of voxel information having the same plane coordinates (x, y) as the plane coordinates (x, y) included in the pixel information. Voxel information is selected from the three-dimensional image I vol in order from the point closest to the point (here, the value of z is small). Then, the depth information acquisition unit 105 determines whether the color information included in the selected voxel information is information indicating “transparency”.
  • the depth information acquisition unit 105 determines, for each pixel information, voxel information whose color information is not information indicating “transparent”, and determines the first found voxel information as target voxel information.
  • the target voxel information is voxel information closest to the reference point among the voxel information whose color information is not information indicating “transparent”.
  • the depth information acquisition part 105 acquires the depth information which object voxel information has for every pixel information, and adds it to each pixel information.
  • the two-dimensional image storage unit 107 with depth information stores the generated two-dimensional image with depth information in the two-dimensional image storage unit 103 with depth information.
  • d is a variable of depth information
  • d0 is depth information having the smallest value among depth information of pixel information constituting the cylinder.
  • D1 is the depth information having the largest value among the depth information of the pixel information constituting the cylinder.
  • D2 is depth information having the smallest value among the depth information of pixel information constituting the sphere.
  • D3 is the depth information having the largest value among the depth information of the pixel information constituting the sphere.
  • the extended image generation unit 111 reads the 3D image I vol from the 3D image storage unit 102. Then, the extended image generating unit 111 generates an extended image I aug by superimposing the two-dimensional image with post-processing depth information and the read I vol . The generated extended image I aug is shown in FIG. Then, the output unit 112 outputs the extended image I aug .
  • the real image I cam (two-dimensional image) is stored in the two-dimensional image storage unit 101, and the three-dimensional image I vol is stored in the three-dimensional image storage unit 102.
  • a two-dimensional image with depth information in which depth information is added to I cam is stored in the two-dimensional image storage unit 103 with depth information.
  • the reception unit 104 receives a second extended image acquisition instruction.
  • the second extended image acquisition instruction here is a post-processing depth information-added two-dimensional image I ′ cam in which pixel information having depth information up to the sphere is made transparent in the two-dimensional image with depth information.
  • “d3” is the depth information having the largest value among the depth information of the pixel information constituting the sphere.
  • the extended image generation unit 111 reads the 3D image I vol from the 3D image storage unit 102.
  • the extended image generation unit 111 superimposes the processed two-dimensional image with depth information and the read I vol to generate an extended image I aug .
  • the generated extended image I aug is shown in FIG.
  • the output unit 112 outputs the extended image I aug .
  • the two-dimensional image I cam is stored in the two-dimensional image storage unit 101, and the three-dimensional image I vol is stored in the three-dimensional image storage unit 102.
  • a two-dimensional image with depth information in which depth information is added to I cam is stored in the two-dimensional image storage unit 103 with depth information.
  • the output unit 112 has output the two-dimensional image I cam in accordance with a user instruction.
  • the user brings the pointer of the input means (for example, mouse) to the location of the sphere in the output I cam and gives an instruction.
  • the reception unit 104 receives a continuous area instruction.
  • the continuous area instruction indicates the entire sphere.
  • the instruction voxel information determination unit 109 acquires the coordinate value (x1, y1) of the specified point.
  • the instruction voxel information determination unit 109 acquires all voxel information having the same plane coordinates (x1, y1) as the acquired coordinate values (x1, y1) from the three-dimensional image I vol .
  • the instruction voxel information determination unit 109 acquires the reference voxel information that is not the transparent but the foremost voxel information among all the acquired voxel information.
  • the instruction voxel information determination unit 109 acquires color information (col1) of the reference voxel information.
  • the instruction voxel information determination unit 109 acquires all voxel information adjacent to the reference voxel information from I vol .
  • the instruction voxel information determination unit 109 compares the color information (col1) of the reference voxel information with the color information (col2) of each adjacent voxel information, and whether the difference between the two color information is within a threshold value. Whether or not (
  • the above processing is recursively repeated using each instruction voxel information as reference voxel information. Then, the instruction voxel information determination unit 109 obtains all the voxel information constituting the sphere.
  • the 3D image processing unit 110 sets the ⁇ values of all the voxel information obtained by the instruction voxel information determination unit 109 to 1 and sets the ⁇ values of other voxel information to 0. Then, the 3D image processing unit 110 obtains a processed 3D image I ′ vol shown in FIG. In addition, the broken line of I ′ vol in FIG. 15 is not actually visible.
  • the extended image generation unit 111 superimposes the processed 3D image I ′ vol and the 2D image with depth information or the 2D image with processed depth information.
  • the extended image I aug is generated.
  • the two-dimensional image with post-processing depth information may be subjected to any processing.
  • the two-dimensional image storage unit 101 stores a two-dimensional image (FIG. 16) obtained by photographing the human body with an endoscope.
  • a three-dimensional image (FIG. 17) aligned with the two-dimensional image is stored in the three-dimensional image storage unit 102.
  • a three-dimensional image is an image obtained by using MRI for a human.
  • the two-dimensional image with depth information is stored in the two-dimensional image storage unit 103 with depth information in which the depth information is added to the two-dimensional image (FIG. 16).
  • the reception unit 104 receives a third extended image acquisition instruction.
  • the third extended image acquisition instruction here is that the ⁇ value of the two-dimensional image with depth information is set to ⁇ 1 (constant) and the ⁇ value of the three-dimensional image is set to ⁇ 2 (constant).
  • a projection image of the processed three-dimensional image I ′ vol (a three-dimensional image with an ⁇ value “ ⁇ 2”) is pasted on the two-dimensional image I ′ cam (a two-dimensional image with depth information with an ⁇ value “ ⁇ 1”) It is assumed that the instruction is for rendering and obtaining an extended image.
  • the two-dimensional image processing unit with depth information 108 sets the ⁇ value of all the pixel information of the two-dimensional image with depth information to ⁇ 1, and obtains a processed two-dimensional image I ′ cam with depth information.
  • the 3D image processing unit 110 sets the ⁇ value of all voxel information of the 3D image to ⁇ 2, and obtains a processed 3D image I ′ vol .
  • the extended image generation unit 111 superimposes I ′ cam and I ′ vol to generate an extended image I aug .
  • the output unit 112 outputs the extended image I aug .
  • An output example of such I aug is shown in FIG.
  • a necessary image can be obtained by effectively using a two-dimensional image and a three-dimensional image of the same object.
  • the image processing apparatus 1 is used in a surgery support system, for example, it is possible to superimpose information that is not directly visible to the naked eye, such as a planned cutting position and surrounding blood vessels and nerves, on an endoscopic image. Effective surgical support.
  • the image processing apparatus 1 includes a two-dimensional image storage unit 101, a three-dimensional image storage unit 102, a two-dimensional image storage unit 103 with depth information, a reception unit 104, and a depth information acquisition unit. 105, depth information-added two-dimensional image acquisition unit 106, depth information-added two-dimensional image storage unit 107, depth information-added two-dimensional image processing unit 108, instruction voxel information determination unit 109, three-dimensional image processing unit 110, extension An image generation unit 111 and an output unit 112 are provided.
  • the image processing apparatus 1 may omit some of the above configurations. For example, as illustrated in FIG.
  • the image processing apparatus 1 includes a three-dimensional image storage unit 102, a two-dimensional image storage unit 103 with depth information, a reception unit 104, a two-dimensional image processing unit with depth information 108, and an extended image. Only the generation unit 111 and the output unit 112 may be used.
  • the processing in the present embodiment may be realized by software. Then, this software may be distributed by software download or the like. Further, this software may be recorded on a recording medium such as a CD-ROM and distributed.
  • the software that implements the information processing apparatus according to the present embodiment is the following program. That is, this program stores image data of one or more three-dimensional objects in a storage area, position information that is information indicating a position, color information that is information about a color, and a depth from a reference point. A three-dimensional image composed of a plurality of voxel information, which is point information having depth information, and an image obtained by photographing the one or more three-dimensional objects, and aligned with the three-dimensional image.
  • a two-dimensional image with depth information composed of a plurality of pixel information, which is point information having position information, color information, and depth information, and is accessible to the storage area 2D image processing with depth information for performing a first image processing on the 2D image with depth information using the depth information, and obtaining a 2D image with depth information after processing.
  • An extended image generating unit that generates an extended image obtained by superimposing a two-dimensional image and a processed three-dimensional image obtained by performing second image processing on the three-dimensional image or the three-dimensional image, and the extended image Is a program for functioning as an output unit for outputting.
  • the storage area is an image obtained by photographing the one or more three-dimensional objects, the image is aligned with the three-dimensional image, and position information and color information are stored.
  • a depth information acquisition unit for acquiring depth information included in voxel information, which is color information that is not transparent and the depth information is closest to the reference position, for each position information on a plane;
  • Each depth information acquired by the depth information acquisition unit for each position information on a plane is a plurality of pieces of pixel information included in the two-dimensional image, and each pixel having the same position information on the plane.
  • the computer further functions as a two-dimensional image acquisition unit with depth information for acquiring a two-dimensional image with depth information consisting of two or more pieces of pixel information having position information, color information, and depth information added to the information It is preferable that the program is to be executed.
  • the two-dimensional image processing unit with depth information may have at least two or more different color information for the two-dimensional image with depth information depending on a difference in depth information of pixel information.
  • the program is a program that causes a computer to function as one that performs first image processing for setting different opacity information and obtains a post-processing depth information-added two-dimensional image.
  • the second image processing for setting at least two or more different color information and / or different opacity information according to the difference in depth information of the voxel information for the three-dimensional image.
  • the program is a program that further functions a computer as a three-dimensional image processing unit that performs and acquires a processed three-dimensional image.
  • a reception unit that receives an instruction for one or more points of the output two-dimensional image, two-dimensional image with depth information, or extended image, and one or more points corresponding to the instruction
  • a program for causing a computer to further function as an instruction voxel information determination unit that determines two or more consecutive voxel information using color information included in the voxel information, wherein the three-dimensional image processing unit includes the instruction voxel Second image processing is performed for setting different color information and / or different opacity information to two or more voxel information determined by the information determination unit, or voxel information other than the two or more voxel information.
  • the program is a program that causes a computer to function as a device for acquiring a later three-dimensional image.
  • FIG. 20 shows the external appearance of a computer that executes the program described in this specification to realize the image processing apparatus or the like according to the above-described embodiment.
  • the above-described embodiments can be realized by computer hardware and a computer program executed thereon.
  • FIG. 20 is an overview of the computer system 340
  • FIG. 21 is a block diagram of the computer system 340.
  • the computer system 340 includes a computer 341 including an FD drive and a CD-ROM drive, a keyboard 342, a mouse 343, and a monitor 344.
  • the computer 341 stores an MPU 3413, a bus 3414 connected to the CD-ROM drive 3412 and the FD drive 3411, and a program such as a bootup program.
  • a RAM 3416 for temporarily storing application program instructions and providing a temporary storage space; and a hard disk 3417 for storing application programs, system programs, and data.
  • the computer 341 may further include a network card that provides connection to the LAN.
  • a program for causing the computer system 340 to execute the functions of the image processing apparatus or the like of the above-described embodiment is stored in the CD-ROM 3501 or FD 3502, inserted into the CD-ROM drive 3412 or FD drive 3411, and further the hard disk 3417. May be transferred to.
  • the program may be transmitted to the computer 341 via a network (not shown) and stored in the hard disk 3417.
  • the program is loaded into the RAM 3416 at the time of execution.
  • the program may be loaded directly from the CD-ROM 3501, the FD 3502, or the network.
  • the program does not necessarily include an operating system (OS) or a third-party program that causes the computer 341 to execute the functions of the image processing apparatus according to the above-described embodiment.
  • the program only needs to include an instruction portion that calls an appropriate function (module) in a controlled manner and obtains a desired result. How the computer system 340 operates is well known and will not be described in detail.
  • the computer that executes the program may be singular or plural. That is, centralized processing may be performed, or distributed processing may be performed.
  • each process may be realized by centralized processing by a single device (system), or by distributed processing by a plurality of devices. May be.
  • the image processing apparatus has an effect that a necessary image can be obtained by effectively using a two-dimensional image and a three-dimensional image of the same object, and surgery. It is useful as a support system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

[Problème] L'invention a pour but de rendre possible, contrairement à ce qui était le cas par des moyens conventionnels, l'obtention efficace d'une image requise en utilisant une image bidimensionnelle et une image tridimensionnelle du même objet. [Solution] Une image requise peut être obtenue à l'aide d'un dispositif de traitement d'images comportant : une unité de traitement d'images bidimensionnelles avec informations de relief qui mémorise une image tridimensionnelle qui représente des données d'image d'un ou plusieurs objets tridimensionnels et qui est constituée d'une pluralité d'informations de voxels comprenant des informations de localisation, des informations de couleur et des informations de relief, et une image bidimensionnelle avec informations de relief qui est une image photographiée d'un ou plusieurs objets tridimensionnels, qui est alignée en localisation avec l'image tridimensionnelle et qui est constituée d'une pluralité d'informations de pixels comprenant des informations de localisation, des informations de couleur, et des informations de relief, ladite unité de traitement d'images bidimensionnelles réalisant un premier traitement d'image sur l'image bidimensionnelle avec informations de relief en utilisant les informations de relief et acquérant une image bidimensionnelle après traitement avec informations de relief ; une unité de génération d'images étendues qui génère une image étendue dans laquelle l'image bidimensionnelle après traitement avec informations de relief et l'image tridimensionnelle se chevauchent ; et une unité de sortie qui présente l'image étendue.
PCT/JP2012/071213 2011-08-24 2012-08-22 Dispositif de traitement d'images, procédé de traitement d'images et support d'enregistrement WO2013027773A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-182615 2011-08-24
JP2011182615A JP5808004B2 (ja) 2011-08-24 2011-08-24 画像処理装置、画像処理方法、およびプログラム

Publications (1)

Publication Number Publication Date
WO2013027773A1 true WO2013027773A1 (fr) 2013-02-28

Family

ID=47746511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/071213 WO2013027773A1 (fr) 2011-08-24 2012-08-22 Dispositif de traitement d'images, procédé de traitement d'images et support d'enregistrement

Country Status (2)

Country Link
JP (1) JP5808004B2 (fr)
WO (1) WO2013027773A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9747680B2 (en) 2013-11-27 2017-08-29 Industrial Technology Research Institute Inspection apparatus, method, and computer program product for machine vision inspection
CN107967715A (zh) * 2016-10-19 2018-04-27 富士施乐株式会社 数据处理装置、三维物体创建系统以及数据处理方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6698824B2 (ja) * 2016-04-11 2020-05-27 富士フイルム株式会社 画像表示制御装置および方法並びにプログラム
JP6461257B2 (ja) * 2017-07-24 2019-01-30 キヤノン株式会社 画像処理装置およびその方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09253038A (ja) * 1996-03-22 1997-09-30 Toshiba Corp 医用画像診断装置
JP2000051202A (ja) * 1998-08-14 2000-02-22 Ge Yokogawa Medical Systems Ltd 画像表示方法および画像表示装置
JP2002238887A (ja) * 2001-02-19 2002-08-27 National Cancer Center-Japan 仮想内視鏡
JP2002263053A (ja) * 2001-03-06 2002-09-17 Olympus Optical Co Ltd 医用画像表示装置および医用画像表示方法
JP2006061274A (ja) * 2004-08-25 2006-03-09 Konica Minolta Medical & Graphic Inc プログラム、及び内視鏡システム
JP2006320427A (ja) * 2005-05-17 2006-11-30 Hitachi Medical Corp 内視鏡手術支援システム
JP2008220802A (ja) * 2007-03-15 2008-09-25 Hitachi Medical Corp 医用画像診断装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09253038A (ja) * 1996-03-22 1997-09-30 Toshiba Corp 医用画像診断装置
JP2000051202A (ja) * 1998-08-14 2000-02-22 Ge Yokogawa Medical Systems Ltd 画像表示方法および画像表示装置
JP2002238887A (ja) * 2001-02-19 2002-08-27 National Cancer Center-Japan 仮想内視鏡
JP2002263053A (ja) * 2001-03-06 2002-09-17 Olympus Optical Co Ltd 医用画像表示装置および医用画像表示方法
JP2006061274A (ja) * 2004-08-25 2006-03-09 Konica Minolta Medical & Graphic Inc プログラム、及び内視鏡システム
JP2006320427A (ja) * 2005-05-17 2006-11-30 Hitachi Medical Corp 内視鏡手術支援システム
JP2008220802A (ja) * 2007-03-15 2008-09-25 Hitachi Medical Corp 医用画像診断装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9747680B2 (en) 2013-11-27 2017-08-29 Industrial Technology Research Institute Inspection apparatus, method, and computer program product for machine vision inspection
CN107967715A (zh) * 2016-10-19 2018-04-27 富士施乐株式会社 数据处理装置、三维物体创建系统以及数据处理方法
CN107967715B (zh) * 2016-10-19 2023-11-07 富士胶片商业创新有限公司 数据处理装置、三维物体创建系统以及数据处理方法

Also Published As

Publication number Publication date
JP2013045284A (ja) 2013-03-04
JP5808004B2 (ja) 2015-11-10

Similar Documents

Publication Publication Date Title
US20160232703A1 (en) System and method for image processing
KR101724360B1 (ko) 혼합현실 디스플레이 장치
JP4588736B2 (ja) 画像処理方法および装置並びにプログラム
CN106898027B (zh) 用于三维图像到二维图像的标测的方法和设备
JP2006055213A (ja) 画像処理装置、及びプログラム
JP5492024B2 (ja) 領域分割結果修正装置、方法、及びプログラム
JP6215057B2 (ja) 可視化装置、可視化プログラムおよび可視化方法
JP5793243B2 (ja) 画像処理方法および画像処理装置
JP2013531322A (ja) 多重画像の融合
JP5808004B2 (ja) 画像処理装置、画像処理方法、およびプログラム
US12073508B2 (en) System and method for image processing
CN107633478B (zh) 图像处理装置、图像处理方法以及计算机可读介质
US20110172534A1 (en) Providing at least one slice image based on at least three points in an ultrasound system
JP6739897B2 (ja) レンダリングを行う方法及びシステム
WO2009122724A1 (fr) Dispositif, procédé et programme de traitement d'image
JP2006000127A (ja) 画像処理方法および装置並びにプログラム
JP5595207B2 (ja) 医用画像表示装置
US11000252B2 (en) Device for visualizing a 3D object
EP4258221A2 (fr) Appareil de traitement d'image, procédé de traitement d'image et programme
JP2008067915A (ja) 医用画像表示装置
JP2009247502A (ja) 中間画像生成方法および装置ならびにプログラム
KR100466409B1 (ko) 가상 내시경 시스템, 가상 내시경 디스플레이 방법과 그 방법을 컴퓨터 상에서 수행하는 프로그램을 저장한 컴퓨터가 판독 가능한 기록 매체
JP5065740B2 (ja) 画像処理方法および装置並びにプログラム
US9218104B2 (en) Image processing device, image processing method, and computer program product
WO2010113690A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12826069

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12826069

Country of ref document: EP

Kind code of ref document: A1