US20050068317A1 - Program, method, and device for comparing three-dimensional images in voxel form - Google Patents

Program, method, and device for comparing three-dimensional images in voxel form Download PDF

Info

Publication number
US20050068317A1
US20050068317A1 US10/989,464 US98946404A US2005068317A1 US 20050068317 A1 US20050068317 A1 US 20050068317A1 US 98946404 A US98946404 A US 98946404A US 2005068317 A1 US2005068317 A1 US 2005068317A1
Authority
US
United States
Prior art keywords
voxel
image
images
comparison
differential image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/989,464
Other languages
English (en)
Inventor
Makoto Amakai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Assigned to FUJITSU LIMITED reassignment FUJITSU LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AMAKAI, MAKOTO
Publication of US20050068317A1 publication Critical patent/US20050068317A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/36Level of detail

Definitions

  • the present invention relates to a program, method, and device for comparing three-dimensional (3D) images and displaying differences therebetween. More particularly, the present invention relates to a 3D image comparison program, method, and device that use voxel-based techniques to compare given 3D images at a high accuracy.
  • Three-dimensional imaging techniques are used in many fields today to represent an object on a computer screen.
  • Such 3D images include, for example, design images created with 3D computer aided design (CAD) tools prevalent in various manufacturing industries.
  • CAD computer aided design
  • solid images of affected part can be captured with a 3D ultrasonic diagnostic system.
  • 3D solid images in the engineering field are that they make it easy for designers to review their product design with a prototype, as well as for product inspectors to check manufactured products, with respect to the intended design image.
  • solid images captured with a 3D ultrasonic diagnostic system will aid doctors to visually identify an illness or deformation at an affected part of the patient's body.
  • Images to be compared are: a CAD design model composed of 3D free-form surfaces and a set of point data of a product or component manufactured using that model, the latter being obtained by scanning an object with a 3D geometry measurement device. Also compared is a cross section or surface that is formed from such measured point data.
  • FIG. 26 shows an example of a 3D image of an object (CAD data of 3D curved surface) and one element of a 3D point data set (measurement data).
  • z y 3 +3 x 2 y (1)
  • a given point (x 0 , y 0 , z 0 ) its distance to the surface is defined to be the minimum value of ⁇ (x ⁇ x 0 ) 2 +(y ⁇ y 0 ) 2 +(z ⁇ z 0 ) 2 ⁇ 1/2 .
  • This calculation is repeated extensively for all pieces of surface in the neighborhood of point (x 0 , y 0 , z 0 ), and the smallest among the resulting values is extracted as the point-to-image distance.
  • FIG. 27 Illustrated in FIG. 27 are first and second surface data Im 101 and Im 102 , the former being a design model and the other being a model built from photographs or the like.
  • the conventional method is unable to compare those two sets of surface data directly. Therefore, according to the conventional method, the surface given by the second surface data Im 102 is first converted into a set of discrete points, and this point data set PIm 102 is then compared with the first surface data Im 101 .
  • this 3D image comparison program compares first and second 3D images and displays their differences as follows.
  • the computer first produces a 3D differential image by converting given first and second 3D images into voxel form and making a comparison between the two voxel-converted images.
  • the computer produces a 3D fine differential image that provides fine-voxel images that provide details of the first and second 3D images at a higher resolution than that of the 3D differential image.
  • the computer determines a representation scheme from differences between the first and second 3D images in their respective fine-voxel images.
  • the computer displays the 3D differential image including the surface voxel drawn in the determined representation scheme.
  • this comparison program compares first and second 3D images and displays their differences as follows.
  • the computer produces a 3D differential image by converting given first and second 3D images into voxel form and making a comparison between the two voxel-converted images.
  • the computer extracts a dissimilar part and counts voxel mismatches perpendicularly to each surface of a reference voxel selected from an outermost layer of matched voxels that is revealed. Based on that count, the computer determines a representation scheme for each surface of the reference voxel, and displays each surface of the reference voxel in the corresponding representation scheme.
  • this 3D image comparison program compares first and second 3D images and displays their differences as follows.
  • the computer first produces a 3D differential image by converting given first and second 3D images into voxel form and making a comparison between the two voxel-converted images.
  • the computer calculates a ratio of the number of surface voxels of a dissimilar part of the 3D differential image to the total number of voxels constituting the dissimilar part, and based on that ratio, it determines a representation scheme for visualizing voxel mismatches on the 3D differential image.
  • the computer displays the dissimilar part of the 3D differential image in the representation scheme determined from the ratio.
  • FIG. 1 is a conceptual view of a first embodiment of the invention.
  • FIG. 2 is a conceptual view of a voxel-based 3D image comparison method according to the first embodiment.
  • FIG. 3 shows an example of a computer hardware structure that is suitable for execution of a 3D image comparison program.
  • FIG. 4 is a block diagram showing a functional structure of a 3D image comparison device.
  • FIG. 5 gives an overview of a comparison process of the 3D image comparison program.
  • FIG. 6 shows fine-voxel comparison of two 3D images, which is made after the initial comparison is completed.
  • FIG. 7 shows a color for common voxels, which is selected in a process of setting colors according to voxel mismatch count.
  • FIG. 8 shows a color for a voxel mismatch belonging to the reference image, which is selected in the process of setting colors according to voxel mismatch count.
  • FIG. 9 shows a color for a voxel mismatch belonging to the subject image, which is selected in the process of setting colors according to voxel mismatch count.
  • FIG. 10 shows the relationship between voxel element numbers and articulation point numbers.
  • FIG. 11 shows a typical data structure representing articulation points of a 3D image.
  • FIG. 12 shows a data structure of voxel representation using articulation points.
  • FIG. 13 is a flowchart showing the entire process flow of 3D image comparison.
  • FIG. 14 is a flowchart of a comparison data entry process as part of the 3D image comparison process of FIG. 13 .
  • FIG. 15 is a flowchart of voxel processing as part of the 3D image comparison process of FIG. 13 .
  • FIG. 16 is a flowchart of an image positioning process as part of the 3D image comparison process of FIG. 13 .
  • FIG. 17 is a flowchart of a first image comparison process as part of the 3D image comparison process of FIG. 13 .
  • FIG. 18 gives an overview of a comparison process performed with a 3D image comparison program according to a second embodiment.
  • FIG. 19 is a flowchart of a second image comparison process as part of the 3D image comparison process of FIG. 13 .
  • FIG. 20 shows a first example of a difference that is evaluated by a 3D image comparison program according to a third embodiment of the invention.
  • FIG. 21 shows a light color determined by the 3D image comparison program of the third embodiment.
  • FIG. 22 shows a second example of a difference that is evaluated by the 3D image comparison program of the third embodiment.
  • FIG. 23 shows a deep color determined by the 3D image comparison program of the third embodiment.
  • FIG. 24 shows a third example of a difference that is evaluated by the 3D image comparison program of the third embodiment.
  • FIG. 25 is a flowchart of a third image comparison process as part of the 3D image comparison process of FIG. 13 .
  • FIG. 26 shows a 3D image of an object (CAD data of 3D curved surface) and one element of a 3D point data set (measurement data).
  • FIG. 27 is a conceptual view of a conventional 3D image comparison method.
  • the present invention is directed to geometrical comparison of 3D images such as:
  • voxel mismatches are counted in the perpendicular direction (x, y, or z direction) from each surface of a voxel that is selected from among those on the outermost layer of matched voxels.
  • the resulting count values are used to determine representation schemes for the corresponding surfaces of that voxel (called a reference voxel). Comparison results are thus visualized as different surface colors of a reference voxel by color-coding the thickness of the corresponding dissimilar part.
  • a first embodiment of the invention provides a 3D image comparison program, method, and device that visualize differences between 3D images in the first way (1) mentioned above.
  • a second embodiment of the invention provides a 3D image comparison program, method, and device that visualize differences between 3D images in the second way (2).
  • a third embodiment of the invention provides a 3D image comparison program, method, and device that visualize differences between 3D images in the third way (3).
  • This section explains a first embodiment of the invention, in which 3D images are compared on an individual voxel basis and their differences are visualized in a particular representation scheme.
  • FIG. 1 is a conceptual view of the first embodiment.
  • a 3D image comparison program according to the first embodiment which causes a computer to compare 3D images and visualize their differences according to the steps described below. Briefly, the computer compares a first 3D image B 1 representing a design model in voxel form with a second 3D image B 2 representing measurement results in voxel form and displays their differences in a particular representation scheme.
  • the computer converts given first and second 3D images B 1 and B 2 into voxel form.
  • the computer compares the two 3D images B 1 and B 2 in the voxel domain, thereby producing a 3D differential image VB 1 that represents their differences and similarities distinguishably.
  • the computer produces a 3D fine differential image VB 2 of the 3D images B 1 and B 2 .
  • This 3D fine differential image VB 2 contains fine-voxel images that provides local details of the original 3D images B 1 and B 2 in that surface voxel, at a higher resolution than that of the 3D differential image.
  • step S 3 the computer determines a representation scheme from differences between the first and second 3D images in their respective fine-voxel images.
  • step S 4 the computer outputs voxel comparison results on a display screen or the like by drawing the 3D differential image VB 1 , where the surface voxel is depicted in the determined representation scheme.
  • the voxel comparison results may be output, not only on a display screen, but to a printer, plotter, or other devices. In this way, the given 3D images can be compared quickly and accurately.
  • RGB red, green, and blue
  • CML subtractive mixture of cyan, magenta, and yellow
  • This subtractive color mixture may include black as an additional element.
  • first surface data Im 11 representing a design model
  • second surface data Im 12 obtained through measurement (e.g., by scanning an object). Since those two sets of surface data cannot be compared directly, the 3D image comparison method of the first embodiment begins processing with converting a volume defined by the first surface data Im 11 into a chunk of voxels B 11 , as well as that of the second surface data Im 12 into another chunk of voxels B 12 . The method then compares those two chunks of voxels B 11 and B 12 , thereby overcoming the difficulty of comparing two 3D images.
  • FIG. 3 shows an example of a computer hardware structure that is suitable for execution of a 3D image comparison program.
  • the illustrated computer 100 has a central processing unit (CPU) 101 to control the entire system, interacting with the following components via a bus 107 : a random access memory (RAM) 102 , a hard disk drive (HDD) 103 , a graphics processor 104 , an input device interface 105 , and a communication interface 106 .
  • CPU central processing unit
  • the RAM 102 temporarily stores the whole or part of operating system (OS) programs and application programs that the CPU 101 executes, as well as other various data objects manipulated by the CPU 101 at runtime.
  • the HDD 103 stores program and data files of the operating system and various applications including a 3D image comparison program.
  • the graphics processor 104 is coupled to an external monitor unit P 111 .
  • the graphics processor 104 produces video images in accordance with drawing commands from the CPU 101 and displays them on the screen of the monitor unit P 111 .
  • the input device interface 105 is coupled to a keyboard P 112 and a mouse P 113 , so that input signals from the keyboard P 112 and mouse P 113 will be supplied to the CPU 101 via the bus 107 .
  • the communication interface 106 is linked to a network 110 , permitting the CPU 101 to exchange data with other computers.
  • the computer 100 with the above-described hardware configuration will function as a 3D image comparison device when a 3D image comparison program is running thereon.
  • the following will now explain what processing functions the computer, as a 3D image comparison device, is supposed to provide.
  • FIG. 4 is a block diagram showing a functional structure of a 3D image comparison device according to the invention. This structure applies all the first to third embodiments.
  • the illustrated 3D image comparison device has the following elements: a comparison data entry unit 10 for entering comparison data; a voxel processor 20 for producing voxels from given 3d images; an image positioning processor 30 for positioning voxel-converted 3D images; a voxel difference evaluator 40 for evaluating and displaying voxel mismatches; and a voxel overlaying processor 50 for overlaying those voxels for display.
  • the comparison data entry unit 10 has, among others, the following functional elements: a measurement resolution entry unit 11 , a measurement resolution memory 12 , a 3D surface entry unit 13 , an image memory 14 , a point set process switch 15 , a surface point set entry unit 16 , a surface point set memory 17 , a volume point set entry unit 18 , and a volume point set memory 19 .
  • the voxel processor 20 has, among others, a first voxel generator 21 , a second voxel generator 22 , and a voxel memory 23 .
  • the comparison data entry unit 10 manages a process of accepting entry of source image data, such as CAD data and measurement data, for 3D image comparison.
  • source image data such as CAD data and measurement data
  • the comparison data entry unit 10 identifies the image type of each specified 3D image.
  • the comparison data entry unit 10 also checks whether 3D images have been entered to relevant storage spaces, i.e., the image memory 14 , surface point set memory 17 , or volume point set memory 19 , as specified by the user.
  • Source image data for comparison are supplied from a 3D CAD image file DB 11 and a 3D measured image file DB 12 .
  • the 3D CAD image file DB 11 stores CAD data of a 3D surface model.
  • the 3D measured image file DB 12 stores measurement results including surface measurement data and section measurement data.
  • the measurement resolution entry unit 11 receives a measurement resolution parameter from the user.
  • This measurement resolution parameter gives a minimum size of voxels into which 3D images are to be divided.
  • the user can specify a desired size for this purpose, and preferably, the resolution is as fine as the precision required in manufacturing a product of interest, which is, for example, 0.01 mm.
  • the measurement resolution memory 12 coupled to the measurement resolution entry unit 11 , stores the measurement resolution parameter that is received.
  • the 3D surface entry unit 13 is activated when the image type of a user-specified 3D image turns out to be “3D surface model.”
  • the 3D surface entry unit 13 receives 3D surface data from the 3D CAD image file DB 11 .
  • the image memory 14 coupled to the 3D surface entry unit 13 , stores this received 3D surface data in an internal storage space.
  • the point set process switch 15 is coupled to a surface point set entry unit 16 and a volume point set entry unit 18 to select either of them to handle a given point set data.
  • the point set process switch 15 activates either the surface point set entry unit 16 or volume point set entry unit 18 .
  • the surface point set entry unit 16 handles entry of surface measurement data.
  • the surface point set entry unit 16 is activated to receive surface measurement data from the 3D measured image file DB 12 .
  • the surface point set memory 17 stores the received surface measurement data in its internal storage space.
  • the volume point set entry unit 18 handles entry of section measurement data.
  • the volume point set entry unit 18 is activated to receive section measurement data from a 3D measured image file DB 12 .
  • the volume point set memory 19 stores the received section measurement data in its internal storage space.
  • the voxel processor 20 is placed between the comparison data entry unit 10 and image positioning processor 30 .
  • the voxel processor 20 produces an image in voxel form from a 3D image supplied from the comparison data entry unit 10 .
  • the voxel processor 20 selects comparison data recorded in the comparison data entry unit 10 .
  • the voxel processor 20 determines the voxel size from the measurement resolution parameter recorded in the comparison data entry unit 10 . Then, with reference to the image type of each 3D image stored in the comparison data entry unit 10 , the voxel processor 20 determines what to do at the next step.
  • the voxel processor 20 retrieves relevant 3D surface data out of the image memory 14 and supplies it to a first voxel generator 21 .
  • the first voxel generator 21 creates a voxel-converted image from given 3D surface data.
  • the voxel processor 20 retrieves relevant surface measurement data out of the surface point set memory 17 and supplies it to a second voxel generator 22 . Further, in the case the image type indicates that the image in question is section measurement data, the voxel processor 20 retrieves relevant section measurement data out of the volume point set memory 19 and supplies it to the second voxel generator 22 . The second voxel generator 22 creates a voxel-converted image from given surface measurement data or given section measurement data. The voxel processor 20 checks whether the two voxel generators 21 and 22 have successfully produced voxel-converted images.
  • the image positioning processor 30 is placed between the voxel processor 20 and voxel difference evaluator 40 to align the two voxel-converted images supplied from the voxel processor 20 . More specifically, two 3D images (i.e., one is CAD data representing a 3D surface, and the other is surface measurement data or section measurement data obtained through measurement) have a set of fiducial points that are previously specified in each of them. The image positioning processor 30 selects those fiducial points and calculates the offset of each images from the selected fiducial points. The image positioning processor 30 moves the voxel-converted 3D images, based on the calculated offsets, and then it stores the moved images in its internal storage space.
  • two 3D images i.e., one is CAD data representing a 3D surface, and the other is surface measurement data or section measurement data obtained through measurement
  • the image positioning processor 30 selects those fiducial points and calculates the offset of each images from the selected fiducial points.
  • the image positioning processor 30
  • the voxel difference evaluator 40 is placed between the image positioning processor 30 and voxel overlaying processor 50 .
  • the voxel difference evaluator 40 compares and evaluates two images properly positioned by the image positioning processor 30 and displays the results by visualizing differences between the two images in a particular representation scheme, e.g., by using RGB colors with particular depths.
  • the voxel difference evaluator 40 selects a surface voxel of a 3D differential image and applies an additional voxel conversion to that part at a specified test resolution. That is, the selected surface voxel is subdivided into smaller voxels. The size of those “fine voxels” is determined according to a test resolution, which may be specified previously by the user. This test resolution may initially be set to several times as coarse as the measurement resolution, and later be varied stepwise to evaluate images at a finer resolution.
  • the voxel difference evaluator 40 makes a comparison between the reconverted 3D surface and surface/section measurement data in the fine voxel domain and counts their voxel mismatches. The details of this comparison process will be discussed later.
  • the voxel difference evaluator 40 determines a color depth as a representation scheme to be used, so that the voxel of interest will be colored.
  • the voxel difference evaluator 40 determines whether all surface voxels are colored in this way, meaning that it checks whether the comparison of voxel-converted 3D images is completed.
  • the voxel overlaying processor 50 coupled to the voxel difference evaluator 40 , aligns and combines a plurality of resulting voxel images processed by the voxel difference evaluator 40 .
  • the first embodiment reduces computation time by comparing 3D shapes in the voxel domain.
  • a CAD-generated image of an object having 3D curved surfaces The image space is divided into uniform, minute cubes, or voxels, and a 3D image of that object is created by setting every voxel belonging to the object to ON state and every other outer voxel to OFF state.
  • the source image data is given as an output of an X-ray CT scanner, voxels are set to ON state if they have high x-ray densities, while the others are set to OFF state.
  • Boolean operations include an exclusive-OR (XOR) operation between voxels of 3D images B 21 and B 22 , which extracts voxel mismatches.
  • the initial voxel resolution of 3D images is relatively coarse compared to test resolution.
  • the comparison process then produces fine voxels for each coarse voxel in order to find and evaluate differences between the images in terms of the number of voxel mismatches.
  • This difference information in numerical form is then put on one of the two images in visual form, so that the user can recognize it visually. The following will describe the mechanism for this feature, with reference to FIG. 6 .
  • FIG. 6 shows a fine-voxel comparison of two 3D images, which is made after an initial coarse-voxel comparison is finished.
  • the process assumes that a 3D differential image VB 21 has been produced in coarse voxel form as shown in FIG. 6 .
  • Voxels on the surface of this 3D differential image VB 21 are subdivided into fine voxels (3D fine differential image VB 22 ), so that the difference between two images can be evaluated in terms of the number of mismatches in fine voxels observed therein.
  • the 3D fine differential image VB 22 of one coarse voxel contains eight fine voxels that are different between two source images 121 and 122 .
  • voxels Bx 11 , Bx 12 , and Bx 13 are given particular colors (e.g., several blues with different depths), indicating that the voxel mismatches belong to the second image 122 , but not to the first image 121 .
  • Another voxel Bx 16 is given another particular color (e.g., red with a certain depth), indicating that this mismatch belongs to the first image 121 , but not to the second image 122 .
  • the remaining voxels Bx 14 and Bx 15 have no colors assigned, meaning that no mismatches are found at that portion.
  • Voxels are subdivided in the proportion of one coarse voxel to four fine voxels in the example of FIG. 6 . This means that the magnitude of three-dimensional mismatch in a coarse voxel can take a value of 64 (4 ⁇ 4 ⁇ 4) at maximum. Referring next to FIGS. 7 to 9 , the following will describe how a color is determined when a mismatch count is given.
  • FIG. 7 shows a color for common voxels, which is selected in a process of setting colors according to voxel mismatch count.
  • the control boxes in area ST 1 indicates that maximum intensities (e.g., 255, 255, 255) are given to three colors RGB to represent common voxels (i.e., voxels with no mismatch).
  • FIG. 8 shows a color for a voxel mismatch belonging to the reference image, which is selected in the process of setting colors according to voxel mismatch count.
  • the central control box in area ST 2 indicates that the intensity of green (G) is decreased in accordance with the number of voxel mismatches.
  • the colors determined in this way are used to color-coding voxels at which the reference image lies inside the subject image.
  • FIG. 9 shows a color for a voxel mismatch belonging to the subject image, which is selected in the process of setting colors according to voxel mismatch count.
  • the topmost control box in area ST 3 indicates that the intensity of red (R) is decreased in accordance with the number of voxel mismatches, as shown in FIG. 9 .
  • the colors determined in this way are used to color-coding voxels at which the subject image lies inside the reference image.
  • the color values are determined in proportion to the number of mismatched fine voxels of each image.
  • FIGS. 10 to 12 the structure of data used in the 3D image comparison program, method, and device will be described below.
  • FIG. 10 shows the relationship between voxel element numbers and articulation point numbers.
  • FIG. 10 assumes that a 3D image is converted to eight voxels Bx 1 to Bx 8 .
  • Voxel Bx 1 is defined as a set of articulation points (1, 2, 5, 4, 10 11, 14, 13), where the points is ordered counterclockwise, first on the bottom plane and then on the top plane.
  • the voxel data structure contains coordinate data for each of such articulation points. Referring now to FIGS. 11 and 12 , the following will describe this coordinate data in detail.
  • FIG. 11 shows a typical data structure representing articulation points of a 3D image, which is, more particularly, a typical data structure used in a finite element analysis.
  • Voxel data structure D 1 of FIG. 11 is formed from a data type field D 11 representing classification of data, a data number field D 12 giving serial numbers in each data type, and a location data field D 13 providing location data for each data type.
  • the data type field D 11 contains symbols such as “GRID” and “CHEXA” as shown in FIG. 11 .
  • GRID refers to articulation points
  • CHEXA refers to segments constituting a 3D image.
  • the data number field D 12 contains serial numbers. For example, GRID entries are serially numbered, from “1” to “27,” as FIG. 11 shows. These serial numbers correspond to what have been mentioned in FIG. 10 as articulation point numbers.
  • the data number field D 12 also contains serial numbers “1” to “8” for CHEXA entries shown in FIG. 11 . These serial numbers “1” to “8” are what have been mentioned in FIG. 10 as voxel element numbers.
  • the location data field D 13 contains the coordinates of each articulation point, such as “0.000000, 0.000000, 50.00000” for the GRID entry numbered “1,” as shown in FIG. 11 .
  • Each data number (referred to as “articulation point ID” for GRID entries) is followed by coordinates (x, y, z) in floating-point notation.
  • FIG. 11 shows other entries of the same data type.
  • the location data field D 13 further contains articulation point IDs of each CHEXA entry, such as “17, 21, 27, 22, 7, 10, 26, 14” for the entry numbered “1.” That is, each CHEXA entry is defined as a data number (referred to as “element ID”) accompanied by such a series of articulation point IDs.
  • FIG. 11 also shows other similar entries.
  • FIG. 12 shows a data structure of voxels using articulation points, which enables a 3D image space to be represented in the voxel domain.
  • the data structure D 2 defines voxels in two separate records that are previously defined, one containing element sizes and the other containing articulation points of each element. Element sizes are represented by the lengths of x, y, and z edges in floating-point form. Each element is defined as a series of articulation points.
  • the illustrated data structure has no particular record for element IDs because those element definitions are arranged in an orderly sequence. In the example of FIG.
  • RECORD 7 defines the element size as “2.50000000e+001, 2.50000000e+001, 2.50000004e+001” in the scientific notation (floating-point notation).
  • RECORD 9 contains articulation points “1, 2, 5, 4, 10, 11, 14, 13” to “14 15, 18, 17, 23, 24, 27, 26” to define multiple elements.
  • 3D images are compared with a 3D image comparison device with the above-described structure. The following will explain in detail what the 3D image comparison process actually performs.
  • FIG. 13 is a flowchart showing the entire process flow of 3D image comparison. This process is executed by the CPU 101 upon power-up of the 3D image comparison device, upon activation of a relevant program, or in response to other predetermined events. The following will describe the flowchart of FIG. 13 in the order of step numbers, with reference to the functions explained earlier in FIG. 4 .
  • Step S 10 The comparison data entry unit 10 performs a comparison data entry process to input CAD data and measurement data, which are 3D images to be compared. Details of this comparison data entry process will be described later with reference to FIG. 14 .
  • Step S 20 The voxel processor 20 performs voxel processing on the 3D images given at step S 10 to generate images in voxel form. Details of this voxel processing will be described later with reference to FIG. 15 .
  • Step S 30 With the two images converted into voxel form at step S 20 , the image positioning processor 30 performs an image positioning process to align the two images properly. Details of this image positioning process will be described later with reference to FIG. 16 .
  • Step S 40 The voxel difference evaluator 40 executes an image comparison process to compare and evaluate the two images that has been aligned properly at step S 30 . Details of this image comparison process will be described later with reference to FIG. 17 .
  • Step S 50 The voxel difference evaluator 40 draws, in a particular representation scheme, a plurality of images produced at the comparison and evaluation step S 40 .
  • the term “particular representation scheme” refers herein to a technique of representing image differences by using, for example, RGB colors with different depths.
  • Step S 60 The voxel overlaying processor 50 aligns and combines the plurality of images shown at step S 50 , thereby overlaying comparison results on a specified 3D image. More specifically, a 3D differential image is obtained from a comparison between two voxel-converted images, with exclusive-OR operations to selectively visualize a mismatched portion of them. In this case, the voxel overlaying processor 50 can display the original images and their mismatched portions distinguishably by using different colors. This is accomplished through superimposition of the two voxel-converted images, 3D differential image, and/or 3D fine differential image.
  • FIG. 14 is a flowchart of a comparison data entry process as part of the 3D image comparison process of FIG. 13 .
  • Step S 10 (comparison data entry process) of FIG. 13 calls up the following sequence of steps. This comparison data entry process is performed by the comparison data entry unit 10 of the 3D image comparison device.
  • Step S 101 The measurement resolution entry unit 11 in the comparison data entry unit 10 receives input of measurement resolution from the user.
  • Step S 102 The measurement resolution memory 12 stores the measurement resolution parameter received at step S 101 .
  • Step S 103 The comparison data entry unit 10 receives a 3D image entry command from the user.
  • Step S 104 The comparison data entry unit 10 determines of what image type the 3D image specified at step S 103 is. The comparison data entry unit 10 then proceeds to step S 105 if the image type is “3D surface model,” or to step S 107 if it is “surface measurement data,” or to S 109 if it is “section measurement data.” In the case the image type is either “surface measurement data” or “section measurement data,” the point set process switch 15 activates a surface point set entry unit 16 or volume point set entry unit 18 , accordingly.
  • Step S 105 Since the image type identified at step S 104 is “3D surface model,” the 3D surface entry unit 13 in the comparison data entry unit 10 receives a 3D surface entry from a 3D CAD image file DB 11 .
  • Step S 106 The image memory 14 stores the 3D surface received at step S 105 in its internal storage device.
  • Step S 107 Since the image type identified at step S 104 is “surface measurement data,” the surface point set entry unit 16 receives a surface measurement data entry from a 3D measured image file DB 12 .
  • Step S 108 The surface point set memory 17 stores the surface measurement data received at step S 107 in its internal storage device.
  • Step S 109 Since the image type identified at step S 104 is “section measurement data,” the volume point set entry unit 18 receives a section measurement data entry from a 3D measured image file DB 12 .
  • Step S 110 The volume point set memory 19 stores the section measurement data received at step S 109 in its internal storage device.
  • Step S 111 The comparison data entry unit 10 determines whether all necessary 3D images have been stored at steps S 106 , S 108 , and S 110 . If more 3D images are needed, then comparison data entry unit 10 goes back to step S 104 to repeat another cycle of processing in a similar fashion. If a sufficient number of 3D images are ready, the comparison data entry unit 10 exits from the present process, thus returning to step S 10 of FIG. 13 .
  • FIG. 15 is a flowchart of voxel processing as part of the 3D image comparison process of FIG. 13 .
  • Step S 20 (voxel processing) of FIG. 13 calls up the following sequence of steps. This voxel processing is performed by the voxel processor 20 in the 3D image comparison device.
  • Step S 201 The voxel processor 20 selects the comparison data stored at step S 10 .
  • Step S 202 Based on the measurement resolution received and stored at step S 10 , the voxel processor 20 determines voxel size.
  • Step S 203 The voxel processor 20 is supplied with the types of 3D images stored at step S 10 .
  • Step S 204 The voxel processor 20 determines which image type has been supplied at step S 203 . The voxel processor 20 then proceeds to step S 205 if the image type is “3D surface model,” or to step S 207 if it is “surface measurement data,” or to S 209 if it is “section measurement data.”
  • Step S 205 Since the image type identified at step S 204 is “3D surface model,” the voxel processor 20 selects and reads 3D surface data from the image memory 14 and delivers it to the first voxel generator 21 .
  • Step S 206 With the 3D surface data delivered at step S 205 , the first voxel generator 21 in the voxel processor 20 produces a 3D image in voxel form.
  • Step S 207 Since the image type identified at step S 204 is “surface measurement data,” the second voxel processor 20 selects and reads surface measurement data from the surface point set memory 17 and delivers it to the second voxel generator 22 .
  • Step S 208 With the surface measurement data delivered at step S 207 , the second voxel generator 22 in the voxel processor 20 creates a 3D image in voxel form.
  • Step S 209 Since the image type identified at step S 204 is section measurement data, the voxel processor 20 selects and reads section measurement data from the volume point set memory 19 and delivers it to the voxel generator 22 .
  • Step S 210 With the section measurement data delivered at step S 209 , the second voxel generator 22 in the voxel processor 20 creates a 3D image in voxel form.
  • Step S 211 The voxel processor 20 determines whether all required 3D images have been created at steps S 206 , S 208 , and/or S 210 . If more 3D images are needed, then voxel processor 20 goes back to step S 204 to repeat another cycle of processing in a similar fashion. If all required 3D images are ready, the voxel processor 20 exits from the present process, thus returning to step S 20 of FIG. 13 .
  • FIG. 16 is a flowchart of an image positioning process as part of the 3D image comparison process of FIG. 13 .
  • Step S 30 (image positioning process) of FIG. 13 calls up the following sequence of steps. This image positioning process is performed by an image positioning processor 30 in the 3D image comparison device.
  • the image positioning processor 30 selects predefined fiducial points of two 3D images.
  • One of the two 3D images is CAD data of a 3D surface model, and the other is surface measurement data or section measurement data obtained through measurement.
  • Step S 302 The image positioning processor 30 calculates the offsets of images from the fiducial points selected at step S 301 .
  • Step S 303 Based on the offsets calculated at step S 302 , the image positioning processor 30 moves the voxel-converted 3D images.
  • Step S 304 The image positioning processor 30 stores the 3D images moved at step S 303 in its internal storage space. It then exits from the current process, thus returning to the step S 30 of FIG. 13 .
  • FIG. 17 is a flowchart of the first image comparison process as part of the 3D image comparison process of FIG. 13 .
  • Step S 40 (image comparison process) of FIG. 13 calls up the following sequence of steps.
  • This first image comparison process is executed by the voxel difference evaluator 40 of the proposed 3D image comparison device.
  • Step S 411 The voxel difference evaluator 40 produces a 3D differential image by comparing given two 3D images in the voxel domain.
  • Step S 412 The voxel difference evaluator 40 selects a surface voxel of the 3D differential image. This selection can easily be made, since a surface voxel adjoins a voxel being set to OFF state while some other surrounding voxels are set to ON state.
  • Step S 413 Now that a surface voxel is selected at step S 412 , the voxel difference evaluator 40 performs an additional voxel conversion at a given test resolution. That is, the selected surface voxel is subdivided into fine voxels at a higher resolution than that of the 3D differential image.
  • the resulting 3D fine differential image contains fine-voxel images that provides details of the original 3D images (i.e., 3D surface model and surface/section measurement data) in the surface voxel that is selected.
  • Step S 414 The voxel difference evaluator 40 counts fine-voxel mismatches found in the (coarse) surface voxel.
  • Step S 415 Based on the number of mismatches counted at step S 414 , the voxel difference evaluator 40 determines a color as a representation scheme for the selected surface voxel.
  • Step S 416 The voxel difference evaluator 40 determines whether all surface voxels are colored. In other words, the voxel difference evaluator 40 determines whether the voxel-converted 3D images are completely compared. If there are still uncolored surface voxels, the voxel difference evaluator 40 returns the step S 412 to repeat a similar process. If all surface voxels are colored, the voxel difference evaluator 40 exits from the present process and returns to step S 40 of FIG. 13 .
  • the computer first converts 3D images B 21 and B 22 into voxel form and compares them to create a 3D differential image VB 21 .
  • the computer With respect to a surface voxel of the 3D differential image VB 21 , the computer produces a 3D fine differential image VB 22 containing fine-voxel images that provides details of the original two 3D images B 21 and B 22 at a higher resolution than in the initial comparison.
  • the computer determines a representation scheme for visualizing a dissimilar part of the 3D differential image VB 21 , based on detailed differences between the 3D images B 21 and B 22 .
  • the color of a surface voxel is determined from the number of mismatches found in that voxel.
  • the computer then outputs voxel comparison results on a display screen or the like by drawing the 3D differential image VB 21 , including surface voxels depicted in their respective colors.
  • the present invention enables given 3D images to be compared more quickly and accurately.
  • the proposed comparison process handles images as a limited number of 3D voxels, rather than a set of countless points, thus making it possible to improve memory resource usage and reduce the number of computational cycles.
  • a significant reduction of computational cycles can be achieved by omitting comparison of voxels other than those on the surface.
  • step S 412 of FIG. 17 will include the substeps of determining whether the selected surface voxel is flagged as a mismatch and finding another one if it is not.
  • each surface of a reference voxel is displayed in a particular representation scheme that reflects the thickness of a dissimilar layer of voxels on that voxel surface after the 3D images are compared in the voxel domain.
  • FIGS. 18 and 19 the following will describe the second embodiment in detail.
  • FIG. 18 gives an overview of a comparison process performed with a 3D image comparison program according to the second embodiment.
  • the comparison process starts with creating voxels in the same way as in the first embodiment, and it then assigns a particular color to each surface of voxels in one 3D image to visualize its differences from the other.
  • the color is determined in accordance with the number of voxel mismatches counted perpendicularly, or in the thickness direction, from each surface of matched voxels.
  • step S 1 b given 3D images are compared in voxel form, and creates a 3D differential image VB 21 in the same way as in the first embodiment (step S 1 b ).
  • a dissimilar part is discovered and separated from the 3D differential image VB 21 (step S 2 b ).
  • an outermost layer of matched voxels appears under the removed dissimilar part.
  • One of such matched voxels is then designated as a reference voxel BB 1 , and the number of voxel mismatches is counted in the direction perpendicular to each 3D geometric surface (z-y plane, z-x plane, y-x plane) of the reference voxel BB 1 .
  • a representation scheme for visualizing that dissimilar part is determined (step S 3 b ). More specifically, think of y-z and z-x planes in the illustrated cross section A-A, for example. The mismatch is evaluated in this case as three voxels on the y-z plane and two voxels on the z-x plane. Representation schemes for reference voxel surfaces are determined from those count values. Each surface of the reference voxel is displayed in the determined representation schemes (step S 4 b ). Such representation schemes include color designations in RGB or CMY format mentioned earlier.
  • the proposed 3D image comparison program causes a computer to compare given 3D images in the voxel domain in the same way as in the first embodiment. It separates a dissimilar part from the resulting 3D differential image VB 21 , thus permitting an outermost layer of matched voxels to appear. Then the computer selects a voxel BB 1 (reference voxel) from that layer and counts voxel mismatches perpendicularly from each 3D geometric surface (z-y plane, z-x plane, y-x plane) of the selected reference voxel BB 1 .
  • the voxel mismatch count means the thickness of a dissimilar part, and based on that count, the computer determines a representation scheme for depicting the corresponding reference voxel surface. The computer renders each surface of the reference voxel in the determined representation scheme.
  • the reference voxel has been explained as a matched surface voxel that adjoins a dissimilar part of the images under test.
  • the following explanation will use the same term to refer to one of fine voxels that are produced by reconverting a range of coarse voxels at a given test resolution.
  • the voxel difference evaluator 40 compares two images that have been properly positioned by the image positioning processor 30 .
  • the voxel difference evaluator 40 selects a surface voxel of the resulting 3D differential image. If this surface voxel indicates a mismatch of the 3D images under test, then the voxel difference evaluator 40 determines which range of voxels to reconvert. That is, it identifies a range containing an image mismatch for more detailed comparison.
  • the voxel difference evaluator 40 performs an additional voxel conversion at a given test resolution. That is, voxels in the determined range are subdivided into fine voxels according to a given test resolution. The voxel difference evaluator 40 then counts fine voxel mismatches by examining the 3D surface model and surface/section measurement data now in the form of fine voxels.
  • an outermost fine voxel of a matched part of the compared images is designated as a reference voxel, and the number of voxel mismatches (i.e., the thickness of a dissimilar part) is counted in each direction perpendicular to different 3D geometric surfaces (e.g., z-y, z-x, and y-x planes) of that voxel.
  • the voxel difference evaluator 40 determines color depths as representation schemes, so that the surfaces will be colored. That is, the voxel difference evaluator 40 assigns a particular representation scheme (color) to each surface of the reference voxel, which is a fine voxel belonging to a matched part of the compared images. The voxel difference evaluator 40 repeats the above until the voxel-converted 3D images are completely compared.
  • the following will provide more specifics about the comparison process implemented in a 3D image comparison program according to the second embodiment of the invention.
  • the first to third embodiments of the invention use their respective versions of the 3D image comparison process.
  • the version for the second embodiment is now referred to as the second image comparison process.
  • FIG. 19 is a flowchart of the second image comparison process as part of the 3D image comparison process of FIG. 13 .
  • Step 40 (image comparison process) of FIG. 13 calls up the following sequence of steps.
  • This second image comparison process is executed by the voxel difference evaluator 40 of the proposed 3D image comparison device. Note that the flowchart of FIG. 19 includes reconversion of a range of coarse voxels.
  • Step S 421 The voxel difference evaluator 40 produces a 3D differential image by comparing given two 3D images in the voxel domain.
  • Step S 422 The voxel difference evaluator 40 selects a surface voxel. This selection can easily be made, since a surface voxel of a voxel-converted 3D image adjoins a voxel being set to OFF state while some other surrounding voxels are set to ON state.
  • Step S 423 Now that a surface voxel is selected at step S 422 , the voxel difference evaluator 40 determines what range of voxels need an additional conversion. That is, it identifies a range of voxels surrounding an image mismatch, if any, for the purpose of more detailed comparison.
  • Step S 424 Within the range determined at step S 423 , the voxel difference evaluator 40 performs an additional voxel conversion at a given test resolution. Voxels in the determined range are thus subdivided into fine voxels according to a given test resolution.
  • the voxel difference evaluator 40 compares the reconverted 3D surface model and surface/section measurement data and counts their voxel mismatches. More specifically, an outermost fine voxel of a matched part of images in the reconversion range is designated as a reference voxel, and the number of fine voxel mismatches is counted in each direction perpendicular to different 3D geometric surfaces (e.g., z-y plane, z-x plane, y-x plane) of that voxel. The resulting mismatch counts indicate the thicknesses of the dissimilar part measured in different directions.
  • 3D geometric surfaces e.g., z-y plane, z-x plane, y-x plane
  • Step S 426 Based on the voxel mismatch counts obtained at step S 425 , the voxel difference evaluator 40 determines color depths, so that voxel surfaces will be colored. That is, the voxel difference evaluator 40 assigns a particular representation scheme (color) to each surface of the reference voxel, which is a fine voxel belonging to a matched part of the compared images.
  • Step S 427 The voxel difference evaluator 40 determines whether the voxel-converted 3D images are completely evaluated. If there are still uncolored voxels, the voxel difference evaluator 40 returns the step S 422 to repeat a similar process. If all voxels are colored, the voxel difference evaluator 40 exits from the present process and returns to step S 40 of FIG. 13 .
  • the comparison process according to the second embodiment visualizes the difference with colors assigned on each surface of a 3D object, rather than drawing voxels with some depth, in order to reduce the consumption of computational resources (e.g., memory capacity, disk area, computation time).
  • computational resources e.g., memory capacity, disk area, computation time.
  • the proposed technique reduces resource consumption to a few tenths.
  • the depth of a color representing a mismatch is proportional to the thickness of a dissimilar layer of voxels.
  • the color depth of each voxel surface is normalized with respect to the maximum depth of all colors assigned.
  • a matched voxel immediately adjoining a dissimilar part is selected as a reference voxel.
  • the reference voxel is as large as other voxels constituting a 3D differential image.
  • the reference voxel is a fine voxel selected from among those produced by subdividing voxels in a range that contains a dissimilar part.
  • This section describes a third embodiment of the invention, which differs from the first embodiment in how a comparison is implemented. Specifically, the third embodiment compares 3D images on an individual voxel basis, counts all surface voxels that constitute a dissimilar part of the images, calculates the volume of that dissimilar part, and displays those voxels using an appropriate representation scheme that reflects the ratio of the number of surface voxels to the volume.
  • the third embodiment compares 3D images on an individual voxel basis, counts all surface voxels that constitute a dissimilar part of the images, calculates the volume of that dissimilar part, and displays those voxels using an appropriate representation scheme that reflects the ratio of the number of surface voxels to the volume.
  • FIG. 20 shows a first example of a difference that is evaluated by a 3D image comparison program according to a third embodiment of the invention.
  • the illustrated difference between two images A and B is identified as a dissimilar part of voxels produced in the same way as in the first embodiment.
  • a dissimilar part, or “island,” consists of one or more voxels, and the comparison method of the third embodiment gives each island a particular color that represents a ratio of surface voxels of that island to the total number of voxels forming that island.
  • the following will provide more specifics about how to determine a color for each island, or each dissimilar part of images under test.
  • two 3D images A and B are converted into voxel form and compared in the same way as in the first embodiment.
  • the comparison method of the third embodiment quantifies the dissimilarity in the following way:
  • a color representing a dissimilarity of images is specified in the RGB domain.
  • the intensity of blue (B) is modified to express the degree of dissimilarity.
  • the control boxes in area ST 11 are set to (255(R), 255(G), 223(B)) to indicate the presence of a thin layer of voxel mismatches on a certain portion of the surface.
  • the value of 223(B) gives a light color, which indicates that the dissimilar part is a thin layer.
  • the blue level was set to, for example, 96 (B), it would indicate the presence of a thick layer of voxel mismatches as a dissimilar part of images. The following will discuss more about this case, with reference to FIG. 22 .
  • FIG. 22 shows a second example of a difference that is evaluated by the 3D image comparison program of the third embodiment, and particularly the case where the mismatch is greater than the one explained in FIG. 20 .
  • the comparison method of the third embodiment quantifies the dissimilarity in the following way:
  • FIG. 23 shows a deep color determined by the 3D image comparison program of the third embodiment.
  • a color representing a dissimilarity of images is specified in the RGB domain.
  • the intensity of blue (B) is modified to express the degree of dissimilarity.
  • the control boxes in area ST 12 are set to (255(R), 255(G), 96(B)) to indicate the presence of a thick layer of voxel mismatches on a certain portion of the surface.
  • the value of 96(B) gives a deep color, which indicates that the dissimilar part is a thick layer.
  • FIG. 24 shows a third example of a difference that is evaluated by the 3D image comparison program of the third embodiment.
  • all coarse voxels containing a mismatch are subdivided into fine voxels for detailed comparison.
  • the boldest line is a demarcation line that separates the discovered dissimilar part from other voxels.
  • FIG. 24 shows is a result of a fine voxel comparison made with the voxels shown in FIG. 20 . This detailed comparison is similar to what have been explained in FIGS. 20 and 22 , and that explanation is not repeated here.
  • the voxel difference evaluator 40 compares two images that have been properly positioned by the image positioning processor 30 .
  • the voxel difference evaluator 40 selects a surface voxel of the resulting 3D differential image. If this surface voxel indicates a mismatch of the 3D images under test, then the voxel difference evaluator 40 determines which range of voxels to reconvert. That is, it identifies a range containing an image mismatch for more detailed comparison.
  • the voxel difference evaluator 40 performs an additional voxel conversion at a given test resolution. That is, voxels in the determined range are subdivided into fine voxels according to a given test resolution. Subsequently, the voxel difference evaluator 40 then counts mismatches by examining the 3D surface and surface/section measurement data now in the form of fine voxels. It then calculates the ratio of surface voxels to total voxels for each dissimilar part of the 3D differential image.
  • the voxel difference evaluator 40 determines a color depth as a representation scheme, so that relevant voxels will be colored. That is, the voxel difference evaluator 40 determines a representation scheme (i.e., color) for every dissimilar part consisting of voxels flagged as being a mismatch. The voxel difference evaluator 40 repeats the above until the voxel-converted 3D images are completely compared.
  • the following will provide more specifics about the comparison process implemented in a 3D image comparison program according to the third embodiment of the invention.
  • the first to third embodiments of the invention use their respective versions of the 3D image comparison process.
  • the version for the third embodiment is now referred to as the second image comparison process.
  • FIG. 25 is a flowchart of a third image comparison process as part of the 3D image comparison process of FIG. 13 .
  • Step 40 (image comparison process) of FIG. 13 calls up the following sequence of steps.
  • This third image comparison process is executed by the voxel difference evaluator 40 of the proposed 3D image comparison device.
  • Step S 431 The voxel difference evaluator 40 produces a 3D differential image by comparing given two 3D images in the voxel domain.
  • Step S 432 The voxel difference evaluator 40 selects a surface voxel. This selection can easily be made, since a surface voxel of a voxel-converted 3D image adjoins a voxel being set to OFF state while some other surrounding voxels are set to ON state.
  • Step S 433 Now that a surface voxel is selected at step S 432 , the voxel difference evaluator 40 determines what range of voxels need an additional conversion. That is, it identifies a range of voxels surrounding an image mismatch, if any, for the purpose of more detailed comparison.
  • Step S 434 Within the range determined at step S 433 , the voxel difference evaluator 40 performs an additional voxel conversion at a given test resolution. Voxels in the determined range are thus subdivided into fine voxels according to a given test resolution.
  • Step S 435 The voxel difference evaluator 40 compares the reconverted 3D surface and surface/section measurement data and counts their voxel mismatches. It then calculates the ratio of surface voxels to total voxels for each dissimilar part of the 3D differential image.
  • Step S 436 Based on the calculated ratio of voxel mismatches, the voxel difference evaluator 40 determines a color depth, so that relevant voxels will be colored. That is, the voxel difference evaluator 40 determines a representation scheme (i.e., color) for every dissimilar part consisting of voxels flagged as being a mismatch.
  • a representation scheme i.e., color
  • Step S 437 The voxel difference evaluator 40 determines whether the voxel-converted 3D images are completely evaluated. If there are still uncolored voxels, the voxel difference evaluator 40 returns the step S 432 to repeat a similar process. If all voxels are colored, the voxel difference evaluator 40 exits from the present process and returns to step S 40 of FIG. 13 .
  • the comparison process according to the third embodiment visualizes differences between given 3D images by giving a particular color to each dissimilar part as a whole, rather than drawing voxels with some depth, in order to reduce the consumption of computational resources (e.g., memory capacity, disk area, computation time).
  • computational resources e.g., memory capacity, disk area, computation time.
  • the proposed technique reduces resource consumption to a few tenths.
  • the ratio of surface voxels to total voxels is in a range of zero, or nearly zero (more precisely), to two. Accordingly, a lightest color is assigned to the maximum ratio, and a deepest color to the minimum ratio. Note that the size of a dissimilar part is not reflected in the selection of colors since the user can see it from a 3D picture on the screen.
  • the first embodiment of the invention offers the following advantages:
  • computational resource usage is evaluated as follows. Recall the 3D surface model given by equation (1), and suppose that a conventional method is used to calculate minimum distances for many measurement points shown in Table (1). The amount of computation is then estimated, considering the following factors:
  • the method according to the first embodiment of the invention would require only 50 billion floating point operations. This is because of its spatial simplification; the proposed method places a multiple-surface body in a computational space, divides the entire space into voxels with a given measurement resolution, and applies voxel-by-voxel logical operations.
  • the estimated computation time is 25 seconds, under the assumption that the same computer is used.
  • the first embodiment of the present invention allows a considerable reduction of computation times, in comparison to conventional methods. Also, the proposed methods according to the first embodiment is comparable to conventional methods in terms of accuracy, since its entire computational space is divided into voxels as fine as the measurement resolution.
  • All the processes described so far can be implemented as computer programs.
  • a computer system executes such programs to provide intended functions of the present invention.
  • Program files are installed in a computer's hard disk drive or other local mass storage device, and they can be executed after being loaded to the main memory.
  • Such computer programs are stored in a computer-readable medium for the purpose of storage and distribution.
  • Suitable computer-readable storage media include magnetic storage media, optical discs, magneto-optical storage media, and semiconductor memory devices.
  • Magnetic storage media include hard disks, floppy disks (FD), ZIP (a type of magnetic disks from Iomega Corporation, USA), and magnetic tapes.
  • Optical discs include digital versatile discs (DVD), DVD random access memory (DVD-RAM), compact disc read-only memory (CD-ROM), CD-Recordable (CD-R), and CD-Rewritable (CD-RW).
  • Magneto-optical storage media include magneto-optical discs (MO).
  • Semiconductor memory devices include flash memories.
  • Portable storage media such as DVD and CD-ROM, are suitable for the distribution of program products. It is also possible to upload computer programs to a server computer for distribution to client computers over a network.
  • two voxel-converted 3D images are compared, and a particular surface voxel identified in this comparison is subdivided for further comparison at a higher resolution.
  • the comparison results are displayed in a representation scheme that is determined from the result of the second comparison.
  • the proposed technique enables 3D images to be compared quickly and accurately.
  • given 3D images are compared, and a representation scheme is determined from the number of voxel mismatches counted perpendicularly to each surface of a reference voxel.
  • the comparison results are displayed in the determined representation scheme.
  • given 3D images are compared, and a representation scheme is determined from a surface-to-volume ratio of a dissimilar part of the 3D images under test.
  • the comparison results are displayed in the determined representation scheme.
US10/989,464 2002-06-28 2004-11-17 Program, method, and device for comparing three-dimensional images in voxel form Abandoned US20050068317A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2002/006621 WO2004003850A1 (ja) 2002-06-28 2002-06-28 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2002/006621 Continuation WO2004003850A1 (ja) 2002-06-28 2002-06-28 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置

Publications (1)

Publication Number Publication Date
US20050068317A1 true US20050068317A1 (en) 2005-03-31

Family

ID=29808175

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/989,464 Abandoned US20050068317A1 (en) 2002-06-28 2004-11-17 Program, method, and device for comparing three-dimensional images in voxel form

Country Status (6)

Country Link
US (1) US20050068317A1 (ja)
EP (1) EP1519318A4 (ja)
JP (1) JP4001600B2 (ja)
KR (1) KR100639139B1 (ja)
CN (1) CN100388317C (ja)
WO (1) WO2004003850A1 (ja)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060152510A1 (en) * 2002-06-19 2006-07-13 Jochen Dick Cross-platform and data-specific visualisation of 3d data records
US20080167969A1 (en) * 2005-02-24 2008-07-10 Dolphin Software Ltd. System and Method For Computerized Ordering Among Replaceable or Otherwise Associated Products
US20090110297A1 (en) * 2007-10-25 2009-04-30 Fujitsu Limited Computer readable recording medium storing difference emphasizing program, difference emphasizing method, and difference emphasizing apparatus
US20090167761A1 (en) * 2005-12-16 2009-07-02 Ihi Corporation Self-position identifying method and device, and three-dimensional shape measuring method and device
US20090184961A1 (en) * 2005-12-16 2009-07-23 Ihi Corporation Three-dimensional shape data recording/display method and device, and three-dimensional shape measuring method and device
US20090202155A1 (en) * 2005-12-16 2009-08-13 Ihi Corporation Three-dimensional shape data position matching method and device
US20100322535A1 (en) * 2009-06-22 2010-12-23 Chunghwa Picture Tubes, Ltd. Image transformation method adapted to computer program product and image display device
US20150019178A1 (en) * 2010-06-11 2015-01-15 Assemble Systems Llc System And Method For Managing Changes In Building Information Models
US20150022521A1 (en) * 2013-07-17 2015-01-22 Microsoft Corporation Sparse GPU Voxelization for 3D Surface Reconstruction
US20160260222A1 (en) * 2015-03-02 2016-09-08 Lawrence Livermore National Security, Llc System for detecting objects in streaming 3d images formed from data acquired with a medium penetrating sensor
US20160338780A1 (en) * 2010-10-20 2016-11-24 Medtronic Navigation, Inc. Selected Image Acquisition Technique To Optimize Patient Model Construction
US9710960B2 (en) 2014-12-04 2017-07-18 Vangogh Imaging, Inc. Closed-form 3D model generation of non-rigid complex objects from incomplete and noisy scans
US9715761B2 (en) 2013-07-08 2017-07-25 Vangogh Imaging, Inc. Real-time 3D computer vision processing engine for object recognition, reconstruction, and analysis
US20180104898A1 (en) * 2016-10-19 2018-04-19 Shapeways, Inc. Systems and methods for identifying three-dimensional printed objects
US20190026953A1 (en) * 2016-05-16 2019-01-24 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3d object
US10366179B2 (en) 2016-07-08 2019-07-30 Fujitsu Limited Computer-readable storage medium and information processing device
US10373375B2 (en) * 2011-04-08 2019-08-06 Koninklijke Philips N.V. Image processing system and method using device rotation
US10380762B2 (en) 2016-10-07 2019-08-13 Vangogh Imaging, Inc. Real-time remote collaboration and virtual presence using simultaneous localization and mapping to construct a 3D model and update a scene based on sparse data
US10810783B2 (en) 2018-04-03 2020-10-20 Vangogh Imaging, Inc. Dynamic real-time texture alignment for 3D models
US10839585B2 (en) 2018-01-05 2020-11-17 Vangogh Imaging, Inc. 4D hologram: real-time remote avatar creation and animation control
US10847262B2 (en) * 2018-04-10 2020-11-24 Ziosoft, Inc. Medical image processing apparatus, medical image processing method and medical image processing system
US20200410712A1 (en) * 2018-10-30 2020-12-31 Liberty Reach Inc. Machine Vision-Based Method and System for Measuring 3D Pose of a Part or Subassembly of Parts
US20210023718A1 (en) * 2019-07-22 2021-01-28 Fanuc Corporation Three-dimensional data generation device and robot control system
US11080540B2 (en) 2018-03-20 2021-08-03 Vangogh Imaging, Inc. 3D vision processing using an IP block
US11170224B2 (en) 2018-05-25 2021-11-09 Vangogh Imaging, Inc. Keyframe-based object scanning and tracking
US11170552B2 (en) 2019-05-06 2021-11-09 Vangogh Imaging, Inc. Remote visualization of three-dimensional (3D) animation with synchronized voice in real-time
US11216947B2 (en) * 2015-04-20 2022-01-04 Mars Bioimaging Limited Material identification using multi-energy CT image data
US11232633B2 (en) 2019-05-06 2022-01-25 Vangogh Imaging, Inc. 3D object capture and object reconstruction using edge cloud computing resources
US11335063B2 (en) 2020-01-03 2022-05-17 Vangogh Imaging, Inc. Multiple maps for 3D object scanning and reconstruction

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100419794C (zh) * 2005-12-16 2008-09-17 北京中星微电子有限公司 一种图像问题的检测方法
CN102598053A (zh) * 2009-06-24 2012-07-18 皇家飞利浦电子股份有限公司 对象内的植入设备的空间和形状表征
JP5766936B2 (ja) * 2010-11-11 2015-08-19 国立大学法人 東京大学 3次元環境復元装置、3次元環境復元方法、及びロボット
GB201308866D0 (en) * 2013-05-16 2013-07-03 Siemens Medical Solutions System and methods for efficient assessment of lesion developemnt
JP6073429B2 (ja) * 2015-08-31 2017-02-01 株式会社大林組 構造物状況把握支援装置、構造物状況把握支援方法及びプログラム
KR101644426B1 (ko) * 2015-09-30 2016-08-02 상명대학교서울산학협력단 변형3d모델에 대응되는 원본3d모델 인식방법
JP6765666B2 (ja) * 2016-07-12 2020-10-07 学校法人慶應義塾 立体物製造装置、立体物製造方法及びプログラム
EP3355279A1 (en) * 2017-01-30 2018-08-01 3D Repo Ltd Method and computer programs for identifying differences between 3-dimensional scenes
JP6888386B2 (ja) * 2017-04-17 2021-06-16 富士通株式会社 差分検知プログラム、差分検知装置、差分検知方法
KR102238911B1 (ko) * 2020-04-23 2021-04-16 비아이밀리그램㈜ 건축물의 공정진행상태 산출 장치, 건축물의 공정진행상태 산출 방법 및 건축물의 공정진행상태 산출시스템

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5273040A (en) * 1991-11-14 1993-12-28 Picker International, Inc. Measurement of vetricle volumes with cardiac MRI
US5301271A (en) * 1991-07-11 1994-04-05 Matsushita Electric Industrial Co., Ltd. Image processing separately processing two-density-level image data and multi-level image data
US6175371B1 (en) * 1995-06-02 2001-01-16 Philippe Schoulz Process for transforming images into stereoscopic images, images and image series obtained by this process

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62219075A (ja) * 1986-03-20 1987-09-26 Hitachi Medical Corp 三次元画像の半透明表示方法
JPH04276513A (ja) * 1991-03-04 1992-10-01 Matsushita Electric Ind Co Ltd 形状測定処理方法
JPH0516435A (ja) * 1991-07-11 1993-01-26 Matsushita Electric Ind Co Ltd 画像形成装置
AU6341396A (en) * 1995-06-30 1997-02-05 Ultrapointe Corporation Method for characterizing defects on semiconductor wafers
JPH0935058A (ja) * 1995-07-17 1997-02-07 Nec Corp 画像認識方法
JP4082718B2 (ja) * 1996-01-25 2008-04-30 株式会社日立製作所 画像認識方法および画像表示方法および画像認識装置
JP3564531B2 (ja) * 1999-05-07 2004-09-15 株式会社モノリス 画像領域の追跡方法および装置
EP1054384A3 (en) * 1999-05-20 2003-09-24 TeraRecon, Inc., A Delaware Corporation Method and apparatus for translating and interfacing voxel memory addresses
DE10000185A1 (de) * 2000-01-05 2001-07-12 Philips Corp Intellectual Pty Verfahren zur Darstellung des zeitlichen Verlaufs des Blutflusses in einem Untersuchungsobjekt

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5301271A (en) * 1991-07-11 1994-04-05 Matsushita Electric Industrial Co., Ltd. Image processing separately processing two-density-level image data and multi-level image data
US5273040A (en) * 1991-11-14 1993-12-28 Picker International, Inc. Measurement of vetricle volumes with cardiac MRI
US6175371B1 (en) * 1995-06-02 2001-01-16 Philippe Schoulz Process for transforming images into stereoscopic images, images and image series obtained by this process

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060152510A1 (en) * 2002-06-19 2006-07-13 Jochen Dick Cross-platform and data-specific visualisation of 3d data records
US20080167969A1 (en) * 2005-02-24 2008-07-10 Dolphin Software Ltd. System and Method For Computerized Ordering Among Replaceable or Otherwise Associated Products
US20090167761A1 (en) * 2005-12-16 2009-07-02 Ihi Corporation Self-position identifying method and device, and three-dimensional shape measuring method and device
US20090184961A1 (en) * 2005-12-16 2009-07-23 Ihi Corporation Three-dimensional shape data recording/display method and device, and three-dimensional shape measuring method and device
US20090202155A1 (en) * 2005-12-16 2009-08-13 Ihi Corporation Three-dimensional shape data position matching method and device
US8116558B2 (en) * 2005-12-16 2012-02-14 Ihi Corporation Three-dimensional shape data position matching method and device
US8121399B2 (en) * 2005-12-16 2012-02-21 Ihi Corporation Self-position identifying method and device, and three-dimensional shape measuring method and device
US8300048B2 (en) 2005-12-16 2012-10-30 Ihi Corporation Three-dimensional shape data recording/display method and device, and three-dimensional shape measuring method and device
US20090110297A1 (en) * 2007-10-25 2009-04-30 Fujitsu Limited Computer readable recording medium storing difference emphasizing program, difference emphasizing method, and difference emphasizing apparatus
US8311320B2 (en) 2007-10-25 2012-11-13 Fujitsu Limited Computer readable recording medium storing difference emphasizing program, difference emphasizing method, and difference emphasizing apparatus
US20100322535A1 (en) * 2009-06-22 2010-12-23 Chunghwa Picture Tubes, Ltd. Image transformation method adapted to computer program product and image display device
US8422824B2 (en) * 2009-06-22 2013-04-16 Chunghwa Picture Tubes, Ltd. Image transformation method device for obtaining a three dimensional image
US20150019178A1 (en) * 2010-06-11 2015-01-15 Assemble Systems Llc System And Method For Managing Changes In Building Information Models
US10617477B2 (en) * 2010-10-20 2020-04-14 Medtronic Navigation, Inc. Selected image acquisition technique to optimize patient model construction
US20160338780A1 (en) * 2010-10-20 2016-11-24 Medtronic Navigation, Inc. Selected Image Acquisition Technique To Optimize Patient Model Construction
US11213357B2 (en) 2010-10-20 2022-01-04 Medtronic Navigation, Inc. Selected image acquisition technique to optimize specific patient model reconstruction
US10629002B2 (en) 2011-04-08 2020-04-21 Koninklijke Philips N.V. Measurements and calibration utilizing colorimetric sensors
US10373375B2 (en) * 2011-04-08 2019-08-06 Koninklijke Philips N.V. Image processing system and method using device rotation
US9715761B2 (en) 2013-07-08 2017-07-25 Vangogh Imaging, Inc. Real-time 3D computer vision processing engine for object recognition, reconstruction, and analysis
US20150022521A1 (en) * 2013-07-17 2015-01-22 Microsoft Corporation Sparse GPU Voxelization for 3D Surface Reconstruction
US9984498B2 (en) * 2013-07-17 2018-05-29 Microsoft Technology Licensing, Llc Sparse GPU voxelization for 3D surface reconstruction
US9710960B2 (en) 2014-12-04 2017-07-18 Vangogh Imaging, Inc. Closed-form 3D model generation of non-rigid complex objects from incomplete and noisy scans
US10007996B2 (en) * 2015-03-02 2018-06-26 Lawrence Livermore National Security, Llc System for detecting objects in streaming 3D images formed from data acquired with a medium penetrating sensor
US20160260222A1 (en) * 2015-03-02 2016-09-08 Lawrence Livermore National Security, Llc System for detecting objects in streaming 3d images formed from data acquired with a medium penetrating sensor
US11216947B2 (en) * 2015-04-20 2022-01-04 Mars Bioimaging Limited Material identification using multi-energy CT image data
US20220028074A1 (en) * 2015-04-20 2022-01-27 Mars Bioimaging Limited Material identification using multi-energy ct image data
US20190026953A1 (en) * 2016-05-16 2019-01-24 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3d object
US11043042B2 (en) * 2016-05-16 2021-06-22 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3D object
US10366179B2 (en) 2016-07-08 2019-07-30 Fujitsu Limited Computer-readable storage medium and information processing device
US10380762B2 (en) 2016-10-07 2019-08-13 Vangogh Imaging, Inc. Real-time remote collaboration and virtual presence using simultaneous localization and mapping to construct a 3D model and update a scene based on sparse data
US11040491B2 (en) * 2016-10-19 2021-06-22 Shapeways, Inc. Systems and methods for identifying three-dimensional printed objects
EP3529023A4 (en) * 2016-10-19 2020-07-01 Shapeways, Inc. SYSTEMS AND METHODS FOR IDENTIFYING 3D PRINTED OBJECTS
US20180104898A1 (en) * 2016-10-19 2018-04-19 Shapeways, Inc. Systems and methods for identifying three-dimensional printed objects
WO2018075628A1 (en) 2016-10-19 2018-04-26 Shapeways, Inc. Systems and methods for identifying three-dimensional printed objects
US10839585B2 (en) 2018-01-05 2020-11-17 Vangogh Imaging, Inc. 4D hologram: real-time remote avatar creation and animation control
US11080540B2 (en) 2018-03-20 2021-08-03 Vangogh Imaging, Inc. 3D vision processing using an IP block
US10810783B2 (en) 2018-04-03 2020-10-20 Vangogh Imaging, Inc. Dynamic real-time texture alignment for 3D models
US10847262B2 (en) * 2018-04-10 2020-11-24 Ziosoft, Inc. Medical image processing apparatus, medical image processing method and medical image processing system
US11170224B2 (en) 2018-05-25 2021-11-09 Vangogh Imaging, Inc. Keyframe-based object scanning and tracking
US20200410712A1 (en) * 2018-10-30 2020-12-31 Liberty Reach Inc. Machine Vision-Based Method and System for Measuring 3D Pose of a Part or Subassembly of Parts
US11461926B2 (en) * 2018-10-30 2022-10-04 Liberty Reach Inc. Machine vision-based method and system for measuring 3D pose of a part or subassembly of parts
US11232633B2 (en) 2019-05-06 2022-01-25 Vangogh Imaging, Inc. 3D object capture and object reconstruction using edge cloud computing resources
US11170552B2 (en) 2019-05-06 2021-11-09 Vangogh Imaging, Inc. Remote visualization of three-dimensional (3D) animation with synchronized voice in real-time
US11654571B2 (en) * 2019-07-22 2023-05-23 Fanuc Corporation Three-dimensional data generation device and robot control system
US20210023718A1 (en) * 2019-07-22 2021-01-28 Fanuc Corporation Three-dimensional data generation device and robot control system
US11335063B2 (en) 2020-01-03 2022-05-17 Vangogh Imaging, Inc. Multiple maps for 3D object scanning and reconstruction

Also Published As

Publication number Publication date
CN100388317C (zh) 2008-05-14
EP1519318A4 (en) 2008-11-19
KR20050010982A (ko) 2005-01-28
KR100639139B1 (ko) 2006-10-30
WO2004003850A1 (ja) 2004-01-08
JP4001600B2 (ja) 2007-10-31
CN1630886A (zh) 2005-06-22
JPWO2004003850A1 (ja) 2005-10-27
EP1519318A1 (en) 2005-03-30

Similar Documents

Publication Publication Date Title
US20050068317A1 (en) Program, method, and device for comparing three-dimensional images in voxel form
Attene et al. Polygon mesh repairing: An application perspective
US6867772B2 (en) 3D computer modelling apparatus
US20080225044A1 (en) Method and Apparatus for Editing Three-Dimensional Images
EP1687777A2 (en) Method and system for distinguishing surfaces in 3d data sets ("dividing voxels")
US10650587B2 (en) Isosurface generation method and visualization system
CN101248459A (zh) 一种生成3d对象的2d图像的方法
Catalucci et al. State-of-the-art in point cloud analysis
JP7017852B2 (ja) 記述子を用いた3dオブジェクトの位置特定
Cucchiara et al. An image analysis approach for automatically re-orienteering CT images for dental implants
WO2009101577A2 (en) Interactive selection of a region of interest and segmentation of image data
US20050162418A1 (en) Boundary data inside/outside judgment method and program thereof
Valenzuela et al. FISICO: fast image segmentation correction
WO2005109255A1 (ja) ボリュームデータのセルラベリング方法とそのプログラム
JP3643237B2 (ja) 近似ポリゴン表面スムース化方法
Bhattacharya et al. Interactive exploration and visualization using MetaTracts extracted from carbon fiber reinforced composites
Glaßer et al. Visualization of 3D cluster results for medical tomographic image data
Nyström On quantitative shape analysis of digital volume images
Klein et al. Volume-of-interest specification on arbitrarily resliced volume datasets
Dietze et al. Visualization of deviations between different geometries using a multi-level voxel-based representation
US8698829B2 (en) Discrete element texture synthesis
Laidlaw et al. Classification of material mixtures in volume data for visualization and modeling
Chatterjee et al. End-to-End GPU-Accelerated Low-Poly Remeshing using Curvature Map and Voronoi Tessellation✱
Sourin et al. Segmentation of MRI brain data using a haptic device
Dang From micro-CT TO A NURBS-based interface-enriched generalized finite element method

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AMAKAI, MAKOTO;REEL/FRAME:016003/0252

Effective date: 20041018

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE