WO2004003850A1 - 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 - Google Patents
3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 Download PDFInfo
- Publication number
- WO2004003850A1 WO2004003850A1 PCT/JP2002/006621 JP0206621W WO2004003850A1 WO 2004003850 A1 WO2004003850 A1 WO 2004003850A1 JP 0206621 W JP0206621 W JP 0206621W WO 2004003850 A1 WO2004003850 A1 WO 2004003850A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- dimensional
- difference
- poxels
- poxel
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 99
- 238000005259 measurement Methods 0.000 claims abstract description 81
- 238000012545 processing Methods 0.000 claims description 209
- 230000008569 process Effects 0.000 claims description 54
- 239000003086 colorant Substances 0.000 claims description 32
- 239000000654 additive Substances 0.000 claims description 4
- 230000000996 additive effect Effects 0.000 claims description 4
- 239000000203 mixture Substances 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 description 102
- 238000010586 diagram Methods 0.000 description 43
- 238000011960 computer-aided design Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 17
- 238000004364 calculation method Methods 0.000 description 11
- 238000013461 design Methods 0.000 description 9
- 230000015654 memory Effects 0.000 description 9
- 238000007667 floating Methods 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000007796 conventional method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000002591 computed tomography Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 101001139126 Homo sapiens Krueppel-like factor 6 Proteins 0.000 description 1
- 101100285391 Mus musculus Hltf gene Proteins 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 108700001232 mouse P Proteins 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/36—Level of detail
Definitions
- 3D image comparison program 3D image comparison method, and 3D image comparison device
- the present invention relates to a three-dimensional image comparison program, a three-dimensional image comparison method, and a three-dimensional image comparison device for comparing three-dimensional images and displaying differences, and in particular, to compare the three-dimensional image poxel differences with high accuracy.
- the present invention relates to a three-dimensional image comparison program, a three-dimensional image comparison method, and a three-dimensional image comparison device.
- the three-dimensional images include, for example, a design image by three-dimensional CAD (Computer Aided Design) in various manufacturing industries, and a three-dimensional image of a diseased part by a three-dimensional ultrasonic diagnostic imaging apparatus in the medical field.
- CAD Computer Aided Design
- a three-dimensional image of a diseased part by a three-dimensional ultrasonic diagnostic imaging apparatus in the medical field.
- the image to be compared is a design model consisting of a three-dimensional free-form surface created by CAD, a product manufactured from the model, or point cloud information obtained by measuring a part or part with a three-dimensional shape measuring instrument, or The cut created from the point cloud information This is information about the screen.
- the difference was evaluated based on the distance between a point on one surface and a point on the other surface.
- These points can be measured with a computed tomography (CT) scanner or 3D digitizer, or obtained by dividing a given surface according to certain rules.
- CT computed tomography
- the distance between the surfaces is the perpendicular distance from each point to the surface. This distance information is plotted on a point on the surface, and the value is displayed, or the color density or the luminance is proportional to the distance, and the image is rendered visible. Therefore, a specific description will be given below with reference to FIG.
- FIG. 26 is a diagram showing a three-dimensional image (CAD data: three-dimensional curved surface) and a three-dimensional point cloud (measurement data) of the object.
- CAD data three-dimensional curved surface
- measurement data measurement data
- the distance between a certain point (x., Y., Z.) Is the minimum of ⁇ (xx 0 ) 2 + (yy 0 ) 2 + (z-Z 0 ) 2 ⁇ 1/2 .
- This calculation, the point (x., Y., Z 0) is performed for all surfaces in the vicinity innumerable, and smallest with by point and Ime distance di things therein.
- FIG. 27 is a conceptual diagram of a conventional comparison method using a three-dimensional image.
- the conventional comparison method if the surface and the point cloud are compared with high accuracy, the number of target point clouds is innumerable.
- the actual CAD model has many small surfaces, and the number of surfaces to be compared is enormous. In other words, in the actual CAD model, there are innumerable combinations of points and surfaces to be calculated. As a result, the following problems appear.
- the number of measurement points required in the actual manufacturing industry is as shown in Table (1) below. Regardless of the method used, the number of measurement points becomes a bottleneck, making it practically impossible to implement. I have.
- the number of measurement points is assumed to be five times the surface size (400 x 400 mm) of the above parts. Disclosure of the invention
- the present invention has been made in view of the above points, and provides a three-dimensional image comparison slogram, a three-dimensional image comparison method, and a three-dimensional image that can perform three-dimensional image comparison with high accuracy and high speed. It is an object to provide a comparison device.
- a three-dimensional image comparison program as shown in FIG. 1 is provided to solve the above problem.
- the three-dimensional image comparison program of the present invention is applied when comparing three-dimensional images and displaying differences. Further, the three-dimensional image comparison program of the present invention causes a computer to execute the following processing.
- the first three-dimensional image previously converted into a poxel is compared with the second three-dimensional image to generate a three-dimensional difference image (step S 1).
- a fine 3D difference image between the first 3D image and the second 3D image is generated with higher precision than the comparison on the surface of the 3D difference image (step S). 2).
- comparing the three-dimensional difference minute image of the first three-dimensional image with the three-dimensional difference minute image of the second three-dimensional image, the difference part of the three-dimensional difference image The display format is determined (step S3).
- the three-dimensional difference image is displayed in the display format as a result of the poxel comparison (step S).
- the computer first compares the first three-dimensional image and the second three-dimensional image, which have been converted into the poxels, in advance.
- a three-dimensional difference image is generated.
- a three-dimensional difference image between the first three-dimensional image and the second three-dimensional image is generated on the surface of the three-dimensional difference image with finer precision than the comparison. Is done.
- a display format indicating the difference of the poxels is determined for the three-dimensional difference image and the three-dimensional difference minute image. Then, the three-dimensional difference image is displayed in the display format as a result of the poxel comparison.
- a computer compares a first three-dimensional image previously converted into a poxel with a second three-dimensional image, and generates a three-dimensional difference image. Generate and partially extract the difference part of the three-dimensional difference image, count the number of non-matching poxels in the vertical direction from each face of the three-dimensional shape with respect to the matching surface poxels, and count Determining a display format representing the difference of the pixels on the basis of the number of pixels, and displaying the three-dimensional difference image having the determined display format in the display format.
- a 3D image comparison program is provided.
- the computer first compares the first three-dimensional image and the second three-dimensional image, which have been previously converted into poxels, to obtain a three-dimensional difference image. Is generated. Further, the difference part of the three-dimensional difference image is partially extracted, and the number of non-matching poxels is counted in the vertical direction from each face of the three-dimensional shape with respect to the matching surface poxels. Further, a display format indicating the difference of the poxels is determined based on the number of the force points. Then, the three-dimensional difference image for which the display format has been determined is displayed in the display format.
- a computer compares a first three-dimensional image and a second three-dimensional image that have been previously converted into a poxel to generate a three-dimensional difference image. Generate the ratio of the number of surface poxels in the difference portion of the three-dimensional difference image to the total number of poxels, and display a display format representing the difference of the poxels based on the ratio with respect to the three-dimensional difference image. And determine the display format A three-dimensional image comparison program is provided, wherein the determined three-dimensional difference image is displayed in a predetermined display format corresponding to the ratio to execute the processing.
- a computer first compares a first three-dimensional image and a second three-dimensional image that have been previously converted into a poxel to generate a three-dimensional difference image. You. Next, the ratio between the number of surface poxels in the different part of the three-dimensional difference image and the total number of poxels is calculated, and a display format representing the difference of the poxels is determined for the three-dimensional difference image based on the ratio. Is done. Then, for the three-dimensional difference image for which the display format has been determined, the difference portion is displayed in a predetermined display format corresponding to the ratio.
- FIG. 1 is a principle configuration diagram in the first embodiment.
- FIG. 2 is a conceptual diagram of a method for comparing three-dimensional images according to the first embodiment.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of a computer capable of executing a three-dimensional image comparison program.
- FIG. 4 is a functional block diagram showing a functional configuration of the three-dimensional image comparison device.
- FIG. 5 is a diagram showing an outline of the comparison processing of the three-dimensional image comparison program.
- FIG. 6 is a diagram showing a procedure for comparing the compared three-dimensional images with finer poxels.
- FIG. 7 is a diagram showing color determination for a common poxel in a method of setting a color determined by the number of poxel differences.
- FIG. 8 is a diagram showing color determination for a difference poxel (reference image) in a method of setting a color determined by the number of poxel differences.
- FIG. 9 is a diagram showing color determination for a difference poxel (comparison target image) in a method of setting a color determined by the number of poxel differences.
- FIG. 10 is a diagram showing the relationship between element numbers of nodes and nodes.
- FIG. 11 is a diagram showing a general data structure at a node on a three-dimensional image.
- FIG. 12 is a diagram showing a data structure of the poxel method at a poxel node.
- FIG. 13 is a flowchart showing the overall flow of the three-dimensional image comparison processing.
- FIG. 14 is a flowchart showing comparison information input processing in the three-dimensional image comparison processing of FIG.
- FIG. 15 is a flowchart showing the poxel processing in the three-dimensional image comparison processing of FIG.
- FIG. 16 is a flowchart showing the image registration processing in the three-dimensional image comparison processing of FIG.
- FIG. 17 is a flowchart showing a first image comparison process in the three-dimensional image comparison process of FIG.
- FIG. 18 is a diagram showing an outline of the comparison processing of the three-dimensional image comparison program in the second embodiment.
- FIG. 19 is a flowchart showing a second image comparison process in the three-dimensional image comparison process of FIG.
- FIG. 20 is a diagram showing a difference evaluation (1) of the comparison processing of the three-dimensional image comparison program in the third embodiment.
- FIG. 21 is a diagram showing colors (light colors) determined by the comparison processing of the three-dimensional image comparison program in the third embodiment.
- FIG. 22 is a diagram showing a difference evaluation (2) of the comparison processing of the three-dimensional image comparison program in the third embodiment.
- FIG. 23 is a diagram showing colors (dark colors) determined by the comparison processing of the three-dimensional image comparison program in the third embodiment.
- FIG. 24 is a diagram showing the difference evaluation (3) of the comparison processing of the three-dimensional image comparison program in the third embodiment.
- FIG. 25 is a flowchart showing a third image comparison process in the three-dimensional image comparison process of FIG.
- Figure 26 is a diagram showing a three-dimensional image (CAD data: three-dimensional curved surface) of an object and a three-dimensional point cloud (measurement data).
- FIG. 27 is a conceptual diagram of a conventional comparison method using a three-dimensional image.
- the present invention can be applied to, for example, a method for comparing geometric shapes with the following three-dimensional images.
- two 3D images are both expressed as a uniform cube or a rectangular parallelepiped (that is, a voxel), a Boolean operation is performed for each individual poxel, and the two images are compared. This is done by evaluating the agreement and disagreement.
- the two images are aligned based on reference points or characteristic shapes.
- image differences are represented by the number of unmatched poxels.
- the information of the unmatched pixels is displayed on the surface.
- the number of difference pixels in the depth direction from each surface (X method, y direction, z direction) of the reference poxel is displayed in a predetermined display format for each surface of the reference poxel.
- the difference in the three-dimensional image is represented by each surface of the reference poxel (color corresponding to the depth of the number of difference poxels).
- a three-dimensional image comparison program capable of expressing a difference in a three-dimensional image by a point (color of a poxel), a three-dimensional image Describes the comparison method and 3D image comparison device I will tell.
- a three-dimensional image comparison program capable of expressing a difference in a three-dimensional image on each surface of a reference poxel (a color corresponding to the depth of the number of difference poxels), The comparison method and the three-dimensional image comparison device are described.
- a three-dimensional image comparison program capable of expressing a difference of a three-dimensional image by the entire difference part (color corresponding to the ratio) Will be described.
- three-dimensional images are compared in units of poxels, and differences in three-dimensional images are displayed in a predetermined display format.
- FIG. 1 is a principle configuration diagram in the first embodiment.
- the three-dimensional image comparison program according to the first embodiment causes a computer to perform comparison Z difference display of three-dimensional images on a computer.
- the comparison Z difference display by computer is executed on the following page of Tagawa. This shows a procedure in which a computer compares a 3D image B1 in which a design model is converted into a poxel with a 3D image B2 in which measured data is converted into a poxel, and displays the data in a predetermined display format.
- the computer generates a three-dimensional difference image VB1 by comparing the three-dimensional image B2 and the three-dimensional image B2 that have been converted into the poxels in advance (step S1).
- the computer generates, on the surface of the three-dimensional difference image VB1, a three-dimensional difference minute image VB2 of the three-dimensional image B1 and the three-dimensional image B2 with finer accuracy than the comparison in step S1. (Step S2).
- the computer determines the display format of the difference portion of the three-dimensional difference image VB1 by comparing the three-dimensional difference detail image of the three-dimensional image B1 and the three-dimensional difference detail image of the three-dimensional image B2. Yes (step S3).
- This display format is an additive color mixture (RGB) based on the three primary colors of red, green, and blue.
- the display format is subtractive color mixing (C) based on the three primary colors of cyan, magenta, and yellow. MY). If subtractive color mixing is used, black can be added in addition to the three primary colors of cyan, magenta, and yellow.
- the computer displays the three-dimensional difference image V B1 whose display format has been determined on a display or the like in a predetermined display format as a voxel comparison result (step S4).
- the poxel comparison result can be printed not only on a display or the like but also on a printer or a plotter.
- the computer compares the three-dimensional image B1 previously converted into the poxels with the three-dimensional image B2 to generate the three-dimensional difference image VB1. .
- the computer generates a detailed three-dimensional difference image VB2 of the three-dimensional image B1 and the three-dimensional image B2 on the surface of the three-dimensional difference image VB1 with a finer precision than the comparison of the step S1. Is done.
- the computer compares the three-dimensional difference minute image of the three-dimensional image B1 with the three-dimensional difference minute image of the three-dimensional image B2 to determine the display format of the difference part of the three-dimensional difference image VB1. Is done.
- the computer displays the three-dimensional difference image V B 1 whose display format has been determined on a display or the like in a predetermined display format as a result of the poxel comparison. This makes it possible to perform the comparison processing with high accuracy and high speed in comparison of three-dimensional images.
- FIG. 2 is a conceptual diagram of a method for comparing three-dimensional images according to the first embodiment.
- the comparison method in the first embodiment as shown in FIG. 2, it is not possible to directly compare the surface data I m1 1 which is a design model with another surface data I m1 2. Therefore, in the comparison method according to the first embodiment, the volumes enclosed by the surface data I ml 1 and I ml 2 are converted into poxels, and the poxels B 1 1 and B 12 are generated, respectively. The comparison between 1 and the poxel B 1 2 is performed. This makes it possible to easily compare two 3D images.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of a computer that can execute a three-dimensional image comparison program.
- the entire computer 100 is controlled by a CPU (Central Processing Unit) 101.
- the CPU 101 is connected to a random access memory (RAM) 102, a hard disk drive (HDD) 103, a graphic processing device 104, an input interface 105, and a communication interface 106 via a bus 107. .
- RAM random access memory
- HDD hard disk drive
- the RAM 102 temporarily stores at least a part of an S (Operating System) program and an application program to be executed by the CPU 101. Further, the RAMI 02 stores various data required for processing by the CPU 101.
- the HDD 103 stores an OS and application programs such as a three-dimensional image comparison program.
- a monitor P111 is connected to the graphic processing device 104. Further, the graphic processing device 104 displays an image on the screen of the monitor P111 according to a command from the CPU 101.
- the input interface 105 is connected with a keypad P112 and a mouse P113. The input interface 105 transmits a signal transmitted from the keyboard P 112 or the mouse P 113 to the CPU 101 via the bus 107.
- Communication interface 106 is connected to network 110.
- the communication interface 106 transmits and receives data to and from another computer via the network 110.
- the computer 100 By causing the computer 100 having the above hardware configuration to execute the three-dimensional image comparison program, the computer 100 functions as a three-dimensional image comparison device.
- the processing functions of the three-dimensional image comparison device realized by the computer executing the three-dimensional image comparison program will be described below.
- FIG. 4 is a functional block diagram showing a functional configuration of the three-dimensional image comparison device.
- the three-dimensional image comparison device includes a comparison information input unit 10 for inputting comparison information, a poxel processing unit 20 for generating a poxel from the input three-dimensional image, and a Image alignment processing unit 30 for aligning dimensional images, Poxel difference evaluation processing for evaluating and displaying poxel differences And a poxel superimposition processing unit 50 for superimposing the displayed poxels.
- the comparison information input unit 10 apart from its own function, is a measurement accuracy input unit 11, a measurement accuracy storage unit 12, a three-dimensional surface input unit 13, an image storage unit 14, a point cloud It has a processing switching section 15, a point group (surface) input section 16, a point group (surface) storage section 17, a point group (volume) input section 18, and a point group (volume) storage section 19.
- the poxel processing section 20 has a poxel generating section 21, a poxel generating section 22, and a poxel storage section 23, separately from its own function. The details of these functions are listed below.
- the comparison information input unit 10 is connected to the poxel processing unit 20, the three-dimensional CAD image DB 11, and the three-dimensional measurement image DB 12. Enter Here, the comparison information input unit 10 accepts a user's instruction to input a three-dimensional image. Further, the comparison information input unit 10 determines the image type of the received three-dimensional image. Then, the comparison information input unit 10 determines whether or not the three-dimensional image is stored in the image storage unit 14, the point cloud (surface) storage unit 17, or the point cloud (volume) storage unit 19. I do.
- the measurement accuracy input unit 11 is connected to the measurement accuracy storage unit 12 and receives an input of the measurement accuracy by the user.
- the measurement accuracy can be freely determined by the user, and is the minimum unit of a 3D image to be turned into a poxel.
- the measurement accuracy it is preferable to use, for example, the required accuracy (0.01 mm or the like) of the product to be manufactured.
- the measurement accuracy storage unit 12 is connected to the measurement accuracy input unit 11 and stores the measurement accuracy received by the measurement accuracy input unit 11.
- the three-dimensional surface input unit 13 is connected to the image storage unit 14 and the three-dimensional CAD image DB 11, and if the image type of the three-dimensional image specified by the user is a three-dimensional surface. For example, input of a three-dimensional surface is received from the three-dimensional CAD image DB 11.
- the image storage unit 14 is connected to the three-dimensional surface input unit 13 and stores the three-dimensional surface received by the three-dimensional surface input unit 13 in an internal storage device.
- the point cloud processing switching section 15 is connected to the point cloud (surface) input section 16 and the point cloud (volume) input section 18. Connected to switch point cloud processing.
- the point cloud processing switching unit 15 is a point cloud (surface) input unit 16 and a point cloud (volume). Switches to input section 18.
- the point cloud (surface) input unit 16 is connected to the point cloud processing switching unit 15, point cloud (surface) storage unit 17, and three-dimensional measurement image DB 12, and accepts input of measurement surface data.
- the point cloud (surface) input unit 16 receives the input of the measurement surface data from the three-dimensional measurement image DB 12 if the image type of the three-dimensional image specified by the user is the measurement surface data.
- the point cloud (surface) storage unit 17 is connected to the point cloud (surface) input unit 16 and stores the measured surface data.
- the point cloud (surface) storage unit 17 stores the measured surface data received by the point cloud (surface) input unit 16 in an internal storage device.
- the point cloud (volume) input unit 18 is connected to the point cloud processing switching unit 15, the point cloud (volume) storage unit 19, and the three-dimensional measurement image DB 12, and accepts input of measurement cross-section data. Do.
- the point cloud (volume) input unit 18 accepts input of measured cross-sectional data from the three-dimensional measured image DB 12 if the image type of the three-dimensional image specified by the user is the measured cross-sectional data.
- the point cloud (volume) storage unit 19 is connected to the point cloud (volume) input unit 18 and stores the measured section data.
- the point cloud (volume) storage unit 19 stores the measured cross-sectional data received by the point cloud (volume) input unit 18 in an internal storage device.
- the poxel processing unit 20 is connected to the comparison information input unit 10 and the image registration processing unit 30 and generates an image by a poxel from the three-dimensional image input by the comparison information input unit 10.
- the poxel processing unit 20 selects the comparison information recorded in the comparison information input unit 10.
- the poxel processing unit 20 determines the poxel size based on the measurement accuracy recorded in the comparison information input unit 10. Then, the poxel processing unit 20 receives the image type of the three-dimensional image stored in the comparison information input unit 10 and determines the next process.
- the poxel processing unit 20 selects a three-dimensional surface from the image storage unit 14 of the comparison information input unit 10 if the image type is a three-dimensional surface. Select and read, and pass this 3D surface to the poxel generator 21. If the image type is the measurement surface data, the poxel processing unit 20 selects and reads the measurement surface data from the point cloud (surface) storage unit 17 of the comparison information input unit 10 and reads the measurement surface data. To the poxel generator 22.
- the poxel processing section 20 selects and reads the measurement section data from the point cloud (volume) storage section 19 of the comparison information input section 10.
- the measurement section data is passed to the poxel generator 22.
- the poxel processing section 20 determines whether or not the poxel is generated by the poxel generating section 21 and the poxel generating section 22.
- the poxel generator 21 generates an image based on the passed three-dimensional surface.
- the poxel generator 22 generates an image using the poxels based on the passed measurement surface data. Further, the poxel generating unit 22 generates an image using the poxels based on the passed measurement cross-sectional data.
- the image alignment processing unit 30 is connected to the poxel processing unit 20 and the poxel difference evaluation processing unit 40, and aligns the two images converted into the poxels by the poxel processing unit 20.
- the image registration processing unit 30 selects a reference point that has been added to the two three-dimensional images in advance.
- the two three-dimensional images refer to a three-dimensional surface that is CAD data, and measurement surface data or measurement cross-section data that is measurement data. Further, the image registration processing unit 30 calculates an offset from the selected reference point. Further, based on the calculated offset, the image registration processing unit 30 moves the three-dimensional surface converted into the poxels. Then, the image registration processing unit 30 stores the moved three-dimensional surface in an internal storage device.
- the poxel difference evaluation processing unit 40 is connected to the image alignment processing unit 30 and the poxel overlay processing unit 50, and is aligned by the image alignment processing unit 30.
- the two images are compared and evaluated, and the two evaluated images are displayed in a predetermined display format.
- the predetermined display format is, for example, that the depth of each color of RGB is changed according to the difference between images and displayed.
- the difference evaluation processing unit 40 selects a surface poxel. With this selection, the poxel difference evaluation processing unit 40 determines whether the selected poxel is a surface poxel. Then, if a surface poxel is selected by this determination, the poxel difference evaluation processing unit 40 re-poxels with evaluation accuracy. That is, according to the evaluation accuracy, the surface pixels are made into finer pixels.
- the evaluation accuracy can be determined in advance by the user. In addition, the evaluation accuracy can be refined stepwise from several times the measurement accuracy.
- the poxel difference evaluation processing unit 40 compares and counts the difference poxels based on the three-dimensional surface converted into the poxels and the measured surface Z-section data. The details of this comparison processing will be described later.
- the poxel difference evaluation processing section 40 determines the color depth as a display format based on the counted difference poxels, and colors the poxels.
- the poxel difference evaluation processing unit 40 determines whether or not the poxels are colored. That is, the poxel difference evaluation processing unit 40 determines whether or not the three-dimensional images converted into the poxels have been compared.
- the poxel overlay processing section 50 is connected to the poxel difference evaluation processing section 40, and overlays a plurality of images displayed by the poxel difference evaluation processing section 40.
- the three-dimensional CAD image DB 11 is connected to the comparison information input unit 10 and stores a three-dimensional surface that is CAD data.
- the three-dimensional measurement image DB12 is connected to the comparison information input unit 10, and stores measurement surface data and measurement cross-section data that are measurement data.
- the calculation time is reduced by comparing the three-dimensional shapes with the poxels.
- the image is divided into fine and uniform cubes (poxels), the inside of which is turned on, and the outside is turned off.
- Poxels fine and uniform cubes
- the image measured by the X-ray CT scanner pixels with high X-ray density are turned on, and those with low X-ray density are turned off.
- FIG. 5 is a diagram showing an outline of the comparison processing of the three-dimensional image comparison program.
- the comparison process converts the two images into a poxel like a three-dimensional image B21 and a three-dimensional image B22, and sets the overlapping part (3D difference image VB21) as a Boolean image.
- the difference can be expressed visually.
- the computational complexity of the comparison is several tens of 1Z of the conventional method.
- a non-matching pixel can be extracted by performing an exclusive OR (XOR) between the pixel of the three-dimensional image B21 and the pixel of the three-dimensional image B22.
- XOR exclusive OR
- FIG. 6 is a diagram showing a procedure for comparing the compared three-dimensional images with finer poxels.
- a coarse poxel (3D difference image VB21) is generated.
- the poxel including the surface is finely divided (three-dimensional difference fine image VB22), and the difference between the two images is evaluated based on the number of difference poxels.
- the difference between the two images is "8". This indicates, for example, the difference between the fine poxels in the three-dimensional difference fine image VB22 of the images I21 and I22 in the coarse poxels.
- the color depth of the poxels is determined based on the number of differences.
- the overall difference becomes visually understandable.
- the image I22 can be determined to be a difference.
- the poxel Bx16 since the color of this portion (for example, the depth of red) is determined, it can be seen that the image I21 is a difference.
- the color of this part since the color of this part is not determined, it can be determined that there is no difference.
- the poxels are further subdivided by the poxels having a similarity ratio of 1/4.
- the difference is up to 64 (4X4X4) poxels.
- FIG. 7 is a diagram showing color determination for a common poxel in a method of setting a color determined by the number of poxel differences.
- the brightness is set to the maximum value (for example, 255, 255, 255) for the three RGB colors at the setting position ST1. I do.
- FIG. 8 is a diagram showing color determination for a difference poxel (reference image) in a method of setting a color determined by the number of poxel differences.
- FIG. 9 is a diagram showing color determination for a difference poxel (comparison target image) in a method of setting a color determined by the number of poxel differences.
- the color values are determined in proportion to the number of poxels in each image, as shown in Figs.
- FIG. 10 is a diagram showing the relationship between the element numbers of nodes and nodes.
- the poxels converted from the three-dimensional image are, for example, eight poxels BX1 to BX8.
- the poxel B xl is composed of articulation points (1, 2, 5, 4, 10, 11, 14, 13).
- the I injection number at the node is counted counterclockwise on each of the upper and lower surfaces.
- the coordinate information is registered in the poxel data structure for each of such nodes. Have been. Therefore, the details of this coordinate information will be described with reference to FIG. 11 and FIG.
- FIG. 11 is a diagram showing a general data structure at a node of a three-dimensional image.
- FIG. 11 shows a data structure in a typical finite element analysis.
- the data structure D1 of the poxels has a data type area D11 indicating a data type, a data number area D12 indicating a serial number in each data type, and a data type area D12 indicating a serial number in each data type. It comprises a position information area D13 indicating position information.
- the overnight area D11 is composed of two items, "GRID” and "CHEXA". Where “GR I D” indicates a node and "CHE
- XA indicates the elements obtained by dividing the 3D image.
- RID " is assigned a serial number of 1 to 27.
- the serial numbers 1 to 27 are the node numbers described above with reference to FIG. 10.
- the data number area D12 for example, As shown in Fig. 11, for the data type "CHEXA",
- serial numbers 1 to 8 are the element numbers of the poxels described above with reference to FIG.
- the coordinate information at each node of “0.000000, 0.000000, 50.00000” is set for the data number 1 of “ID.” This is the same as the data number (“GR ID”).
- the coordinates of x, y, and z are represented by floating-point numbers, which are set as shown in the figure below. For example, as shown in Fig. 11, for the data number 1 of the data type "CHEXA", the node ID of each element of "17, 21, 27, 22, 7, 10, 26, 1" is set. This is the data number ("CHEXA"
- FIG. 12 is a diagram showing the data structure of the poxel method at the nodes of the poxels.
- Fig. 12 shows the data structure in the case of expressing three dimensions by the Poxel method. ing.
- the poxel data structure D2 sets the size of the element and the node of each element for each of the predefined records.
- the size of the element the length of the side of X, y, z is represented by a floating point. Nodes that define each element are set as nodes of each element. However, the element IDs described above do not need to be defined because they are regularly arranged.
- the record (RECORD 7) has an element size expressed in exponential notation (floating point notation) “2.500000 e + 001, 2.50000000 e + 001, 2. 50000004 e + 001 "is set.
- FIG. 13 is a flowchart showing the overall flow of the three-dimensional image comparison processing. This process is a process in which the three-dimensional image comparison device is activated by turning on the power, activating a program, or any predetermined timing, and is executed by the CPU 101. Hereinafter, the processing illustrated in FIG. 13 will be described along the step numbers. The names of the functions in this chart are explained based on Fig. 4.
- the comparison information input unit 10 performs a comparison information input process of inputting CAD data of a three-dimensional image and measurement data, which are comparison information. The details of the comparison information input processing will be described later with reference to FIG.
- Step S20 The poxel processing unit 20 performs a poxel process of generating an image using the poxels from the three-dimensional image input in step S10. The details of the poxel processing will be described later with reference to FIG.
- Step S30 The image alignment processing unit 30 determines in step S20 Performs image alignment processing to align the two images converted to poxels. The details of the image registration process will be described later with reference to FIG.
- Step S40 The poxel difference evaluation processing unit 40 performs an image comparison process of comparing and evaluating the two images positioned in step S30. The details of the image comparison process will be described later with reference to FIG.
- the poxel difference evaluation processing unit 40 displays the plurality of images compared and evaluated in step S40 in a predetermined display format.
- the predetermined display format is, for example, that the depth of each color of RGB is changed according to the difference between images and displayed.
- the poxel overlay processing unit 50 overlays the plurality of images displayed in step S50.
- the comparison result is superimposed on the 3D image.
- a three-dimensional difference image which is a comparison result, is generated by exclusive OR based on the two images that have been subjected to the poxelation, and only the difference part can be visualized.
- the poxel superimposition processing unit 50 superimposes the two poxelized images, the three-dimensional difference image, or the three-dimensional difference fine image, and displays the original image and the difference part in different colors. Can be done.
- FIG. 14 is a flowchart showing comparison information input processing in the three-dimensional image comparison processing of FIG.
- step S10 comparison information input processing
- processing is performed according to the following flow. This comparison information input processing is performed by the comparison information input unit 10 of the three-dimensional image comparison device.
- the measurement accuracy input unit 11 of the comparison information input unit 10 accepts the input of the measurement accuracy by the user.
- Step S102 The measurement accuracy storage unit 12 of the comparison information input unit 10 stores the measurement accuracy received in step S101.
- the comparison information input unit 10 receives a user's input instruction of a three-dimensional image.
- Step S104 The comparison information input unit 10 accepts in step S103 Determines the image type of the 3D image.
- the comparison information input unit 10 proceeds to step S105 if the image type is the three-dimensional surface, and proceeds to step S107 if the image type is the measurement surface data, or If the image type is the measurement section data, the flow advances to step S109. If the image type is measurement surface data or measurement section data, the point group (surface) input section 16 and the point group (surface) input section 16 are switched by the point group processing switching section 15 of the comparison information input section 10. (Volume) Switching to input section 18 is performed.
- Step S105 The 3D surface input unit 13 of the comparison information input unit 10 uses the 3D CAD image DB 11 since the image type is 3D surface in step S104. Accept input of 3D surface.
- Step S106 The image storage unit 14 of the comparison information input unit 10 stores the three-dimensional surface received in step S105 in an internal storage device.
- Step S107 The point group (surface) of the comparison information input unit 10 (surface)
- the input unit 16 is a 3D measurement image because the image type is the measurement surface data in step S104. Accept input of measurement surface data from DB12.
- Step S108 The point group (surface) storage unit 17 of the comparison information input unit 10 stores the measurement surface data received in step S107 in an internal storage device.
- the point group (volume) input part 18 of the comparison information input part 10 is a three-dimensional measurement image because the image type is the measurement section data in step S104. Accepts input of measurement section data from DB12.
- Step S110 The point cloud (volume) storage unit 19 of the comparison information input unit 10 stores the measured cross-sectional data received in step S109 in an internal storage device.
- Step S111 The comparison information input unit 10 determines whether a three-dimensional image has been stored in step S106, step S108, or step S110. Here, the comparison information input unit 10 returns to step S104 if a three-dimensional image has not been recorded, and repeats the same processing. If a three-dimensional image has been recorded, this processing ends. Then, the process returns to step S10 in FIG.
- FIG. 15 is a flowchart showing the poxel processing in the three-dimensional image comparison processing of FIG.
- step S20 poxel processing
- Fig. 13 Processing is performed according to the following flow. This poxel processing is performed by the poxel processing unit 20 of the three-dimensional image comparison device.
- Step S201 The poxel processing unit 20 selects the comparison information stored in step S10.
- Step S202 The poxel processing unit 20 determines the poxel size based on the measurement accuracy stored in step S10.
- Step S203 The poxel processing unit 20 receives the type of the three-dimensional image stored in step S10.
- Step S204 The poxel processing unit 20 determines the image type of the three-dimensional image received in step S203.
- the poxel processing unit 20 proceeds to step S205 if the image type is a three-dimensional surface, and proceeds to step S207 if the image type is a measurement surface data. If the image type is the measurement section data, the process proceeds to step S209.
- Step S 205 Since the image type is a three-dimensional surface in step S 204, the poxel processing unit 20 changes the image storage units 14 to 3 of the comparison information input unit 10. Select and read the dimensional surface, and pass this 3D surface to the poxel generator 21.
- Step S206 The poxel generation unit 21 of the poxel processing unit 20 generates a 3D image by the poxel based on the 3D surface passed in step S205. Generate.
- the poxel processing unit 20 stores the point group (surface) of the comparison information input unit 10 since the image type is measured surface data in step S204. Select and read measurement surface data from 17 and pass this measurement surface data to the pixel generator 22.
- Step S208 The poxel generating unit 22 of the poxel processing unit 20 generates a three-dimensional image using the poxels based on the measurement surface data passed in step S207. .
- the poxel processing unit 20 stores the point group (volume) storage unit of the comparison information input unit 10 because the image type is the measurement section data in step S204. Select and read the measurement cross-section data from 19 and pass this measurement cross-section data to the pixel generator 22.
- Step S210 The poxel generation unit 22 of the poxel processing unit 20 generates a three-dimensional image using the poxels based on the measurement cross-sectional data passed in step S209. .
- step S211 the poxel processing section 20 determines whether or not a three-dimensional image is generated by the poxels. Here, if no poxel has been generated, the process returns to step S204 to repeat the same processing. If a poxel has been generated, the poxel processing unit 20 ends this process and returns to the step in FIG. Return to S20.
- FIG. 16 is a flowchart showing an image registration process in the three-dimensional image comparison process of FIG.
- step S30 image registration processing
- processing is performed according to the following flow. Note that this image registration processing is performed by the image registration processing unit 30 of the three-dimensional image comparison device.
- the image alignment processing unit 30 selects a reference point added to two three-dimensional images in advance.
- the two 3D images refer to a 3D surface that is CAD data and a surface data or a cross section that is a measurement data.
- Step S302 The image registration processing unit 30 calculates an offset from the reference point selected in step S301.
- Step S303 The image alignment processing unit 30 moves the poxel-transformed three-dimensional image based on the offset calculated in step S302.
- Step S304 The image registration processing unit 30 stores the three-dimensional image moved in step S303 in an internal storage device, ends this processing, and ends the processing in step S3 of FIG. Returns to 0.
- FIG. 17 is a flowchart showing a first image comparison process in the three-dimensional image comparison process of FIG.
- step S40 image comparison processing
- processing is performed according to the following flow. Note that this first image comparison processing is performed by the poxel difference evaluation processing unit 40 of the three-dimensional image comparison device. Also,
- Poxel difference evaluation processing section 40 selects a surface poxel.
- the selection of the surface poxels can be easily determined by, for example, that the inside of the surface poxels is on and the outside is off for a three-dimensional image that has been turned into a poxel.
- Step S 4 1 2 The poxel difference evaluation processing section 40 determines whether the poxel selected in step S 411 is a surface poxel. Here, the poxel difference evaluation processing unit 40 proceeds to step S413 if a surface poxel is selected, and returns to step S411 if a surface poxel is not selected.
- Step S4 13 The poxel difference evaluation processing section 40 performs re-poxelization with evaluation accuracy because the surface poxel is selected in step S4 12. That is, the surface poxels are further refined according to the evaluation accuracy.
- the poxel difference evaluation processing unit 40 compares and counts the difference poxels based on the three-dimensional surface converted into the poxels and the measured surface Z section data.
- Step S4l5 The poxel difference evaluation processing section 40 determines the color depth of the display format based on the difference poxels counted in step S4l4, and colors the poxels. Do.
- Step S 4 16 The poxel difference evaluation processing section 40 determines whether or not the poxels have been colored in step S 4 15. That is, the poxel difference evaluation processing unit 40 determines whether or not the three-dimensional images converted into the poxels have been compared. Here, if the poxel is not colored, the poxel difference evaluation processing unit 40 returns to step S411 and repeats the same processing. If the poxel is colored, This process ends, and the process returns to step S40 in FIG.
- the computer compares the three-dimensional image B 21 previously converted into the poxels with the three-dimensional image B 22 to obtain the three-dimensional difference image VB 21. Generated.
- the computer converts the three-dimensional difference image VB 22 between the three-dimensional image B 21 and the three-dimensional image B 22 to the surface of the three-dimensional difference image VB 21 with higher precision than the previous comparison. Generated.
- the computer compares the three-dimensional difference detailed image of the three-dimensional image B21 with the three-dimensional difference fine image of the three-dimensional image B22, and displays the difference part of the three-dimensional difference image VB21.
- the format is determined. In this determination, the color is determined according to the number of differences of the poxels.
- the computer displays the three-dimensional difference image VB21 whose display format has been determined on a display or the like in a predetermined display format as a result of the poxel comparison.
- This makes it possible to perform the comparison processing with high accuracy and high speed in comparison of three-dimensional images.
- the image comparison target a three-dimensional poxel, it is not necessary to compare innumerable point clouds, and it is possible to increase the efficiency of memory resources and reduce the number of operations.
- the number of operations can be significantly reduced.
- the 3D image surface coarse pixels are compared to generate a 3D difference fine image, and the comparison is performed. Only the coarse poxels with differences can be compared by generating a 3D difference fine image.
- the second embodiment differs from the first embodiment in the form of comparison processing. That is, in the second embodiment, three-dimensional images are compared on a per-poxel basis, and the difference between the three-dimensional images is further determined by a predetermined display format corresponding to the depth of the number of different poxels from each surface of the reference poxel. And each surface of the reference poxel is displayed.
- FIG. 18 is a diagram illustrating an outline of a comparison process of the three-dimensional image comparison program in the second embodiment.
- the comparison process in the second embodiment uses the poxels created in the first embodiment, and displays the difference in color for each face of the poxels in one of the three-dimensional images.
- the color is determined by the number of poxels in the vertical direction (depth) from each surface as follows.
- a three-dimensional image is compared using the poxels created in the first embodiment (step Sib).
- the mismatched part of the compared three-dimensional difference image VB21 is partially extracted (step S2b).
- the number of unmatched poxels (vertical direction) from each face (zy plane, zx plane, yx plane) of the three-dimensional shape is determined (step S3b).
- the computer compares the three-dimensional images using the poxels created in the first embodiment. Also, the mismatched part of the compared three-dimensional difference image VB21 is partially extracted.
- the computer calculates the number of non-matching poxels (vertical poxels BB 1) in the vertical direction from each of the three-dimensional shapes (zy, zx, and yx) with respect to the matching poxels (reference poxel BB1). Is counted, and the display format is determined.
- the computer displays each of the matching poxels (reference poxels) in a predetermined display format corresponding to the count number.
- the functions in the above outline differ only in the function of the poxel difference evaluation processing unit 40 in the function block diagram of FIG. Therefore, in the second embodiment, only new functions (including duplication) of the pixel difference evaluation processing unit 40 will be described.
- the poxel on the surface adjacent to and coincident with the mismatched portion is described as the reference poxel, but in the following description, the surface adjacent to and coincident with the difference portion is described.
- a poxel is selected and a range is determined, and the poxel when this range is re-poxelled with evaluation accuracy is set as a reference poxel.
- a new function of the poxel difference evaluation processing unit 40 is that the image alignment processing unit 30 performs an image comparison process of comparing the aligned two images, and compares the compared two images in a predetermined display format. To display.
- the predetermined display format is, for example, that the depth of each color of RGB is changed and displayed according to a difference in image.
- the poxel difference evaluation processing unit 40 selects a surface poxel. With this selection, the poxel difference evaluation processing unit 40 determines whether the selected poxel is a surface poxel. Then, if a surface poxel is selected by this determination, the poxel difference evaluation processing unit 40 determines the range of re-poxelization. That is, the part where the image does not match is determined as the range to be investigated in detail.
- the poxel difference evaluation processing unit 40 re-poxels the determined range with the evaluation accuracy. That is, the surface poxels for which the range has been determined are further refined according to the evaluation accuracy. Then, the poxel difference evaluation processing unit 40 counts the difference poxels based on the three-dimensional surface converted into the fine poxels and the measurement surface / cross-sectional data. In other words, this corresponds to the number of non-matching poxels (the number of difference poxels) from each surface (zy, zx, and yx) of the three-dimensional shape of the reference poxel with respect to the fine poxels of the matching surface (reference poxels). Count).
- the poxel difference evaluation processing section 40 determines the color depth as a display format based on the counted difference poxels, and colors the poxels. That is, the poxel difference evaluation processing unit 40 determines the display format (color) of each face of the reference poxel with respect to the fine poxels (reference poxels) of the matching surfaces.
- the poxel difference evaluation processing unit 40 determines whether or not the poxels are colored. That is, the poxel difference evaluation processing unit 40 determines whether or not the three-dimensional images converted into the poxels have been compared.
- FIG. 19 is a flowchart showing a second image comparison process in the three-dimensional image comparison process of FIG.
- step S40 image comparison processing
- Fig. 13 processing is performed according to the following flow.
- this second image comparison processing is performed by the poxel difference evaluation processing unit 40 of the three-dimensional image comparison device.
- the poxel on the surface adjacent to and coincident with the non-coincidence (difference) is described as the reference poxel, but in the following description, the surface adjacent to and coincident with the difference A poxel is selected and a range is determined, and the poxel when this range is re-poxelized with the evaluation accuracy is set as a reference poxel.
- Poxel difference evaluation processing section 40 selects a surface poxel.
- the selection of the surface poxels can be easily determined by, for example, that the inside of the surface poxels is on and the outside is off for a three-dimensional image that has been turned into a poxel.
- Step S422 The poxel difference evaluation processing unit 40 determines whether the poxel selected in step S421 is a surface poxel. Here, the poxel difference evaluation processing unit 40 proceeds to step S 423 if a surface poxel is selected, and returns to step S 421 if a surface poxel is not selected.
- Step S 4 2 3 Since the surface poxels have been selected in step S 4 22, the poxel difference evaluation processing section 40 determines the range of re-poxelization. That is, the part where the images do not match is determined as the range to be investigated in detail.
- Step S424 The poxel difference evaluation processing unit 40 performs re-poxelization with evaluation accuracy on the range determined in step S424. In other words, the surface poxels for which the range has been determined are further refined into poxels according to the evaluation accuracy.
- the poxel difference evaluation processing unit 40 counts the difference poxels again based on the three-dimensional surface converted into the poxels and the measured surface Z-section data. That is, this corresponds to the matching surface poxel (reference poxel). On the other hand, the number of non-matching poxels (depth of the number of difference poxels) is counted from each surface (zy plane, zx plane, yx plane) of the three-dimensional shape of the reference poxel.
- the poxel difference evaluation processing unit 40 determines the color depth, which is the display format, based on the difference poxels counted in step S 4 25, and colorizes the poxels. Do. That is, the poxel difference evaluation processing unit 40 determines the display format (color) of each face of the poxel with respect to the matching face poxel (reference poxel).
- Step S 4 27 The poxel difference evaluation processing section 40 determines whether or not the poxels are colored in step S 4 26. That is, the poxel difference evaluation processing unit 40 determines whether or not the three-dimensional images converted into the poxels have been compared. Here, the poxel difference evaluation processing unit 40 returns to step S 4 21 and repeats the same processing if the poxel is not colored. To step S40.
- the comparison processing in the second embodiment is performed by replacing the pixels including the depth direction with three in order to reduce the computer resources (memory amount, disk area, and calculation time) for comparison.
- the difference is displayed by the color of each surface of the dimensional shape. This reduces computational resources by a fraction.
- the color that expresses the difference is determined by the depth proportional to the number of depth poxels. Furthermore, this density is averaged based on the maximum value of the total density.
- the poxels on the surface that are adjacent and coincide with the non-coincidence part are described as the reference poxels, and the surface poxels that are adjacent and coincide with the difference part are selected and the range is determined.
- the reference poxels the poxels obtained when this range is re-poxelized with the evaluation accuracy are used as the reference poxels, the poxels on the matching surface can be simply used as the reference poxels.
- the third embodiment differs from the first embodiment in the form of comparison processing. That is, in the third embodiment, a three-dimensional image is compared on a per-poxel basis, and a predetermined display format corresponding to the ratio of the number of surface poxels and the volume in the entire difference part of the three-dimensional image is obtained. Allows This is to display the cell.
- FIG. 20 is a diagram showing a difference evaluation (1) of the comparison processing of the three-dimensional image comparison program in the third embodiment.
- the difference between the pixels generated in the first embodiment is only the part where the two images are different.
- This different part is composed of one or more poxels.
- the ratio between the number of surface poxels in these islands and the total number of poxels in the island is displayed as a color.
- the color is determined as follows. In the following description, this island is simply referred to as a difference part.
- a three-dimensional image A and an image B are compared using the poxels created in the first embodiment.
- the shape of the difference part is expressed as follows. 'Volume: 1 2 Poxels
- the ratio of the number of surface poxels Z to the total number of poxels was 2 5 12, indicating that the difference of the poxels for a certain difference portion was small.
- the volume per surface area is set in colors as shown in Fig. 21.
- FIG. 21 is a diagram showing colors (light colors) determined by the comparison processing of the three-dimensional image comparison program in the third embodiment.
- the colors for displaying the different portions are described using RGB as an example.
- the color of blue (B) is made variable to express the poxel difference of the entire difference part.
- the RGB settings at the setting position ST11 are: 255 (R), 255 (G), 22 3 (B) is set.
- This 2 23 (B) shows a pale color, which indicates that the poxel difference of a certain difference portion is light. If this is, for example, 96 (B), then the poxel differences across a given difference can be shown to be strong.
- FIG. 22 shows the evaluation of the poxel difference when the poxel difference is deep.
- FIG. 22 is a diagram showing a difference evaluation (2) of the comparison processing of the three-dimensional image comparison program in the third embodiment. Note that FIG. 22 describes a case where the pixel difference is deeper than that of FIG.
- a three-dimensional image A and an image B are compared using the poxels created in the first embodiment.
- the shape of the difference part is expressed as follows. • Volume: 4 3 Poxels
- the ratio of the number of surface poxels Z to the total number of poxels was 3 2 Z 43, indicating that the difference of the poxels for a certain difference portion was deep.
- the volume per surface area is set in colors as shown in Fig. 23.
- FIG. 23 is a diagram showing colors (dark colors) determined by the comparison processing of the three-dimensional image comparison program in the third embodiment.
- the colors for displaying the different portions are described using RGB as an example.
- the color of blue (B) variable by making the color of blue (B) variable, the poxel difference of the entire difference part is expressed.
- the RGB settings at the setting position ST12 are 255 (R), 255 (G), 96 (B) is set. This 96 (B) shows a dark color, indicating that the poxel difference in the entire difference part is deep.
- FIG. 24 is a diagram showing the difference evaluation (3) of the comparison processing of the three-dimensional image comparison program in the third embodiment.
- the separation (thick line) line in Fig. 24 indicates the separation position of the difference poxel compared with the original difference poxel by the fine poxel.
- FIG. 24 shows a case where the poxel difference is evaluated in more detail with respect to FIG. The description of the difference evaluation method is omitted because the same processing as in FIGS. 20 and 22 is performed on more detailed poxels.
- the functions in the above outline differ only in the function of the poxel difference evaluation processing unit 40 in the functional block diagram of FIG. Therefore, in the third embodiment, only new functions (including duplication) of the pixel difference evaluation processing unit 40 will be described.
- the surface poxels that are adjacent and coincide with the non-coincidence part are selected, the range is determined, and the poxel obtained when the re-poxelization is performed with the evaluation accuracy is used as the reference poxel.
- a poxel on the surface that is adjacent to and coincides with the non-coincidence portion may be used as the reference poxel.
- a matching surface poxel may simply be used as the reference poxel.
- a new function of the poxel difference evaluation processing unit 40 is to compare and evaluate the two images aligned by the image alignment processing unit 30 and display the evaluated two images in a predetermined display format.
- the predetermined display format is, for example, that the depth of each color of RGB is changed and displayed according to a difference between images.
- the poxel difference evaluation processing unit 40 selects a surface poxel. With this selection, the poxel difference evaluation processing unit 40 determines whether the selected poxel is a surface poxel. Then, since the surface poxels have been selected, the poxel difference evaluation processing unit 40 determines the range of the re-poxelization. That is, the part where the images do not match is determined as the range to be investigated in detail.
- the poxel difference evaluation processing unit 40 re-poxels the determined range with the evaluation accuracy. That is, the surface poxels for which the range has been determined are further refined according to the evaluation accuracy. Then, the poxel difference evaluation processing unit 40 counts the number of difference poxels based on the three-dimensional surface converted into the poxels and the measured surface Z-section data. That is, the ratio between the number of surface poxels and the number of all poxels in the entire difference part of the three-dimensional difference image is obtained.
- the poxel difference evaluation processing unit 40 determines the color depth, which is the display format, based on the obtained difference poxel ratio, and colors the poxels. In other words, the poxel difference evaluation processing unit 40 determines the display format (color) of the entire difference part composed of unmatched poxels.
- the poxel difference evaluation processing unit 40 determines whether or not the poxels are colored. In other words, the poxel difference evaluation processing unit 40 executes the three-dimensional Determine if the images have been compared.
- FIG. 25 is a flowchart showing a third image comparison process in the three-dimensional image comparison process of FIG.
- step S40 image comparison processing
- FIG. 25 processing is performed according to the following flow. Note that the third image comparison processing is performed by the poxel difference evaluation processing unit 40 of the three-dimensional image comparison device.
- Poxel difference evaluation processing section 40 selects a surface poxel.
- the selection of the surface poxels can be easily determined by, for example, that the inside of the surface poxels is on and the outside is off for a three-dimensional image that is turned into a poxel.
- Step S 4 32 The poxel difference evaluation processing unit 40 determines whether the poxel selected in step S 431 is a surface poxel. Here, the poxel difference evaluation processing unit 40 proceeds to step S433 if a surface poxel has been selected, and returns to step S431 if a surface poxel has not been selected.
- Step S 4 3 3 Since the surface poxels have been selected in step S 4 32, the poxel difference evaluation processing section 40 determines the range of re-poxelization. That is, the part where the images do not match is determined as the range to be investigated in detail.
- Step S4334 The poxel difference evaluation processing unit 40 re-poxels the range determined in step S433 with evaluation accuracy. In other words, the surface poxels for which the range has been determined are further refined into poxels according to the evaluation accuracy.
- the poxel difference evaluation processing unit 40 counts the number of difference poxels based on the three-dimensional surface converted into the poxels and the measured surface / cross-sectional data. That is, the ratio between the number of surface poxels in the entire difference portion of the three-dimensional difference image and the total number of poxels is obtained.
- the poxel difference evaluation processing unit 40 determines the color depth of the display format based on the ratio of the difference poxels obtained in step S 4 3 5, and colors the poxels. I do. That is, the poxel difference evaluation processing unit 40 determines the display format (color) of the entire difference unit including the unmatched poxels.
- Step S437 The poxel difference evaluation processing section 40 determines whether or not the poxels are colored in step S436. That is, the poxel difference evaluation processing unit 40 determines whether or not the three-dimensional images converted into the poxels have been compared. Here, the poxel difference evaluation processing unit 40 returns to step S 4 31 if the poxels are not colored, and repeats the same processing. If the poxels are colored, the processing ends and the processing ends. To step S40.
- the comparison process in the third embodiment reduces the computer resources (memory capacity, disk area, and calculation time) for comparison, and instead of using the pixels including the depth direction, The difference is displayed in the color of the whole part.
- the ratio of the number of surface poxels / total number of poxels is a maximum of 2 and a minimum of 0 (more precisely, a very small value close to 0). Therefore, the color depth is set so that 2 is the lightest color and 0 is the darkest color.
- the size of the different part can be understood from the size of the entire different part when displayed in three dimensions.
- computational resources are evaluated as follows.
- this calculation requires calculation of tens of thousands of floating point operations to find one shortest distance. This is repeated for 100 billion points. Ultimately, the time required for this calculation is about 30 hours, assuming that 2 floating point operations can be performed simultaneously at 1 GHz CPU.
- the method according to the embodiment of the present invention only divides the entire space into poxels with measurement accuracy for a volume including a plurality of surfaces and performs a logical operation for each poxel. 100 million floating point operations. Therefore, the same computer takes 25 seconds to calculate.
- the method according to the first embodiment of the present invention can greatly reduce the processing time of the computer as compared with the conventional method.
- the method according to the first embodiment of the present invention divides the entire space into poxels having measurement accuracy, so that comparison and calculation can be performed with the same accuracy as the conventional method.
- Computer-readable recording media include magnetic recording media, optical discs, magneto-optical recording media, and semiconductor memories. Magnetic recording media include hard disks, flexible disks (FD), ZIP (a type of magnetic disk), and magnetic tape.
- Optical discs include DVD (Digital Versatile Disc), DVD-RAM (DVD Random Access Memory), CD-ROM (Compact Disc Read Only Memory), CD-R (CD Recordable), and CD-RW (CD Rewritable) ) and so on.
- Magneto-optical recording media include M ⁇ (Magneto Optical Disk).
- Semiconductor memories include flash memory. When distributing computer programs, portable recording media such as DVDs and CD-ROMs on which the computer programs are recorded are sold.
- the computer program can be stored in the storage device of the server, and the program can be transferred from the server to the client via the network.
- two three-dimensional images converted into poxels are compared, and a surface based on the comparison is generated with further finer accuracy, and the comparison is performed. Since the display format is determined and the display format is used, it is possible to compare the three-dimensional images with high accuracy and high speed.
- a display format corresponding to a ratio obtained by dividing the surface area of the entire difference part of the three-dimensional image by the volume is determined. Displaying in the display format enables 3D images to be compared with high accuracy and high speed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Geometry (AREA)
- Image Generation (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2002/006621 WO2004003850A1 (ja) | 2002-06-28 | 2002-06-28 | 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 |
KR1020047021408A KR100639139B1 (ko) | 2002-06-28 | 2002-06-28 | 3차원 이미지 비교 프로그램을 기록한 컴퓨터로 판독 가능한 기록 매체, 3차원 이미지 비교 방법 및 3차원 이미지 비교 장치 |
EP02736203A EP1519318A4 (en) | 2002-06-28 | 2002-06-28 | PROGRAM AND METHOD FOR COMPARING THREE-DIMENSIONAL IMAGE AND APPARATUS THEREFOR |
JP2004517224A JP4001600B2 (ja) | 2002-06-28 | 2002-06-28 | 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 |
CNB028292359A CN100388317C (zh) | 2002-06-28 | 2002-06-28 | 三维图象的比较程序、比较方法及比较装置 |
US10/989,464 US20050068317A1 (en) | 2002-06-28 | 2004-11-17 | Program, method, and device for comparing three-dimensional images in voxel form |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2002/006621 WO2004003850A1 (ja) | 2002-06-28 | 2002-06-28 | 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/989,464 Continuation US20050068317A1 (en) | 2002-06-28 | 2004-11-17 | Program, method, and device for comparing three-dimensional images in voxel form |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004003850A1 true WO2004003850A1 (ja) | 2004-01-08 |
Family
ID=29808175
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2002/006621 WO2004003850A1 (ja) | 2002-06-28 | 2002-06-28 | 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20050068317A1 (ja) |
EP (1) | EP1519318A4 (ja) |
JP (1) | JP4001600B2 (ja) |
KR (1) | KR100639139B1 (ja) |
CN (1) | CN100388317C (ja) |
WO (1) | WO2004003850A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100419794C (zh) * | 2005-12-16 | 2008-09-17 | 北京中星微电子有限公司 | 一种图像问题的检测方法 |
JP2009104515A (ja) * | 2007-10-25 | 2009-05-14 | Fujitsu Ltd | 差異強調プログラム、差異強調処理方法及び差異強調処理装置 |
JP2012103144A (ja) * | 2010-11-11 | 2012-05-31 | Univ Of Tokyo | 3次元環境復元装置、3次元環境復元方法、及びロボット |
JP2016103263A (ja) * | 2015-08-31 | 2016-06-02 | 株式会社大林組 | 構造物状況把握支援装置、構造物状況把握支援方法及びプログラム |
JP2018005862A (ja) * | 2016-07-08 | 2018-01-11 | 富士通株式会社 | ファセット化処理プログラム、ファセット抽出プログラム、ファセット化処理方法、ファセット抽出方法および情報処理装置 |
JP2018008403A (ja) * | 2016-07-12 | 2018-01-18 | 学校法人慶應義塾 | 立体物製造装置、立体物製造方法及びプログラム |
JP2018181056A (ja) * | 2017-04-17 | 2018-11-15 | 富士通株式会社 | 差分検知プログラム、差分検知装置、差分検知方法 |
JP2021021672A (ja) * | 2019-07-30 | 2021-02-18 | 日本電気通信システム株式会社 | 距離計測装置、システム、方法、及びプログラム |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1669056A (zh) * | 2002-06-19 | 2005-09-14 | 西门子公司 | 3d数据组的平台交叉和针对数据的可视化 |
US7685024B2 (en) * | 2005-02-24 | 2010-03-23 | Dolphin Software Ltd. | System and method for computerized ordering |
CN101331380B (zh) * | 2005-12-16 | 2011-08-03 | 株式会社Ihi | 三维形状数据的存储/显示方法和装置以及三维形状的计测方法和装置 |
DE112006003363B4 (de) * | 2005-12-16 | 2016-05-04 | Ihi Corporation | Verfahren und Vorrichtung zur Identifizierung der Eigenposition, und Verfahren und Vorrichtung zur Messung einer dreidimensionalen Gestalt |
CN101331381B (zh) * | 2005-12-16 | 2011-08-24 | 株式会社Ihi | 三维形状数据的位置对准方法和装置 |
TWI409717B (zh) * | 2009-06-22 | 2013-09-21 | Chunghwa Picture Tubes Ltd | 適用於電腦產品與影像顯示裝置的影像轉換方法 |
CN107481217A (zh) * | 2009-06-24 | 2017-12-15 | 皇家飞利浦电子股份有限公司 | 对象内的植入设备的空间和形状表征 |
US20110307281A1 (en) * | 2010-06-11 | 2011-12-15 | Satterfield & Pontikes Construction, Inc. | Model inventory manager |
US8768029B2 (en) | 2010-10-20 | 2014-07-01 | Medtronic Navigation, Inc. | Selected image acquisition technique to optimize patient model construction |
US10373375B2 (en) * | 2011-04-08 | 2019-08-06 | Koninklijke Philips N.V. | Image processing system and method using device rotation |
GB201308866D0 (en) * | 2013-05-16 | 2013-07-03 | Siemens Medical Solutions | System and methods for efficient assessment of lesion developemnt |
US9715761B2 (en) | 2013-07-08 | 2017-07-25 | Vangogh Imaging, Inc. | Real-time 3D computer vision processing engine for object recognition, reconstruction, and analysis |
US9984498B2 (en) * | 2013-07-17 | 2018-05-29 | Microsoft Technology Licensing, Llc | Sparse GPU voxelization for 3D surface reconstruction |
US9710960B2 (en) | 2014-12-04 | 2017-07-18 | Vangogh Imaging, Inc. | Closed-form 3D model generation of non-rigid complex objects from incomplete and noisy scans |
US10007996B2 (en) * | 2015-03-02 | 2018-06-26 | Lawrence Livermore National Security, Llc | System for detecting objects in streaming 3D images formed from data acquired with a medium penetrating sensor |
AU2016250935B2 (en) * | 2015-04-20 | 2020-12-03 | Mars Bioimaging Limited | Improving material identification using multi-energy CT image data |
KR101644426B1 (ko) * | 2015-09-30 | 2016-08-02 | 상명대학교서울산학협력단 | 변형3d모델에 대응되는 원본3d모델 인식방법 |
WO2017200527A1 (en) * | 2016-05-16 | 2017-11-23 | Hewlett-Packard Development Company, L.P. | Generating a shape profile for a 3d object |
US10380762B2 (en) | 2016-10-07 | 2019-08-13 | Vangogh Imaging, Inc. | Real-time remote collaboration and virtual presence using simultaneous localization and mapping to construct a 3D model and update a scene based on sparse data |
WO2018075628A1 (en) | 2016-10-19 | 2018-04-26 | Shapeways, Inc. | Systems and methods for identifying three-dimensional printed objects |
EP3355279A1 (en) * | 2017-01-30 | 2018-08-01 | 3D Repo Ltd | Method and computer programs for identifying differences between 3-dimensional scenes |
US10839585B2 (en) | 2018-01-05 | 2020-11-17 | Vangogh Imaging, Inc. | 4D hologram: real-time remote avatar creation and animation control |
US11080540B2 (en) | 2018-03-20 | 2021-08-03 | Vangogh Imaging, Inc. | 3D vision processing using an IP block |
US10810783B2 (en) | 2018-04-03 | 2020-10-20 | Vangogh Imaging, Inc. | Dynamic real-time texture alignment for 3D models |
JP7084193B2 (ja) * | 2018-04-10 | 2022-06-14 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
US11170224B2 (en) | 2018-05-25 | 2021-11-09 | Vangogh Imaging, Inc. | Keyframe-based object scanning and tracking |
US10776949B2 (en) | 2018-10-30 | 2020-09-15 | Liberty Reach Inc. | Machine vision-based method and system for measuring 3D pose of a part or subassembly of parts |
US11232633B2 (en) | 2019-05-06 | 2022-01-25 | Vangogh Imaging, Inc. | 3D object capture and object reconstruction using edge cloud computing resources |
US11170552B2 (en) | 2019-05-06 | 2021-11-09 | Vangogh Imaging, Inc. | Remote visualization of three-dimensional (3D) animation with synchronized voice in real-time |
JP7376268B2 (ja) * | 2019-07-22 | 2023-11-08 | ファナック株式会社 | 三次元データ生成装置及びロボット制御システム |
US11335063B2 (en) | 2020-01-03 | 2022-05-17 | Vangogh Imaging, Inc. | Multiple maps for 3D object scanning and reconstruction |
KR102238911B1 (ko) * | 2020-04-23 | 2021-04-16 | 비아이밀리그램㈜ | 건축물의 공정진행상태 산출 장치, 건축물의 공정진행상태 산출 방법 및 건축물의 공정진행상태 산출시스템 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5273040A (en) * | 1991-11-14 | 1993-12-28 | Picker International, Inc. | Measurement of vetricle volumes with cardiac MRI |
US5301271A (en) * | 1991-07-11 | 1994-04-05 | Matsushita Electric Industrial Co., Ltd. | Image processing separately processing two-density-level image data and multi-level image data |
JPH0935058A (ja) * | 1995-07-17 | 1997-02-07 | Nec Corp | 画像認識方法 |
JPH09204532A (ja) * | 1996-01-25 | 1997-08-05 | Hitachi Ltd | 画像認識方法および画像表示方法 |
EP1054384A2 (en) * | 1999-05-20 | 2000-11-22 | Mitsubishi Denki Kabushiki Kaisha | Method and apparatus for translating and interfacing voxel memory addresses |
US6175371B1 (en) * | 1995-06-02 | 2001-01-16 | Philippe Schoulz | Process for transforming images into stereoscopic images, images and image series obtained by this process |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62219075A (ja) * | 1986-03-20 | 1987-09-26 | Hitachi Medical Corp | 三次元画像の半透明表示方法 |
JPH04276513A (ja) * | 1991-03-04 | 1992-10-01 | Matsushita Electric Ind Co Ltd | 形状測定処理方法 |
JPH0516435A (ja) * | 1991-07-11 | 1993-01-26 | Matsushita Electric Ind Co Ltd | 画像形成装置 |
AU6341396A (en) * | 1995-06-30 | 1997-02-05 | Ultrapointe Corporation | Method for characterizing defects on semiconductor wafers |
JP3564531B2 (ja) * | 1999-05-07 | 2004-09-15 | 株式会社モノリス | 画像領域の追跡方法および装置 |
DE10000185A1 (de) * | 2000-01-05 | 2001-07-12 | Philips Corp Intellectual Pty | Verfahren zur Darstellung des zeitlichen Verlaufs des Blutflusses in einem Untersuchungsobjekt |
-
2002
- 2002-06-28 WO PCT/JP2002/006621 patent/WO2004003850A1/ja active Application Filing
- 2002-06-28 EP EP02736203A patent/EP1519318A4/en not_active Withdrawn
- 2002-06-28 CN CNB028292359A patent/CN100388317C/zh not_active Expired - Fee Related
- 2002-06-28 KR KR1020047021408A patent/KR100639139B1/ko not_active IP Right Cessation
- 2002-06-28 JP JP2004517224A patent/JP4001600B2/ja not_active Expired - Fee Related
-
2004
- 2004-11-17 US US10/989,464 patent/US20050068317A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5301271A (en) * | 1991-07-11 | 1994-04-05 | Matsushita Electric Industrial Co., Ltd. | Image processing separately processing two-density-level image data and multi-level image data |
US5273040A (en) * | 1991-11-14 | 1993-12-28 | Picker International, Inc. | Measurement of vetricle volumes with cardiac MRI |
US6175371B1 (en) * | 1995-06-02 | 2001-01-16 | Philippe Schoulz | Process for transforming images into stereoscopic images, images and image series obtained by this process |
JPH0935058A (ja) * | 1995-07-17 | 1997-02-07 | Nec Corp | 画像認識方法 |
JPH09204532A (ja) * | 1996-01-25 | 1997-08-05 | Hitachi Ltd | 画像認識方法および画像表示方法 |
EP1054384A2 (en) * | 1999-05-20 | 2000-11-22 | Mitsubishi Denki Kabushiki Kaisha | Method and apparatus for translating and interfacing voxel memory addresses |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100419794C (zh) * | 2005-12-16 | 2008-09-17 | 北京中星微电子有限公司 | 一种图像问题的检测方法 |
JP2009104515A (ja) * | 2007-10-25 | 2009-05-14 | Fujitsu Ltd | 差異強調プログラム、差異強調処理方法及び差異強調処理装置 |
US8311320B2 (en) | 2007-10-25 | 2012-11-13 | Fujitsu Limited | Computer readable recording medium storing difference emphasizing program, difference emphasizing method, and difference emphasizing apparatus |
JP2012103144A (ja) * | 2010-11-11 | 2012-05-31 | Univ Of Tokyo | 3次元環境復元装置、3次元環境復元方法、及びロボット |
JP2016103263A (ja) * | 2015-08-31 | 2016-06-02 | 株式会社大林組 | 構造物状況把握支援装置、構造物状況把握支援方法及びプログラム |
JP2018005862A (ja) * | 2016-07-08 | 2018-01-11 | 富士通株式会社 | ファセット化処理プログラム、ファセット抽出プログラム、ファセット化処理方法、ファセット抽出方法および情報処理装置 |
JP2018008403A (ja) * | 2016-07-12 | 2018-01-18 | 学校法人慶應義塾 | 立体物製造装置、立体物製造方法及びプログラム |
JP2018181056A (ja) * | 2017-04-17 | 2018-11-15 | 富士通株式会社 | 差分検知プログラム、差分検知装置、差分検知方法 |
JP2021021672A (ja) * | 2019-07-30 | 2021-02-18 | 日本電気通信システム株式会社 | 距離計測装置、システム、方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
KR100639139B1 (ko) | 2006-10-30 |
CN1630886A (zh) | 2005-06-22 |
JP4001600B2 (ja) | 2007-10-31 |
EP1519318A4 (en) | 2008-11-19 |
EP1519318A1 (en) | 2005-03-30 |
CN100388317C (zh) | 2008-05-14 |
US20050068317A1 (en) | 2005-03-31 |
KR20050010982A (ko) | 2005-01-28 |
JPWO2004003850A1 (ja) | 2005-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004003850A1 (ja) | 3次元イメージ比較プログラム、3次元イメージ比較方法、および3次元イメージ比較装置 | |
US8059120B2 (en) | Image process apparatus for three-dimensional model | |
US7576737B2 (en) | Image processing device and program | |
JPH1196374A (ja) | 3次元モデリング装置、3次元モデリング方法および3次元モデリングプログラムを記録した媒体 | |
CN102549622B (zh) | 用于处理体图像数据的方法 | |
KR20140142470A (ko) | 나무 모델과 숲 모델 생성 방법 및 장치 | |
CN110809788B (zh) | 深度图像的融合方法、装置和计算机可读存储介质 | |
US20200082618A1 (en) | Isosurface generation method and visualization system | |
Besançon et al. | An evaluation of visualization methods for population statistics based on choropleth maps | |
JP3611583B2 (ja) | 特徴線の抽出を伴う三次元イメージ情報の処理装置 | |
GB2399703A (en) | Volumetric representation of a 3D object | |
CN111666788B (zh) | 一种图像处理方法、装置及设备、存储介质 | |
Rasoulzadeh et al. | Strokes2Surface: Recovering Curve Networks From 4D Architectural Design Sketches | |
US7388584B2 (en) | Method and program for determining insides and outsides of boundaries | |
JP4014140B2 (ja) | 三次元モデリング装置及びその方法及びそのプログラム | |
EP3467764A1 (en) | Image processing method and image processing apparatus | |
Balzer et al. | Volumetric reconstruction applied to perceptual studies of size and weight | |
Xia et al. | A novel approach for computing exact visual hull from silhouettes | |
Le Van et al. | An effective RGB color selection for complex 3D object structure in scene graph systems | |
CN111462199A (zh) | 基于gpu的快速散斑图像匹配方法 | |
CN118334094B (zh) | 一种基于三维点云的模型配准方法 | |
CN117495693A (zh) | 用于内窥镜的图像融合方法、系统、介质及电子设备 | |
Tattersall et al. | Reconstructing Creative Lego Models | |
CN117765115A (zh) | 体数据可视化方法、系统和存储介质 | |
WO2021024367A1 (ja) | 形状データ処理装置、形状データ処理方法及び形状データ処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004517224 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002736203 Country of ref document: EP Ref document number: 10989464 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020047021408 Country of ref document: KR Ref document number: 20028292359 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020047021408 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2002736203 Country of ref document: EP |