WO2020165976A1 - シミュレーション装置、シミュレーション方法およびシミュレーションプログラム - Google Patents

シミュレーション装置、シミュレーション方法およびシミュレーションプログラム Download PDF

Info

Publication number
WO2020165976A1
WO2020165976A1 PCT/JP2019/005156 JP2019005156W WO2020165976A1 WO 2020165976 A1 WO2020165976 A1 WO 2020165976A1 JP 2019005156 W JP2019005156 W JP 2019005156W WO 2020165976 A1 WO2020165976 A1 WO 2020165976A1
Authority
WO
WIPO (PCT)
Prior art keywords
measurement
image
simulation
captured image
generation unit
Prior art date
Application number
PCT/JP2019/005156
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
亮輔 川西
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to CN201980091525.1A priority Critical patent/CN113412500B/zh
Priority to DE112019006855.5T priority patent/DE112019006855T5/de
Priority to JP2020571967A priority patent/JP7094400B2/ja
Priority to PCT/JP2019/005156 priority patent/WO2020165976A1/ja
Publication of WO2020165976A1 publication Critical patent/WO2020165976A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Definitions

  • the present invention relates to a simulation device, a simulation method, and a simulation program capable of simulating the measurement result of a three-dimensional measurement device.
  • Patent Document 1 discloses a three-dimensional measuring device that performs three-dimensional measurement by the active stereo method.
  • the three-dimensional measurement device disclosed in Patent Document 1 projects a geometric pattern from a projection device onto a measurement target, sets a plurality of detection points on the geometric pattern in a captured image output by the imaging device, and sets the detection points as detection points.
  • the three-dimensional position of the surface of the corresponding measurement object is calculated based on the triangulation method.
  • the present invention has been made in view of the above, and an object of the present invention is to obtain a simulation device capable of evaluating a measurement result even when there is no actual environment or when the sample of the measurement target is small.
  • a simulation device includes a projection device that projects light onto a measurement target, and an image capture that includes the measurement target irradiated with projection light from the projection device.
  • a measurement condition acquisition unit that acquires measurement condition information indicating measurement conditions of a three-dimensional measurement device that includes an imaging device that captures a space, and a virtual captured image that reproduces a captured image output by the imaging device based on the measurement condition information.
  • a virtual photographed image generation unit that generates a 3D measurement calculation unit that obtains a measured value by performing a 3D measurement process that measures the 3D position of the surface of the measurement target using the virtual photographed image.
  • an output unit that outputs the simulation result including the output.
  • FIG. 3 is a diagram showing a functional configuration of a virtual captured image generation unit according to a second embodiment of the present invention.
  • 5 is a flowchart showing the operation of the simulation device having the virtual captured image generation unit shown in FIG.
  • generation part shown in FIG. Flowchart showing the operation of the simulation apparatus shown in FIG.
  • the flowchart which shows the detail of step S304 shown in FIG. The flowchart which shows the detail of step S305 shown in FIG.
  • the figure which shows the detailed functional structure of the sensor viewpoint data generation part shown in FIG. The figure which shows the detailed functional structure of the map generation part shown in FIG. 4 is a flowchart showing the operation of the simulation apparatus according to the fourth embodiment of the present invention.
  • FIG. 23 A flowchart showing the operation of the simulation apparatus shown in FIG. The figure which shows an example of the display screen of the simulation apparatus shown in FIG. The figure which shows the function structure of the simulation apparatus concerning Embodiment 7 of this invention.
  • FIG. 3 is a diagram showing a configuration of a control circuit for realizing the functions of the simulation apparatus according to the first to eighth embodiments of the present invention.
  • FIG. 2 is a diagram showing an example of a hardware configuration for realizing the functions of the simulation apparatus according to the first to eighth embodiments of the present invention.
  • FIG. 1 is a diagram showing a functional configuration of a simulation device 10 according to the first exemplary embodiment of the present invention.
  • the simulation device 10 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, and an output unit 104.
  • the simulation device 10 simulates the measurement result of the three-dimensional measurement device based on the measurement condition information indicating the measurement condition of the three-dimensional measurement device that measures the three-dimensional position of the surface of the measurement target using the active stereo method.
  • the three-dimensional measurement device assumed here includes a projection device that projects light onto a measurement target object, and an imaging device that captures an imaging space including the measurement target object irradiated with projection light from the projection device.
  • the projection light from the projection device indicates a projection pattern used for three-dimensional measurement.
  • projection light refers to light that indicates a projection pattern.
  • the three-dimensional measurement device can perform a three-dimensional measurement process of measuring the three-dimensional position of the surface of the measurement target based on the projection pattern included in the captured image output by the imaging device. Further, the simulation device 10 may simulate the output result of the system using the three-dimensional measuring device.
  • the measurement condition acquisition unit 101 acquires measurement condition information indicating the measurement conditions of the three-dimensional measuring device. Details of the measurement condition information will be described later.
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the virtual captured image generation unit 102 is a virtual computer graphics (CG) image that reproduces a captured image output by the imaging device included in the three-dimensional measurement device based on the measurement condition information input from the measurement condition acquisition unit 101. Generate a captured image.
  • the virtual captured image generation unit 102 inputs the generated virtual captured image to the three-dimensional measurement calculation unit 103.
  • CG virtual computer graphics
  • the three-dimensional measurement calculation unit 103 uses the virtual captured image input from the virtual captured image generation unit 102 to perform the three-dimensional measurement processing executed by the three-dimensional measurement device and acquire the measurement value.
  • the three-dimensional measurement calculation unit 103 inputs the acquired measurement value to the output unit 104.
  • the output unit 104 outputs the simulation result including the measurement value input from the three-dimensional measurement calculation unit 103.
  • FIG. 2 is a flowchart showing the operation of the simulation device 10 shown in FIG.
  • the measurement condition acquisition unit 101 of the simulation device 10 acquires measurement condition information (step S101).
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the virtual captured image generation unit 102 generates a virtual captured image based on the measurement condition information (step S102).
  • the virtual captured image generation unit 102 inputs the generated virtual captured image to the three-dimensional measurement calculation unit 103.
  • the three-dimensional measurement calculation unit 103 executes the calculation of three-dimensional measurement using the virtual captured image and acquires the measurement value (step S103).
  • the three-dimensional measurement calculation unit 103 inputs the acquired measurement value to the output unit 104.
  • the output unit 104 outputs the simulation result including the measured value (step S104).
  • the simulation apparatus 10 may generate a virtual captured image that reproduces a captured image output by the image capturing apparatus based on the measurement condition information of the three-dimensional measurement apparatus, and may perform the three-dimensional measurement processing using the virtual captured image. it can. With such a configuration, it is possible to verify the three-dimensional measurement on the simulation without actually installing the projection device and the imaging device and acquiring the actual data. Therefore, it is not necessary to install the 3D measuring device on site, adjust the hardware and software of the 3D measuring device, and collect the actual data of the measurement target, reducing the time required for the work. Therefore, it becomes possible to suppress human and physical costs. Therefore, an appropriate measurement condition can be specified by performing a simulation while setting various measurement conditions of the three-dimensional measurement device. Therefore, it is possible to shorten the design period of the three-dimensional measuring device and the trial-and-error verification period for introducing the three-dimensional measuring device to the site and operating it.
  • the measurement condition acquisition unit 101 acquires measurement condition information indicating the measurement conditions of the three-dimensional measuring device.
  • the measurement condition acquisition unit 101 includes a projection condition information acquisition unit 201, a shooting condition information acquisition unit 202, a measurement target object information acquisition unit 203, and a non-measurement target object information acquisition unit 204.
  • the projection condition information acquisition unit 201 acquires projection condition information indicating the projection condition of the projection device.
  • the projection condition information can include information that can specify at least one of the performance and the usage state of the projection device.
  • the performance of the projection device is, for example, the resolution and the angle of view
  • the state of the projection device is, for example, the position and orientation of the projection device and the focus state.
  • the projection condition information can include information indicating a projection pattern.
  • the information indicating the projection pattern includes information indicating the pattern of the projection pattern. When there are a plurality of projection patterns, the information indicating the projection pattern may further include information indicating the number of projection patterns.
  • the pattern of the projection pattern is a stripe pattern in which stripes with a predetermined thickness for each projection pattern are arranged according to a certain rule, a dot pattern arranged in an irregular arrangement, and the intensity of the projection light in the projection pattern.
  • the gradation pattern may be smoothly changed, or a combination thereof.
  • the above pattern is an example, and projection patterns of any pattern can be used.
  • the number of projection patterns is an arbitrary number of one or more.
  • the information indicating the pattern of the projection pattern may be, for example, information indicating the type of the pattern, or information indicating the light intensity distribution on a predetermined projection plane.
  • the image capturing condition information acquisition unit 202 acquires image capturing condition information indicating the image capturing conditions of the image capturing apparatus.
  • the shooting condition information can include information that can specify at least one of the performance and the usage state of the imaging device.
  • the performance of the imaging device is, for example, the resolution and the angle of view, and the state of the imaging device is, for example, the position and orientation of the imaging device and the focus state.
  • the measurement target object information acquisition unit 203 acquires information about the measurement target object, for example, measurement target object information that is information indicating the shape, position and orientation of the measurement target object, and characteristics.
  • the characteristic of the measurement target is, for example, the reflection characteristic of the measurement target.
  • the information indicating the reflection characteristic of the measurement target includes, for example, at least one of the color, the diffuse reflectance, and the specular reflectance of the measurement target.
  • the non-measurement target information acquisition unit 204 acquires non-measurement target information that is information about a non-measurement target that is an object other than the measurement target or ambient light other than the projection light existing in the shooting space.
  • the non-measurement target is, for example, a container, a gantry, or a jig that holds the measurement target.
  • the non-measurement target is not limited to the above example, and an object other than the measurement target that can be reflected in a captured image captured by an actual image capturing apparatus, such as a wall, a window, another landscape, or the above-described object.
  • light such as illumination light for irradiating.
  • the non-measurement target information can include at least one of the position, shape, and characteristic of the non-measurement target and information indicating the state of ambient light.
  • the information indicating the shape of the object included in the measurement target information and the non-measurement target information is a combination of meshes such as 3D-CAD (3Dimensional-Computer-Assisted Drawing), a primitive such as a sphere or a rectangular parallelepiped, or an aggregate thereof.
  • 3D-CAD 3Dimensional-Computer-Assisted Drawing
  • a primitive such as a sphere or a rectangular parallelepiped
  • an aggregate thereof can be expressed as
  • the reflection characteristic of an object is used to reproduce the appearance of the object when it is exposed to light.
  • the information indicating the state of the ambient light is used to obtain a shadow effect such as an image actually taken by the imaging device.
  • the object may be in a state of floating in the air, or a box for accommodating the floor surface and the object may be set so that the object exists inside the box. It is preferable that the reflection characteristics can be set for the floor surface and the box as well as for the object.
  • the projection condition information and the shooting condition information including the performances such as the resolution and the angle of view of the projection device and the imaging device can be individually set, the measurement error that may occur in the actual three-dimensional measurement device can be more accurate It can be reproduced.
  • the resolution of the image pickup device is lower than the resolution of the projection device, even if the projection pattern of the projection device is made finer, the pattern of the projection pattern cannot be finely determined from the image obtained by shooting the projection pattern. This may cause errors in three-dimensional measurement and loss.
  • the virtual captured image generation unit 102 Since the virtual captured image generation unit 102 generates the virtual captured image based on the resolution of the image capturing apparatus, by analyzing the virtual captured image, the fineness of the projection pattern that can be discriminated by the image capturing apparatus can be known.
  • the limit value of the fineness of the projection pattern that can be discriminated by the image pickup device can be estimated as the upper limit value of the resolution of the projection device.
  • the performance of the imaging device that matches the resolution of the projection device can be considered. That is, the projection device and the imaging device included in the three-dimensional measurement device need to have performances that match each other, and therefore it is preferable to study the performances of the projection device and the imaging device using the simulation results.
  • the virtual captured image generation unit 102 generates a virtual captured image that reproduces the captured image output by the imaging device based on the measurement condition information.
  • the virtual photographed image generation unit 102 reproduces the arrangement of objects existing in the photographing space in the virtual photographing space based on the measurement object information and the non-measurement object information included in the measurement condition information, and the photographing condition information. Based on, it is possible to specify the portion in the shooting space indicated by the virtual shot image.
  • the virtual captured image generation unit 102 can reproduce the shadow generated by the projection light in the capturing space by using the reflection model based on the projection condition information, the measurement target object information, and the non-measurement target object information.
  • the virtual captured image generation unit 102 can reproduce the position indicating the boundary of the shadow generated by the projection light in the captured image output by the image capturing apparatus, with a pixel value.
  • the error included in the position indicating the boundary of the shadow reduces the visibility of the light in the captured image.
  • the decrease in the visibility of light means that the three-dimensional position where the projection light is irradiated in the photographing space does not match the detection result of the irradiation position of the projection light on the photographed image, and the two-dimensional projection device is used.
  • a shadow boundary of light for example, a pattern boundary that is a boundary between a region irradiated with projection light and a region not irradiated with projection light cannot be detected correctly. For this reason, if the visibility of light is reduced, an error occurs in the measurement result of the three-dimensional measurement or the measurement becomes impossible, and the quality of the three-dimensional measurement deteriorates.
  • the factors that cause the error in the position of the shadow boundary of light in the captured image may include those caused by the projection device and those caused by the imaging device.
  • the error factors caused by the projection device include mutual reflection in which the projection light from the projection device is reflected on the first surface and then enters the second surface to illuminate the second surface, and the projection device is out of focus. It is possible that the light is out of focus.
  • the second surface may be the surface of an object different from the first surface, or may be a different part of the same object as the first surface.
  • the phenomenon treated as an error factor is not limited to the above, and may include any phenomenon that causes a change in the projection state of light or a change in a captured image.
  • the light beam emitted from the projection device may be reflected on the surface of the first object in the shooting space, and the reflected light may illuminate a second object different from the first object. It is also conceivable that the light reflected by the site X of the first object existing in the imaging space illuminates a site Y different from the site X of the same first object.
  • the light ray is not limited to be reflected once, and may be reflected multiple times. However, since the energy of light is absorbed each time it is reflected, if the number of times of reflection is repeated, the observation sensitivity of the image pickup device is lowered and the captured image is often not affected. Therefore, mutual reflection due to the reflected light a predetermined number of times or more may be ignored.
  • the light intensity by mutual reflection can be determined based on the intensity of the light before reflection, the reflection characteristics of the reflection point, and the traveling path of the light, in addition to the reflection model similar to that when the virtual captured image is generated.
  • shadow boundaries of light may be observed in addition to the pattern boundaries that should be observed on the image.
  • the light irradiation direction cannot be calculated correctly, which causes a measurement error in three-dimensional measurement.
  • the measurement value output by the three-dimensional measurement calculation unit 103 includes a measurement target that may actually cause mutual reflection.
  • the measurement error that can occur when three-dimensional measurement is performed on an object is reproduced. Therefore, it becomes possible to grasp the magnitude and tendency of appearance of the measurement error before actually configuring the three-dimensional measuring device. If the appearance tendency of measurement error can be grasped in advance, consider a method to reduce the influence of mutual reflection by changing the measurement conditions such as the positional relationship between the projection device and the imaging device or the arrangement method of the measurement target. You can also Therefore, it is possible to improve the accuracy of three-dimensional measurement.
  • the shading boundary of the light is blurred and the pattern boundary is blurred, so that the accuracy of analysis of the projection pattern on the captured image decreases for the purpose of specifying the irradiation direction of the light.
  • the direction may be calculated incorrectly.
  • the shooting space has a certain depth or more, it is possible that the focus of the projection device cannot be adjusted to the entire shooting space. In such a case, it is necessary to consider a calculation method capable of accurately analyzing the projection pattern even when the projection device is out of focus.
  • actual data which is data obtained by actually operating the projection device and the imaging device and observing the measurement object, it is necessary to know the true position of the pattern boundary in the actual data.
  • the simulation device 10 can acquire the true position of the pattern boundary based on the measurement condition information, and generates a virtual photographed image that reproduces the blur of the pattern boundary due to the focus shift of the projection device. .. Therefore, it is possible to easily determine whether or not the detection result of the pattern boundary is improved as a result of improving the method of analyzing the projected pattern. Therefore, it becomes possible to streamline the development of the projection pattern analysis algorithm required for three-dimensional measurement.
  • the simulation device 10 has an effect that the design of the three-dimensional measuring device can be facilitated.
  • the virtual captured image generation unit 102 can calculate the image distortion as an effect of the distortion aberration caused by the lens of the imaging device.
  • image distortion model correspondence between pixels before and after image distortion can be obtained using the following mathematical expression (1).
  • (x u , yu ) are image coordinates in the undistorted image
  • (x d , y d ) are image coordinates in the distorted image
  • K is a coefficient indicating the degree of distortion
  • r is the distance from the center of the image to the pixel of interest.
  • the pixel of interest is a pixel at an arbitrary coordinate (x d , y d ) of the distorted image.
  • the virtual captured image generation unit 102 may sequentially select all the pixels of the distorted image as the target pixel.
  • Many image distortions have been proposed, from a simplified model to a detailed model.
  • the simulation apparatus 10 can use an arbitrary model including the mathematical expression (1) as the calculation expression of the image distortion.
  • the virtual captured image generation unit 102 can reproduce random noise by setting the appearance probability and intensity of noise for each pixel or for each fixed area on the image.
  • the virtual photographed image generation unit 102 determines whether or not to add noise to the pixel or region using the appearance probability, and when it determines to add noise, changes the color of the pixel or region based on the set intensity. ..
  • a rate of change with respect to the color of the original pixel or area may be used, or an integer value may be used.
  • the intensity may be indicated using a fixed rate of change or an integer value, or may have a certain range. Further, the intensity can take both a positive value, which means an increase in the brightness of the pixel, and a negative value, which means a decrease in the brightness of the pixel.
  • the virtual captured image generation unit 102 can reproduce the image blur caused by the focus shift of the imaging device by using the color information of the pixels around the target pixel. Any pixel on the image may be selected as the pixel of interest.
  • a method of calculating the color after the image blur there are a method of taking an average value of colors of the pixel of interest and its surrounding pixels, and Gaussian smoothing in which pixels closer to the pixel of interest are combined at a higher ratio.
  • the use of Gaussian smoothing has an advantage that image blur can be reproduced more accurately than the method of taking an average value. Further, the method of taking the average value has an advantage that the processing time is faster than the method of using Gaussian smoothing.
  • the image blur also changes depending on the distance from the imaging device to the object.
  • Objects that are in focus have less blur, and objects that are out of focus are captured as a larger, blurred image.
  • the virtual captured image generation unit 102 can reproduce the blur closer to the actual captured image.
  • the information may be acquired as the measurement condition information or may be calculated from other information included in the measurement condition information.
  • a specific method for the virtual photographed image generation unit 102 to reproduce the error factor will be described in the second and third embodiments, but the reproduction method is not limited thereto.
  • the three-dimensional measurement calculation unit 103 executes a three-dimensional measurement process using the virtual captured image as an input.
  • the three-dimensional measurement process performed by the three-dimensional measurement calculation unit 103 may be the same as the three-dimensional measurement process performed by the three-dimensional measurement device using a captured image captured in an actual environment.
  • the three-dimensional measurement process includes a process of identifying the projection pattern irradiated on the measurement target and a process of measuring the distance.
  • the projection pattern identification process the brightness information of pixels at the same position is acquired from a plurality of captured images, it is determined whether each pixel is illuminated by the projection device, and the combination is acquired.
  • the pattern identification processing there is a method of analyzing the pattern of a local projection pattern around the pixel of interest.
  • the local projection pattern is designed so as to be unique in the entire pattern, it is possible to determine where in the entire projection pattern projected from the projection device, the portion illuminated around the pixel of interest is located. Can be specified. In any of the pattern identification methods, the purpose is to uniquely obtain the vector from the projection device to the pixel of interest.
  • the three-dimensional measurement calculation unit 103 determines that the sensor information indicating the arrangement condition such as the position and orientation of the projection device and the imaging device included in the measurement condition information.
  • the distance measurement processing is performed based on the principle of triangulation. The details of the distance measurement process differ depending on the projection pattern used. Since the projection pattern used by the simulation device 10 may be any projection pattern, the distance measurement process is selected according to the projection pattern used.
  • FIG. 4 is a diagram showing an example of the display screen 20 output by the simulation device 10 shown in FIG.
  • the display screen 20 includes a processing result display area 21, a measurement condition display area 22, a measurement condition list display area 23, a display content selection area 24, an execute button 25, a save button 26, and an end button 27. Including.
  • the simulation result is displayed in the processing result display area 21.
  • the measurement condition display area 22 individual measurement conditions are displayed and the displayed measurement conditions can be changed and input.
  • the measurement condition list display area 23 displays a list of saved measurement conditions. When one is selected from the list of measurement conditions displayed in the list display area 23, the selected measurement condition is displayed in the measurement condition display area 22.
  • the display content selection area 24 an operation unit for selecting the content to be displayed in the processing result display area 21 is displayed.
  • the captured image reproduced by the virtual captured image generation unit 102 and the measurement data are options. It is displayed.
  • the execution button 25 is an operation unit for executing a simulation process using the measurement condition displayed in the measurement condition display area 22.
  • the save button 26 is an operation unit for saving the measurement condition displayed in the measurement condition display area 22.
  • the end button 27 is an operation unit for ending the simulation processing of three-dimensional measurement.
  • the measurement condition acquisition unit 101 acquires the measurement condition displayed in the measurement condition display area 22 and inputs it to the virtual captured image generation unit 102.
  • the virtual captured image generation unit 102 generates a virtual captured image and inputs it to the three-dimensional measurement calculation unit 103
  • the three-dimensional measurement calculation unit 103 executes three-dimensional measurement processing and inputs the measurement result to the output unit 104
  • the output The unit 104 outputs to the processing result display area 21 processing results such as virtual captured images and measurement values obtained in the course of the simulation processing, input data to the simulation processing such as measurement condition information, and the like.
  • the output unit 104 may perform processing such as emphasizing a point of interest, adjusting contrast, and removing noise so that the output is easy for the user to see.
  • the display screen 20 shown in FIG. 4 is an example, and the display screen 20 is not limited to the example of FIG. By using the display screen 20 shown in FIG. 4, the user can confirm repeated simulation results while sequentially adjusting the measurement conditions.
  • a virtual captured image that reproduces a captured image output by an actual imaging device is generated, and three-dimensional measurement processing is performed based on the virtual captured image. ..
  • the simulation device 10 can reproduce the error in the position of the light included in the captured image acquired by the actual three-dimensional measuring device.
  • the error occurs due to, for example, image distortion due to distortion aberration, random noise, image blur, and mutual reflection. It is difficult to add these errors to the actual captured image afterwards when actually configuring the three-dimensional measurement device, but by reproducing these errors on the virtual captured image, the simulation result of the three-dimensional measurement is obtained.
  • the accuracy of can be improved. Further, if the projection pattern and the error can be reproduced with good reproducibility in the virtual captured image, it is possible to use normal processing for the three-dimensional measurement processing.
  • FIG. 5 is a diagram showing a functional configuration of the virtual captured image generation unit 102 according to the second exemplary embodiment of the present invention.
  • the virtual captured image generation unit 102 has an optical reproduction image generation unit 301 and an image quality deterioration processing unit 302.
  • a device including the virtual captured image generation unit 102 shown in FIG. 5 and having the same configuration as the simulation device 10 shown in FIG. 1 is referred to as a simulation device 12 according to the second embodiment.
  • the configuration of the simulation apparatus 12 is the same as that of the first embodiment shown in FIG. 1 except for the virtual photographed image generation unit 102, and therefore detailed description thereof will be omitted here.
  • the same components as those in the first embodiment will be described using the reference numerals shown in FIG. 1, and the differences from the first embodiment will be mainly described.
  • the optical reproduction image generation unit 301 performs an optical simulation based on the measurement condition information and generates an optical reproduction image that reproduces a captured image.
  • the image quality deterioration processing unit 302 performs an image quality deterioration process on the optically reproduced image according to an error factor.
  • the virtual captured image generation unit 102 sets the image after the deterioration processing as a virtual captured image.
  • FIG. 6 is a flowchart showing the operation of the simulation device 12 having the virtual captured image generation unit 102 shown in FIG.
  • the measurement condition acquisition unit 101 acquires measurement condition information (step S201).
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the optical reproduction image generation unit 301 of the virtual captured image generation unit 102 generates an optical reproduction image based on the measurement condition information (step S202).
  • the optical reproduction image generation unit 301 inputs the generated optical reproduction image to the image quality deterioration processing unit 302.
  • the image quality deterioration processing unit 302 executes image quality deterioration processing on the optical reproduction image (step S203).
  • the image quality deterioration processing unit 302 inputs the image after the image quality deterioration processing to the three-dimensional measurement calculation unit 103 as a virtual captured image.
  • the three-dimensional measurement calculation unit 103 performs three-dimensional measurement processing using the virtual captured image and obtains a measurement result (step S204).
  • the three-dimensional measurement calculation unit 103 inputs the measurement result to the output unit 104.
  • the output unit 104 outputs the simulation result including the measurement result (step S205).
  • the optical reproduction image generation unit 301 calculates the image pickup position in the image pickup space corresponding to each pixel in the image, and whether the calculated image pickup position is irradiated with projection light, that is, the projection light of the projection device picks up the image pickup position. It is determined whether or not has reached. First, the optical reproduction image generation unit 301 calculates a vector V cam passing through each pixel from the optical center O cam of the imaging device based on the sensor information and the information indicating the arrangement and characteristics of the object existing in the shooting space. To do. The optical reproduction image generation unit 301 detects a point P obj on the surface of the object that first intersects the vector V cam . By such calculation, the object imaged in each pixel can be grasped.
  • the optical reproduction image generation unit 301 determines whether the point P obj on the surface of the object is illuminated by the projection device.
  • the optical reproduction image generation unit 301 first calculates a vector V proj from the optical center O proj of the projection device to the point P obj .
  • the optical reproduction image generation unit 301 uses the sensor information and the pattern information indicating the projection pattern of the projection device to determine whether or not the vector V proj is included in the range irradiated with the projection pattern from the projection device. .. If included, the point P obj on the surface of the object can be determined to be illuminated by the projection device.
  • the optical reproduction image generation unit 301 can determine the color of each pixel by using the information indicating the reflection characteristics and the state of ambient light included in the measurement condition information in addition to the above calculation result.
  • Typical reflection models used to determine colors include Lambertian reflection for diffuse reflection and Phong's reflection model for specular reflection.
  • the reflection model used by the optical reproduction image generation unit 301 is not limited to these, and any reflection model can be used.
  • the boundary of the object and the boundary of the projection pattern of the projection device may be included in the range captured by one pixel. .. In that case, it is more natural to use a color in which the colors of two or more objects forming the boundary are mixed as the pixel color.
  • the above-described vector V cam can detect only one intersection in the shooting space. Therefore, among the objects forming the boundary, only the color of the object intersecting the vector V cam is determined as the color of the pixel. As a result, there is a possibility that an image in which the boundary between the object and the projection pattern looks unnatural is generated. This is a kind of phenomenon caused by so-called quantization error, and is a phenomenon which often becomes a problem in the field of image processing.
  • an optical reproduction image is created with a resolution higher than the resolution of the virtual captured image, and the process of reducing the image to the image size in the image quality deterioration process is performed.
  • the optical reproduction image generation unit 301 generates an optical reproduction image with a resolution that is four times the resolution of the virtual captured image that is finally output.
  • the color of each pixel of the virtual captured image can be determined using the color information of the four pixels closest to that pixel in the optically reproduced image.
  • the optical reproduction image generation unit 301 generates an optical reproduction image by using, among the error factors, for example, information on mutual reflection and focus shift of the projection device, which are error factors caused by the projection device. Since these affect the projection state of the projection pattern, for example, when a focus shift occurs, the optical reproduction image generation unit 301 blurs the shadow boundary of light including the pattern boundary in the optical reproduction image.
  • the optical reproduction image generation unit 301 makes, for example, the brightness of the pixel including the incident point where the reflected light reflected by the first surface enters the second surface higher than the brightness when the influence of mutual reflection is not considered. By doing so, mutual reflection can be reproduced.
  • the optical reproduction image generation unit 301 can reproduce the blur of the light shadow boundary by adjusting the brightness of the pixel corresponding to the light shadow boundary.
  • the method described in the third embodiment may be used. it can.
  • the image quality deterioration processing unit 302 deteriorates the image quality of the optical reproduction image by using information such as image distortion, random noise, and focus deviation of the imaging device, which are error factors caused by the imaging device. For example, the image quality deterioration processing unit 302 performs a filtering process for flattening the brightness change in the image to blur the outline of an object, the shadow boundary of light, or the like, and changes the brightness of a pixel at a randomly selected position. By doing so, the image quality of the optically reproduced image can be deteriorated.
  • an optical reproduction image that is an image that reproduces an optical effect including a shadow caused by light being shielded by an object can be obtained by an optical simulation. it can. Furthermore, by performing the image quality deterioration process on the optically reproduced image, it is possible to accurately reproduce the actual captured image.
  • FIG. 7 is a diagram showing a functional configuration of the simulation apparatus 13 according to the third exemplary embodiment of the present invention.
  • the simulation device 13 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, an output unit 104, and an error information acquisition unit 105.
  • parts different from the simulation device 12 will be mainly described.
  • the simulation device 13 has an error information acquisition unit 105 in addition to the configuration of the simulation device 12.
  • the error information acquisition unit 105 externally acquires error information indicating an error factor represented by a virtual captured image.
  • the error information includes at least one of error intensity and addition order for each type of error factor.
  • the error information acquisition unit 105 inputs the acquired error information to the virtual captured image generation unit 102.
  • FIG. 8 is a diagram showing a functional configuration of the virtual photographed image generation unit 102 shown in FIG. 7.
  • the virtual captured image generation unit 102 includes an optical reproduction image generation unit 301, an image quality deterioration processing unit 302, and an error factor determination unit 303.
  • the error factor determination unit 303 determines the processing condition of the error factor addition process including at least one of the strength and the order of addition of the error factor addition process for generating the optical reproduction image and performing the image quality deterioration process based on the error information. To do.
  • the error factor determination unit 303 inputs the determined processing condition to the optical reproduction image generation unit 301.
  • the optical reproduction image generation unit 301 generates an optical reproduction image based on the measurement condition information, the error information, and the processing condition input from the error factor determination unit 303.
  • the image quality deterioration processing unit 302 performs image quality deterioration processing on the optically reproduced image based on the measurement condition information and the error information.
  • FIG. 9 is a flowchart showing the operation of the simulation device 13 shown in FIG.
  • the measurement condition acquisition unit 101 acquires measurement condition information (step S301).
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the error information acquisition unit 105 acquires error information (step S302).
  • the error information acquisition unit 105 inputs the acquired error information to the virtual captured image generation unit 102.
  • the process of step S301 and the process of step S302 may be performed concurrently in parallel.
  • the error factor determination unit 303 of the virtual captured image generation unit 102 determines the processing condition of the error factor addition process including at least one of the strength of the error factor and the addition order (step S303).
  • the error factor determination unit 303 inputs the determined processing condition to the optical reproduction image generation unit 301.
  • the optical reproduction image generation unit 301 generates an optical reproduction image by an optical simulation based on the processing condition and the measurement condition information (step S304).
  • the optical reproduction image generation unit 301 inputs the generated optical reproduction image to the image quality deterioration processing unit 302.
  • the image quality deterioration processing unit 302 executes the image quality deterioration process of the optical reproduction image based on the measurement condition information and the error information (step S305).
  • the image quality deterioration processing unit 302 inputs the image after the image quality deterioration processing to the three-dimensional measurement calculation unit 103 as a virtual captured image.
  • the three-dimensional measurement calculation unit 103 executes three-dimensional measurement processing using the virtual captured image and obtains a measurement result (step S306).
  • the three-dimensional measurement calculation unit 103 inputs the measurement result to the output unit 104.
  • the output unit 104 outputs the simulation result including the measurement result (step S307).
  • FIG. 10 is a flowchart showing details of step S304 shown in FIG.
  • the optical reproduction image generation unit 301 first generates an optical reproduction image that does not include an error factor (step S401). As described above, the optical reproduction image generation unit 301 discriminates between the projection area, which is the area directly illuminated by the projection light, and the non-projection area, which is the non-illuminated area, and draws the projection area in a brighter color than the non-projection area. To do.
  • the optical reproduction image generation unit 301 acquires the order of addition of error factors from the error information (step S402). Further, the optical reproduction image generation unit 301 acquires the type of error factor and the intensity for each type from the error information (step S403).
  • the optical reproduction image generation unit 301 determines whether or not to add an error to the optical reproduction image according to the addition order acquired in step S402. The optical reproduction image generation unit 301 first determines whether to add mutual reflection to the optical reproduction image based on the error information (step S404).
  • the optical reproduction image generation unit 301 adds mutual reflection to the optical reproduction image (step S405). Specifically, when mutual reflection occurs, the reflected light reflected by the first surface of the object enters the second surface. At this time, the optical reproduction image generation unit 301 sets the luminance of the pixel including the incident point where the reflected light is incident on the second surface as the pixel whose luminance increases due to mutual reflection. The optical reproduction image generation unit 301 determines the amount of increase in brightness based on the position and orientation of the object including the first surface and the object including the second surface and the surface characteristics of the first surface and the second surface. decide. The brightness of each pixel of the optically reproduced image is the second brightness obtained by adding the increase amount of the brightness due to the mutual reflection to the first brightness of the pixel when the influence of the mutual reflection is not considered.
  • the optical reproduction image generation unit 301 does not necessarily have to calculate the increase in luminance due to mutual reflection in the reflection of all light in the process of step S405.
  • the amount of increase in brightness may be smaller than the resolution of the image, so for a surface with a specular reflectance below the threshold, there is no increase in brightness due to mutual reflection.
  • step S406 determines whether or not the error adding process is finished.
  • step S406: Yes the optical reproduction image generation unit 301 ends the error addition processing and inputs the optical reproduction image to the image quality deterioration processing unit 302.
  • step S406: No the optical reproduction image generation unit 301 returns to the process of step S403.
  • step S404 If the mutual reflection is not added (step S404: No), the optical reproduction image generation unit 301 determines whether or not the optical blur of the projection device is added (step S407).
  • the optical reproduction image generation unit 301 adds the optical blur of the projection device to the optical reproduction image (step S408).
  • the blurring of the light shadow boundary can be represented by the brightness of the pixel.
  • the optical reproduction image generation unit 301 based on the measurement condition information, the first area, which is an area to which light is projected when it is assumed that there is no focus shift in the optical reproduction image, and the same as in the projection device.
  • the second area which is an area where light is not projected, is determined.
  • the optical reproduction image generation unit 301 specifies pixels within a predetermined distance from the boundary between the first area and the second area based on the determination result.
  • the maximum brightness of these pixels is the brightness of the area where the light is projected, and the minimum brightness is the brightness of the area where the light is not projected.
  • the brightness of the pixel may be determined near the boundary based on the distance between the pixel of interest and the boundary. The brightness of the pixel may be determined so that the closer the target pixel is to the first area, the closer to the maximum brightness, and the closer the target pixel is to the second area, the closer to the minimum brightness. In this way, by changing the brightness based on the shortest distance between the pixel of interest and the first region, it is possible to reproduce the state where the projection device is out of focus.
  • the optical reproduction image generation unit 301 proceeds to step S406 after finishing the process of step S408.
  • step S407 When the optical blur of the projection device is not added (step S407: No), the optical reproduction image generation unit 301 determines whether to add the ambient light to the optical reproduction image (step S409).
  • step S409: Yes When adding ambient light (step S409: Yes), the optical reproduction image generation unit 301 adds ambient light to the optical reproduction image (step S410). Ambient light is represented by at least one of the brightness and color of a pixel. After finishing the process of step S410, the optical reproduction image generation unit 301 proceeds to step S406. When no ambient light is added (step S409: No), the optical reproduction image generation unit 301 ends the error addition process.
  • FIG. 11 is a flowchart showing details of step S305 shown in FIG.
  • the image quality deterioration processing unit 302 first acquires the order of addition of error factors based on the error information (step S501).
  • the image quality deterioration processing unit 302 further acquires the type of error factor and the strength of the error based on the error information (step S502).
  • the image quality deterioration processing unit 302 determines whether or not to add an error according to the addition order acquired in step S501.
  • the image quality deterioration processing unit 302 determines whether to add image distortion (step S503). When it is determined that the image distortion is added (step S503: Yes), the image quality deterioration processing unit 302 adds the image distortion to the optical reproduction image (step S504). After adding the image distortion, the image quality deterioration processing unit 302 determines whether or not to end the image quality deterioration process (step S505). When it is determined that the image quality deterioration process is to be ended (step S505: Yes), the image quality deterioration processing unit 302 ends the process. When it is determined that the image quality deterioration process is not to be ended (step S505: No), the image quality deterioration processing unit 302 returns to the process of step S502.
  • step S503 When it is determined that the image distortion is not added (step S503: No), the image quality deterioration processing unit 302 subsequently determines whether random noise is added (step S506). When it is determined that random noise is added (step S506: Yes), the image quality deterioration processing unit 302 adds random noise to the optical reproduction image (step S507). When the process of step S507 ends, the image quality deterioration processing unit 302 proceeds to the process of step S505.
  • step S506 determines whether or not the random noise is added (step S506: No).
  • step S508 determines whether or not the image blur is added (step S508).
  • step S508 No
  • the image quality deterioration processing unit 302 ends the process.
  • step S509 the image quality deterioration processing unit 302 adds the image blur to the optically reproduced image (step S509).
  • step S509 ends, the image quality deterioration processing unit 302 proceeds to the process of step S505.
  • the image quality deterioration processing unit 302 may add only one type of error factor to the optical reproduction image, or may add the same error factor to the optical reproduction image multiple times. For example, when the addition of random noise and the addition of image blurring are alternately executed a plurality of times, it is possible to obtain the effect of causing color unevenness in the appearance of the target object. As described above, various image quality effects can be reproduced by combining the order of addition of the error factors and the intensity.
  • FIG. 12 is a diagram showing an example of the display screen 30 output by the simulation device 13 shown in FIG. 7.
  • the display screen 30 includes a processing result display area 21, a measurement condition display area 22, a measurement condition list display area 23, a display content selection area 24, an execute button 25, a save button 26, and an end button 27. And an error factor setting area 28.
  • the display screen 30 has an error factor setting area 28 in addition to the components of the display screen 20 described in the first embodiment.
  • description of the same parts as the display screen 20 will be omitted, and parts different from the display screen 20 will be mainly described.
  • the error factor setting area 28 is an area for setting the error intensity and error addition order for each type of error factor.
  • the user can input and set the error intensity and the addition order in the error factor setting area 28.
  • the error information acquisition unit 105 can acquire the error information displayed in the error factor setting area 28 when the execute button 25 is operated.
  • the simulation device 13 can execute the simulation reflecting the error information set by the user by the error information acquisition unit 105.
  • the user can evaluate the three-dimensional measurement executed under the desired operating condition.
  • the type of error factor since it is possible to select the type of error factor, the order of addition of the error to the optically reproduced image, and the strength, it is possible to increase the variation of reproducible image quality. Since it is also possible to test by randomly changing the operating conditions and usage environment, it is possible to test with a combination of conditions that the designer could not assume in advance, and the test contents will be comprehensive, As a result, the effect of increasing the reliability of the three-dimensional measuring device can be expected.
  • FIG. 13 is a diagram showing a functional configuration of the optical reproduction image generation unit 301 according to the fourth embodiment of the present invention.
  • the optical reproduction image generation unit 301 includes a sensor viewpoint data generation unit 401, a map generation unit 402, and an image synthesis unit 403.
  • the simulation apparatus 14 includes the optical reproduction image generation unit 301 shown in FIG. 13 and having the same configuration as the simulation apparatus 13 shown in FIG. 7 is called the simulation apparatus 14 according to the fourth embodiment.
  • the simulation device 14 has the same configuration as the simulation device 13 shown in FIG. 7, and the configuration of the optical reproduction image generation unit 301 of the virtual captured image generation unit 102 is different from that of the simulation device 13.
  • description of components similar to those of the simulation device 13 will be omitted, and portions different from those of the simulation device 13 will be mainly described.
  • the sensor viewpoint data generation unit 401 generates sensor viewpoint data including a first image showing a shooting space viewed from the image pickup device and distance data from each of the image pickup device and the projection device to an object in the shooting space.
  • the sensor viewpoint data generation unit 401 inputs the generated sensor viewpoint data to the map generation unit 402.
  • FIG. 14 is a diagram showing a detailed functional configuration of the sensor viewpoint data generation unit 401 shown in FIG.
  • the first image generated by the sensor viewpoint data generation unit 401 is a bright image in which the entire shooting space is represented by the brightness when the projection light is irradiated, and a dark image in which the projection space is not irradiated with the projection light.
  • the sensor viewpoint data generation unit 401 includes a bright image generation unit 501 that generates a bright image, a dark image generation unit 502 that generates a dark image, and a distance data generation unit 503 that generates distance data.
  • the sensor viewpoint data generation unit 401 generates a first image including a bright image generated by the bright image generation unit 501 and a dark image generated by the dark image generation unit 502, and distance data generated by the distance data generation unit 503. Output the sensor viewpoint data including.
  • the sensor viewpoint data output by the sensor viewpoint data generation unit 401 is input to the map generation unit 402.
  • the map generation unit 402 generates an irradiation map, which is a map showing different numerical values for the first region illuminated by the light from the projection device and the second region unilluminated by the light from the projection device, based on the measurement condition information. To do. For example, in the irradiation map, the first area can be represented by "1" and the second area can be represented by "0".
  • the map generation unit 402 inputs the generated irradiation map to the image synthesis unit 403.
  • FIG. 15 is a diagram showing a detailed functional configuration of the map generation unit 402 shown in FIG.
  • the map generation unit 402 includes an irradiation area calculation unit 601, an irradiation map generation unit 602, a light blurring reproduction unit 603, a reflection area calculation unit 604, and a reflection map generation unit 605.
  • the irradiation area calculation unit 601 calculates an area where the light from the projection device can be illuminated based on the measurement condition information.
  • the irradiation area calculation unit 601 inputs information indicating the calculated area to the irradiation map generation unit 602.
  • the irradiation map generation unit 602 uses the information from the irradiation region calculation unit 601 and the measurement condition information to determine the first region illuminated by the light from the projection device and the second region unilluminated by the light from the projection device. An irradiation map, which is a map indicated by different numerical values, is generated. The irradiation map generation unit 602 inputs the generated irradiation map to the light blurring reproduction unit 603.
  • the reflection region calculation unit 604 uses the measurement condition information and the sensor viewpoint data to reflect the light emitted from the projection device, which is a light whose intensity and direction have changed due to the light being once reflected by an object in the imaging space. By associating the three-dimensional position of the point illuminated by with the coordinates on the first image, the reflection area which is the area illuminated by the reflected light ray in the first image is calculated.
  • the reflection area calculation unit 604 inputs information indicating the calculated reflection area to the reflection map generation unit 605.
  • the reflection map generation unit 605 uses the information indicating the reflection area and the measurement condition information to generate a reflection map that is a map showing different values for the reflection area and the area not illuminated by the reflected light for each projection pattern by the projection device. To generate. For example, the reflection map can represent “1” for the reflection area and “0” for the area not illuminated by the reflected light.
  • the reflection map generation unit 605 inputs the generated reflection map to the light blurring reproduction unit 603.
  • the light blur reproducing unit 603 reproduces the effect of light blur on the irradiation map. Specifically, the light blurring reproduction unit 603 adjusts the value of the irradiation map from 0 to a real number according to the degree of blurring of the light shadow boundary. The light blurring reproducing unit 603 reproduces the effect of light blurring on the reflection map as well as on the irradiation map. Specifically, the light blurring reproduction unit 603 adjusts the value of the irradiation map from 0 to a real number according to the degree of blurring of the light shadow boundary. The light blurring reproducing unit 603 outputs the irradiation map and the reflection map after adjusting the values.
  • the image combining unit 403 generates an optical reproduction image by combining the bright image and the dark image included in the first image based on the information of the irradiation map and the reflection map. Specifically, the image synthesizing unit 403 sets the brightness of each pixel of the optical reproduction image to the brightness acquired from the pixel at the same position in the bright image or the dark image based on the value of the irradiation map, thereby obtaining the bright image. And the dark image are combined. Further, when the values of the irradiation map and the reflection map are adjusted by the light blurring reproduction unit 603, the image composition unit 403 can weight the brightness of each pixel based on the adjusted values.
  • the number of virtual captured images is equal to the number of projection patterns. Therefore, when the number of projection patterns increases, there is a problem that the processing time for creating a virtual captured image becomes long. Therefore, it is effective to implement the processing that is common when creating a virtual captured image in the irradiation of different projection patterns so that the processing is performed only once. Examples of common processing include creation of a bright image and a dark image of the shooting space viewed from the viewpoint of the imaging device, calculation of distance data from the imaging device and the projection device to an object existing in the shooting space, and the like.
  • the calculation result will differ if the projection pattern is different.
  • the calculation of the area that becomes the shadow of the object and the shadow of the projection light is shielded is the same regardless of the projection pattern, and thus can be extracted as a common process.
  • the pattern projected on the scene is blurred, which may reduce the pattern recognition accuracy and consequently the three-dimensional measurement accuracy.
  • the blur of the pattern light does not occur when the image is captured by the imaging device, but occurs when the pattern is projected from the projection device. Therefore, the phenomenon cannot be reproduced by the method of adding blur to the virtual photographed image.
  • the irradiation map can be defined as a two-dimensional array having the same number of elements as the virtual captured image, and each element has a real number value of 0 to 1.
  • the area illuminated by the projection light is expressed as 1
  • the area where the projection light does not reach is expressed as 0
  • the area illuminated by a lower intensity than usual due to the blur of the projection light is expressed as a value greater than 0 and less than 1. it can.
  • this is expressed by a mathematical expression, it becomes as shown in the following mathematical expression (2).
  • I i,j is the luminance at the image coordinates (i,j) of the virtual captured image
  • P i,j is the value at the coordinates (i,j) of the irradiation map
  • B i,j is , The brightness at the image coordinates (i,j) of the bright image
  • S i,j is the brightness at the image coordinates (i,j) of the dark image.
  • the projection device When creating an irradiation map, first assume that the projection device is in focus, that is, there is no blur of the projection light, and calculate the area illuminated by the projection pattern. At this point, the value of each element of the irradiation map is 0 or 1. Blurring of projection light is reproduced by applying a smoothing filter such as Gaussian smoothing to this irradiation map. By performing the Gaussian smoothing, the change in the value of the irradiation map near the boundary between the area illuminated by the pattern light and the area not illuminated by the pattern light becomes smooth. By smoothing the change in the value of the boundary portion, the effect of blurring the boundary of the projection light can be obtained, and the blur of the projection light can be reproduced.
  • a smoothing filter such as Gaussian smoothing
  • FIG. 16 is a flowchart showing the operation of the simulation device 14 according to the fourth exemplary embodiment of the present invention.
  • the sensor viewpoint data generation unit 401 generates a bright image and a dark image in the bright image generation unit 501 and the dark image generation unit 502 (step S601).
  • the sensor viewpoint data generation unit 401 generates distance data of each viewpoint in the distance data generation unit 503 concurrently with step S601 (step S602).
  • the sensor viewpoint data generation unit 401 inputs the sensor viewpoint data including the generated bright image, dark image, and distance data to the map generation unit 402.
  • the map generation unit 402 calculates the irradiation area in the irradiation area calculation unit 601 (step S603). Subsequently, the map generation unit 402 generates an irradiation map in the irradiation map generation unit 602 (step S604).
  • the map generation unit 402 acquires the order of addition of error factors (step S605). Subsequently, the map generation unit 402 acquires the type and strength of the error factor (step S606). The map generation unit 402 determines whether to add mutual reflection (step S607). When it is determined that the mutual reflection is added (step S607: Yes), the map generation unit 402 causes the reflection region calculation unit 604 to calculate the reflection region (step S608), and causes the reflection map generation unit 605 to generate the reflection map (step). S609).
  • step S607 When it is determined that the mutual reflection is not added (step S607: No), the processes of steps S608 and S609 are omitted. Subsequently, the map generation unit 402 determines whether or not to add light blur (step S610). When it is determined that the light blur is added (step S610: Yes), the light blur reproduction unit 603 of the map generation unit 402 reproduces the light blur on the irradiation map (step S611).
  • step S611 When it is determined that the light blur is not added (step S610: No), the process of step S611 is omitted. Subsequently, the map generation unit 402 determines whether to add ambient light (step S612). When it is determined that ambient light is added (step S612: Yes), the irradiation map generation unit 602 of the map generation unit 402 adds ambient light to the irradiation map (step S613). When it is determined that the ambient light is not added (step S612: No), the process of step S613 is omitted.
  • the map generation unit 402 determines whether or not to finish the error addition process (step S614). When it is determined that the error adding process is not to be ended (step S614: No), the process returns to step S606. When it is determined that the error addition process is to be ended (step S614: Yes), the image composition unit 403 executes the image composition process (step S615).
  • the simulation device 14 by creating the first image, the irradiation map, and the reflection map, which are images in which optical phenomena are individually reproduced, It is possible to simplify the process and the data structure for generating the optical reproduction image when the projection pattern of 1 is used.
  • the irradiation map to represent the blur effect of the light shadow boundary caused by the focus shift
  • the reflection map to express the effect of the mutual reflection
  • the simulation apparatus 14 can individually reproduce each of a plurality of types of optical phenomena. Therefore, it is possible to know in detail which factor caused the measurement error caused by the three-dimensional measurement. By adopting such a configuration, there is an effect that the design and performance evaluation of the three-dimensional measuring device become easy.
  • Embodiment 5 The simulation device 15 (not shown) according to the fifth embodiment has the same configuration as the simulation device 13 shown in FIG. 7.
  • the output screen of the simulation device 15 is different from that of the simulation device 13.
  • description of the same parts as those of the simulation device 13 will be omitted, and parts different from those of the simulation device 13 will be mainly described.
  • FIG. 17 is a diagram showing an example of the display screen 40 output by the simulation device 15 according to the fifth embodiment of the present invention.
  • the simulation device 15 can output a simulation result that further includes at least one of the measurement condition information and at least one of the virtual captured image.
  • the display screen 40 can include an adjustment item display area 41, a save button 42, an end button 43, and a measurement result display area 44.
  • the adjustment item display area 41 is an area in which set values of items to be adjusted are displayed in a list.
  • the measurement result display area 44 is an area for displaying the measurement values when the items of the measurement condition to be adjusted are set to the respective set values listed.
  • the save button 42 is an operation unit for performing an operation of saving the measurement result
  • the end button 43 is an operation unit for performing an operation of ending the process.
  • the setting values of the measurement condition items to be adjusted, the measurement values that are the processing results, and virtual captured images and irradiation maps that are obtained in the processing process as intermediate results are displayed side by side. Further, the difference result based on the processing result for any of the set values may be displayed together, or a portion where the difference between the processing results is large may be highlighted.
  • the user outputs the measurement condition information, the virtual captured image, and the like together with the measurement value, so that the user can obtain the third order for each measurement condition.
  • the process of the original measurement result can be confirmed.
  • by displaying the processing results and the intermediate results corresponding to the plurality of measurement conditions side by side it becomes possible to compare and examine the set values of the measurement conditions. Therefore, the arrangement and performance of the projection device and the imaging device can be easily examined.
  • FIG. 18 is a diagram showing a functional configuration of the simulation device 16 according to the sixth exemplary embodiment of the present invention.
  • the simulation device 16 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, an output unit 104, an error information acquisition unit 105, an evaluation reference data generation unit 106, and a measurement evaluation. And part 107.
  • the simulation device 16 has an evaluation reference data generation unit 106 and a measurement evaluation unit 107 in addition to the configuration of the simulation device 13.
  • an evaluation reference data generation unit 106 and a measurement evaluation unit 107 in addition to the configuration of the simulation device 13.
  • detailed description of the same configuration as that of the simulation device 13 will be omitted, and the differences from the simulation device 13 will be mainly described.
  • the evaluation reference data generation unit 106 uses the measurement condition information to generate evaluation reference data that is an evaluation reference for simulation results.
  • the evaluation reference data generation unit 106 inputs the generated evaluation reference data to the measurement evaluation unit 107.
  • the measurement evaluation unit 107 evaluates the simulation result using the evaluation reference data and acquires the simulation evaluation.
  • the output unit 104 outputs the simulation evaluation result in addition to the simulation result.
  • FIG. 19 is a flowchart showing the operation of the simulation device 16 shown in FIG.
  • the operation shown in FIG. 19 is the same as that of FIG. 9 from step S301 to step S307, and detailed description thereof will be omitted.
  • the evaluation reference data generating unit 106 generates the evaluation reference data after the three-dimensional measurement process is completed (step S701).
  • the measurement evaluation unit 107 performs a measurement evaluation process of comparing the measurement data obtained by the simulation with the evaluation reference data to calculate a quantitative evaluation value (step S702).
  • the output unit 104 outputs the simulation evaluation (step S703).
  • the distance data from the imaging device included in the sensor viewpoint data, the processing result of the three-dimensional measurement when the virtual captured image before adding the error factor is input, and the processing such as the irradiation map The data obtained in the process and the actual measurement data obtained by actually measuring the measurement object can be considered.
  • the measurable region can be, for example, a region in which no deficiency occurs in the measurement data obtained by inputting the virtual captured image before adding the error factor. Further, statistics such as average value, variance, standard deviation, and maximum value obtained by comparing the evaluation reference data and the simulation result may be used as the evaluation index.
  • a feedback method for example, it is possible to generate a plurality of different simulation results and acquire the setting value with the best simulation evaluation.
  • a plurality of simulation results for example, there is a method of changing the value of the surface characteristic of the measurement object under the measurement condition or changing the strength of the error factor.
  • Parameters such as the surface characteristics of the measurement target and the strength of error factors when the simulation evaluation is the best can be defined as the optimum simulation settings. By providing the user with this optimum simulation setting, more accurate verification experiments can be performed.
  • FIG. 20 is a diagram showing an example of the display screen 50 of the simulation device 16 shown in FIG.
  • the display screen 50 includes an evaluation reference data display area 51 in addition to the components of the display screen 40.
  • the evaluation standard data is displayed along with a plurality of setting values of the adjustment target items of the measurement conditions.
  • the intermediate result obtained from the simulation of the three-dimensional measurement process using the reference value may be displayed.
  • the display screen 50 can further display the simulation evaluation.
  • the simulation device 16 can obtain evaluation reference data in addition to the simulation result.
  • the evaluation reference data is data serving as a reference when evaluating the simulation result, and is actual measurement data or the like.
  • the simulation device 16 can facilitate examination of the simulation result by displaying the measured value included in the simulation result and the evaluation reference data side by side.
  • the simulation device 16 can obtain a simulation evaluation in which the simulation result is evaluated using the evaluation reference data. With such a configuration, it is possible to quantitatively grasp the measurement error and the loss. For this reason, it is easy to study the performance of the imaging device and the projection device, and to determine the suitability of three-dimensional measurement for each measurement object. Further, when the performance evaluation method is defined by the standard, it is possible to easily perform the suitability judgment by obtaining the simulation evaluation by using the performance evaluation method.
  • FIG. 21 is a diagram showing a functional configuration of the simulation apparatus 17 according to the seventh embodiment of the present invention.
  • the simulation device 17 has an object recognition processing unit 108 and a recognition evaluation unit 109 in addition to the configuration of the simulation device 16 according to the sixth embodiment.
  • a part different from the simulation device 16 will be mainly described.
  • the object recognition processing unit 108 receives the simulation result output by the three-dimensional measurement calculation unit 103 and the measurement condition information output by the measurement condition acquisition unit 101, and determines the position and orientation of the object existing in the imaging space and the object. A recognition result including at least one of a gripping position that is a grippable position is acquired. The object recognition processing unit 108 inputs the recognition result to the recognition evaluation unit 109.
  • the recognition evaluation unit 109 evaluates the recognition result based on the measurement condition information. Specifically, the recognition evaluation unit 109 acquires a recognition evaluation result that includes at least one of the position and orientation estimation accuracy included in the recognition result and the gripping position estimation accuracy. The recognition evaluation unit 109 inputs the recognition result of the object and the recognition evaluation result to the output unit 104. The output unit 104 outputs the recognition result and the recognition evaluation result in addition to the simulation result and the simulation evaluation.
  • FIG. 22 is a flowchart showing the operation of the simulation apparatus 17 shown in FIG.
  • the operations in steps S301 to S307 and steps S701 to S703 are the same as those in FIG. 19, and thus description thereof will be omitted here.
  • the part different from FIG. 19 will be mainly described.
  • the object recognition processing unit 108 executes the object recognition process and acquires the recognition result (step S801).
  • the recognition evaluation unit 109 executes a recognition evaluation process for evaluating the recognition result (step S802).
  • the output unit 104 outputs the recognition evaluation result (step S803).
  • the recognition result may be output in addition to the recognition evaluation result.
  • One of the purposes of three-dimensional measurement of objects is recognition of the position and orientation of the measurement target.
  • an application program that causes a robot to grip an object is applicable.
  • the position and orientation of the object to be grasped is not known in advance, it is necessary to sense the object on the spot and recognize the position and orientation of the object or the position where the robot can grasp.
  • the object recognition processing unit 108 that performs the object recognition processing with the simulation result of the three-dimensional measurement as an input.
  • An arbitrary algorithm may be used as an algorithm for object recognition in the object recognition processing unit 108.
  • the object recognition algorithm may be input with a three-dimensional point group that handles the result of the three-dimensional measurement as a set of points in a three-dimensional space, or with a depth image that represents the three-dimensional measurement result as a two-dimensional image.
  • the recognition result such as the estimation accuracy of the position and orientation of the recognized object and the estimation accuracy of the gripping position is evaluated.
  • a problem in many cases is that the true value of the position and orientation of the object is unknown. Since the true value is unknown, even if the recognition result of the position and orientation of the object is output, it is difficult to quantitatively determine the quality. However, since the position and orientation of the object are known in the simulation, it is possible to quantitatively evaluate the recognition result.
  • the simulation device 17 As described above, according to the simulation device 17 according to the seventh embodiment of the present invention, it is possible to verify the object recognition using the simulation result. With such a configuration, it is possible to evaluate the performance of object recognition without actually configuring a three-dimensional measuring device. Since the position and orientation of the recognition target are known in the simulation space, the result of object recognition can be compared with the true value, and the effect of facilitating quantitative evaluation of the object recognition performance is achieved.
  • FIG. 23 is a diagram showing the functional configuration of the simulation apparatus 18 according to the eighth embodiment of the present invention.
  • the simulation device 18 has an object grip evaluation unit 110 in addition to the configuration of the simulation device 17 according to the seventh embodiment.
  • the object grip evaluation unit 110 calculates the object based on the measurement condition information output by the measurement condition acquisition unit 101, the simulation result output by the three-dimensional measurement calculation unit 103, and the recognition result output by the object recognition processing unit 108.
  • the grip evaluation result of the object for which the grip success probability is evaluated is acquired.
  • the object grip evaluation unit 110 inputs the grip evaluation result to the output unit 104.
  • the output unit 104 also outputs the grip evaluation result.
  • FIG. 24 is a flowchart showing the operation of the simulation device 18 shown in FIG.
  • the operations in steps S301 to S307, steps S701 to S703, and steps S801 to S803 are the same as those in FIG. 22, and thus the description thereof is omitted here.
  • steps S301 to S307, steps S701 to S703, and steps S801 to S803 are the same as those in FIG. 22, and thus the description thereof is omitted here.
  • a part different from FIG. 22 will be mainly described.
  • step S901 the object grip evaluation unit 110 executes the object grip evaluation (step S901).
  • the object grip evaluation unit 110 inputs the grip evaluation result to the output unit 104.
  • the output unit 104 outputs the grip evaluation result (step S902).
  • the operation of step S901 can be executed concurrently with the recognition evaluation process.
  • the recognition result of the object it is possible to obtain the information on the position where the robot hand grips the object.
  • the information when the robot hand is moved to the gripping position and the gripping operation is executed, the position of contact with the object and the magnitude and direction of the force generated in the robot hand or the object can be simulated.
  • the magnitude and direction of the force generated in the robot hand and the object are known, it is possible to estimate whether the gripping will be successful. For example, consider a case where the robot hand is a parallel hand having two claws. The case where gripping fails is a case where the gripped object comes off from the gap between the claws of the robot hand. As a situation corresponding to this case, it is conceivable that the force applied to the gripping object in the direction perpendicular to the closing direction of the parallel hand is larger than the frictional force between the robot hand and the gripping object. Therefore, it is possible to determine whether or not the grip is successful if there is information such as the friction coefficient of the surfaces of the robot hand and the grip object, the weight of the grip object, and the grip force of the robot hand.
  • the eighth embodiment of the present invention has an object grip evaluation unit 110 having a simulation function of such an object grip by a robot.
  • FIG. 25 shows an output screen example according to the eighth embodiment.
  • FIG. 25 is a diagram showing an example of the display screen 60 output by the simulation device 18 shown in FIG.
  • the display screen 60 has a recognition and grip evaluation result display area 61 in addition to the display content of the display screen 30.
  • As a method of outputting the recognition result and the grip evaluation result different symbols may be output depending on whether the recognition or grip is successful or unsuccessful, or the cause of the failure may be output at the time of failure, Any quantitative value used for evaluation may be displayed.
  • the grip success rate can be evaluated on the simulation, so that there is an effect that the pre-verification before building the robot system becomes easy. ..
  • the cause of failure can be isolated and verified by simulation, so that the cause can be investigated and improved in a short period of time.
  • Measurement condition acquisition unit 101 virtual captured image generation unit 102, three-dimensional measurement calculation unit 103, output unit 104, error information acquisition unit 105, evaluation reference data generation unit 106, measurement evaluation unit 107, object recognition processing unit 108, recognition evaluation
  • the unit 109 and the object grip evaluation unit 110 are realized by a processing circuit. These processing circuits may be realized by dedicated hardware, or may be control circuits using a CPU (Central Processing Unit).
  • CPU Central Processing Unit
  • FIG. 26 is a diagram showing dedicated hardware for realizing the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention.
  • the processing circuit 90 is a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
  • FIG. 27 is a diagram showing the configuration of the control circuit 91 for realizing the functions of the simulation apparatuses 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention.
  • the control circuit 91 includes a processor 92 and a memory 93.
  • the processor 92 is a CPU and is also called a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
  • the memory 93 is, for example, a nonvolatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable ROM), and an EEPROM (registered trademark) (Electrically EPROM), These include magnetic disks, flexible disks, optical disks, compact disks, mini disks, and DVD (Digital Versatile Disk).
  • a RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable ROM), and an EEPROM (registered trademark) (Electrically EPROM)
  • magnetic disks flexible disks, optical disks, compact disks, mini disks, and DVD (Digital Versatile Disk).
  • the control circuit 91 When the above processing circuit is realized by the control circuit 91, it is realized by the processor 92 reading and executing a program stored in the memory 93 and corresponding to the processing of each component.
  • the memory 93 is also used as a temporary memory in each process executed by the processor 92.
  • FIG. 28 is a diagram showing an example of a hardware configuration for realizing the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention.
  • the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention use the input device 94 and the output device 95 in addition to the processor 92 and the memory 93.
  • the input device 94 is an input interface such as a keyboard, a mouse, and a touch sensor that receives an input operation from a user.
  • the output device 95 is, for example, a display device and can display an output screen to the user. When a touch panel is used, the display device of the touch panel is the output device 95 and the touch sensor superimposed on the display device is the input device 94.
  • the functions of the measurement condition acquisition unit 101 and the output unit 104 of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 may be realized by only the processor 92, or the processor 92 and the input device 94. Alternatively, it may be realized by the output device 95 or an interface therewith.
  • the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention may be realized by one piece of hardware, or a plurality of functions may be realized. It may be distributed and processed by hardware.
  • the display screens 20, 30, 40, 50, 60 shown above are examples, and various changes can be made.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)
PCT/JP2019/005156 2019-02-13 2019-02-13 シミュレーション装置、シミュレーション方法およびシミュレーションプログラム WO2020165976A1 (ja)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201980091525.1A CN113412500B (zh) 2019-02-13 2019-02-13 模拟装置及模拟方法
DE112019006855.5T DE112019006855T5 (de) 2019-02-13 2019-02-13 Simulationsvorrichtung und simulationsverfahren
JP2020571967A JP7094400B2 (ja) 2019-02-13 2019-02-13 シミュレーション装置およびシミュレーション方法
PCT/JP2019/005156 WO2020165976A1 (ja) 2019-02-13 2019-02-13 シミュレーション装置、シミュレーション方法およびシミュレーションプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/005156 WO2020165976A1 (ja) 2019-02-13 2019-02-13 シミュレーション装置、シミュレーション方法およびシミュレーションプログラム

Publications (1)

Publication Number Publication Date
WO2020165976A1 true WO2020165976A1 (ja) 2020-08-20

Family

ID=72044404

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/005156 WO2020165976A1 (ja) 2019-02-13 2019-02-13 シミュレーション装置、シミュレーション方法およびシミュレーションプログラム

Country Status (4)

Country Link
JP (1) JP7094400B2 (zh)
CN (1) CN113412500B (zh)
DE (1) DE112019006855T5 (zh)
WO (1) WO2020165976A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114202981A (zh) * 2021-12-10 2022-03-18 新疆工程学院 一种用于摄影测量实验的仿真平台
KR20220053936A (ko) * 2020-10-23 2022-05-02 엘아이지넥스원 주식회사 영상센서 모사방법
CN115802014A (zh) * 2021-09-09 2023-03-14 卡西欧计算机株式会社 记录介质、设置模拟方法和设置模拟装置
US20230189024A1 (en) * 2021-12-10 2023-06-15 T-Mobile Usa, Inc. Location simulation for wireless devices

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7380661B2 (ja) * 2021-09-21 2023-11-15 セイコーエプソン株式会社 投射方法、及び投射システム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015203652A (ja) * 2014-04-15 2015-11-16 キヤノン株式会社 情報処理装置および情報処理方法
WO2016186211A1 (ja) * 2015-05-21 2016-11-24 国立大学法人 鹿児島大学 3次元計測システム、3次元計測方法及び3次元計測プログラム
JP2018144158A (ja) * 2017-03-03 2018-09-20 株式会社キーエンス ロボットシミュレーション装置、ロボットシミュレーション方法、ロボットシミュレーションプログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200907764A (en) * 2007-08-01 2009-02-16 Unique Instr Co Ltd Three-dimensional virtual input and simulation apparatus
JP5198078B2 (ja) * 2008-01-24 2013-05-15 株式会社日立製作所 計測装置および計測方法
CN108369089B (zh) * 2015-11-25 2020-03-24 三菱电机株式会社 3维图像测量装置及方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015203652A (ja) * 2014-04-15 2015-11-16 キヤノン株式会社 情報処理装置および情報処理方法
WO2016186211A1 (ja) * 2015-05-21 2016-11-24 国立大学法人 鹿児島大学 3次元計測システム、3次元計測方法及び3次元計測プログラム
JP2018144158A (ja) * 2017-03-03 2018-09-20 株式会社キーエンス ロボットシミュレーション装置、ロボットシミュレーション方法、ロボットシミュレーションプログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220053936A (ko) * 2020-10-23 2022-05-02 엘아이지넥스원 주식회사 영상센서 모사방법
KR102507277B1 (ko) * 2020-10-23 2023-03-07 엘아이지넥스원 주식회사 영상센서 모사방법
CN115802014A (zh) * 2021-09-09 2023-03-14 卡西欧计算机株式会社 记录介质、设置模拟方法和设置模拟装置
CN114202981A (zh) * 2021-12-10 2022-03-18 新疆工程学院 一种用于摄影测量实验的仿真平台
US20230189024A1 (en) * 2021-12-10 2023-06-15 T-Mobile Usa, Inc. Location simulation for wireless devices
CN114202981B (zh) * 2021-12-10 2023-06-16 新疆工程学院 一种用于摄影测量实验的仿真平台

Also Published As

Publication number Publication date
JP7094400B2 (ja) 2022-07-01
JPWO2020165976A1 (ja) 2021-09-30
DE112019006855T5 (de) 2021-11-04
CN113412500B (zh) 2023-12-29
CN113412500A (zh) 2021-09-17

Similar Documents

Publication Publication Date Title
WO2020165976A1 (ja) シミュレーション装置、シミュレーション方法およびシミュレーションプログラム
JP6745173B2 (ja) 画像検査装置、画像検査方法、画像検査プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器
US9563954B2 (en) Method for capturing the three-dimensional surface geometry of objects
JP7422689B2 (ja) 投影角度の動的選択による物品検査
CN107525479A (zh) 针对特定光学测量来确定物体性质
Sun et al. An empirical evaluation of factors influencing camera calibration accuracy using three publicly available techniques
JP6519265B2 (ja) 画像処理方法
CN107025663A (zh) 视觉系统中用于3d点云匹配的杂波评分系统及方法
JP6054576B2 (ja) 測定対象物の少なくとも1つの仮想画像を生成する方法及び装置
TWI836146B (zh) 多成像模式影像對準
CN103021061B (zh) 用于评估机动车的物体识别装置的方法
WO2020075252A1 (ja) 情報処理装置、プログラム及び情報処理方法
JP6614905B2 (ja) 三次元計測装置およびその制御方法
CN106415198B (zh) 图像记录方法和执行该方法的坐标测量机
KR20230136291A (ko) 사영 공간에서의 구조적 유사성을 이용한 3차원 매쉬 퀄리티 평가 방법, 이를 수행하는 장치 및 컴퓨터 프로그램
US20020065637A1 (en) Method and apparatus for simulating the measurement of a part without using a physical measurement system
JP2015206654A (ja) 情報処理装置、情報処理方法、プログラム
JP2018200328A (ja) 検査装置、検査方法およびプログラム
JP4764963B2 (ja) 画像処理装置
US20200234458A1 (en) Apparatus and method for encoding in structured depth camera system
US20040258311A1 (en) Method for generating geometric models for optical partial recognition
JP2003168129A (ja) 三次元画像処理方法、三次元画像処理プログラム、三次元画像処理装置および三次元画像処理システム
JP6864722B2 (ja) 検査装置、検査方法およびプログラム
KR102502029B1 (ko) 표면 반사 정보를 이용한 광학적 반사 특성 산출 방법
US11875447B1 (en) Systems and methods for color correcting three-dimensional objects formed by point cloud data points

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19915567

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020571967

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 19915567

Country of ref document: EP

Kind code of ref document: A1