WO2020165976A1 - Simulation device, simulation method, and simulation program - Google Patents

Simulation device, simulation method, and simulation program Download PDF

Info

Publication number
WO2020165976A1
WO2020165976A1 PCT/JP2019/005156 JP2019005156W WO2020165976A1 WO 2020165976 A1 WO2020165976 A1 WO 2020165976A1 JP 2019005156 W JP2019005156 W JP 2019005156W WO 2020165976 A1 WO2020165976 A1 WO 2020165976A1
Authority
WO
WIPO (PCT)
Prior art keywords
measurement
image
simulation
captured image
generation unit
Prior art date
Application number
PCT/JP2019/005156
Other languages
French (fr)
Japanese (ja)
Inventor
亮輔 川西
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2019/005156 priority Critical patent/WO2020165976A1/en
Priority to CN201980091525.1A priority patent/CN113412500B/en
Priority to DE112019006855.5T priority patent/DE112019006855T5/en
Priority to JP2020571967A priority patent/JP7094400B2/en
Publication of WO2020165976A1 publication Critical patent/WO2020165976A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models

Definitions

  • the present invention relates to a simulation device, a simulation method, and a simulation program capable of simulating the measurement result of a three-dimensional measurement device.
  • Patent Document 1 discloses a three-dimensional measuring device that performs three-dimensional measurement by the active stereo method.
  • the three-dimensional measurement device disclosed in Patent Document 1 projects a geometric pattern from a projection device onto a measurement target, sets a plurality of detection points on the geometric pattern in a captured image output by the imaging device, and sets the detection points as detection points.
  • the three-dimensional position of the surface of the corresponding measurement object is calculated based on the triangulation method.
  • the present invention has been made in view of the above, and an object of the present invention is to obtain a simulation device capable of evaluating a measurement result even when there is no actual environment or when the sample of the measurement target is small.
  • a simulation device includes a projection device that projects light onto a measurement target, and an image capture that includes the measurement target irradiated with projection light from the projection device.
  • a measurement condition acquisition unit that acquires measurement condition information indicating measurement conditions of a three-dimensional measurement device that includes an imaging device that captures a space, and a virtual captured image that reproduces a captured image output by the imaging device based on the measurement condition information.
  • a virtual photographed image generation unit that generates a 3D measurement calculation unit that obtains a measured value by performing a 3D measurement process that measures the 3D position of the surface of the measurement target using the virtual photographed image.
  • an output unit that outputs the simulation result including the output.
  • FIG. 3 is a diagram showing a functional configuration of a virtual captured image generation unit according to a second embodiment of the present invention.
  • 5 is a flowchart showing the operation of the simulation device having the virtual captured image generation unit shown in FIG.
  • generation part shown in FIG. Flowchart showing the operation of the simulation apparatus shown in FIG.
  • the flowchart which shows the detail of step S304 shown in FIG. The flowchart which shows the detail of step S305 shown in FIG.
  • the figure which shows the detailed functional structure of the sensor viewpoint data generation part shown in FIG. The figure which shows the detailed functional structure of the map generation part shown in FIG. 4 is a flowchart showing the operation of the simulation apparatus according to the fourth embodiment of the present invention.
  • FIG. 23 A flowchart showing the operation of the simulation apparatus shown in FIG. The figure which shows an example of the display screen of the simulation apparatus shown in FIG. The figure which shows the function structure of the simulation apparatus concerning Embodiment 7 of this invention.
  • FIG. 3 is a diagram showing a configuration of a control circuit for realizing the functions of the simulation apparatus according to the first to eighth embodiments of the present invention.
  • FIG. 2 is a diagram showing an example of a hardware configuration for realizing the functions of the simulation apparatus according to the first to eighth embodiments of the present invention.
  • FIG. 1 is a diagram showing a functional configuration of a simulation device 10 according to the first exemplary embodiment of the present invention.
  • the simulation device 10 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, and an output unit 104.
  • the simulation device 10 simulates the measurement result of the three-dimensional measurement device based on the measurement condition information indicating the measurement condition of the three-dimensional measurement device that measures the three-dimensional position of the surface of the measurement target using the active stereo method.
  • the three-dimensional measurement device assumed here includes a projection device that projects light onto a measurement target object, and an imaging device that captures an imaging space including the measurement target object irradiated with projection light from the projection device.
  • the projection light from the projection device indicates a projection pattern used for three-dimensional measurement.
  • projection light refers to light that indicates a projection pattern.
  • the three-dimensional measurement device can perform a three-dimensional measurement process of measuring the three-dimensional position of the surface of the measurement target based on the projection pattern included in the captured image output by the imaging device. Further, the simulation device 10 may simulate the output result of the system using the three-dimensional measuring device.
  • the measurement condition acquisition unit 101 acquires measurement condition information indicating the measurement conditions of the three-dimensional measuring device. Details of the measurement condition information will be described later.
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the virtual captured image generation unit 102 is a virtual computer graphics (CG) image that reproduces a captured image output by the imaging device included in the three-dimensional measurement device based on the measurement condition information input from the measurement condition acquisition unit 101. Generate a captured image.
  • the virtual captured image generation unit 102 inputs the generated virtual captured image to the three-dimensional measurement calculation unit 103.
  • CG virtual computer graphics
  • the three-dimensional measurement calculation unit 103 uses the virtual captured image input from the virtual captured image generation unit 102 to perform the three-dimensional measurement processing executed by the three-dimensional measurement device and acquire the measurement value.
  • the three-dimensional measurement calculation unit 103 inputs the acquired measurement value to the output unit 104.
  • the output unit 104 outputs the simulation result including the measurement value input from the three-dimensional measurement calculation unit 103.
  • FIG. 2 is a flowchart showing the operation of the simulation device 10 shown in FIG.
  • the measurement condition acquisition unit 101 of the simulation device 10 acquires measurement condition information (step S101).
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the virtual captured image generation unit 102 generates a virtual captured image based on the measurement condition information (step S102).
  • the virtual captured image generation unit 102 inputs the generated virtual captured image to the three-dimensional measurement calculation unit 103.
  • the three-dimensional measurement calculation unit 103 executes the calculation of three-dimensional measurement using the virtual captured image and acquires the measurement value (step S103).
  • the three-dimensional measurement calculation unit 103 inputs the acquired measurement value to the output unit 104.
  • the output unit 104 outputs the simulation result including the measured value (step S104).
  • the simulation apparatus 10 may generate a virtual captured image that reproduces a captured image output by the image capturing apparatus based on the measurement condition information of the three-dimensional measurement apparatus, and may perform the three-dimensional measurement processing using the virtual captured image. it can. With such a configuration, it is possible to verify the three-dimensional measurement on the simulation without actually installing the projection device and the imaging device and acquiring the actual data. Therefore, it is not necessary to install the 3D measuring device on site, adjust the hardware and software of the 3D measuring device, and collect the actual data of the measurement target, reducing the time required for the work. Therefore, it becomes possible to suppress human and physical costs. Therefore, an appropriate measurement condition can be specified by performing a simulation while setting various measurement conditions of the three-dimensional measurement device. Therefore, it is possible to shorten the design period of the three-dimensional measuring device and the trial-and-error verification period for introducing the three-dimensional measuring device to the site and operating it.
  • the measurement condition acquisition unit 101 acquires measurement condition information indicating the measurement conditions of the three-dimensional measuring device.
  • the measurement condition acquisition unit 101 includes a projection condition information acquisition unit 201, a shooting condition information acquisition unit 202, a measurement target object information acquisition unit 203, and a non-measurement target object information acquisition unit 204.
  • the projection condition information acquisition unit 201 acquires projection condition information indicating the projection condition of the projection device.
  • the projection condition information can include information that can specify at least one of the performance and the usage state of the projection device.
  • the performance of the projection device is, for example, the resolution and the angle of view
  • the state of the projection device is, for example, the position and orientation of the projection device and the focus state.
  • the projection condition information can include information indicating a projection pattern.
  • the information indicating the projection pattern includes information indicating the pattern of the projection pattern. When there are a plurality of projection patterns, the information indicating the projection pattern may further include information indicating the number of projection patterns.
  • the pattern of the projection pattern is a stripe pattern in which stripes with a predetermined thickness for each projection pattern are arranged according to a certain rule, a dot pattern arranged in an irregular arrangement, and the intensity of the projection light in the projection pattern.
  • the gradation pattern may be smoothly changed, or a combination thereof.
  • the above pattern is an example, and projection patterns of any pattern can be used.
  • the number of projection patterns is an arbitrary number of one or more.
  • the information indicating the pattern of the projection pattern may be, for example, information indicating the type of the pattern, or information indicating the light intensity distribution on a predetermined projection plane.
  • the image capturing condition information acquisition unit 202 acquires image capturing condition information indicating the image capturing conditions of the image capturing apparatus.
  • the shooting condition information can include information that can specify at least one of the performance and the usage state of the imaging device.
  • the performance of the imaging device is, for example, the resolution and the angle of view, and the state of the imaging device is, for example, the position and orientation of the imaging device and the focus state.
  • the measurement target object information acquisition unit 203 acquires information about the measurement target object, for example, measurement target object information that is information indicating the shape, position and orientation of the measurement target object, and characteristics.
  • the characteristic of the measurement target is, for example, the reflection characteristic of the measurement target.
  • the information indicating the reflection characteristic of the measurement target includes, for example, at least one of the color, the diffuse reflectance, and the specular reflectance of the measurement target.
  • the non-measurement target information acquisition unit 204 acquires non-measurement target information that is information about a non-measurement target that is an object other than the measurement target or ambient light other than the projection light existing in the shooting space.
  • the non-measurement target is, for example, a container, a gantry, or a jig that holds the measurement target.
  • the non-measurement target is not limited to the above example, and an object other than the measurement target that can be reflected in a captured image captured by an actual image capturing apparatus, such as a wall, a window, another landscape, or the above-described object.
  • light such as illumination light for irradiating.
  • the non-measurement target information can include at least one of the position, shape, and characteristic of the non-measurement target and information indicating the state of ambient light.
  • the information indicating the shape of the object included in the measurement target information and the non-measurement target information is a combination of meshes such as 3D-CAD (3Dimensional-Computer-Assisted Drawing), a primitive such as a sphere or a rectangular parallelepiped, or an aggregate thereof.
  • 3D-CAD 3Dimensional-Computer-Assisted Drawing
  • a primitive such as a sphere or a rectangular parallelepiped
  • an aggregate thereof can be expressed as
  • the reflection characteristic of an object is used to reproduce the appearance of the object when it is exposed to light.
  • the information indicating the state of the ambient light is used to obtain a shadow effect such as an image actually taken by the imaging device.
  • the object may be in a state of floating in the air, or a box for accommodating the floor surface and the object may be set so that the object exists inside the box. It is preferable that the reflection characteristics can be set for the floor surface and the box as well as for the object.
  • the projection condition information and the shooting condition information including the performances such as the resolution and the angle of view of the projection device and the imaging device can be individually set, the measurement error that may occur in the actual three-dimensional measurement device can be more accurate It can be reproduced.
  • the resolution of the image pickup device is lower than the resolution of the projection device, even if the projection pattern of the projection device is made finer, the pattern of the projection pattern cannot be finely determined from the image obtained by shooting the projection pattern. This may cause errors in three-dimensional measurement and loss.
  • the virtual captured image generation unit 102 Since the virtual captured image generation unit 102 generates the virtual captured image based on the resolution of the image capturing apparatus, by analyzing the virtual captured image, the fineness of the projection pattern that can be discriminated by the image capturing apparatus can be known.
  • the limit value of the fineness of the projection pattern that can be discriminated by the image pickup device can be estimated as the upper limit value of the resolution of the projection device.
  • the performance of the imaging device that matches the resolution of the projection device can be considered. That is, the projection device and the imaging device included in the three-dimensional measurement device need to have performances that match each other, and therefore it is preferable to study the performances of the projection device and the imaging device using the simulation results.
  • the virtual captured image generation unit 102 generates a virtual captured image that reproduces the captured image output by the imaging device based on the measurement condition information.
  • the virtual photographed image generation unit 102 reproduces the arrangement of objects existing in the photographing space in the virtual photographing space based on the measurement object information and the non-measurement object information included in the measurement condition information, and the photographing condition information. Based on, it is possible to specify the portion in the shooting space indicated by the virtual shot image.
  • the virtual captured image generation unit 102 can reproduce the shadow generated by the projection light in the capturing space by using the reflection model based on the projection condition information, the measurement target object information, and the non-measurement target object information.
  • the virtual captured image generation unit 102 can reproduce the position indicating the boundary of the shadow generated by the projection light in the captured image output by the image capturing apparatus, with a pixel value.
  • the error included in the position indicating the boundary of the shadow reduces the visibility of the light in the captured image.
  • the decrease in the visibility of light means that the three-dimensional position where the projection light is irradiated in the photographing space does not match the detection result of the irradiation position of the projection light on the photographed image, and the two-dimensional projection device is used.
  • a shadow boundary of light for example, a pattern boundary that is a boundary between a region irradiated with projection light and a region not irradiated with projection light cannot be detected correctly. For this reason, if the visibility of light is reduced, an error occurs in the measurement result of the three-dimensional measurement or the measurement becomes impossible, and the quality of the three-dimensional measurement deteriorates.
  • the factors that cause the error in the position of the shadow boundary of light in the captured image may include those caused by the projection device and those caused by the imaging device.
  • the error factors caused by the projection device include mutual reflection in which the projection light from the projection device is reflected on the first surface and then enters the second surface to illuminate the second surface, and the projection device is out of focus. It is possible that the light is out of focus.
  • the second surface may be the surface of an object different from the first surface, or may be a different part of the same object as the first surface.
  • the phenomenon treated as an error factor is not limited to the above, and may include any phenomenon that causes a change in the projection state of light or a change in a captured image.
  • the light beam emitted from the projection device may be reflected on the surface of the first object in the shooting space, and the reflected light may illuminate a second object different from the first object. It is also conceivable that the light reflected by the site X of the first object existing in the imaging space illuminates a site Y different from the site X of the same first object.
  • the light ray is not limited to be reflected once, and may be reflected multiple times. However, since the energy of light is absorbed each time it is reflected, if the number of times of reflection is repeated, the observation sensitivity of the image pickup device is lowered and the captured image is often not affected. Therefore, mutual reflection due to the reflected light a predetermined number of times or more may be ignored.
  • the light intensity by mutual reflection can be determined based on the intensity of the light before reflection, the reflection characteristics of the reflection point, and the traveling path of the light, in addition to the reflection model similar to that when the virtual captured image is generated.
  • shadow boundaries of light may be observed in addition to the pattern boundaries that should be observed on the image.
  • the light irradiation direction cannot be calculated correctly, which causes a measurement error in three-dimensional measurement.
  • the measurement value output by the three-dimensional measurement calculation unit 103 includes a measurement target that may actually cause mutual reflection.
  • the measurement error that can occur when three-dimensional measurement is performed on an object is reproduced. Therefore, it becomes possible to grasp the magnitude and tendency of appearance of the measurement error before actually configuring the three-dimensional measuring device. If the appearance tendency of measurement error can be grasped in advance, consider a method to reduce the influence of mutual reflection by changing the measurement conditions such as the positional relationship between the projection device and the imaging device or the arrangement method of the measurement target. You can also Therefore, it is possible to improve the accuracy of three-dimensional measurement.
  • the shading boundary of the light is blurred and the pattern boundary is blurred, so that the accuracy of analysis of the projection pattern on the captured image decreases for the purpose of specifying the irradiation direction of the light.
  • the direction may be calculated incorrectly.
  • the shooting space has a certain depth or more, it is possible that the focus of the projection device cannot be adjusted to the entire shooting space. In such a case, it is necessary to consider a calculation method capable of accurately analyzing the projection pattern even when the projection device is out of focus.
  • actual data which is data obtained by actually operating the projection device and the imaging device and observing the measurement object, it is necessary to know the true position of the pattern boundary in the actual data.
  • the simulation device 10 can acquire the true position of the pattern boundary based on the measurement condition information, and generates a virtual photographed image that reproduces the blur of the pattern boundary due to the focus shift of the projection device. .. Therefore, it is possible to easily determine whether or not the detection result of the pattern boundary is improved as a result of improving the method of analyzing the projected pattern. Therefore, it becomes possible to streamline the development of the projection pattern analysis algorithm required for three-dimensional measurement.
  • the simulation device 10 has an effect that the design of the three-dimensional measuring device can be facilitated.
  • the virtual captured image generation unit 102 can calculate the image distortion as an effect of the distortion aberration caused by the lens of the imaging device.
  • image distortion model correspondence between pixels before and after image distortion can be obtained using the following mathematical expression (1).
  • (x u , yu ) are image coordinates in the undistorted image
  • (x d , y d ) are image coordinates in the distorted image
  • K is a coefficient indicating the degree of distortion
  • r is the distance from the center of the image to the pixel of interest.
  • the pixel of interest is a pixel at an arbitrary coordinate (x d , y d ) of the distorted image.
  • the virtual captured image generation unit 102 may sequentially select all the pixels of the distorted image as the target pixel.
  • Many image distortions have been proposed, from a simplified model to a detailed model.
  • the simulation apparatus 10 can use an arbitrary model including the mathematical expression (1) as the calculation expression of the image distortion.
  • the virtual captured image generation unit 102 can reproduce random noise by setting the appearance probability and intensity of noise for each pixel or for each fixed area on the image.
  • the virtual photographed image generation unit 102 determines whether or not to add noise to the pixel or region using the appearance probability, and when it determines to add noise, changes the color of the pixel or region based on the set intensity. ..
  • a rate of change with respect to the color of the original pixel or area may be used, or an integer value may be used.
  • the intensity may be indicated using a fixed rate of change or an integer value, or may have a certain range. Further, the intensity can take both a positive value, which means an increase in the brightness of the pixel, and a negative value, which means a decrease in the brightness of the pixel.
  • the virtual captured image generation unit 102 can reproduce the image blur caused by the focus shift of the imaging device by using the color information of the pixels around the target pixel. Any pixel on the image may be selected as the pixel of interest.
  • a method of calculating the color after the image blur there are a method of taking an average value of colors of the pixel of interest and its surrounding pixels, and Gaussian smoothing in which pixels closer to the pixel of interest are combined at a higher ratio.
  • the use of Gaussian smoothing has an advantage that image blur can be reproduced more accurately than the method of taking an average value. Further, the method of taking the average value has an advantage that the processing time is faster than the method of using Gaussian smoothing.
  • the image blur also changes depending on the distance from the imaging device to the object.
  • Objects that are in focus have less blur, and objects that are out of focus are captured as a larger, blurred image.
  • the virtual captured image generation unit 102 can reproduce the blur closer to the actual captured image.
  • the information may be acquired as the measurement condition information or may be calculated from other information included in the measurement condition information.
  • a specific method for the virtual photographed image generation unit 102 to reproduce the error factor will be described in the second and third embodiments, but the reproduction method is not limited thereto.
  • the three-dimensional measurement calculation unit 103 executes a three-dimensional measurement process using the virtual captured image as an input.
  • the three-dimensional measurement process performed by the three-dimensional measurement calculation unit 103 may be the same as the three-dimensional measurement process performed by the three-dimensional measurement device using a captured image captured in an actual environment.
  • the three-dimensional measurement process includes a process of identifying the projection pattern irradiated on the measurement target and a process of measuring the distance.
  • the projection pattern identification process the brightness information of pixels at the same position is acquired from a plurality of captured images, it is determined whether each pixel is illuminated by the projection device, and the combination is acquired.
  • the pattern identification processing there is a method of analyzing the pattern of a local projection pattern around the pixel of interest.
  • the local projection pattern is designed so as to be unique in the entire pattern, it is possible to determine where in the entire projection pattern projected from the projection device, the portion illuminated around the pixel of interest is located. Can be specified. In any of the pattern identification methods, the purpose is to uniquely obtain the vector from the projection device to the pixel of interest.
  • the three-dimensional measurement calculation unit 103 determines that the sensor information indicating the arrangement condition such as the position and orientation of the projection device and the imaging device included in the measurement condition information.
  • the distance measurement processing is performed based on the principle of triangulation. The details of the distance measurement process differ depending on the projection pattern used. Since the projection pattern used by the simulation device 10 may be any projection pattern, the distance measurement process is selected according to the projection pattern used.
  • FIG. 4 is a diagram showing an example of the display screen 20 output by the simulation device 10 shown in FIG.
  • the display screen 20 includes a processing result display area 21, a measurement condition display area 22, a measurement condition list display area 23, a display content selection area 24, an execute button 25, a save button 26, and an end button 27. Including.
  • the simulation result is displayed in the processing result display area 21.
  • the measurement condition display area 22 individual measurement conditions are displayed and the displayed measurement conditions can be changed and input.
  • the measurement condition list display area 23 displays a list of saved measurement conditions. When one is selected from the list of measurement conditions displayed in the list display area 23, the selected measurement condition is displayed in the measurement condition display area 22.
  • the display content selection area 24 an operation unit for selecting the content to be displayed in the processing result display area 21 is displayed.
  • the captured image reproduced by the virtual captured image generation unit 102 and the measurement data are options. It is displayed.
  • the execution button 25 is an operation unit for executing a simulation process using the measurement condition displayed in the measurement condition display area 22.
  • the save button 26 is an operation unit for saving the measurement condition displayed in the measurement condition display area 22.
  • the end button 27 is an operation unit for ending the simulation processing of three-dimensional measurement.
  • the measurement condition acquisition unit 101 acquires the measurement condition displayed in the measurement condition display area 22 and inputs it to the virtual captured image generation unit 102.
  • the virtual captured image generation unit 102 generates a virtual captured image and inputs it to the three-dimensional measurement calculation unit 103
  • the three-dimensional measurement calculation unit 103 executes three-dimensional measurement processing and inputs the measurement result to the output unit 104
  • the output The unit 104 outputs to the processing result display area 21 processing results such as virtual captured images and measurement values obtained in the course of the simulation processing, input data to the simulation processing such as measurement condition information, and the like.
  • the output unit 104 may perform processing such as emphasizing a point of interest, adjusting contrast, and removing noise so that the output is easy for the user to see.
  • the display screen 20 shown in FIG. 4 is an example, and the display screen 20 is not limited to the example of FIG. By using the display screen 20 shown in FIG. 4, the user can confirm repeated simulation results while sequentially adjusting the measurement conditions.
  • a virtual captured image that reproduces a captured image output by an actual imaging device is generated, and three-dimensional measurement processing is performed based on the virtual captured image. ..
  • the simulation device 10 can reproduce the error in the position of the light included in the captured image acquired by the actual three-dimensional measuring device.
  • the error occurs due to, for example, image distortion due to distortion aberration, random noise, image blur, and mutual reflection. It is difficult to add these errors to the actual captured image afterwards when actually configuring the three-dimensional measurement device, but by reproducing these errors on the virtual captured image, the simulation result of the three-dimensional measurement is obtained.
  • the accuracy of can be improved. Further, if the projection pattern and the error can be reproduced with good reproducibility in the virtual captured image, it is possible to use normal processing for the three-dimensional measurement processing.
  • FIG. 5 is a diagram showing a functional configuration of the virtual captured image generation unit 102 according to the second exemplary embodiment of the present invention.
  • the virtual captured image generation unit 102 has an optical reproduction image generation unit 301 and an image quality deterioration processing unit 302.
  • a device including the virtual captured image generation unit 102 shown in FIG. 5 and having the same configuration as the simulation device 10 shown in FIG. 1 is referred to as a simulation device 12 according to the second embodiment.
  • the configuration of the simulation apparatus 12 is the same as that of the first embodiment shown in FIG. 1 except for the virtual photographed image generation unit 102, and therefore detailed description thereof will be omitted here.
  • the same components as those in the first embodiment will be described using the reference numerals shown in FIG. 1, and the differences from the first embodiment will be mainly described.
  • the optical reproduction image generation unit 301 performs an optical simulation based on the measurement condition information and generates an optical reproduction image that reproduces a captured image.
  • the image quality deterioration processing unit 302 performs an image quality deterioration process on the optically reproduced image according to an error factor.
  • the virtual captured image generation unit 102 sets the image after the deterioration processing as a virtual captured image.
  • FIG. 6 is a flowchart showing the operation of the simulation device 12 having the virtual captured image generation unit 102 shown in FIG.
  • the measurement condition acquisition unit 101 acquires measurement condition information (step S201).
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the optical reproduction image generation unit 301 of the virtual captured image generation unit 102 generates an optical reproduction image based on the measurement condition information (step S202).
  • the optical reproduction image generation unit 301 inputs the generated optical reproduction image to the image quality deterioration processing unit 302.
  • the image quality deterioration processing unit 302 executes image quality deterioration processing on the optical reproduction image (step S203).
  • the image quality deterioration processing unit 302 inputs the image after the image quality deterioration processing to the three-dimensional measurement calculation unit 103 as a virtual captured image.
  • the three-dimensional measurement calculation unit 103 performs three-dimensional measurement processing using the virtual captured image and obtains a measurement result (step S204).
  • the three-dimensional measurement calculation unit 103 inputs the measurement result to the output unit 104.
  • the output unit 104 outputs the simulation result including the measurement result (step S205).
  • the optical reproduction image generation unit 301 calculates the image pickup position in the image pickup space corresponding to each pixel in the image, and whether the calculated image pickup position is irradiated with projection light, that is, the projection light of the projection device picks up the image pickup position. It is determined whether or not has reached. First, the optical reproduction image generation unit 301 calculates a vector V cam passing through each pixel from the optical center O cam of the imaging device based on the sensor information and the information indicating the arrangement and characteristics of the object existing in the shooting space. To do. The optical reproduction image generation unit 301 detects a point P obj on the surface of the object that first intersects the vector V cam . By such calculation, the object imaged in each pixel can be grasped.
  • the optical reproduction image generation unit 301 determines whether the point P obj on the surface of the object is illuminated by the projection device.
  • the optical reproduction image generation unit 301 first calculates a vector V proj from the optical center O proj of the projection device to the point P obj .
  • the optical reproduction image generation unit 301 uses the sensor information and the pattern information indicating the projection pattern of the projection device to determine whether or not the vector V proj is included in the range irradiated with the projection pattern from the projection device. .. If included, the point P obj on the surface of the object can be determined to be illuminated by the projection device.
  • the optical reproduction image generation unit 301 can determine the color of each pixel by using the information indicating the reflection characteristics and the state of ambient light included in the measurement condition information in addition to the above calculation result.
  • Typical reflection models used to determine colors include Lambertian reflection for diffuse reflection and Phong's reflection model for specular reflection.
  • the reflection model used by the optical reproduction image generation unit 301 is not limited to these, and any reflection model can be used.
  • the boundary of the object and the boundary of the projection pattern of the projection device may be included in the range captured by one pixel. .. In that case, it is more natural to use a color in which the colors of two or more objects forming the boundary are mixed as the pixel color.
  • the above-described vector V cam can detect only one intersection in the shooting space. Therefore, among the objects forming the boundary, only the color of the object intersecting the vector V cam is determined as the color of the pixel. As a result, there is a possibility that an image in which the boundary between the object and the projection pattern looks unnatural is generated. This is a kind of phenomenon caused by so-called quantization error, and is a phenomenon which often becomes a problem in the field of image processing.
  • an optical reproduction image is created with a resolution higher than the resolution of the virtual captured image, and the process of reducing the image to the image size in the image quality deterioration process is performed.
  • the optical reproduction image generation unit 301 generates an optical reproduction image with a resolution that is four times the resolution of the virtual captured image that is finally output.
  • the color of each pixel of the virtual captured image can be determined using the color information of the four pixels closest to that pixel in the optically reproduced image.
  • the optical reproduction image generation unit 301 generates an optical reproduction image by using, among the error factors, for example, information on mutual reflection and focus shift of the projection device, which are error factors caused by the projection device. Since these affect the projection state of the projection pattern, for example, when a focus shift occurs, the optical reproduction image generation unit 301 blurs the shadow boundary of light including the pattern boundary in the optical reproduction image.
  • the optical reproduction image generation unit 301 makes, for example, the brightness of the pixel including the incident point where the reflected light reflected by the first surface enters the second surface higher than the brightness when the influence of mutual reflection is not considered. By doing so, mutual reflection can be reproduced.
  • the optical reproduction image generation unit 301 can reproduce the blur of the light shadow boundary by adjusting the brightness of the pixel corresponding to the light shadow boundary.
  • the method described in the third embodiment may be used. it can.
  • the image quality deterioration processing unit 302 deteriorates the image quality of the optical reproduction image by using information such as image distortion, random noise, and focus deviation of the imaging device, which are error factors caused by the imaging device. For example, the image quality deterioration processing unit 302 performs a filtering process for flattening the brightness change in the image to blur the outline of an object, the shadow boundary of light, or the like, and changes the brightness of a pixel at a randomly selected position. By doing so, the image quality of the optically reproduced image can be deteriorated.
  • an optical reproduction image that is an image that reproduces an optical effect including a shadow caused by light being shielded by an object can be obtained by an optical simulation. it can. Furthermore, by performing the image quality deterioration process on the optically reproduced image, it is possible to accurately reproduce the actual captured image.
  • FIG. 7 is a diagram showing a functional configuration of the simulation apparatus 13 according to the third exemplary embodiment of the present invention.
  • the simulation device 13 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, an output unit 104, and an error information acquisition unit 105.
  • parts different from the simulation device 12 will be mainly described.
  • the simulation device 13 has an error information acquisition unit 105 in addition to the configuration of the simulation device 12.
  • the error information acquisition unit 105 externally acquires error information indicating an error factor represented by a virtual captured image.
  • the error information includes at least one of error intensity and addition order for each type of error factor.
  • the error information acquisition unit 105 inputs the acquired error information to the virtual captured image generation unit 102.
  • FIG. 8 is a diagram showing a functional configuration of the virtual photographed image generation unit 102 shown in FIG. 7.
  • the virtual captured image generation unit 102 includes an optical reproduction image generation unit 301, an image quality deterioration processing unit 302, and an error factor determination unit 303.
  • the error factor determination unit 303 determines the processing condition of the error factor addition process including at least one of the strength and the order of addition of the error factor addition process for generating the optical reproduction image and performing the image quality deterioration process based on the error information. To do.
  • the error factor determination unit 303 inputs the determined processing condition to the optical reproduction image generation unit 301.
  • the optical reproduction image generation unit 301 generates an optical reproduction image based on the measurement condition information, the error information, and the processing condition input from the error factor determination unit 303.
  • the image quality deterioration processing unit 302 performs image quality deterioration processing on the optically reproduced image based on the measurement condition information and the error information.
  • FIG. 9 is a flowchart showing the operation of the simulation device 13 shown in FIG.
  • the measurement condition acquisition unit 101 acquires measurement condition information (step S301).
  • the measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
  • the error information acquisition unit 105 acquires error information (step S302).
  • the error information acquisition unit 105 inputs the acquired error information to the virtual captured image generation unit 102.
  • the process of step S301 and the process of step S302 may be performed concurrently in parallel.
  • the error factor determination unit 303 of the virtual captured image generation unit 102 determines the processing condition of the error factor addition process including at least one of the strength of the error factor and the addition order (step S303).
  • the error factor determination unit 303 inputs the determined processing condition to the optical reproduction image generation unit 301.
  • the optical reproduction image generation unit 301 generates an optical reproduction image by an optical simulation based on the processing condition and the measurement condition information (step S304).
  • the optical reproduction image generation unit 301 inputs the generated optical reproduction image to the image quality deterioration processing unit 302.
  • the image quality deterioration processing unit 302 executes the image quality deterioration process of the optical reproduction image based on the measurement condition information and the error information (step S305).
  • the image quality deterioration processing unit 302 inputs the image after the image quality deterioration processing to the three-dimensional measurement calculation unit 103 as a virtual captured image.
  • the three-dimensional measurement calculation unit 103 executes three-dimensional measurement processing using the virtual captured image and obtains a measurement result (step S306).
  • the three-dimensional measurement calculation unit 103 inputs the measurement result to the output unit 104.
  • the output unit 104 outputs the simulation result including the measurement result (step S307).
  • FIG. 10 is a flowchart showing details of step S304 shown in FIG.
  • the optical reproduction image generation unit 301 first generates an optical reproduction image that does not include an error factor (step S401). As described above, the optical reproduction image generation unit 301 discriminates between the projection area, which is the area directly illuminated by the projection light, and the non-projection area, which is the non-illuminated area, and draws the projection area in a brighter color than the non-projection area. To do.
  • the optical reproduction image generation unit 301 acquires the order of addition of error factors from the error information (step S402). Further, the optical reproduction image generation unit 301 acquires the type of error factor and the intensity for each type from the error information (step S403).
  • the optical reproduction image generation unit 301 determines whether or not to add an error to the optical reproduction image according to the addition order acquired in step S402. The optical reproduction image generation unit 301 first determines whether to add mutual reflection to the optical reproduction image based on the error information (step S404).
  • the optical reproduction image generation unit 301 adds mutual reflection to the optical reproduction image (step S405). Specifically, when mutual reflection occurs, the reflected light reflected by the first surface of the object enters the second surface. At this time, the optical reproduction image generation unit 301 sets the luminance of the pixel including the incident point where the reflected light is incident on the second surface as the pixel whose luminance increases due to mutual reflection. The optical reproduction image generation unit 301 determines the amount of increase in brightness based on the position and orientation of the object including the first surface and the object including the second surface and the surface characteristics of the first surface and the second surface. decide. The brightness of each pixel of the optically reproduced image is the second brightness obtained by adding the increase amount of the brightness due to the mutual reflection to the first brightness of the pixel when the influence of the mutual reflection is not considered.
  • the optical reproduction image generation unit 301 does not necessarily have to calculate the increase in luminance due to mutual reflection in the reflection of all light in the process of step S405.
  • the amount of increase in brightness may be smaller than the resolution of the image, so for a surface with a specular reflectance below the threshold, there is no increase in brightness due to mutual reflection.
  • step S406 determines whether or not the error adding process is finished.
  • step S406: Yes the optical reproduction image generation unit 301 ends the error addition processing and inputs the optical reproduction image to the image quality deterioration processing unit 302.
  • step S406: No the optical reproduction image generation unit 301 returns to the process of step S403.
  • step S404 If the mutual reflection is not added (step S404: No), the optical reproduction image generation unit 301 determines whether or not the optical blur of the projection device is added (step S407).
  • the optical reproduction image generation unit 301 adds the optical blur of the projection device to the optical reproduction image (step S408).
  • the blurring of the light shadow boundary can be represented by the brightness of the pixel.
  • the optical reproduction image generation unit 301 based on the measurement condition information, the first area, which is an area to which light is projected when it is assumed that there is no focus shift in the optical reproduction image, and the same as in the projection device.
  • the second area which is an area where light is not projected, is determined.
  • the optical reproduction image generation unit 301 specifies pixels within a predetermined distance from the boundary between the first area and the second area based on the determination result.
  • the maximum brightness of these pixels is the brightness of the area where the light is projected, and the minimum brightness is the brightness of the area where the light is not projected.
  • the brightness of the pixel may be determined near the boundary based on the distance between the pixel of interest and the boundary. The brightness of the pixel may be determined so that the closer the target pixel is to the first area, the closer to the maximum brightness, and the closer the target pixel is to the second area, the closer to the minimum brightness. In this way, by changing the brightness based on the shortest distance between the pixel of interest and the first region, it is possible to reproduce the state where the projection device is out of focus.
  • the optical reproduction image generation unit 301 proceeds to step S406 after finishing the process of step S408.
  • step S407 When the optical blur of the projection device is not added (step S407: No), the optical reproduction image generation unit 301 determines whether to add the ambient light to the optical reproduction image (step S409).
  • step S409: Yes When adding ambient light (step S409: Yes), the optical reproduction image generation unit 301 adds ambient light to the optical reproduction image (step S410). Ambient light is represented by at least one of the brightness and color of a pixel. After finishing the process of step S410, the optical reproduction image generation unit 301 proceeds to step S406. When no ambient light is added (step S409: No), the optical reproduction image generation unit 301 ends the error addition process.
  • FIG. 11 is a flowchart showing details of step S305 shown in FIG.
  • the image quality deterioration processing unit 302 first acquires the order of addition of error factors based on the error information (step S501).
  • the image quality deterioration processing unit 302 further acquires the type of error factor and the strength of the error based on the error information (step S502).
  • the image quality deterioration processing unit 302 determines whether or not to add an error according to the addition order acquired in step S501.
  • the image quality deterioration processing unit 302 determines whether to add image distortion (step S503). When it is determined that the image distortion is added (step S503: Yes), the image quality deterioration processing unit 302 adds the image distortion to the optical reproduction image (step S504). After adding the image distortion, the image quality deterioration processing unit 302 determines whether or not to end the image quality deterioration process (step S505). When it is determined that the image quality deterioration process is to be ended (step S505: Yes), the image quality deterioration processing unit 302 ends the process. When it is determined that the image quality deterioration process is not to be ended (step S505: No), the image quality deterioration processing unit 302 returns to the process of step S502.
  • step S503 When it is determined that the image distortion is not added (step S503: No), the image quality deterioration processing unit 302 subsequently determines whether random noise is added (step S506). When it is determined that random noise is added (step S506: Yes), the image quality deterioration processing unit 302 adds random noise to the optical reproduction image (step S507). When the process of step S507 ends, the image quality deterioration processing unit 302 proceeds to the process of step S505.
  • step S506 determines whether or not the random noise is added (step S506: No).
  • step S508 determines whether or not the image blur is added (step S508).
  • step S508 No
  • the image quality deterioration processing unit 302 ends the process.
  • step S509 the image quality deterioration processing unit 302 adds the image blur to the optically reproduced image (step S509).
  • step S509 ends, the image quality deterioration processing unit 302 proceeds to the process of step S505.
  • the image quality deterioration processing unit 302 may add only one type of error factor to the optical reproduction image, or may add the same error factor to the optical reproduction image multiple times. For example, when the addition of random noise and the addition of image blurring are alternately executed a plurality of times, it is possible to obtain the effect of causing color unevenness in the appearance of the target object. As described above, various image quality effects can be reproduced by combining the order of addition of the error factors and the intensity.
  • FIG. 12 is a diagram showing an example of the display screen 30 output by the simulation device 13 shown in FIG. 7.
  • the display screen 30 includes a processing result display area 21, a measurement condition display area 22, a measurement condition list display area 23, a display content selection area 24, an execute button 25, a save button 26, and an end button 27. And an error factor setting area 28.
  • the display screen 30 has an error factor setting area 28 in addition to the components of the display screen 20 described in the first embodiment.
  • description of the same parts as the display screen 20 will be omitted, and parts different from the display screen 20 will be mainly described.
  • the error factor setting area 28 is an area for setting the error intensity and error addition order for each type of error factor.
  • the user can input and set the error intensity and the addition order in the error factor setting area 28.
  • the error information acquisition unit 105 can acquire the error information displayed in the error factor setting area 28 when the execute button 25 is operated.
  • the simulation device 13 can execute the simulation reflecting the error information set by the user by the error information acquisition unit 105.
  • the user can evaluate the three-dimensional measurement executed under the desired operating condition.
  • the type of error factor since it is possible to select the type of error factor, the order of addition of the error to the optically reproduced image, and the strength, it is possible to increase the variation of reproducible image quality. Since it is also possible to test by randomly changing the operating conditions and usage environment, it is possible to test with a combination of conditions that the designer could not assume in advance, and the test contents will be comprehensive, As a result, the effect of increasing the reliability of the three-dimensional measuring device can be expected.
  • FIG. 13 is a diagram showing a functional configuration of the optical reproduction image generation unit 301 according to the fourth embodiment of the present invention.
  • the optical reproduction image generation unit 301 includes a sensor viewpoint data generation unit 401, a map generation unit 402, and an image synthesis unit 403.
  • the simulation apparatus 14 includes the optical reproduction image generation unit 301 shown in FIG. 13 and having the same configuration as the simulation apparatus 13 shown in FIG. 7 is called the simulation apparatus 14 according to the fourth embodiment.
  • the simulation device 14 has the same configuration as the simulation device 13 shown in FIG. 7, and the configuration of the optical reproduction image generation unit 301 of the virtual captured image generation unit 102 is different from that of the simulation device 13.
  • description of components similar to those of the simulation device 13 will be omitted, and portions different from those of the simulation device 13 will be mainly described.
  • the sensor viewpoint data generation unit 401 generates sensor viewpoint data including a first image showing a shooting space viewed from the image pickup device and distance data from each of the image pickup device and the projection device to an object in the shooting space.
  • the sensor viewpoint data generation unit 401 inputs the generated sensor viewpoint data to the map generation unit 402.
  • FIG. 14 is a diagram showing a detailed functional configuration of the sensor viewpoint data generation unit 401 shown in FIG.
  • the first image generated by the sensor viewpoint data generation unit 401 is a bright image in which the entire shooting space is represented by the brightness when the projection light is irradiated, and a dark image in which the projection space is not irradiated with the projection light.
  • the sensor viewpoint data generation unit 401 includes a bright image generation unit 501 that generates a bright image, a dark image generation unit 502 that generates a dark image, and a distance data generation unit 503 that generates distance data.
  • the sensor viewpoint data generation unit 401 generates a first image including a bright image generated by the bright image generation unit 501 and a dark image generated by the dark image generation unit 502, and distance data generated by the distance data generation unit 503. Output the sensor viewpoint data including.
  • the sensor viewpoint data output by the sensor viewpoint data generation unit 401 is input to the map generation unit 402.
  • the map generation unit 402 generates an irradiation map, which is a map showing different numerical values for the first region illuminated by the light from the projection device and the second region unilluminated by the light from the projection device, based on the measurement condition information. To do. For example, in the irradiation map, the first area can be represented by "1" and the second area can be represented by "0".
  • the map generation unit 402 inputs the generated irradiation map to the image synthesis unit 403.
  • FIG. 15 is a diagram showing a detailed functional configuration of the map generation unit 402 shown in FIG.
  • the map generation unit 402 includes an irradiation area calculation unit 601, an irradiation map generation unit 602, a light blurring reproduction unit 603, a reflection area calculation unit 604, and a reflection map generation unit 605.
  • the irradiation area calculation unit 601 calculates an area where the light from the projection device can be illuminated based on the measurement condition information.
  • the irradiation area calculation unit 601 inputs information indicating the calculated area to the irradiation map generation unit 602.
  • the irradiation map generation unit 602 uses the information from the irradiation region calculation unit 601 and the measurement condition information to determine the first region illuminated by the light from the projection device and the second region unilluminated by the light from the projection device. An irradiation map, which is a map indicated by different numerical values, is generated. The irradiation map generation unit 602 inputs the generated irradiation map to the light blurring reproduction unit 603.
  • the reflection region calculation unit 604 uses the measurement condition information and the sensor viewpoint data to reflect the light emitted from the projection device, which is a light whose intensity and direction have changed due to the light being once reflected by an object in the imaging space. By associating the three-dimensional position of the point illuminated by with the coordinates on the first image, the reflection area which is the area illuminated by the reflected light ray in the first image is calculated.
  • the reflection area calculation unit 604 inputs information indicating the calculated reflection area to the reflection map generation unit 605.
  • the reflection map generation unit 605 uses the information indicating the reflection area and the measurement condition information to generate a reflection map that is a map showing different values for the reflection area and the area not illuminated by the reflected light for each projection pattern by the projection device. To generate. For example, the reflection map can represent “1” for the reflection area and “0” for the area not illuminated by the reflected light.
  • the reflection map generation unit 605 inputs the generated reflection map to the light blurring reproduction unit 603.
  • the light blur reproducing unit 603 reproduces the effect of light blur on the irradiation map. Specifically, the light blurring reproduction unit 603 adjusts the value of the irradiation map from 0 to a real number according to the degree of blurring of the light shadow boundary. The light blurring reproducing unit 603 reproduces the effect of light blurring on the reflection map as well as on the irradiation map. Specifically, the light blurring reproduction unit 603 adjusts the value of the irradiation map from 0 to a real number according to the degree of blurring of the light shadow boundary. The light blurring reproducing unit 603 outputs the irradiation map and the reflection map after adjusting the values.
  • the image combining unit 403 generates an optical reproduction image by combining the bright image and the dark image included in the first image based on the information of the irradiation map and the reflection map. Specifically, the image synthesizing unit 403 sets the brightness of each pixel of the optical reproduction image to the brightness acquired from the pixel at the same position in the bright image or the dark image based on the value of the irradiation map, thereby obtaining the bright image. And the dark image are combined. Further, when the values of the irradiation map and the reflection map are adjusted by the light blurring reproduction unit 603, the image composition unit 403 can weight the brightness of each pixel based on the adjusted values.
  • the number of virtual captured images is equal to the number of projection patterns. Therefore, when the number of projection patterns increases, there is a problem that the processing time for creating a virtual captured image becomes long. Therefore, it is effective to implement the processing that is common when creating a virtual captured image in the irradiation of different projection patterns so that the processing is performed only once. Examples of common processing include creation of a bright image and a dark image of the shooting space viewed from the viewpoint of the imaging device, calculation of distance data from the imaging device and the projection device to an object existing in the shooting space, and the like.
  • the calculation result will differ if the projection pattern is different.
  • the calculation of the area that becomes the shadow of the object and the shadow of the projection light is shielded is the same regardless of the projection pattern, and thus can be extracted as a common process.
  • the pattern projected on the scene is blurred, which may reduce the pattern recognition accuracy and consequently the three-dimensional measurement accuracy.
  • the blur of the pattern light does not occur when the image is captured by the imaging device, but occurs when the pattern is projected from the projection device. Therefore, the phenomenon cannot be reproduced by the method of adding blur to the virtual photographed image.
  • the irradiation map can be defined as a two-dimensional array having the same number of elements as the virtual captured image, and each element has a real number value of 0 to 1.
  • the area illuminated by the projection light is expressed as 1
  • the area where the projection light does not reach is expressed as 0
  • the area illuminated by a lower intensity than usual due to the blur of the projection light is expressed as a value greater than 0 and less than 1. it can.
  • this is expressed by a mathematical expression, it becomes as shown in the following mathematical expression (2).
  • I i,j is the luminance at the image coordinates (i,j) of the virtual captured image
  • P i,j is the value at the coordinates (i,j) of the irradiation map
  • B i,j is , The brightness at the image coordinates (i,j) of the bright image
  • S i,j is the brightness at the image coordinates (i,j) of the dark image.
  • the projection device When creating an irradiation map, first assume that the projection device is in focus, that is, there is no blur of the projection light, and calculate the area illuminated by the projection pattern. At this point, the value of each element of the irradiation map is 0 or 1. Blurring of projection light is reproduced by applying a smoothing filter such as Gaussian smoothing to this irradiation map. By performing the Gaussian smoothing, the change in the value of the irradiation map near the boundary between the area illuminated by the pattern light and the area not illuminated by the pattern light becomes smooth. By smoothing the change in the value of the boundary portion, the effect of blurring the boundary of the projection light can be obtained, and the blur of the projection light can be reproduced.
  • a smoothing filter such as Gaussian smoothing
  • FIG. 16 is a flowchart showing the operation of the simulation device 14 according to the fourth exemplary embodiment of the present invention.
  • the sensor viewpoint data generation unit 401 generates a bright image and a dark image in the bright image generation unit 501 and the dark image generation unit 502 (step S601).
  • the sensor viewpoint data generation unit 401 generates distance data of each viewpoint in the distance data generation unit 503 concurrently with step S601 (step S602).
  • the sensor viewpoint data generation unit 401 inputs the sensor viewpoint data including the generated bright image, dark image, and distance data to the map generation unit 402.
  • the map generation unit 402 calculates the irradiation area in the irradiation area calculation unit 601 (step S603). Subsequently, the map generation unit 402 generates an irradiation map in the irradiation map generation unit 602 (step S604).
  • the map generation unit 402 acquires the order of addition of error factors (step S605). Subsequently, the map generation unit 402 acquires the type and strength of the error factor (step S606). The map generation unit 402 determines whether to add mutual reflection (step S607). When it is determined that the mutual reflection is added (step S607: Yes), the map generation unit 402 causes the reflection region calculation unit 604 to calculate the reflection region (step S608), and causes the reflection map generation unit 605 to generate the reflection map (step). S609).
  • step S607 When it is determined that the mutual reflection is not added (step S607: No), the processes of steps S608 and S609 are omitted. Subsequently, the map generation unit 402 determines whether or not to add light blur (step S610). When it is determined that the light blur is added (step S610: Yes), the light blur reproduction unit 603 of the map generation unit 402 reproduces the light blur on the irradiation map (step S611).
  • step S611 When it is determined that the light blur is not added (step S610: No), the process of step S611 is omitted. Subsequently, the map generation unit 402 determines whether to add ambient light (step S612). When it is determined that ambient light is added (step S612: Yes), the irradiation map generation unit 602 of the map generation unit 402 adds ambient light to the irradiation map (step S613). When it is determined that the ambient light is not added (step S612: No), the process of step S613 is omitted.
  • the map generation unit 402 determines whether or not to finish the error addition process (step S614). When it is determined that the error adding process is not to be ended (step S614: No), the process returns to step S606. When it is determined that the error addition process is to be ended (step S614: Yes), the image composition unit 403 executes the image composition process (step S615).
  • the simulation device 14 by creating the first image, the irradiation map, and the reflection map, which are images in which optical phenomena are individually reproduced, It is possible to simplify the process and the data structure for generating the optical reproduction image when the projection pattern of 1 is used.
  • the irradiation map to represent the blur effect of the light shadow boundary caused by the focus shift
  • the reflection map to express the effect of the mutual reflection
  • the simulation apparatus 14 can individually reproduce each of a plurality of types of optical phenomena. Therefore, it is possible to know in detail which factor caused the measurement error caused by the three-dimensional measurement. By adopting such a configuration, there is an effect that the design and performance evaluation of the three-dimensional measuring device become easy.
  • Embodiment 5 The simulation device 15 (not shown) according to the fifth embodiment has the same configuration as the simulation device 13 shown in FIG. 7.
  • the output screen of the simulation device 15 is different from that of the simulation device 13.
  • description of the same parts as those of the simulation device 13 will be omitted, and parts different from those of the simulation device 13 will be mainly described.
  • FIG. 17 is a diagram showing an example of the display screen 40 output by the simulation device 15 according to the fifth embodiment of the present invention.
  • the simulation device 15 can output a simulation result that further includes at least one of the measurement condition information and at least one of the virtual captured image.
  • the display screen 40 can include an adjustment item display area 41, a save button 42, an end button 43, and a measurement result display area 44.
  • the adjustment item display area 41 is an area in which set values of items to be adjusted are displayed in a list.
  • the measurement result display area 44 is an area for displaying the measurement values when the items of the measurement condition to be adjusted are set to the respective set values listed.
  • the save button 42 is an operation unit for performing an operation of saving the measurement result
  • the end button 43 is an operation unit for performing an operation of ending the process.
  • the setting values of the measurement condition items to be adjusted, the measurement values that are the processing results, and virtual captured images and irradiation maps that are obtained in the processing process as intermediate results are displayed side by side. Further, the difference result based on the processing result for any of the set values may be displayed together, or a portion where the difference between the processing results is large may be highlighted.
  • the user outputs the measurement condition information, the virtual captured image, and the like together with the measurement value, so that the user can obtain the third order for each measurement condition.
  • the process of the original measurement result can be confirmed.
  • by displaying the processing results and the intermediate results corresponding to the plurality of measurement conditions side by side it becomes possible to compare and examine the set values of the measurement conditions. Therefore, the arrangement and performance of the projection device and the imaging device can be easily examined.
  • FIG. 18 is a diagram showing a functional configuration of the simulation device 16 according to the sixth exemplary embodiment of the present invention.
  • the simulation device 16 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, an output unit 104, an error information acquisition unit 105, an evaluation reference data generation unit 106, and a measurement evaluation. And part 107.
  • the simulation device 16 has an evaluation reference data generation unit 106 and a measurement evaluation unit 107 in addition to the configuration of the simulation device 13.
  • an evaluation reference data generation unit 106 and a measurement evaluation unit 107 in addition to the configuration of the simulation device 13.
  • detailed description of the same configuration as that of the simulation device 13 will be omitted, and the differences from the simulation device 13 will be mainly described.
  • the evaluation reference data generation unit 106 uses the measurement condition information to generate evaluation reference data that is an evaluation reference for simulation results.
  • the evaluation reference data generation unit 106 inputs the generated evaluation reference data to the measurement evaluation unit 107.
  • the measurement evaluation unit 107 evaluates the simulation result using the evaluation reference data and acquires the simulation evaluation.
  • the output unit 104 outputs the simulation evaluation result in addition to the simulation result.
  • FIG. 19 is a flowchart showing the operation of the simulation device 16 shown in FIG.
  • the operation shown in FIG. 19 is the same as that of FIG. 9 from step S301 to step S307, and detailed description thereof will be omitted.
  • the evaluation reference data generating unit 106 generates the evaluation reference data after the three-dimensional measurement process is completed (step S701).
  • the measurement evaluation unit 107 performs a measurement evaluation process of comparing the measurement data obtained by the simulation with the evaluation reference data to calculate a quantitative evaluation value (step S702).
  • the output unit 104 outputs the simulation evaluation (step S703).
  • the distance data from the imaging device included in the sensor viewpoint data, the processing result of the three-dimensional measurement when the virtual captured image before adding the error factor is input, and the processing such as the irradiation map The data obtained in the process and the actual measurement data obtained by actually measuring the measurement object can be considered.
  • the measurable region can be, for example, a region in which no deficiency occurs in the measurement data obtained by inputting the virtual captured image before adding the error factor. Further, statistics such as average value, variance, standard deviation, and maximum value obtained by comparing the evaluation reference data and the simulation result may be used as the evaluation index.
  • a feedback method for example, it is possible to generate a plurality of different simulation results and acquire the setting value with the best simulation evaluation.
  • a plurality of simulation results for example, there is a method of changing the value of the surface characteristic of the measurement object under the measurement condition or changing the strength of the error factor.
  • Parameters such as the surface characteristics of the measurement target and the strength of error factors when the simulation evaluation is the best can be defined as the optimum simulation settings. By providing the user with this optimum simulation setting, more accurate verification experiments can be performed.
  • FIG. 20 is a diagram showing an example of the display screen 50 of the simulation device 16 shown in FIG.
  • the display screen 50 includes an evaluation reference data display area 51 in addition to the components of the display screen 40.
  • the evaluation standard data is displayed along with a plurality of setting values of the adjustment target items of the measurement conditions.
  • the intermediate result obtained from the simulation of the three-dimensional measurement process using the reference value may be displayed.
  • the display screen 50 can further display the simulation evaluation.
  • the simulation device 16 can obtain evaluation reference data in addition to the simulation result.
  • the evaluation reference data is data serving as a reference when evaluating the simulation result, and is actual measurement data or the like.
  • the simulation device 16 can facilitate examination of the simulation result by displaying the measured value included in the simulation result and the evaluation reference data side by side.
  • the simulation device 16 can obtain a simulation evaluation in which the simulation result is evaluated using the evaluation reference data. With such a configuration, it is possible to quantitatively grasp the measurement error and the loss. For this reason, it is easy to study the performance of the imaging device and the projection device, and to determine the suitability of three-dimensional measurement for each measurement object. Further, when the performance evaluation method is defined by the standard, it is possible to easily perform the suitability judgment by obtaining the simulation evaluation by using the performance evaluation method.
  • FIG. 21 is a diagram showing a functional configuration of the simulation apparatus 17 according to the seventh embodiment of the present invention.
  • the simulation device 17 has an object recognition processing unit 108 and a recognition evaluation unit 109 in addition to the configuration of the simulation device 16 according to the sixth embodiment.
  • a part different from the simulation device 16 will be mainly described.
  • the object recognition processing unit 108 receives the simulation result output by the three-dimensional measurement calculation unit 103 and the measurement condition information output by the measurement condition acquisition unit 101, and determines the position and orientation of the object existing in the imaging space and the object. A recognition result including at least one of a gripping position that is a grippable position is acquired. The object recognition processing unit 108 inputs the recognition result to the recognition evaluation unit 109.
  • the recognition evaluation unit 109 evaluates the recognition result based on the measurement condition information. Specifically, the recognition evaluation unit 109 acquires a recognition evaluation result that includes at least one of the position and orientation estimation accuracy included in the recognition result and the gripping position estimation accuracy. The recognition evaluation unit 109 inputs the recognition result of the object and the recognition evaluation result to the output unit 104. The output unit 104 outputs the recognition result and the recognition evaluation result in addition to the simulation result and the simulation evaluation.
  • FIG. 22 is a flowchart showing the operation of the simulation apparatus 17 shown in FIG.
  • the operations in steps S301 to S307 and steps S701 to S703 are the same as those in FIG. 19, and thus description thereof will be omitted here.
  • the part different from FIG. 19 will be mainly described.
  • the object recognition processing unit 108 executes the object recognition process and acquires the recognition result (step S801).
  • the recognition evaluation unit 109 executes a recognition evaluation process for evaluating the recognition result (step S802).
  • the output unit 104 outputs the recognition evaluation result (step S803).
  • the recognition result may be output in addition to the recognition evaluation result.
  • One of the purposes of three-dimensional measurement of objects is recognition of the position and orientation of the measurement target.
  • an application program that causes a robot to grip an object is applicable.
  • the position and orientation of the object to be grasped is not known in advance, it is necessary to sense the object on the spot and recognize the position and orientation of the object or the position where the robot can grasp.
  • the object recognition processing unit 108 that performs the object recognition processing with the simulation result of the three-dimensional measurement as an input.
  • An arbitrary algorithm may be used as an algorithm for object recognition in the object recognition processing unit 108.
  • the object recognition algorithm may be input with a three-dimensional point group that handles the result of the three-dimensional measurement as a set of points in a three-dimensional space, or with a depth image that represents the three-dimensional measurement result as a two-dimensional image.
  • the recognition result such as the estimation accuracy of the position and orientation of the recognized object and the estimation accuracy of the gripping position is evaluated.
  • a problem in many cases is that the true value of the position and orientation of the object is unknown. Since the true value is unknown, even if the recognition result of the position and orientation of the object is output, it is difficult to quantitatively determine the quality. However, since the position and orientation of the object are known in the simulation, it is possible to quantitatively evaluate the recognition result.
  • the simulation device 17 As described above, according to the simulation device 17 according to the seventh embodiment of the present invention, it is possible to verify the object recognition using the simulation result. With such a configuration, it is possible to evaluate the performance of object recognition without actually configuring a three-dimensional measuring device. Since the position and orientation of the recognition target are known in the simulation space, the result of object recognition can be compared with the true value, and the effect of facilitating quantitative evaluation of the object recognition performance is achieved.
  • FIG. 23 is a diagram showing the functional configuration of the simulation apparatus 18 according to the eighth embodiment of the present invention.
  • the simulation device 18 has an object grip evaluation unit 110 in addition to the configuration of the simulation device 17 according to the seventh embodiment.
  • the object grip evaluation unit 110 calculates the object based on the measurement condition information output by the measurement condition acquisition unit 101, the simulation result output by the three-dimensional measurement calculation unit 103, and the recognition result output by the object recognition processing unit 108.
  • the grip evaluation result of the object for which the grip success probability is evaluated is acquired.
  • the object grip evaluation unit 110 inputs the grip evaluation result to the output unit 104.
  • the output unit 104 also outputs the grip evaluation result.
  • FIG. 24 is a flowchart showing the operation of the simulation device 18 shown in FIG.
  • the operations in steps S301 to S307, steps S701 to S703, and steps S801 to S803 are the same as those in FIG. 22, and thus the description thereof is omitted here.
  • steps S301 to S307, steps S701 to S703, and steps S801 to S803 are the same as those in FIG. 22, and thus the description thereof is omitted here.
  • a part different from FIG. 22 will be mainly described.
  • step S901 the object grip evaluation unit 110 executes the object grip evaluation (step S901).
  • the object grip evaluation unit 110 inputs the grip evaluation result to the output unit 104.
  • the output unit 104 outputs the grip evaluation result (step S902).
  • the operation of step S901 can be executed concurrently with the recognition evaluation process.
  • the recognition result of the object it is possible to obtain the information on the position where the robot hand grips the object.
  • the information when the robot hand is moved to the gripping position and the gripping operation is executed, the position of contact with the object and the magnitude and direction of the force generated in the robot hand or the object can be simulated.
  • the magnitude and direction of the force generated in the robot hand and the object are known, it is possible to estimate whether the gripping will be successful. For example, consider a case where the robot hand is a parallel hand having two claws. The case where gripping fails is a case where the gripped object comes off from the gap between the claws of the robot hand. As a situation corresponding to this case, it is conceivable that the force applied to the gripping object in the direction perpendicular to the closing direction of the parallel hand is larger than the frictional force between the robot hand and the gripping object. Therefore, it is possible to determine whether or not the grip is successful if there is information such as the friction coefficient of the surfaces of the robot hand and the grip object, the weight of the grip object, and the grip force of the robot hand.
  • the eighth embodiment of the present invention has an object grip evaluation unit 110 having a simulation function of such an object grip by a robot.
  • FIG. 25 shows an output screen example according to the eighth embodiment.
  • FIG. 25 is a diagram showing an example of the display screen 60 output by the simulation device 18 shown in FIG.
  • the display screen 60 has a recognition and grip evaluation result display area 61 in addition to the display content of the display screen 30.
  • As a method of outputting the recognition result and the grip evaluation result different symbols may be output depending on whether the recognition or grip is successful or unsuccessful, or the cause of the failure may be output at the time of failure, Any quantitative value used for evaluation may be displayed.
  • the grip success rate can be evaluated on the simulation, so that there is an effect that the pre-verification before building the robot system becomes easy. ..
  • the cause of failure can be isolated and verified by simulation, so that the cause can be investigated and improved in a short period of time.
  • Measurement condition acquisition unit 101 virtual captured image generation unit 102, three-dimensional measurement calculation unit 103, output unit 104, error information acquisition unit 105, evaluation reference data generation unit 106, measurement evaluation unit 107, object recognition processing unit 108, recognition evaluation
  • the unit 109 and the object grip evaluation unit 110 are realized by a processing circuit. These processing circuits may be realized by dedicated hardware, or may be control circuits using a CPU (Central Processing Unit).
  • CPU Central Processing Unit
  • FIG. 26 is a diagram showing dedicated hardware for realizing the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention.
  • the processing circuit 90 is a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
  • FIG. 27 is a diagram showing the configuration of the control circuit 91 for realizing the functions of the simulation apparatuses 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention.
  • the control circuit 91 includes a processor 92 and a memory 93.
  • the processor 92 is a CPU and is also called a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
  • the memory 93 is, for example, a nonvolatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable ROM), and an EEPROM (registered trademark) (Electrically EPROM), These include magnetic disks, flexible disks, optical disks, compact disks, mini disks, and DVD (Digital Versatile Disk).
  • a RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable ROM), and an EEPROM (registered trademark) (Electrically EPROM)
  • magnetic disks flexible disks, optical disks, compact disks, mini disks, and DVD (Digital Versatile Disk).
  • the control circuit 91 When the above processing circuit is realized by the control circuit 91, it is realized by the processor 92 reading and executing a program stored in the memory 93 and corresponding to the processing of each component.
  • the memory 93 is also used as a temporary memory in each process executed by the processor 92.
  • FIG. 28 is a diagram showing an example of a hardware configuration for realizing the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention.
  • the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention use the input device 94 and the output device 95 in addition to the processor 92 and the memory 93.
  • the input device 94 is an input interface such as a keyboard, a mouse, and a touch sensor that receives an input operation from a user.
  • the output device 95 is, for example, a display device and can display an output screen to the user. When a touch panel is used, the display device of the touch panel is the output device 95 and the touch sensor superimposed on the display device is the input device 94.
  • the functions of the measurement condition acquisition unit 101 and the output unit 104 of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 may be realized by only the processor 92, or the processor 92 and the input device 94. Alternatively, it may be realized by the output device 95 or an interface therewith.
  • the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention may be realized by one piece of hardware, or a plurality of functions may be realized. It may be distributed and processed by hardware.
  • the display screens 20, 30, 40, 50, 60 shown above are examples, and various changes can be made.

Abstract

A simulation device (10) is characterized by being provided with: a measurement condition acquisition unit (101) that acquires measurement condition information representing the measurement conditions of a three-dimensional measurement device which is provided with a projection device for projecting light to an object to be measured, and which is also provided with an image capture device for, when the object to be measured is illuminated with the light projected from the projection device, capturing an image of an imaging space containing the object to be measured; a virtual captured image generation unit (102) which, on the basis of the measurement condition information, generates a virtual captured image that is a reproduction of the captured image output by the image capture device; a three-dimensional measurement calculation unit (103) which uses the virtual captured image to perform a three-dimensional measurement process for measuring a three-dimensional position on the surface of the object to be measured, and obtain a measured value; and an output unit (104) which outputs simulation results including the measured value.

Description

シミュレーション装置、シミュレーション方法およびシミュレーションプログラムSimulation device, simulation method, and simulation program
 本発明は、三次元計測装置の計測結果をシミュレーションすることが可能なシミュレーション装置、シミュレーション方法およびシミュレーションプログラムに関する。 The present invention relates to a simulation device, a simulation method, and a simulation program capable of simulating the measurement result of a three-dimensional measurement device.
 従来、光を計測対象物に投影する投影装置と、計測対象物を撮影する撮像装置とを用いて、三次元計測を行う装置が知られている。例えば、特許文献1には、アクティブステレオ法による三次元計測を行う三次元計測装置が開示されている。特許文献1に開示された三次元計測装置は、投影装置から計測対象物に幾何パターンを投影し、撮像装置が出力する撮影画像における幾何パターン上の複数の検出点を設定して、検出点に対応する計測対象物の表面の三次元位置を、三角測量法に基づき計算する。 Conventionally, a device that performs three-dimensional measurement using a projection device that projects light onto a measurement target and an imaging device that images the measurement target is known. For example, Patent Document 1 discloses a three-dimensional measuring device that performs three-dimensional measurement by the active stereo method. The three-dimensional measurement device disclosed in Patent Document 1 projects a geometric pattern from a projection device onto a measurement target, sets a plurality of detection points on the geometric pattern in a captured image output by the imaging device, and sets the detection points as detection points. The three-dimensional position of the surface of the corresponding measurement object is calculated based on the triangulation method.
特開2015-203652号公報JP, 2005-203652, A
 しかしながら、特許文献1に記載の技術では、計測結果を得るためには、実環境を構成して計測対象物のサンプルを用意する必要があった。そのため、実環境がない場合や、計測対象物のサンプルが少ない場合には、実環境における計測結果を評価することができないという問題があった。 However, in the technique described in Patent Document 1, it was necessary to configure the actual environment and prepare a sample of the measurement target in order to obtain the measurement result. Therefore, there is a problem that the measurement result in the actual environment cannot be evaluated when there is no actual environment or when the sample of the measurement target is small.
 本発明は、上記に鑑みてなされたものであって、実環境がない場合や計測対象物のサンプルが少ない場合であっても、計測結果を評価することが可能なシミュレーション装置を得ることを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to obtain a simulation device capable of evaluating a measurement result even when there is no actual environment or when the sample of the measurement target is small. And
 上述した課題を解決し、目的を達成するために、本発明にかかるシミュレーション装置は、光を計測対象物に投影する投影装置と、投影装置からの投影光が照射された計測対象物を含む撮影空間を撮影する撮像装置とを備える三次元計測装置の計測条件を示す計測条件情報を取得する計測条件取得部と、計測条件情報に基づいて、撮像装置が出力する撮影画像を再現した仮想撮影画像を生成する仮想撮影画像生成部と、仮想撮影画像を用いて計測対象物の表面の三次元位置を計測する三次元計測処理を実行し、計測値を得る三次元計測演算部と、計測値を含むシミュレーション結果を出力する出力部と、を備えることを特徴とする。 In order to solve the above-mentioned problems and achieve an object, a simulation device according to the present invention includes a projection device that projects light onto a measurement target, and an image capture that includes the measurement target irradiated with projection light from the projection device. A measurement condition acquisition unit that acquires measurement condition information indicating measurement conditions of a three-dimensional measurement device that includes an imaging device that captures a space, and a virtual captured image that reproduces a captured image output by the imaging device based on the measurement condition information. A virtual photographed image generation unit that generates a 3D measurement calculation unit that obtains a measured value by performing a 3D measurement process that measures the 3D position of the surface of the measurement target using the virtual photographed image. And an output unit that outputs the simulation result including the output.
 本発明によれば、実環境がない場合や計測対象物のサンプルが少ない場合であっても、計測結果を評価することが可能になるという効果を奏する。 According to the present invention, it is possible to evaluate the measurement result even when there is no actual environment or the number of samples of the measurement target is small.
本発明の実施の形態1にかかるシミュレーション装置の機能構成を示す図The figure which shows the function structure of the simulation apparatus concerning Embodiment 1 of this invention. 図1に示すシミュレーション装置の動作を示すフローチャートFlowchart showing the operation of the simulation apparatus shown in FIG. 図1に示す計測条件取得部の詳細な機能構成を示す図The figure which shows the detailed functional structure of the measurement condition acquisition part shown in FIG. 図1に示すシミュレーション装置が出力する表示画面の一例を示す図The figure which shows an example of the display screen which the simulation apparatus shown in FIG. 1 outputs. 本発明の実施の形態2にかかる仮想撮影画像生成部の機能構成を示す図FIG. 3 is a diagram showing a functional configuration of a virtual captured image generation unit according to a second embodiment of the present invention. 図5に示す仮想撮影画像生成部を有するシミュレーション装置の動作を示すフローチャート5 is a flowchart showing the operation of the simulation device having the virtual captured image generation unit shown in FIG. 本発明の実施の形態3にかかるシミュレーション装置の機能構成を示す図The figure which shows the function structure of the simulation apparatus concerning Embodiment 3 of this invention. 図7に示す仮想撮影画像生成部の機能構成を示す図The figure which shows the function structure of the virtual captured image production|generation part shown in FIG. 図7に示すシミュレーション装置の動作を示すフローチャートFlowchart showing the operation of the simulation apparatus shown in FIG. 図9に示すステップS304の詳細を示すフローチャートThe flowchart which shows the detail of step S304 shown in FIG. 図9に示すステップS305の詳細を示すフローチャートThe flowchart which shows the detail of step S305 shown in FIG. 図7に示すシミュレーション装置が出力する表示画面の一例を示す図The figure which shows an example of the display screen which the simulation apparatus shown in FIG. 7 outputs. 本発明の実施の形態4にかかる光学再現画像生成部の機能構成を示す図The figure which shows the function structure of the optical reproduction image production|generation part concerning Embodiment 4 of this invention. 図13に示すセンサ視点データ生成部の詳細な機能構成を示す図The figure which shows the detailed functional structure of the sensor viewpoint data generation part shown in FIG. 図13に示すマップ生成部の詳細な機能構成を示す図The figure which shows the detailed functional structure of the map generation part shown in FIG. 本発明の実施の形態4にかかるシミュレーション装置の動作を示すフローチャート4 is a flowchart showing the operation of the simulation apparatus according to the fourth embodiment of the present invention. 本発明の実施の形態5にかかるシミュレーション装置が出力する表示画面の一例を示す図The figure which shows an example of the display screen which the simulation apparatus concerning Embodiment 5 of this invention outputs. 本発明の実施の形態6にかかるシミュレーション装置の機能構成を示す図The figure which shows the function structure of the simulation apparatus concerning Embodiment 6 of this invention. 図18に示すシミュレーション装置の動作を示すフローチャートA flowchart showing the operation of the simulation apparatus shown in FIG. 図18に示すシミュレーション装置の表示画面の一例を示す図The figure which shows an example of the display screen of the simulation apparatus shown in FIG. 本発明の実施の形態7にかかるシミュレーション装置の機能構成を示す図The figure which shows the function structure of the simulation apparatus concerning Embodiment 7 of this invention. 図21に示すシミュレーション装置の動作を示すフローチャートFlowchart showing the operation of the simulation apparatus shown in FIG. 本発明の実施の形態8にかかるシミュレーション装置の機能構成を示す図The figure which shows the function structure of the simulation apparatus concerning Embodiment 8 of this invention. 図23に示すシミュレーション装置の動作を示すフローチャートA flowchart showing the operation of the simulation apparatus shown in FIG. 図23に示すシミュレーション装置が出力する表示画面の一例を示す図The figure which shows an example of the display screen which the simulation apparatus shown in FIG. 23 outputs. 本発明の実施の形態1~8にかかるシミュレーション装置の機能を実現するための専用のハードウェアを示す図The figure which shows the hardware for exclusive use for implement|achieving the function of the simulation apparatus concerning Embodiments 1-8 of this invention. 本発明の実施の形態1~8にかかるシミュレーション装置の機能を実現するための制御回路の構成を示す図FIG. 3 is a diagram showing a configuration of a control circuit for realizing the functions of the simulation apparatus according to the first to eighth embodiments of the present invention. 本発明の実施の形態1~8にかかるシミュレーション装置の機能を実現するためのハードウェア構成の一例を示す図FIG. 2 is a diagram showing an example of a hardware configuration for realizing the functions of the simulation apparatus according to the first to eighth embodiments of the present invention.
 以下に、本発明の実施の形態にかかるシミュレーション装置、シミュレーション方法およびシミュレーションプログラムを図面に基づいて詳細に説明する。なお、この実施の形態によりこの発明が限定されるものではない。 Below, a simulation apparatus, a simulation method, and a simulation program according to an embodiment of the present invention will be described in detail with reference to the drawings. The present invention is not limited to the embodiments.
実施の形態1.
 図1は、本発明の実施の形態1にかかるシミュレーション装置10の機能構成を示す図である。シミュレーション装置10は、計測条件取得部101と、仮想撮影画像生成部102と、三次元計測演算部103と、出力部104とを有する。
Embodiment 1.
FIG. 1 is a diagram showing a functional configuration of a simulation device 10 according to the first exemplary embodiment of the present invention. The simulation device 10 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, and an output unit 104.
 シミュレーション装置10は、アクティブステレオ法を用いて、計測対象物の表面の三次元位置を計測する三次元計測装置の計測条件を示す計測条件情報に基づいて、三次元計測装置の計測結果をシミュレーションする機能を有する。ここで想定している三次元計測装置は、光を計測対象物に投影する投影装置と、投影装置からの投影光が照射された計測対象物を含む撮影空間を撮影する撮像装置とを備える。投影装置からの投影光は、三次元計測に用いる投影パターンを示している。以下、「投影光」という場合、投影パターンを示す光を指す。三次元計測装置は、撮像装置の出力する撮影画像に含まれる投影パターンに基づいて、計測対象物の表面の三次元位置を計測する三次元計測処理を行うことができる。また、シミュレーション装置10は、三次元計測装置を利用するシステムの出力結果をシミュレーションしてもよい。 The simulation device 10 simulates the measurement result of the three-dimensional measurement device based on the measurement condition information indicating the measurement condition of the three-dimensional measurement device that measures the three-dimensional position of the surface of the measurement target using the active stereo method. Have a function. The three-dimensional measurement device assumed here includes a projection device that projects light onto a measurement target object, and an imaging device that captures an imaging space including the measurement target object irradiated with projection light from the projection device. The projection light from the projection device indicates a projection pattern used for three-dimensional measurement. Hereinafter, the term “projection light” refers to light that indicates a projection pattern. The three-dimensional measurement device can perform a three-dimensional measurement process of measuring the three-dimensional position of the surface of the measurement target based on the projection pattern included in the captured image output by the imaging device. Further, the simulation device 10 may simulate the output result of the system using the three-dimensional measuring device.
 計測条件取得部101は、三次元計測装置の計測条件を示す計測条件情報を取得する。計測条件情報の詳細については、後述する。計測条件取得部101は、取得した計測条件情報を仮想撮影画像生成部102に入力する。 The measurement condition acquisition unit 101 acquires measurement condition information indicating the measurement conditions of the three-dimensional measuring device. Details of the measurement condition information will be described later. The measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
 仮想撮影画像生成部102は、計測条件取得部101から入力される計測条件情報に基づいて、三次元計測装置に含まれる撮像装置が出力する撮影画像を再現したCG(Computer Graphics)画像である仮想撮影画像を生成する。仮想撮影画像生成部102は、生成した仮想撮影画像を三次元計測演算部103に入力する。 The virtual captured image generation unit 102 is a virtual computer graphics (CG) image that reproduces a captured image output by the imaging device included in the three-dimensional measurement device based on the measurement condition information input from the measurement condition acquisition unit 101. Generate a captured image. The virtual captured image generation unit 102 inputs the generated virtual captured image to the three-dimensional measurement calculation unit 103.
 三次元計測演算部103は、仮想撮影画像生成部102から入力される仮想撮影画像を用いて、三次元計測装置が実行する三次元計測処理を実行し、計測値を取得する。三次元計測演算部103は、取得した計測値を出力部104に入力する。 The three-dimensional measurement calculation unit 103 uses the virtual captured image input from the virtual captured image generation unit 102 to perform the three-dimensional measurement processing executed by the three-dimensional measurement device and acquire the measurement value. The three-dimensional measurement calculation unit 103 inputs the acquired measurement value to the output unit 104.
 出力部104は、三次元計測演算部103から入力される計測値を含むシミュレーション結果を出力する。 The output unit 104 outputs the simulation result including the measurement value input from the three-dimensional measurement calculation unit 103.
 図2は、図1に示すシミュレーション装置10の動作を示すフローチャートである。シミュレーション装置10の計測条件取得部101は、計測条件情報を取得する(ステップS101)。計測条件取得部101は、取得した計測条件情報を仮想撮影画像生成部102に入力する。 FIG. 2 is a flowchart showing the operation of the simulation device 10 shown in FIG. The measurement condition acquisition unit 101 of the simulation device 10 acquires measurement condition information (step S101). The measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
 仮想撮影画像生成部102は、計測条件情報に基づいて、仮想撮影画像を生成する(ステップS102)。仮想撮影画像生成部102は、生成した仮想撮影画像を三次元計測演算部103に入力する。 The virtual captured image generation unit 102 generates a virtual captured image based on the measurement condition information (step S102). The virtual captured image generation unit 102 inputs the generated virtual captured image to the three-dimensional measurement calculation unit 103.
 三次元計測演算部103は、仮想撮影画像を用いて、三次元計測の演算を実行し、計測値を取得する(ステップS103)。三次元計測演算部103は、取得した計測値を出力部104に入力する。出力部104は、計測値を含むシミュレーション結果を出力する(ステップS104)。 The three-dimensional measurement calculation unit 103 executes the calculation of three-dimensional measurement using the virtual captured image and acquires the measurement value (step S103). The three-dimensional measurement calculation unit 103 inputs the acquired measurement value to the output unit 104. The output unit 104 outputs the simulation result including the measured value (step S104).
 シミュレーション装置10は、三次元計測装置の計測条件情報に基づいて、撮像装置が出力する撮影画像を再現した仮想撮影画像を生成し、仮想撮影画像を用いて、三次元計測処理を実行することができる。このような構成をとることで、実際に投影装置および撮像装置を設置して実データを取得することなく、シミュレーション上で三次元計測の検証が可能になる。このため、現場に三次元計測装置を設置する作業、三次元計測装置のハードウェアおよびソフトウェアの調整作業、および計測対象物の実データを収集する作業を行う必要がなく、作業にかかる時間を短縮することができ、人的および物的コストを抑制することが可能になる。このため、三次元計測装置の計測条件を様々に設定しながらシミュレーションを行うことで、適切な計測条件を特定することができる。したがって、三次元計測装置の設計期間、三次元計測装置を現場に導入して稼働するまでに行う試行錯誤の検証期間を短縮することが可能である。 The simulation apparatus 10 may generate a virtual captured image that reproduces a captured image output by the image capturing apparatus based on the measurement condition information of the three-dimensional measurement apparatus, and may perform the three-dimensional measurement processing using the virtual captured image. it can. With such a configuration, it is possible to verify the three-dimensional measurement on the simulation without actually installing the projection device and the imaging device and acquiring the actual data. Therefore, it is not necessary to install the 3D measuring device on site, adjust the hardware and software of the 3D measuring device, and collect the actual data of the measurement target, reducing the time required for the work. Therefore, it becomes possible to suppress human and physical costs. Therefore, an appropriate measurement condition can be specified by performing a simulation while setting various measurement conditions of the three-dimensional measurement device. Therefore, it is possible to shorten the design period of the three-dimensional measuring device and the trial-and-error verification period for introducing the three-dimensional measuring device to the site and operating it.
 図3は、図1に示す計測条件取得部101の詳細な機能構成を示す図である。計測条件取得部101は、三次元計測装置の計測条件を示す計測条件情報を取得する。計測条件取得部101は、投影条件情報取得部201と、撮影条件情報取得部202と、計測対象物情報取得部203と、非計測対象物情報取得部204とを有する。 3 is a diagram showing a detailed functional configuration of the measurement condition acquisition unit 101 shown in FIG. The measurement condition acquisition unit 101 acquires measurement condition information indicating the measurement conditions of the three-dimensional measuring device. The measurement condition acquisition unit 101 includes a projection condition information acquisition unit 201, a shooting condition information acquisition unit 202, a measurement target object information acquisition unit 203, and a non-measurement target object information acquisition unit 204.
 投影条件情報取得部201は、投影装置の投影条件を示す投影条件情報を取得する。投影条件情報は、投影装置の性能および使用状態の少なくとも1つを特定可能な情報を含むことができる。投影装置の性能は例えば解像度、画角であり、投影装置の状態は例えば投影装置の位置姿勢、ピントの状態である。また、投影条件情報は、投影パターンを示す情報を含むことができる。投影パターンを示す情報は、投影パターンの模様を示す情報を含む。なお、投影パターンが複数ある場合は、投影パターンを示す情報は、さらに投影パターンの枚数を示す情報を含んでもよい。投影パターンの模様は、投影パターンごとに予め定められた太さの縞を一定の規則に従って並べたストライプ模様、不規則な配置で並べられたドット状の模様、投影パターン中で投影光の強度を滑らかに変化させたグラデーション模様、もしくはこれらの組み合わせであってもよい。上記の模様は一例であり、あらゆる模様の投影パターンを用いることができる。投影パターンの枚数は1枚以上の任意の枚数である。また、投影パターンの模様を示す情報は、例えば、模様の種類を示す情報であってもよいし、所定の投影面における光の強度分布を示す情報であってもよい。 The projection condition information acquisition unit 201 acquires projection condition information indicating the projection condition of the projection device. The projection condition information can include information that can specify at least one of the performance and the usage state of the projection device. The performance of the projection device is, for example, the resolution and the angle of view, and the state of the projection device is, for example, the position and orientation of the projection device and the focus state. Moreover, the projection condition information can include information indicating a projection pattern. The information indicating the projection pattern includes information indicating the pattern of the projection pattern. When there are a plurality of projection patterns, the information indicating the projection pattern may further include information indicating the number of projection patterns. The pattern of the projection pattern is a stripe pattern in which stripes with a predetermined thickness for each projection pattern are arranged according to a certain rule, a dot pattern arranged in an irregular arrangement, and the intensity of the projection light in the projection pattern. The gradation pattern may be smoothly changed, or a combination thereof. The above pattern is an example, and projection patterns of any pattern can be used. The number of projection patterns is an arbitrary number of one or more. The information indicating the pattern of the projection pattern may be, for example, information indicating the type of the pattern, or information indicating the light intensity distribution on a predetermined projection plane.
 撮影条件情報取得部202は、撮像装置の撮影条件を示す撮影条件情報を取得する。撮影条件情報は、撮像装置の性能および使用状態の少なくとも1つを特定可能な情報を含むことができる。撮像装置の性能は例えば解像度、画角であり、撮像装置の状態は例えば、撮像装置の位置姿勢、ピントの状態である。 The image capturing condition information acquisition unit 202 acquires image capturing condition information indicating the image capturing conditions of the image capturing apparatus. The shooting condition information can include information that can specify at least one of the performance and the usage state of the imaging device. The performance of the imaging device is, for example, the resolution and the angle of view, and the state of the imaging device is, for example, the position and orientation of the imaging device and the focus state.
 計測対象物情報取得部203は、計測対象物に関する情報、例えば、計測対象物の形状、位置姿勢などの状態、特性を示す情報である計測対象物情報を取得する。計測対象物の特性は、例えば、計測対象物の反射特性である。計測対象物の反射特性を示す情報は、例えば、計測対象物の色、拡散反射率、および鏡面反射率の少なくとも1つを含む。 The measurement target object information acquisition unit 203 acquires information about the measurement target object, for example, measurement target object information that is information indicating the shape, position and orientation of the measurement target object, and characteristics. The characteristic of the measurement target is, for example, the reflection characteristic of the measurement target. The information indicating the reflection characteristic of the measurement target includes, for example, at least one of the color, the diffuse reflectance, and the specular reflectance of the measurement target.
 非計測対象物情報取得部204は、撮影空間内に存在する、計測対象物以外の物体または投影光以外の環境光である非計測対象物に関する情報である非計測対象物情報を取得する。非計測対象物は、例えば、計測対象物を保持する容器、架台、冶具である。なお、非計測対象物は、上記の例に限定されず、実際の撮影装置が撮影する撮影画像に映り込みうる計測対象物以外の物体、例えば、壁、窓、その他の風景、および上記の物体を照射する照明光などの光も含まれる。非計測対象物情報は、非計測対象物の位置、形状および特性と、環境光の状態を示す情報とのうち少なくとも1つを含むことができる。 The non-measurement target information acquisition unit 204 acquires non-measurement target information that is information about a non-measurement target that is an object other than the measurement target or ambient light other than the projection light existing in the shooting space. The non-measurement target is, for example, a container, a gantry, or a jig that holds the measurement target. Note that the non-measurement target is not limited to the above example, and an object other than the measurement target that can be reflected in a captured image captured by an actual image capturing apparatus, such as a wall, a window, another landscape, or the above-described object. Also included is light such as illumination light for irradiating. The non-measurement target information can include at least one of the position, shape, and characteristic of the non-measurement target and information indicating the state of ambient light.
 なお、計測対象物情報および非計測対象物情報に含まれる物体の形状を示す情報は、3D-CAD(3Dimensional-Computer-Assisted Drawing)のようなメッシュの組み合わせ、球または直方体といったプリミティブもしくはその集合体として表現することができる。物体の反射特性は、光が当たったときの物体の見え方を再現するために用いられる。環境光の状態を示す情報は、実際に撮像装置で撮影した画像のような陰影効果を得るために用いられる。また、物体は宙に浮いている状態であってもよいし、床面および物体を納める箱を設定して、箱の内部に物体が存在するようにしてもよい。床面および箱についても物体と同様に反射特性を設定できることが好ましい。 The information indicating the shape of the object included in the measurement target information and the non-measurement target information is a combination of meshes such as 3D-CAD (3Dimensional-Computer-Assisted Drawing), a primitive such as a sphere or a rectangular parallelepiped, or an aggregate thereof. Can be expressed as The reflection characteristic of an object is used to reproduce the appearance of the object when it is exposed to light. The information indicating the state of the ambient light is used to obtain a shadow effect such as an image actually taken by the imaging device. Further, the object may be in a state of floating in the air, or a box for accommodating the floor surface and the object may be set so that the object exists inside the box. It is preferable that the reflection characteristics can be set for the floor surface and the box as well as for the object.
 上記の通り、投影装置および撮像装置の解像度、画角といった性能を含む投影条件情報および撮影条件情報をそれぞれ個別に設定可能であるため、現実の三次元計測装置で起こり得る計測誤差をより正確に再現することが可能である。例えば、撮像装置の解像度が投影装置の解像度よりも低い場合、投影装置の投影パターンをより精細にしても、投影パターンを撮影した画像からは精細に投影パターンの模様を判別することができないため、三次元計測の誤差、欠損の原因となる。仮想撮影画像生成部102は、撮像装置の解像度に基づいて仮想撮影画像を生成するため、仮想撮影画像を解析することで、撮像装置の判別可能な投影パターンの細かさが分かる。これにより、撮像装置で判別可能な投影パターンの細かさの限界値を、投影装置の解像度の上限値として見積もることができる。或いは、投影装置の解像度に見合った撮像装置の性能を検討することもできる。つまり、三次元計測装置が有する投影装置および撮像装置は、互いに見合った性能のものを使用する必要があるため、シミュレーション結果を用いて、投影装置および撮像装置の性能を検討することが好ましい。 As described above, since the projection condition information and the shooting condition information including the performances such as the resolution and the angle of view of the projection device and the imaging device can be individually set, the measurement error that may occur in the actual three-dimensional measurement device can be more accurate It can be reproduced. For example, when the resolution of the image pickup device is lower than the resolution of the projection device, even if the projection pattern of the projection device is made finer, the pattern of the projection pattern cannot be finely determined from the image obtained by shooting the projection pattern. This may cause errors in three-dimensional measurement and loss. Since the virtual captured image generation unit 102 generates the virtual captured image based on the resolution of the image capturing apparatus, by analyzing the virtual captured image, the fineness of the projection pattern that can be discriminated by the image capturing apparatus can be known. Accordingly, the limit value of the fineness of the projection pattern that can be discriminated by the image pickup device can be estimated as the upper limit value of the resolution of the projection device. Alternatively, the performance of the imaging device that matches the resolution of the projection device can be considered. That is, the projection device and the imaging device included in the three-dimensional measurement device need to have performances that match each other, and therefore it is preferable to study the performances of the projection device and the imaging device using the simulation results.
 仮想撮影画像生成部102は、計測条件情報に基づいて、撮像装置が出力する撮影画像を再現した仮想撮影画像を生成する。仮想撮影画像生成部102は、計測条件情報に含まれる計測対象物情報および非計測対象物情報に基づいて、撮影空間内に存在する物体の配置を仮想の撮影空間内で再現し、撮影条件情報に基づいて、仮想撮影画像が示す撮影空間内の部分を特定することができる。さらに仮想撮影画像生成部102は、投影条件情報、計測対象物情報および非計測対象物情報に基づく反射モデルを用いて、撮影空間内で投影光により生じる陰影を再現することができる。仮想撮影画像生成部102は、撮像装置が出力する撮影画像における投影光によって生じる陰影の境界を示す位置を画素値で再現することができる。ここで、陰影の境界を示す位置に含まれる誤差は、撮影画像中で光の視認性を低下させる。光の視認性の低下とは、投影光が撮影空間内で照射される三次元位置と、撮影画像上における投影光の照射位置の検出結果とが整合しなくなることや、投影装置が二次元の投影パターンを投影する場合において、光の陰影境界、例えば、投影光が照射される領域と投影光が照射されない領域との境界であるパターン境界が正しく検出できなくなることを意味する。このため、光の視認性が低下すると、三次元計測の計測結果に誤差が生じたり、計測不可能な状態となったりして、三次元計測の品質が低下する。 The virtual captured image generation unit 102 generates a virtual captured image that reproduces the captured image output by the imaging device based on the measurement condition information. The virtual photographed image generation unit 102 reproduces the arrangement of objects existing in the photographing space in the virtual photographing space based on the measurement object information and the non-measurement object information included in the measurement condition information, and the photographing condition information. Based on, it is possible to specify the portion in the shooting space indicated by the virtual shot image. Furthermore, the virtual captured image generation unit 102 can reproduce the shadow generated by the projection light in the capturing space by using the reflection model based on the projection condition information, the measurement target object information, and the non-measurement target object information. The virtual captured image generation unit 102 can reproduce the position indicating the boundary of the shadow generated by the projection light in the captured image output by the image capturing apparatus, with a pixel value. Here, the error included in the position indicating the boundary of the shadow reduces the visibility of the light in the captured image. The decrease in the visibility of light means that the three-dimensional position where the projection light is irradiated in the photographing space does not match the detection result of the irradiation position of the projection light on the photographed image, and the two-dimensional projection device is used. In the case of projecting a projection pattern, it means that a shadow boundary of light, for example, a pattern boundary that is a boundary between a region irradiated with projection light and a region not irradiated with projection light cannot be detected correctly. For this reason, if the visibility of light is reduced, an error occurs in the measurement result of the three-dimensional measurement or the measurement becomes impossible, and the quality of the three-dimensional measurement deteriorates.
 撮影画像における光の陰影境界の位置に誤差が生じる要因には、投影装置に起因するものと、撮像装置に起因するものとが含まれうる。投影装置に起因する誤差要因としては、投影装置からの投影光が第1の面で反射した後、第2の面に入射して第2の面を照らす相互反射、投影装置のピントが合わないことによる光のボケなどが考えられる。第2の面は、第1の面と異なる物体の表面であってもよいし、第1の面と同じ物体の異なる部位であってもよい。 The factors that cause the error in the position of the shadow boundary of light in the captured image may include those caused by the projection device and those caused by the imaging device. The error factors caused by the projection device include mutual reflection in which the projection light from the projection device is reflected on the first surface and then enters the second surface to illuminate the second surface, and the projection device is out of focus. It is possible that the light is out of focus. The second surface may be the surface of an object different from the first surface, or may be a different part of the same object as the first surface.
 撮像装置に起因する誤差要因としては、撮像装置が有するレンズの歪曲収差などによる画像ひずみ、画像に規則性なく出現するランダムノイズ、撮像装置のピントが合っていないことによる画像ボケなどが考えられる。なお、誤差要因として扱われる現象は、上記に限定されず、光の投影状態が変化したり、撮影される画像が変化したりする原因となるあらゆる現象を含めてよい。  As the error factors caused by the image pickup device, image distortion due to lens aberration of the image pickup device, random noise that appears irregularly in the image, image blur due to out of focus of the image pickup device, etc. are considered. The phenomenon treated as an error factor is not limited to the above, and may include any phenomenon that causes a change in the projection state of light or a change in a captured image.
 上述の相互反射が生じやすい状況の一例としては、2以上の物体が近接して配置されている場合が挙げられる。投影装置から照射される光線が、撮影空間中の第1の物体の表面で反射して、反射光が第1の物体と異なる第2の物体を照らす場合がある。また、撮影空間に存在する第1の物体の部位Xで反射した光が、同じ第1の物体の部位Xと異なる部位Yを照らす場合も考えられる。 As an example of the situation where the above-mentioned mutual reflection is likely to occur, there is a case where two or more objects are arranged close to each other. The light beam emitted from the projection device may be reflected on the surface of the first object in the shooting space, and the reflected light may illuminate a second object different from the first object. It is also conceivable that the light reflected by the site X of the first object existing in the imaging space illuminates a site Y different from the site X of the same first object.
 相互反射は、投影装置からの投影光だけでなく、室内照明および屋外からの日光といった環境光においても同様に生じうる。また、光線の反射は一度とは限られず、複数回の反射も起こり得る。ただし、光は反射のたびにエネルギーが吸収されるため、反射回数を重ねると、撮像装置の観測感度を下回り、撮影画像に影響を及ぼさなくなることが多い。このため、予め定めた回数以上の反射光による相互反射は無視してもよい。 -Interreflection can occur not only in the projection light from the projection device, but also in ambient light such as indoor lighting and outdoor sunlight. Moreover, the light ray is not limited to be reflected once, and may be reflected multiple times. However, since the energy of light is absorbed each time it is reflected, if the number of times of reflection is repeated, the observation sensitivity of the image pickup device is lowered and the captured image is often not affected. Therefore, mutual reflection due to the reflected light a predetermined number of times or more may be ignored.
 相互反射による光の強度は、仮想撮影画像を生成するときと同様の反射モデルに加えて、反射前の光の強度と反射点の反射特性および光の進行経路に基づいて決定することができる。 The light intensity by mutual reflection can be determined based on the intensity of the light before reflection, the reflection characteristics of the reflection point, and the traveling path of the light, in addition to the reflection model similar to that when the virtual captured image is generated.
 相互反射が生じると、画像上で観測されるべきパターン境界以外にも光の陰影境界が観測されることがある。パターン境界以外の境界が観測された部分においては、光の照射方向を正しく計算することができず、三次元計測の際に計測誤差の原因となる。 When mutual reflection occurs, shadow boundaries of light may be observed in addition to the pattern boundaries that should be observed on the image. In a portion where a boundary other than the pattern boundary is observed, the light irradiation direction cannot be calculated correctly, which causes a measurement error in three-dimensional measurement.
 仮想撮影画像生成部102は、光の相互反射の影響を考慮して、仮想撮影画像を生成するため、三次元計測演算部103が出力する計測値には、実際に相互反射が生じ得る計測対象物に対して三次元計測を行った際に生じ得る計測誤差が再現されている。このため、計測誤差の大きさや出現傾向を、実際に三次元計測装置を構成する前に把握することが可能になる。計測誤差の出現傾向を事前に把握することができれば、投影装置および撮像装置の位置関係または計測対象物の配置方法などの計測条件を変更することで、相互反射の影響を低減する方法を検討することもできる。したがって、三次元計測の精度を向上させることが可能である。 Since the virtual captured image generation unit 102 generates a virtual captured image in consideration of the influence of mutual reflection of light, the measurement value output by the three-dimensional measurement calculation unit 103 includes a measurement target that may actually cause mutual reflection. The measurement error that can occur when three-dimensional measurement is performed on an object is reproduced. Therefore, it becomes possible to grasp the magnitude and tendency of appearance of the measurement error before actually configuring the three-dimensional measuring device. If the appearance tendency of measurement error can be grasped in advance, consider a method to reduce the influence of mutual reflection by changing the measurement conditions such as the positional relationship between the projection device and the imaging device or the arrangement method of the measurement target. You can also Therefore, it is possible to improve the accuracy of three-dimensional measurement.
 投影装置のピントずれが生じると、光の陰影境界のボケが生じ、パターン境界がぼやけるため、光の照射方向を特定する目的で、撮影画像上における投影パターンの解析精度が低下し、光の照射方向が誤って算出される可能性がある。撮影空間が一定以上の奥行を有する場合、投影装置のピントを撮影空間の全体に合わせることができないことも考えられる。このような場合、投影装置のピントずれが生じた場合でも精度よく投影パターンを解析できるような計算方法を検討することが必要となる。しかしながら、実際に投影装置および撮像装置を動作させて計測対象物を観測したデータである実データを用いて投影パターンの解析方法を検証しようとしても、実データにおけるパターン境界の真の位置を知ることは困難である。このため、検証結果が正しいかどうかを厳密に判断することはできない。これに対して、シミュレーション装置10では、計測条件情報に基づいてパターン境界の真の位置を取得することができ、投影装置のピントずれに起因するパターン境界のボケを再現した仮想撮影画像を生成する。そのため、投影パターンの解析方法を改良した結果として、パターン境界の検出結果が改善しているかどうかを容易に判定することが可能になる。したがって、三次元計測に要する投影パターンの解析アルゴリズムの開発を効率化することが可能になる。 When the projection device is out of focus, the shading boundary of the light is blurred and the pattern boundary is blurred, so that the accuracy of analysis of the projection pattern on the captured image decreases for the purpose of specifying the irradiation direction of the light. The direction may be calculated incorrectly. When the shooting space has a certain depth or more, it is possible that the focus of the projection device cannot be adjusted to the entire shooting space. In such a case, it is necessary to consider a calculation method capable of accurately analyzing the projection pattern even when the projection device is out of focus. However, even if an attempt is made to verify the projection pattern analysis method using actual data, which is data obtained by actually operating the projection device and the imaging device and observing the measurement object, it is necessary to know the true position of the pattern boundary in the actual data. It is difficult. Therefore, it is not possible to strictly judge whether the verification result is correct. On the other hand, the simulation device 10 can acquire the true position of the pattern boundary based on the measurement condition information, and generates a virtual photographed image that reproduces the blur of the pattern boundary due to the focus shift of the projection device. .. Therefore, it is possible to easily determine whether or not the detection result of the pattern boundary is improved as a result of improving the method of analyzing the projected pattern. Therefore, it becomes possible to streamline the development of the projection pattern analysis algorithm required for three-dimensional measurement.
 また、投影装置のピントずれの度合と三次元計測の計測結果との関係をシミュレーションすることができるため、どの程度のピントずれであれば計測結果の誤差が許容範囲内となるかを検証することができ、投影装置のピントがあう奥行の範囲といった要求性能を見積もることが可能になる。したがって、シミュレーション装置10は、三次元計測装置の設計を容易にすることができるという効果を奏する。 In addition, since it is possible to simulate the relationship between the degree of focus deviation of the projection device and the measurement result of the three-dimensional measurement, it is necessary to verify how much focus deviation causes the error of the measurement result to be within the allowable range. Therefore, it is possible to estimate the required performance such as the depth range where the projection device is in focus. Therefore, the simulation device 10 has an effect that the design of the three-dimensional measuring device can be facilitated.
 また、仮想撮影画像生成部102は、撮像装置のレンズによる歪曲収差の効果として、画像ひずみを計算することができる。画像ひずみのモデルの一例として、画像にひずみが生じる前後の画素の対応は、下記の数式(1)を用いて求めることができる。 Further, the virtual captured image generation unit 102 can calculate the image distortion as an effect of the distortion aberration caused by the lens of the imaging device. As an example of an image distortion model, correspondence between pixels before and after image distortion can be obtained using the following mathematical expression (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、(x,y)は、ひずみのない画像における画像座標であり、(x,y)は、ひずみのある画像における画像座標であり、Kは、ひずみの度合を示す係数であり、rは、画像中心から注目画素までの距離である。注目画素は、ひずみのある画像の任意の座標(x,y)の画素である。仮想撮影画像生成部102は、注目画素として、ひずみのある画像のすべての画素を順次選択してもよい。画像ひずみは、簡略化したモデルから詳細なモデルまで多数提案されている。シミュレーション装置10は、画像ひずみの計算式として、数式(1)を含む任意のモデルを用いることができる。 Here, (x u , yu ) are image coordinates in the undistorted image, (x d , y d ) are image coordinates in the distorted image, and K is a coefficient indicating the degree of distortion. And r is the distance from the center of the image to the pixel of interest. The pixel of interest is a pixel at an arbitrary coordinate (x d , y d ) of the distorted image. The virtual captured image generation unit 102 may sequentially select all the pixels of the distorted image as the target pixel. Many image distortions have been proposed, from a simplified model to a detailed model. The simulation apparatus 10 can use an arbitrary model including the mathematical expression (1) as the calculation expression of the image distortion.
 仮想撮影画像生成部102は、画素ごとまたは画像上の一定領域ごとに、ノイズの出現確率および強度を設定することで、ランダムノイズを再現することができる。仮想撮影画像生成部102は、出現確率を用いて画素または領域にノイズを付与するか否かを判定し、付与すると判定した場合には、設定した強度に基づいて画素または領域の色を変化させる。強度の指定の方法としては、元の画素または領域の色に対する変化率としてもよいし、整数値としてもよい。強度は、固定の変化率または整数値を用いて示してもよいし、一定の範囲を有するものとしてもよい。また、強度は、画素の輝度の上昇を意味する正の値と、輝度の下降を意味する負の値の両方を取り得る。 The virtual captured image generation unit 102 can reproduce random noise by setting the appearance probability and intensity of noise for each pixel or for each fixed area on the image. The virtual photographed image generation unit 102 determines whether or not to add noise to the pixel or region using the appearance probability, and when it determines to add noise, changes the color of the pixel or region based on the set intensity. .. As a method of specifying the intensity, a rate of change with respect to the color of the original pixel or area may be used, or an integer value may be used. The intensity may be indicated using a fixed rate of change or an integer value, or may have a certain range. Further, the intensity can take both a positive value, which means an increase in the brightness of the pixel, and a negative value, which means a decrease in the brightness of the pixel.
 仮想撮影画像生成部102は、注目画素の周囲の画素の色情報を用いて、撮像装置のピントずれに起因する画像ボケを再現することができる。注目画素としては、画像上の任意の画素を選択してよい。画像ボケが生じた後の色の計算方法としては、注目画素とその周囲の画素における色の平均値とする方法、注目画素に近い画素ほど高い比率で合成するガウシアン平滑化などがある。ガウシアン平滑化を用いることで、平均値をとる方法よりも正確に画像ボケを再現することができるという利点がある。また、平均値をとる方法は、ガウシアン平滑化を用いる方法よりも処理時間が早いという利点がある。また、画像ボケは、撮像装置から物体までの距離によっても変化する。ピントが合っている物体はボケが少なく、ピントがあっていない物体はより大きくぼやけた像として撮像される。距離によるボケの度合の変化とピントが合っている距離といった情報をさらに用いることで、仮想撮影画像生成部102は、より実際の撮影画像に近いボケを再現することが可能である。該情報は、計測条件情報として取得してもよいし、計測条件情報に含まれる他の情報から計算により求めてもよい。なお、仮想撮影画像生成部102が誤差要因を再現する具体的な方法については、一例を実施の形態2および3で説明するが、再現方法はこれらに限定されない。 The virtual captured image generation unit 102 can reproduce the image blur caused by the focus shift of the imaging device by using the color information of the pixels around the target pixel. Any pixel on the image may be selected as the pixel of interest. As a method of calculating the color after the image blur occurs, there are a method of taking an average value of colors of the pixel of interest and its surrounding pixels, and Gaussian smoothing in which pixels closer to the pixel of interest are combined at a higher ratio. The use of Gaussian smoothing has an advantage that image blur can be reproduced more accurately than the method of taking an average value. Further, the method of taking the average value has an advantage that the processing time is faster than the method of using Gaussian smoothing. The image blur also changes depending on the distance from the imaging device to the object. Objects that are in focus have less blur, and objects that are out of focus are captured as a larger, blurred image. By further using the information such as the change in the degree of blurring due to the distance and the distance at which the image is in focus, the virtual captured image generation unit 102 can reproduce the blur closer to the actual captured image. The information may be acquired as the measurement condition information or may be calculated from other information included in the measurement condition information. A specific method for the virtual photographed image generation unit 102 to reproduce the error factor will be described in the second and third embodiments, but the reproduction method is not limited thereto.
 三次元計測演算部103は、仮想撮影画像を入力として、三次元計測処理を実行する。三次元計測演算部103が実行する三次元計測処理は、三次元計測装置が実環境で撮影した撮影画像を用いて行う三次元計測処理と同じであってよい。三次元計測処理は、計測対象物に照射された投影パターンの識別処理と、距離計測処理とを含む。投影パターンの識別処理は、複数枚の撮影画像から同じ位置の画素の輝度情報を取得し、それぞれの画素が投影装置によって照らされているかどうかを判定し、その組み合わせを取得する。パターン識別処理としては、上記の他にも、注目画素の周囲のローカルな投影パターンの模様を解析する方法がある。ローカルな投影パターンがパターン全体においても独自性があるように設計されていれば、投影装置から投光されている投影パターン全体のうち、注目画素周辺に照射されている部分がどこに位置するかを特定することができる。いずれのパターン識別方法においても、投影装置から注目画素までのベクトルを一意に求めることが目的である。 The three-dimensional measurement calculation unit 103 executes a three-dimensional measurement process using the virtual captured image as an input. The three-dimensional measurement process performed by the three-dimensional measurement calculation unit 103 may be the same as the three-dimensional measurement process performed by the three-dimensional measurement device using a captured image captured in an actual environment. The three-dimensional measurement process includes a process of identifying the projection pattern irradiated on the measurement target and a process of measuring the distance. In the projection pattern identification process, the brightness information of pixels at the same position is acquired from a plurality of captured images, it is determined whether each pixel is illuminated by the projection device, and the combination is acquired. In addition to the above, as the pattern identification processing, there is a method of analyzing the pattern of a local projection pattern around the pixel of interest. If the local projection pattern is designed so as to be unique in the entire pattern, it is possible to determine where in the entire projection pattern projected from the projection device, the portion illuminated around the pixel of interest is located. Can be specified. In any of the pattern identification methods, the purpose is to uniquely obtain the vector from the projection device to the pixel of interest.
 投影パターンの識別処理によって、投影装置から注目画素までのベクトルを求めると、三次元計測演算部103は、計測条件情報に含まれる、投影装置および撮像装置の位置および姿勢といった配置条件を示すセンサ情報に基づいて、三角測量の原理で距離計測処理を行う。距離計測処理の詳細は、使用する投影パターンによって異なる。シミュレーション装置10が使用する投影パターンは任意の投影パターンであってよいため、距離計測処理は、使用する投影パターンに合わせて選択される。 When the vector from the projection device to the target pixel is obtained by the identification processing of the projection pattern, the three-dimensional measurement calculation unit 103 determines that the sensor information indicating the arrangement condition such as the position and orientation of the projection device and the imaging device included in the measurement condition information. The distance measurement processing is performed based on the principle of triangulation. The details of the distance measurement process differ depending on the projection pattern used. Since the projection pattern used by the simulation device 10 may be any projection pattern, the distance measurement process is selected according to the projection pattern used.
 図4は、図1に示すシミュレーション装置10が出力する表示画面20の一例を示す図である。表示画面20は、処理結果表示領域21と、計測条件表示領域22と、計測条件のリスト表示領域23と、表示内容選択領域24と、実行ボタン25と、保存ボタン26と、終了ボタン27とを含む。処理結果表示領域21には、シミュレーション結果が表示される。計測条件表示領域22には、個別の計測条件が表示されるとともに、表示された計測条件の変更入力ができるようになっている。計測条件のリスト表示領域23には、保存済みの計測条件がリスト表示される。リスト表示領域23に表示された計測条件のリストの中から1つが選択されると、選択された計測条件が計測条件表示領域22に表示される。表示内容選択領域24には、処理結果表示領域21に表示する内容を選択するための操作部が表示され、ここでは、仮想撮影画像生成部102が再現した撮影画像と、計測データとが選択肢として表示されている。 FIG. 4 is a diagram showing an example of the display screen 20 output by the simulation device 10 shown in FIG. The display screen 20 includes a processing result display area 21, a measurement condition display area 22, a measurement condition list display area 23, a display content selection area 24, an execute button 25, a save button 26, and an end button 27. Including. The simulation result is displayed in the processing result display area 21. In the measurement condition display area 22, individual measurement conditions are displayed and the displayed measurement conditions can be changed and input. The measurement condition list display area 23 displays a list of saved measurement conditions. When one is selected from the list of measurement conditions displayed in the list display area 23, the selected measurement condition is displayed in the measurement condition display area 22. In the display content selection area 24, an operation unit for selecting the content to be displayed in the processing result display area 21 is displayed. Here, the captured image reproduced by the virtual captured image generation unit 102 and the measurement data are options. It is displayed.
 実行ボタン25は、計測条件表示領域22に表示されている計測条件を使用して、シミュレーション処理を実行するための操作部である。保存ボタン26は、計測条件表示領域22に表示されている計測条件を保存するための操作部である。終了ボタン27は、三次元計測のシミュレーション処理を終了するための操作部である。 The execution button 25 is an operation unit for executing a simulation process using the measurement condition displayed in the measurement condition display area 22. The save button 26 is an operation unit for saving the measurement condition displayed in the measurement condition display area 22. The end button 27 is an operation unit for ending the simulation processing of three-dimensional measurement.
 計測条件取得部101は、実行ボタン25が操作されると、計測条件表示領域22に表示されている計測条件を取得して仮想撮影画像生成部102に入力する。仮想撮影画像生成部102が仮想撮影画像を生成して三次元計測演算部103に入力し、三次元計測演算部103が三次元計測処理を実行して計測結果を出力部104に入力すると、出力部104は、仮想撮影画像、計測値といったシミュレーション処理の過程で得られる処理結果、計測条件情報などシミュレーション処理への入力データなどを処理結果表示領域21に出力する。 When the execution button 25 is operated, the measurement condition acquisition unit 101 acquires the measurement condition displayed in the measurement condition display area 22 and inputs it to the virtual captured image generation unit 102. When the virtual captured image generation unit 102 generates a virtual captured image and inputs it to the three-dimensional measurement calculation unit 103, the three-dimensional measurement calculation unit 103 executes three-dimensional measurement processing and inputs the measurement result to the output unit 104, the output The unit 104 outputs to the processing result display area 21 processing results such as virtual captured images and measurement values obtained in the course of the simulation processing, input data to the simulation processing such as measurement condition information, and the like.
 ユーザにとって見やすい出力となるように、出力部104は、注目箇所を強調する、コントラストを調整する、ノイズを除去するなどの処理を行ってもよい。図4に示す表示画面20は一例であり、図4の例に限定されない。図4に示す表示画面20を用いることで、ユーザは、計測条件を逐次調整しながら、繰返しシミュレーション結果を確認することができる。 The output unit 104 may perform processing such as emphasizing a point of interest, adjusting contrast, and removing noise so that the output is easy for the user to see. The display screen 20 shown in FIG. 4 is an example, and the display screen 20 is not limited to the example of FIG. By using the display screen 20 shown in FIG. 4, the user can confirm repeated simulation results while sequentially adjusting the measurement conditions.
 以上説明したように、本発明の実施の形態1によれば、実際の撮像装置が出力する撮影画像を再現した仮想撮影画像が生成され、仮想撮影画像に基づいて、三次元計測処理が行われる。このような構成をとることで、実際に三次元計測装置を構成することなく、三次元計測の計測結果を得ることができる。また、シミュレーション装置10は、実際の三次元計測装置で取得される撮影画像に含まれる光の位置の誤差を再現することができる。誤差は、例えば、歪曲収差による画像ひずみ、ランダムノイズ、画像ボケ、相互反射などに起因して生じる。実際に三次元計測装置を構成した場合に、実際の撮影画像に後からこれらの誤差を付加するのは難しいが、仮想撮影画像上でこれらの誤差を再現することで、三次元計測のシミュレーション結果の精度を向上することができる。また、仮想撮影画像に再現性よく投影パターンおよび誤差を再現することができれば、三次元計測処理は通常の処理を用いることが可能である。 As described above, according to the first embodiment of the present invention, a virtual captured image that reproduces a captured image output by an actual imaging device is generated, and three-dimensional measurement processing is performed based on the virtual captured image. .. With such a configuration, the measurement result of the three-dimensional measurement can be obtained without actually configuring the three-dimensional measurement device. Further, the simulation device 10 can reproduce the error in the position of the light included in the captured image acquired by the actual three-dimensional measuring device. The error occurs due to, for example, image distortion due to distortion aberration, random noise, image blur, and mutual reflection. It is difficult to add these errors to the actual captured image afterwards when actually configuring the three-dimensional measurement device, but by reproducing these errors on the virtual captured image, the simulation result of the three-dimensional measurement is obtained. The accuracy of can be improved. Further, if the projection pattern and the error can be reproduced with good reproducibility in the virtual captured image, it is possible to use normal processing for the three-dimensional measurement processing.
実施の形態2.
 図5は、本発明の実施の形態2にかかる仮想撮影画像生成部102の機能構成を示す図である。仮想撮影画像生成部102は、光学再現画像生成部301と、画質劣化処理部302とを有する。なお、図示していないが、図5に示す仮想撮影画像生成部102を備え、図1に示すシミュレーション装置10と同様の構成を示す装置を、実施の形態2にかかるシミュレーション装置12と称する。シミュレーション装置12は、仮想撮影画像生成部102以外の構成については、図1に示す実施の形態1と同様であるため、ここでは詳細な説明を省略する。以下、実施の形態1と同様の構成要素については、図1に示す符号を用いて説明し、実施の形態1と異なる部分について主に説明する。
Embodiment 2.
FIG. 5 is a diagram showing a functional configuration of the virtual captured image generation unit 102 according to the second exemplary embodiment of the present invention. The virtual captured image generation unit 102 has an optical reproduction image generation unit 301 and an image quality deterioration processing unit 302. Although not shown, a device including the virtual captured image generation unit 102 shown in FIG. 5 and having the same configuration as the simulation device 10 shown in FIG. 1 is referred to as a simulation device 12 according to the second embodiment. The configuration of the simulation apparatus 12 is the same as that of the first embodiment shown in FIG. 1 except for the virtual photographed image generation unit 102, and therefore detailed description thereof will be omitted here. Hereinafter, the same components as those in the first embodiment will be described using the reference numerals shown in FIG. 1, and the differences from the first embodiment will be mainly described.
 光学再現画像生成部301は、計測条件情報に基づいて、光学的なシミュレーションを行い、撮影画像を再現した光学再現画像を生成する。画質劣化処理部302は、光学再現画像に対して、誤差要因に応じた画質の劣化処理を施す。仮想撮影画像生成部102は、劣化処理後の画像を仮想撮影画像とする。 The optical reproduction image generation unit 301 performs an optical simulation based on the measurement condition information and generates an optical reproduction image that reproduces a captured image. The image quality deterioration processing unit 302 performs an image quality deterioration process on the optically reproduced image according to an error factor. The virtual captured image generation unit 102 sets the image after the deterioration processing as a virtual captured image.
 図6は、図5に示す仮想撮影画像生成部102を有するシミュレーション装置12の動作を示すフローチャートである。計測条件取得部101は、計測条件情報を取得する(ステップS201)。計測条件取得部101は、取得した計測条件情報を仮想撮影画像生成部102に入力する。 FIG. 6 is a flowchart showing the operation of the simulation device 12 having the virtual captured image generation unit 102 shown in FIG. The measurement condition acquisition unit 101 acquires measurement condition information (step S201). The measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
 仮想撮影画像生成部102の光学再現画像生成部301は、計測条件情報に基づいて、光学再現画像を生成する(ステップS202)。光学再現画像生成部301は、生成した光学再現画像を画質劣化処理部302に入力する。 The optical reproduction image generation unit 301 of the virtual captured image generation unit 102 generates an optical reproduction image based on the measurement condition information (step S202). The optical reproduction image generation unit 301 inputs the generated optical reproduction image to the image quality deterioration processing unit 302.
 画質劣化処理部302は、光学再現画像に対して、画質劣化処理を実行する(ステップS203)。画質劣化処理部302は、画質劣化処理後の画像を仮想撮影画像として三次元計測演算部103に入力する。 The image quality deterioration processing unit 302 executes image quality deterioration processing on the optical reproduction image (step S203). The image quality deterioration processing unit 302 inputs the image after the image quality deterioration processing to the three-dimensional measurement calculation unit 103 as a virtual captured image.
 三次元計測演算部103は、仮想撮影画像を用いて、三次元計測処理を行い、計測結果を得る(ステップS204)。三次元計測演算部103は、計測結果を出力部104に入力する。出力部104は、計測結果を含むシミュレーション結果を出力する(ステップS205)。 The three-dimensional measurement calculation unit 103 performs three-dimensional measurement processing using the virtual captured image and obtains a measurement result (step S204). The three-dimensional measurement calculation unit 103 inputs the measurement result to the output unit 104. The output unit 104 outputs the simulation result including the measurement result (step S205).
 光学再現画像生成部301は、画像中の各画素に対応する撮影空間内の撮像位置を計算し、計算した撮像位置に投影光が照射されているかどうか、つまり、投影装置の投影光が撮像位置に到達しているか否かを判定する。まず、光学再現画像生成部301は、センサ情報と、撮影空間内に存在する物体の配置および特性を示す情報とに基づいて、撮像装置の光学中心Ocamから各画素を通るベクトルVcamを計算する。光学再現画像生成部301は、ベクトルVcamと最初に交差する物体の表面上の点Pobjを検出する。このような計算によって、各画素に撮像されている物体を把握することができる。 The optical reproduction image generation unit 301 calculates the image pickup position in the image pickup space corresponding to each pixel in the image, and whether the calculated image pickup position is irradiated with projection light, that is, the projection light of the projection device picks up the image pickup position. It is determined whether or not has reached. First, the optical reproduction image generation unit 301 calculates a vector V cam passing through each pixel from the optical center O cam of the imaging device based on the sensor information and the information indicating the arrangement and characteristics of the object existing in the shooting space. To do. The optical reproduction image generation unit 301 detects a point P obj on the surface of the object that first intersects the vector V cam . By such calculation, the object imaged in each pixel can be grasped.
 続いて、光学再現画像生成部301は、物体の表面上の点Pobjが投影装置によって照らされているかどうかを判定する。光学再現画像生成部301は、まず、投影装置の光学中心Oprojから点Pobjに向かうベクトルVprojを計算する。光学再現画像生成部301は、センサ情報と、投影装置の投影パターンを示すパターン情報とを用いて、投影装置から投影パターンが照射される範囲にベクトルVprojが含まれているか否かを判定する。含まれる場合、物体の表面上の点Pobjは、投影装置に照らされていると判定することができる。 Subsequently, the optical reproduction image generation unit 301 determines whether the point P obj on the surface of the object is illuminated by the projection device. The optical reproduction image generation unit 301 first calculates a vector V proj from the optical center O proj of the projection device to the point P obj . The optical reproduction image generation unit 301 uses the sensor information and the pattern information indicating the projection pattern of the projection device to determine whether or not the vector V proj is included in the range irradiated with the projection pattern from the projection device. .. If included, the point P obj on the surface of the object can be determined to be illuminated by the projection device.
 光学再現画像生成部301は、上記の計算結果に加えて、計測条件情報に含まれる反射特性および環境光の状態を示す情報を用いて、各画素の色を決定することができる。色を決定するために用いられる代表的な反射モデルとしては、拡散反射においてはランバート反射、鏡面反射においてはフォンの反射モデルが挙げられる。しかしながら、光学再現画像生成部301が使用する反射モデルはこれらに限定されず、任意の反射モデルを使用することができる。 The optical reproduction image generation unit 301 can determine the color of each pixel by using the information indicating the reflection characteristics and the state of ambient light included in the measurement condition information in addition to the above calculation result. Typical reflection models used to determine colors include Lambertian reflection for diffuse reflection and Phong's reflection model for specular reflection. However, the reflection model used by the optical reproduction image generation unit 301 is not limited to these, and any reflection model can be used.
 なお、光学再現画像の生成において、画像は離散的な画素の集合体であるため、ひとつの画素に撮像される範囲に、物体の境界、投影装置の投影パターンの境界部分が含まれることがある。その場合は、境界を構成する2つ以上の物体の色が混ざった色を画素の色とする方が自然である。しかし、上述のベクトルVcamは、撮影空間中の1点の交差点しか検出することができない。このため、境界を構成する物体のうち、ベクトルVcamと交差する物体の色のみが画素の色として決定されてしまう。結果として、物体や投影パターンの境界が不自然に見える画像が生成される可能性がある。これはいわゆる量子化誤差に起因する現象の一種であり、画像処理の分野においてはしばしば問題となる現象である。 In the generation of the optical reproduction image, since the image is a set of discrete pixels, the boundary of the object and the boundary of the projection pattern of the projection device may be included in the range captured by one pixel. .. In that case, it is more natural to use a color in which the colors of two or more objects forming the boundary are mixed as the pixel color. However, the above-described vector V cam can detect only one intersection in the shooting space. Therefore, among the objects forming the boundary, only the color of the object intersecting the vector V cam is determined as the color of the pixel. As a result, there is a possibility that an image in which the boundary between the object and the projection pattern looks unnatural is generated. This is a kind of phenomenon caused by so-called quantization error, and is a phenomenon which often becomes a problem in the field of image processing.
 量子化誤差に起因する現象の問題を解決するために、光学再現画像を仮想撮影画像の解像度よりも高い解像度で作成しておき、画質劣化処理において画像サイズになるように画像を縮小する処理を加えることで、仮想撮影画像を生成する方法がある。例えば、光学再現画像生成部301は、最終的に出力する仮想撮影画像の解像度の4倍の解像度で光学再現画像を生成する。この場合、仮想撮影画像の各画素の色は、光学再現画像におけるその画素にもっとも近い4画素分の色情報を用いて決定することができる。これにより、物体および投影パターンの境界付近の表現が自然な仮想撮影画像を生成することができる。 In order to solve the problem of the phenomenon caused by the quantization error, an optical reproduction image is created with a resolution higher than the resolution of the virtual captured image, and the process of reducing the image to the image size in the image quality deterioration process is performed. There is a method of generating a virtual captured image by adding it. For example, the optical reproduction image generation unit 301 generates an optical reproduction image with a resolution that is four times the resolution of the virtual captured image that is finally output. In this case, the color of each pixel of the virtual captured image can be determined using the color information of the four pixels closest to that pixel in the optically reproduced image. As a result, it is possible to generate a virtual captured image with a natural expression near the boundary between the object and the projection pattern.
 光学再現画像生成部301は、誤差要因のうち、例えば、投影装置に起因する誤差要因である相互反射および投影装置のピントずれの情報を用いて、光学再現画像を生成する。これらは投影パターンの投影状態に影響するため、例えば、ピントずれが生じる場合には、光学再現画像生成部301は、光学再現画像におけるパターン境界を含む光の陰影境界をぼやけさせる。光学再現画像生成部301は、例えば、第1の面で反射した反射光が第2の面に入射する入射点を含む画素の輝度を、相互反射の影響を考慮しないときの輝度よりも高くすることで、相互反射を再現することができる。また、光学再現画像生成部301は、光の陰影境界に対応する画素の輝度を調整することで、光の陰影境界のボケを再現することができる。相互反射を再現する具体的な方法、および、投影装置のピントずれに起因する光の陰影境界のボケを再現する具体的な方法については、例えば、実施の形態3で説明する方法を用いることができる。 The optical reproduction image generation unit 301 generates an optical reproduction image by using, among the error factors, for example, information on mutual reflection and focus shift of the projection device, which are error factors caused by the projection device. Since these affect the projection state of the projection pattern, for example, when a focus shift occurs, the optical reproduction image generation unit 301 blurs the shadow boundary of light including the pattern boundary in the optical reproduction image. The optical reproduction image generation unit 301 makes, for example, the brightness of the pixel including the incident point where the reflected light reflected by the first surface enters the second surface higher than the brightness when the influence of mutual reflection is not considered. By doing so, mutual reflection can be reproduced. Further, the optical reproduction image generation unit 301 can reproduce the blur of the light shadow boundary by adjusting the brightness of the pixel corresponding to the light shadow boundary. As a specific method of reproducing the mutual reflection and a specific method of reproducing the blurring of the shadow boundary of the light due to the focus shift of the projection device, for example, the method described in the third embodiment may be used. it can.
 画質劣化処理部302は、例えば、撮像装置に起因する誤差要因である画像ひずみ、ランダムノイズおよび撮像装置のピントずれの情報を用いて、光学再現画像の画質を劣化させる。例えば、画質劣化処理部302は、画像中の輝度変化を平坦化するフィルタリング処理を行って、物体の輪郭、光の陰影境界などをぼやけさせたり、ランダムに選択した位置の画素の輝度を変化させたりすることで、光学再現画像の画質を劣化させることができる。 The image quality deterioration processing unit 302 deteriorates the image quality of the optical reproduction image by using information such as image distortion, random noise, and focus deviation of the imaging device, which are error factors caused by the imaging device. For example, the image quality deterioration processing unit 302 performs a filtering process for flattening the brightness change in the image to blur the outline of an object, the shadow boundary of light, or the like, and changes the brightness of a pixel at a randomly selected position. By doing so, the image quality of the optically reproduced image can be deteriorated.
 本発明の実施の形態2にかかるシミュレーション装置12によれば、光学シミュレーションによって、光が物体に遮蔽されることによる影を含めた光学的な効果を再現した画像である光学再現画像を得ることができる。さらに、光学再現画像に画質の劣化処理を施すことで、実際の撮影画像を精度よく再現することが可能になる。 With the simulation device 12 according to the second embodiment of the present invention, an optical reproduction image that is an image that reproduces an optical effect including a shadow caused by light being shielded by an object can be obtained by an optical simulation. it can. Furthermore, by performing the image quality deterioration process on the optically reproduced image, it is possible to accurately reproduce the actual captured image.
 また、光学的な現象を再現する処理と画質の劣化を再現する処理とを明示的に分けることで、三次元計測の誤差要因をそれぞれ独立して分析することが容易になる。三次元計測装置の実機を用いて三次元計測の計測誤差を検証する場合、各誤差要因がどれくらいの比率で影響しているのかを切り分けることは困難であるが、シミュレーション装置12を用いることで、投影装置に起因して生じる計測誤差と、撮像装置に起因して生じる計測誤差とを切り分けて検証することができる。したがって、三次元計測装置の設計や、投影装置および撮像装置などの機器の性能の検討を正確に行うことができる。また、実環境で三次元計測装置の実機を用いて検証するためには、様々な計測条件に対応する機器および計測対象物をそろえる必要があるため、多大な費用および時間がかかる。シミュレーション装置12によれば、シミュレーションによって検討を実施することができるため、より多くの計測条件で三次元計測装置を評価することができ、製品保証が容易になるという効果がある。 Also, by explicitly separating the process that reproduces the optical phenomenon and the process that reproduces the deterioration of the image quality, it becomes easy to analyze the error factors of the three-dimensional measurement independently. When verifying the measurement error of the three-dimensional measurement using an actual three-dimensional measurement device, it is difficult to isolate at what ratio each error factor affects, but by using the simulation device 12, The measurement error caused by the projection device and the measurement error caused by the imaging device can be separated and verified. Therefore, it is possible to accurately design the three-dimensional measuring device and study the performance of the devices such as the projection device and the imaging device. In addition, in order to perform verification using an actual three-dimensional measuring device in an actual environment, it is necessary to prepare devices and measuring objects that correspond to various measuring conditions, which requires a great deal of cost and time. According to the simulation device 12, since the examination can be carried out by simulation, it is possible to evaluate the three-dimensional measurement device under more measurement conditions and there is an effect that the product guarantee becomes easier.
実施の形態3.
 図7は、本発明の実施の形態3にかかるシミュレーション装置13の機能構成を示す図である。シミュレーション装置13は、計測条件取得部101と、仮想撮影画像生成部102と、三次元計測演算部103と、出力部104と、誤差情報取得部105とを有する。以下、シミュレーション装置12と異なる部分について主に説明する。
Embodiment 3.
FIG. 7 is a diagram showing a functional configuration of the simulation apparatus 13 according to the third exemplary embodiment of the present invention. The simulation device 13 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, an output unit 104, and an error information acquisition unit 105. Hereinafter, parts different from the simulation device 12 will be mainly described.
 シミュレーション装置13は、シミュレーション装置12の構成に加えて、誤差情報取得部105を有する。誤差情報取得部105は、仮想撮影画像で表現する誤差要因を示す誤差情報を外部から取得する。誤差情報には、誤差要因の種類ごとの誤差の強度および付加順のうち少なくとも1つが含まれる。誤差情報取得部105は、取得した誤差情報を仮想撮影画像生成部102に入力する。 The simulation device 13 has an error information acquisition unit 105 in addition to the configuration of the simulation device 12. The error information acquisition unit 105 externally acquires error information indicating an error factor represented by a virtual captured image. The error information includes at least one of error intensity and addition order for each type of error factor. The error information acquisition unit 105 inputs the acquired error information to the virtual captured image generation unit 102.
 図8は、図7に示す仮想撮影画像生成部102の機能構成を示す図である。シミュレーション装置13において、仮想撮影画像生成部102は、光学再現画像生成部301と、画質劣化処理部302と、誤差要因決定部303とを有する。 FIG. 8 is a diagram showing a functional configuration of the virtual photographed image generation unit 102 shown in FIG. 7. In the simulation device 13, the virtual captured image generation unit 102 includes an optical reproduction image generation unit 301, an image quality deterioration processing unit 302, and an error factor determination unit 303.
 誤差要因決定部303は、誤差情報に基づいて、光学再現画像を生成して画質の劣化処理を施す誤差要因付加処理の強度および付加順の少なくとも1つを含む誤差要因付加処理の処理条件を決定する。誤差要因決定部303は、決定した処理条件を光学再現画像生成部301に入力する。 The error factor determination unit 303 determines the processing condition of the error factor addition process including at least one of the strength and the order of addition of the error factor addition process for generating the optical reproduction image and performing the image quality deterioration process based on the error information. To do. The error factor determination unit 303 inputs the determined processing condition to the optical reproduction image generation unit 301.
 光学再現画像生成部301は、計測条件情報と誤差情報と誤差要因決定部303から入力される処理条件とに基づいて、光学再現画像を生成する。画質劣化処理部302は、計測条件情報と誤差情報とに基づいて、光学再現画像に対して画質の劣化処理を施す。 The optical reproduction image generation unit 301 generates an optical reproduction image based on the measurement condition information, the error information, and the processing condition input from the error factor determination unit 303. The image quality deterioration processing unit 302 performs image quality deterioration processing on the optically reproduced image based on the measurement condition information and the error information.
 図9は、図7に示すシミュレーション装置13の動作を示すフローチャートである。計測条件取得部101は、計測条件情報を取得する(ステップS301)。計測条件取得部101は、取得した計測条件情報を仮想撮影画像生成部102に入力する。 FIG. 9 is a flowchart showing the operation of the simulation device 13 shown in FIG. The measurement condition acquisition unit 101 acquires measurement condition information (step S301). The measurement condition acquisition unit 101 inputs the acquired measurement condition information to the virtual captured image generation unit 102.
 誤差情報取得部105は、誤差情報を取得する(ステップS302)。誤差情報取得部105は、取得した誤差情報を仮想撮影画像生成部102に入力する。なお、ステップS301の処理とステップS302の処理とは同時並行で行われてよい。 The error information acquisition unit 105 acquires error information (step S302). The error information acquisition unit 105 inputs the acquired error information to the virtual captured image generation unit 102. The process of step S301 and the process of step S302 may be performed concurrently in parallel.
 仮想撮影画像生成部102の誤差要因決定部303は、誤差要因の強度および付加順の少なくとも1つを含む、誤差要因付加処理の処理条件を決定する(ステップS303)。誤差要因決定部303は、決定した処理条件を光学再現画像生成部301に入力する。 The error factor determination unit 303 of the virtual captured image generation unit 102 determines the processing condition of the error factor addition process including at least one of the strength of the error factor and the addition order (step S303). The error factor determination unit 303 inputs the determined processing condition to the optical reproduction image generation unit 301.
 光学再現画像生成部301は、処理条件と、計測条件情報とに基づいて、光学的なシミュレーションによって光学再現画像を生成する(ステップS304)。光学再現画像生成部301は、生成した光学再現画像を画質劣化処理部302に入力する。 The optical reproduction image generation unit 301 generates an optical reproduction image by an optical simulation based on the processing condition and the measurement condition information (step S304). The optical reproduction image generation unit 301 inputs the generated optical reproduction image to the image quality deterioration processing unit 302.
 画質劣化処理部302は、計測条件情報と誤差情報とに基づいて、光学再現画像の画質劣化処理を実行する(ステップS305)。画質劣化処理部302は、画質劣化処理後の画像を仮想撮影画像として三次元計測演算部103に入力する。 The image quality deterioration processing unit 302 executes the image quality deterioration process of the optical reproduction image based on the measurement condition information and the error information (step S305). The image quality deterioration processing unit 302 inputs the image after the image quality deterioration processing to the three-dimensional measurement calculation unit 103 as a virtual captured image.
 三次元計測演算部103は、仮想撮影画像を用いて、三次元計測処理を実行し、計測結果を得る(ステップS306)。三次元計測演算部103は、計測結果を出力部104に入力する。出力部104は、計測結果を含むシミュレーション結果を出力する(ステップS307)。 The three-dimensional measurement calculation unit 103 executes three-dimensional measurement processing using the virtual captured image and obtains a measurement result (step S306). The three-dimensional measurement calculation unit 103 inputs the measurement result to the output unit 104. The output unit 104 outputs the simulation result including the measurement result (step S307).
 図10は、図9に示すステップS304の詳細を示すフローチャートである。光学再現画像生成部301は、まず、誤差要因を含まない光学再現画像を生成する(ステップS401)。上記の通り、光学再現画像生成部301は、投影光によって直接照らされる領域である投影領域と、照らされない領域である非投影領域とを判別し、投影領域を非投影領域よりも明るい色で描画する。 FIG. 10 is a flowchart showing details of step S304 shown in FIG. The optical reproduction image generation unit 301 first generates an optical reproduction image that does not include an error factor (step S401). As described above, the optical reproduction image generation unit 301 discriminates between the projection area, which is the area directly illuminated by the projection light, and the non-projection area, which is the non-illuminated area, and draws the projection area in a brighter color than the non-projection area. To do.
 続いて光学再現画像生成部301は、誤差情報から誤差要因の付加順を取得する(ステップS402)。さらに光学再現画像生成部301は、誤差情報から誤差要因の種類および種類ごとの強度を取得する(ステップS403)。以下、光学再現画像生成部301は、ステップS402で取得した付加順に従って、光学再現画像に誤差を付加するか否かを判断していく。光学再現画像生成部301は、誤差情報に基づいて、まず、光学再現画像に相互反射を付加するか否かを判断する(ステップS404)。 Subsequently, the optical reproduction image generation unit 301 acquires the order of addition of error factors from the error information (step S402). Further, the optical reproduction image generation unit 301 acquires the type of error factor and the intensity for each type from the error information (step S403). Hereinafter, the optical reproduction image generation unit 301 determines whether or not to add an error to the optical reproduction image according to the addition order acquired in step S402. The optical reproduction image generation unit 301 first determines whether to add mutual reflection to the optical reproduction image based on the error information (step S404).
 相互反射を付加する場合(ステップS404:Yes)、光学再現画像生成部301、光学再現画像に相互反射を付加する(ステップS405)。具体的には、相互反射が生じる場合、物体の第1の面で反射した反射光が、第2の面に入射する。このとき光学再現画像生成部301は、第2の面に反射光が入射する入射点を含む画素の輝度を、相互反射によって輝度が増加する画素とする。光学再現画像生成部301は、輝度の増加量を、第1の面を含む物体および第2の面を含む物体の位置姿勢と、第1の面および第2の面の表面特性とに基づいて決定する。光学再現画像の各画素の輝度は、相互反射の影響を考慮しないときの画素の第1の輝度に、相互反射による輝度の増加量を加えた第2の輝度となる。 When adding mutual reflection (step S404: Yes), the optical reproduction image generation unit 301 adds mutual reflection to the optical reproduction image (step S405). Specifically, when mutual reflection occurs, the reflected light reflected by the first surface of the object enters the second surface. At this time, the optical reproduction image generation unit 301 sets the luminance of the pixel including the incident point where the reflected light is incident on the second surface as the pixel whose luminance increases due to mutual reflection. The optical reproduction image generation unit 301 determines the amount of increase in brightness based on the position and orientation of the object including the first surface and the object including the second surface and the surface characteristics of the first surface and the second surface. decide. The brightness of each pixel of the optically reproduced image is the second brightness obtained by adding the increase amount of the brightness due to the mutual reflection to the first brightness of the pixel when the influence of the mutual reflection is not considered.
 なお、光学再現画像生成部301は、ステップS405の処理において、必ずしも全ての光の反射において相互反射による輝度増加を計算しなくてもよい。例えば、光の鏡面反射率が低い面については、輝度増加量は画像の分解能よりも小さくなることがあるため、閾値以下の鏡面反射率を有する面については、相互反射による輝度の増加はないものとみなすことができる。 Note that the optical reproduction image generation unit 301 does not necessarily have to calculate the increase in luminance due to mutual reflection in the reflection of all light in the process of step S405. For example, for a surface with a low specular reflectance of light, the amount of increase in brightness may be smaller than the resolution of the image, so for a surface with a specular reflectance below the threshold, there is no increase in brightness due to mutual reflection. Can be regarded as
 光学再現画像生成部301は、ステップS405の相互反射を付加する処理を終えると、誤差付加処理が終了であるか否かを判断する(ステップS406)。誤差付加処理が終了である場合(ステップS406:Yes)、光学再現画像生成部301は、誤差付加処理を終了して、光学再現画像を画質劣化処理部302に入力する。誤差付加処理が終了でない場合(ステップS406:No)、光学再現画像生成部301は、ステップS403の処理に戻る。 After finishing the process of adding the mutual reflection in step S405, the optical reproduction image generation unit 301 determines whether or not the error adding process is finished (step S406). When the error addition processing is completed (step S406: Yes), the optical reproduction image generation unit 301 ends the error addition processing and inputs the optical reproduction image to the image quality deterioration processing unit 302. When the error addition process is not completed (step S406: No), the optical reproduction image generation unit 301 returns to the process of step S403.
 相互反射を付加しない場合(ステップS404:No)、光学再現画像生成部301は、投影装置の光ボケを付加するか否かを判断する(ステップS407)。 If the mutual reflection is not added (step S404: No), the optical reproduction image generation unit 301 determines whether or not the optical blur of the projection device is added (step S407).
 投影装置の光ボケを付加する場合(ステップS407:Yes)、光学再現画像生成部301は、投影装置の光ボケを光学再現画像に付加する(ステップS408)。光の陰影境界のボケは、画素の輝度で表現することができる。具体的には、光学再現画像生成部301は、計測条件情報に基づき、光学再現画像においてピントずれがないと仮定した場合に光が投影される領域である第1の領域と、同じく投影装置のピントずれがないと仮定した場合に光が投影されない領域である第2の領域とを判別する。光学再現画像生成部301は、判別結果に基づいて、第1の領域と第2の領域との境界から予め定められた距離以内の画素を特定する。これらの画素の最大輝度は、光が投影される領域の輝度であり、最小輝度は、光が投影されない領域の輝度である。画素の輝度は、境界付近において、注目画素と境界との距離に基づいて決定されてよい。画素の輝度は、注目画素が第1の領域に近いほど最大値の輝度に近くなり、注目画素が第2の領域に近いほど最小値の輝度に近くなるように決定されてもよい。このようにして注目画素と第1の領域との最短距離に基づいて輝度を変化させることで、投影装置のピントがずれた状態を再現することが可能である。 When adding the optical blur of the projection device (step S407: Yes), the optical reproduction image generation unit 301 adds the optical blur of the projection device to the optical reproduction image (step S408). The blurring of the light shadow boundary can be represented by the brightness of the pixel. Specifically, the optical reproduction image generation unit 301, based on the measurement condition information, the first area, which is an area to which light is projected when it is assumed that there is no focus shift in the optical reproduction image, and the same as in the projection device. When it is assumed that there is no focus shift, the second area, which is an area where light is not projected, is determined. The optical reproduction image generation unit 301 specifies pixels within a predetermined distance from the boundary between the first area and the second area based on the determination result. The maximum brightness of these pixels is the brightness of the area where the light is projected, and the minimum brightness is the brightness of the area where the light is not projected. The brightness of the pixel may be determined near the boundary based on the distance between the pixel of interest and the boundary. The brightness of the pixel may be determined so that the closer the target pixel is to the first area, the closer to the maximum brightness, and the closer the target pixel is to the second area, the closer to the minimum brightness. In this way, by changing the brightness based on the shortest distance between the pixel of interest and the first region, it is possible to reproduce the state where the projection device is out of focus.
 光学再現画像生成部301は、ステップS408の処理を終えると、ステップS406に進む。投影装置の光ボケを付加しない場合(ステップS407:No)、光学再現画像生成部301は、環境光を光学再現画像に付加するか否かを判断する(ステップS409)。 The optical reproduction image generation unit 301 proceeds to step S406 after finishing the process of step S408. When the optical blur of the projection device is not added (step S407: No), the optical reproduction image generation unit 301 determines whether to add the ambient light to the optical reproduction image (step S409).
 環境光を付加する場合(ステップS409:Yes)、光学再現画像生成部301は、環境光を光学再現画像に付加する(ステップS410)。環境光は、画素の輝度および色の少なくとも一方で表現される。光学再現画像生成部301は、ステップS410の処理を終えると、ステップS406に進む。また、環境光を付加しない場合(ステップS409:No)、光学再現画像生成部301は、誤差付加処理を終了する。 When adding ambient light (step S409: Yes), the optical reproduction image generation unit 301 adds ambient light to the optical reproduction image (step S410). Ambient light is represented by at least one of the brightness and color of a pixel. After finishing the process of step S410, the optical reproduction image generation unit 301 proceeds to step S406. When no ambient light is added (step S409: No), the optical reproduction image generation unit 301 ends the error addition process.
 図11は、図9に示すステップS305の詳細を示すフローチャートである。画質劣化処理部302は、まず、誤差情報に基づいて、誤差要因の付加順を取得する(ステップS501)。画質劣化処理部302は、誤差情報に基づいて、さらに、誤差要因の種類と誤差の強度とを取得する(ステップS502)。以下、画質劣化処理部302は、ステップS501で取得した付加順に従って、誤差を付加するか否かを判断する。 FIG. 11 is a flowchart showing details of step S305 shown in FIG. The image quality deterioration processing unit 302 first acquires the order of addition of error factors based on the error information (step S501). The image quality deterioration processing unit 302 further acquires the type of error factor and the strength of the error based on the error information (step S502). Hereinafter, the image quality deterioration processing unit 302 determines whether or not to add an error according to the addition order acquired in step S501.
 まず画質劣化処理部302は、画像ひずみを付加するか否かを判断する(ステップS503)。画像ひずみを付加すると判断した場合(ステップS503:Yes)、画質劣化処理部302は、光学再現画像に画像ひずみを付加する(ステップS504)。画像ひずみを付加した後、画質劣化処理部302は、画質劣化処理を終了するか否かを判断する(ステップS505)。画質劣化処理を終了すると判断した場合(ステップS505:Yes)、画質劣化処理部302は処理を終了する。画質劣化処理を終了しないと判断した場合(ステップS505:No)、画質劣化処理部302は、ステップS502の処理に戻る。 First, the image quality deterioration processing unit 302 determines whether to add image distortion (step S503). When it is determined that the image distortion is added (step S503: Yes), the image quality deterioration processing unit 302 adds the image distortion to the optical reproduction image (step S504). After adding the image distortion, the image quality deterioration processing unit 302 determines whether or not to end the image quality deterioration process (step S505). When it is determined that the image quality deterioration process is to be ended (step S505: Yes), the image quality deterioration processing unit 302 ends the process. When it is determined that the image quality deterioration process is not to be ended (step S505: No), the image quality deterioration processing unit 302 returns to the process of step S502.
 画像ひずみを付加しないと判断した場合(ステップS503:No)、画質劣化処理部302は、続いて、ランダムノイズを付加するか否かを判断する(ステップS506)。ランダムノイズを付加すると判断した場合(ステップS506:Yes)、画質劣化処理部302は、光学再現画像にランダムノイズを付加する(ステップS507)。ステップS507の処理が終わると、画質劣化処理部302は、ステップS505の処理に進む。 When it is determined that the image distortion is not added (step S503: No), the image quality deterioration processing unit 302 subsequently determines whether random noise is added (step S506). When it is determined that random noise is added (step S506: Yes), the image quality deterioration processing unit 302 adds random noise to the optical reproduction image (step S507). When the process of step S507 ends, the image quality deterioration processing unit 302 proceeds to the process of step S505.
 ランダムノイズを付加しないと判断した場合(ステップS506:No)、画質劣化処理部302は、画像ボケを付加するか否かを判断する(ステップS508)。画像ボケを付加しないと判断した場合(ステップS508:No)、画質劣化処理部302は処理を終了する。画像ボケを付加すると判断した場合(ステップS508:Yes)、画質劣化処理部302は、光学再現画像に画像ボケを付加する(ステップS509)。ステップS509の処理が終わると、画質劣化処理部302は、ステップS505の処理に進む。 When it is determined that the random noise is not added (step S506: No), the image quality deterioration processing unit 302 determines whether or not the image blur is added (step S508). When it is determined that the image blur is not added (step S508: No), the image quality deterioration processing unit 302 ends the process. When it is determined that the image blur is added (step S508: Yes), the image quality deterioration processing unit 302 adds the image blur to the optically reproduced image (step S509). When the process of step S509 ends, the image quality deterioration processing unit 302 proceeds to the process of step S505.
 なお、画質劣化処理部302は、1つの種類の誤差要因だけを光学再現画像に付加してもよいし、同じ誤差要因を複数回、光学再現画像に付加してもよい。例えば、ランダムノイズの付加と画像ボケの付加を交互に複数回実行すると、対象の物体の見え方に色ムラが生じたような効果を得ることができる。このように誤差要因の付加順と強度の組み合わせによって、さまざまな画質効果を再現することができる。 Note that the image quality deterioration processing unit 302 may add only one type of error factor to the optical reproduction image, or may add the same error factor to the optical reproduction image multiple times. For example, when the addition of random noise and the addition of image blurring are alternately executed a plurality of times, it is possible to obtain the effect of causing color unevenness in the appearance of the target object. As described above, various image quality effects can be reproduced by combining the order of addition of the error factors and the intensity.
 図12は、図7に示すシミュレーション装置13が出力する表示画面30の一例を示す図である。表示画面30は、処理結果表示領域21と、計測条件表示領域22と、計測条件のリスト表示領域23と、表示内容選択領域24と、実行ボタン25と、保存ボタン26と、終了ボタン27と、誤差要因設定領域28とを有する。 FIG. 12 is a diagram showing an example of the display screen 30 output by the simulation device 13 shown in FIG. 7. The display screen 30 includes a processing result display area 21, a measurement condition display area 22, a measurement condition list display area 23, a display content selection area 24, an execute button 25, a save button 26, and an end button 27. And an error factor setting area 28.
 表示画面30は、実施の形態1で説明した表示画面20の構成要素に加えて、誤差要因設定領域28を有する。以下、表示画面20と同様の部分については説明を省略し、表示画面20と異なる部分について主に説明する。 The display screen 30 has an error factor setting area 28 in addition to the components of the display screen 20 described in the first embodiment. Hereinafter, description of the same parts as the display screen 20 will be omitted, and parts different from the display screen 20 will be mainly described.
 誤差要因設定領域28は、誤差要因の種類ごとに、誤差の強度および誤差の付加順を設定するための領域である。ユーザは、誤差要因設定領域28に対して、誤差の強度および付加順を入力して設定することができる。誤差情報取得部105は、実行ボタン25が操作されたときに誤差要因設定領域28に表示されている誤差情報を取得することができる。 The error factor setting area 28 is an area for setting the error intensity and error addition order for each type of error factor. The user can input and set the error intensity and the addition order in the error factor setting area 28. The error information acquisition unit 105 can acquire the error information displayed in the error factor setting area 28 when the execute button 25 is operated.
 以上説明したように、本発明の実施の形態3にかかるシミュレーション装置13は、誤差情報取得部105によって、ユーザが設定した誤差情報を反映したシミュレーションを実行することができる。このような構成をとることで、ユーザは、所望の動作条件において実行された三次元計測の評価を行うことができる。また、誤差要因の種類、誤差を光学再現画像に付加する付加順および強度を選択することができるため、再現可能な画質のバリエーションを増やすことができるという効果がある。動作条件や使用環境をランダムに変化させて試験することも可能であるため、設計者が事前に想定できなかった条件の組み合わせでの試験をすることもでき、試験内容が網羅的となるため、結果的に三次元計測装置の信頼性が高まるという効果も期待できる。 As described above, the simulation device 13 according to the third exemplary embodiment of the present invention can execute the simulation reflecting the error information set by the user by the error information acquisition unit 105. With such a configuration, the user can evaluate the three-dimensional measurement executed under the desired operating condition. In addition, since it is possible to select the type of error factor, the order of addition of the error to the optically reproduced image, and the strength, it is possible to increase the variation of reproducible image quality. Since it is also possible to test by randomly changing the operating conditions and usage environment, it is possible to test with a combination of conditions that the designer could not assume in advance, and the test contents will be comprehensive, As a result, the effect of increasing the reliability of the three-dimensional measuring device can be expected.
実施の形態4.
 図13は、本発明の実施の形態4にかかる光学再現画像生成部301の機能構成を示す図である。光学再現画像生成部301は、センサ視点データ生成部401と、マップ生成部402と、画像合成部403とを有する。
Fourth Embodiment
FIG. 13 is a diagram showing a functional configuration of the optical reproduction image generation unit 301 according to the fourth embodiment of the present invention. The optical reproduction image generation unit 301 includes a sensor viewpoint data generation unit 401, a map generation unit 402, and an image synthesis unit 403.
 なお、図13に示す光学再現画像生成部301を備え、図7に示すシミュレーション装置13と同様の構成を有する不図示の装置を、実施の形態4にかかるシミュレーション装置14と称する。シミュレーション装置14は、図7に示すシミュレーション装置13と同様の構成を有し、仮想撮影画像生成部102の光学再現画像生成部301の構成がシミュレーション装置13と異なる。以下、シミュレーション装置13と同様の構成要素については説明を省略し、シミュレーション装置13と異なる部分について主に説明する。 An apparatus (not shown) including the optical reproduction image generation unit 301 shown in FIG. 13 and having the same configuration as the simulation apparatus 13 shown in FIG. 7 is called the simulation apparatus 14 according to the fourth embodiment. The simulation device 14 has the same configuration as the simulation device 13 shown in FIG. 7, and the configuration of the optical reproduction image generation unit 301 of the virtual captured image generation unit 102 is different from that of the simulation device 13. Hereinafter, description of components similar to those of the simulation device 13 will be omitted, and portions different from those of the simulation device 13 will be mainly described.
 センサ視点データ生成部401は、撮像装置から見える撮影空間を示す第1画像と、撮像装置および投影装置のそれぞれから撮影空間内の物体までの距離データとを含むセンサ視点データを生成する。センサ視点データ生成部401は、生成したセンサ視点データをマップ生成部402に入力する。 The sensor viewpoint data generation unit 401 generates sensor viewpoint data including a first image showing a shooting space viewed from the image pickup device and distance data from each of the image pickup device and the projection device to an object in the shooting space. The sensor viewpoint data generation unit 401 inputs the generated sensor viewpoint data to the map generation unit 402.
 図14は、図13に示すセンサ視点データ生成部401の詳細な機能構成を示す図である。センサ視点データ生成部401が生成する第1画像は、投影光が照射されたときの明るさで撮影空間の全体が表現された明画像と、撮影空間に投影光が照射されていない暗画像とを含む。センサ視点データ生成部401は、明画像を生成する明画像生成部501と、暗画像を生成する暗画像生成部502と、距離データを生成する距離データ生成部503とを有する。 FIG. 14 is a diagram showing a detailed functional configuration of the sensor viewpoint data generation unit 401 shown in FIG. The first image generated by the sensor viewpoint data generation unit 401 is a bright image in which the entire shooting space is represented by the brightness when the projection light is irradiated, and a dark image in which the projection space is not irradiated with the projection light. including. The sensor viewpoint data generation unit 401 includes a bright image generation unit 501 that generates a bright image, a dark image generation unit 502 that generates a dark image, and a distance data generation unit 503 that generates distance data.
 センサ視点データ生成部401は、明画像生成部501が生成した明画像と、暗画像生成部502が生成した暗画像とを含む第1画像と、距離データ生成部503が生成する距離データとを含むセンサ視点データを出力する。 The sensor viewpoint data generation unit 401 generates a first image including a bright image generated by the bright image generation unit 501 and a dark image generated by the dark image generation unit 502, and distance data generated by the distance data generation unit 503. Output the sensor viewpoint data including.
 図13の説明に戻る。センサ視点データ生成部401が出力するセンサ視点データは、マップ生成部402に入力される。マップ生成部402は、計測条件情報に基づいて、投影装置からの光が照らされる第1領域と、投影装置からの光が照らされない第2領域とを異なる数値で示すマップである照射マップを生成する。例えば、照射マップは、第1領域を「1」、第2領域を「0」で表すことができる。マップ生成部402は、生成した照射マップを画像合成部403に入力する。 Return to the explanation of FIG. The sensor viewpoint data output by the sensor viewpoint data generation unit 401 is input to the map generation unit 402. The map generation unit 402 generates an irradiation map, which is a map showing different numerical values for the first region illuminated by the light from the projection device and the second region unilluminated by the light from the projection device, based on the measurement condition information. To do. For example, in the irradiation map, the first area can be represented by "1" and the second area can be represented by "0". The map generation unit 402 inputs the generated irradiation map to the image synthesis unit 403.
 図15は、図13に示すマップ生成部402の詳細な機能構成を示す図である。マップ生成部402は、照射領域算出部601と、照射マップ生成部602と、光ボケ再現部603と、反射領域算出部604と、反射マップ生成部605とを有する。 FIG. 15 is a diagram showing a detailed functional configuration of the map generation unit 402 shown in FIG. The map generation unit 402 includes an irradiation area calculation unit 601, an irradiation map generation unit 602, a light blurring reproduction unit 603, a reflection area calculation unit 604, and a reflection map generation unit 605.
 照射領域算出部601は、計測条件情報に基づいて、投影装置からの光が照らされ得る領域を算出する。照射領域算出部601は、算出した領域を示す情報を照射マップ生成部602に入力する。 The irradiation area calculation unit 601 calculates an area where the light from the projection device can be illuminated based on the measurement condition information. The irradiation area calculation unit 601 inputs information indicating the calculated area to the irradiation map generation unit 602.
 照射マップ生成部602は、照射領域算出部601からの情報と、計測条件情報とを用いて、投影装置からの光が照らされる第1領域と投影装置からの光が照らされない第2領域とを異なる数値で示すマップである照射マップを生成する。照射マップ生成部602は、生成した照射マップを光ボケ再現部603に入力する。 The irradiation map generation unit 602 uses the information from the irradiation region calculation unit 601 and the measurement condition information to determine the first region illuminated by the light from the projection device and the second region unilluminated by the light from the projection device. An irradiation map, which is a map indicated by different numerical values, is generated. The irradiation map generation unit 602 inputs the generated irradiation map to the light blurring reproduction unit 603.
 反射領域算出部604は、計測条件情報とセンサ視点データとを用いて、投影装置から照射された光が一度撮影空間内の物体に反射して光線の強度および方向が変化した光である反射光線によって照らされる点の三次元位置と、第1画像上の座標とを対応づけることによって、第1画像において反射光線によって照らされる領域である反射領域を算出する。反射領域算出部604は、算出した反射領域を示す情報を反射マップ生成部605に入力する。反射マップ生成部605は、反射領域を示す情報と計測条件情報とを用いて、投影装置が投影パターンごとに、反射領域と反射光に照らされない領域とを異なる数値で示すマップである反射マップを生成する。例えば、反射マップは、反射領域を「1」、反射光に照らされない領域を「0」で表すことができる。反射マップ生成部605は、生成した反射マップを光ボケ再現部603に入力する。 The reflection region calculation unit 604 uses the measurement condition information and the sensor viewpoint data to reflect the light emitted from the projection device, which is a light whose intensity and direction have changed due to the light being once reflected by an object in the imaging space. By associating the three-dimensional position of the point illuminated by with the coordinates on the first image, the reflection area which is the area illuminated by the reflected light ray in the first image is calculated. The reflection area calculation unit 604 inputs information indicating the calculated reflection area to the reflection map generation unit 605. The reflection map generation unit 605 uses the information indicating the reflection area and the measurement condition information to generate a reflection map that is a map showing different values for the reflection area and the area not illuminated by the reflected light for each projection pattern by the projection device. To generate. For example, the reflection map can represent “1” for the reflection area and “0” for the area not illuminated by the reflected light. The reflection map generation unit 605 inputs the generated reflection map to the light blurring reproduction unit 603.
 光ボケ再現部603は、照射マップに光のボケの効果を再現する。具体的には、光ボケ再現部603は、光の陰影境界のボケの度合によって照射マップの値を0から1の実数に調整する。光ボケ再現部603は、反射マップについても照射マップと同様に光のボケの効果を再現する。具体的には、光ボケ再現部603は、光の陰影境界のボケの度合によって照射マップの値を0から1の実数に調整する。光ボケ再現部603は、値を調整した後の照射マップおよび反射マップを出力する。 The light blur reproducing unit 603 reproduces the effect of light blur on the irradiation map. Specifically, the light blurring reproduction unit 603 adjusts the value of the irradiation map from 0 to a real number according to the degree of blurring of the light shadow boundary. The light blurring reproducing unit 603 reproduces the effect of light blurring on the reflection map as well as on the irradiation map. Specifically, the light blurring reproduction unit 603 adjusts the value of the irradiation map from 0 to a real number according to the degree of blurring of the light shadow boundary. The light blurring reproducing unit 603 outputs the irradiation map and the reflection map after adjusting the values.
 画像合成部403は、照射マップおよび反射マップの情報に基づいて第1画像に含まれる明画像と暗画像とを合成することで光学再現画像を生成する。具体的には、画像合成部403は、光学再現画像の各画素の輝度を、照射マップの値に基づいて、明画像または暗画像における同じ位置の画素から取得した輝度とすることで、明画像と暗画像とを合成する。また、画像合成部403は、光ボケ再現部603によって照射マップおよび反射マップの値が調整されている場合、調整後の値に基づいて各画素の輝度を重みづけすることができる。 The image combining unit 403 generates an optical reproduction image by combining the bright image and the dark image included in the first image based on the information of the irradiation map and the reflection map. Specifically, the image synthesizing unit 403 sets the brightness of each pixel of the optical reproduction image to the brightness acquired from the pixel at the same position in the bright image or the dark image based on the value of the irradiation map, thereby obtaining the bright image. And the dark image are combined. Further, when the values of the irradiation map and the reflection map are adjusted by the light blurring reproduction unit 603, the image composition unit 403 can weight the brightness of each pixel based on the adjusted values.
 仮想撮影画像の枚数は、投影パターンの数に等しい。したがって、投影パターンが増えると、仮想撮影画像を作成するための処理時間が長くなるという問題がある。そこで、異なる投影パターンの照射において仮想撮影画像を作成する際に共通する処理を、1回だけ行えばよいように実装することが有効である。共通する処理の例としては、撮像装置視点から見た撮影空間の明画像および暗画像の作成、撮像装置および投影装置から撮影空間に存在する物体までの距離データの計算などが挙げられる。 The number of virtual captured images is equal to the number of projection patterns. Therefore, when the number of projection patterns increases, there is a problem that the processing time for creating a virtual captured image becomes long. Therefore, it is effective to implement the processing that is common when creating a virtual captured image in the irradiation of different projection patterns so that the processing is performed only once. Examples of common processing include creation of a bright image and a dark image of the shooting space viewed from the viewpoint of the imaging device, calculation of distance data from the imaging device and the projection device to an object existing in the shooting space, and the like.
 投影装置からの投影光によって照らされる領域の計算は、投影パターンが異なると計算結果が異なる。しかし、物体の陰になって投影光が遮蔽されて陰になる領域の計算については、投影パターンによらず一定であるため、共通処理として抜き出すことができる。 According to the calculation of the area illuminated by the projection light from the projection device, the calculation result will differ if the projection pattern is different. However, the calculation of the area that becomes the shadow of the object and the shadow of the projection light is shielded is the same regardless of the projection pattern, and thus can be extracted as a common process.
 投影装置のピントが合っていない場合、シーンに投影されるパターンがボケるため、パターンの認識精度が低下し、結果的に3次元計測の精度が低下する可能性がある。パターン光のボケは、撮像装置の撮影の際に生じるものではなく、投影装置からのパターン投光の際に生じるものである。したがって、仮想撮影画像に対してボケを付加する方法では、現象を再現することはできない。 If the projection device is out of focus, the pattern projected on the scene is blurred, which may reduce the pattern recognition accuracy and consequently the three-dimensional measurement accuracy. The blur of the pattern light does not occur when the image is captured by the imaging device, but occurs when the pattern is projected from the projection device. Therefore, the phenomenon cannot be reproduced by the method of adding blur to the virtual photographed image.
 そこで、仮想撮影画像において投影光によって照らされる領域を示す照射マップを作成する。照射マップは、仮想撮影画像と同じ要素数を持つ2次元配列として定義でき、各要素は0から1までの実数の値を持つ。投影光によって照らされている領域は1、投影光が届かない領域は0、投影光のボケの影響を受けて通常よりも低い強度で照らされている領域は0より大きく1より小さい値として表現できる。これを数式で表すと、以下の数式(2)のようになる。 Therefore, create an irradiation map that shows the area illuminated by the projection light in the virtual captured image. The irradiation map can be defined as a two-dimensional array having the same number of elements as the virtual captured image, and each element has a real number value of 0 to 1. The area illuminated by the projection light is expressed as 1, the area where the projection light does not reach is expressed as 0, and the area illuminated by a lower intensity than usual due to the blur of the projection light is expressed as a value greater than 0 and less than 1. it can. When this is expressed by a mathematical expression, it becomes as shown in the following mathematical expression (2).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、Ii,jは、仮想撮影画像の画像座標(i,j)における輝度であり、Pi,jは、照射マップの座標(i,j)における値であり、Bi,jは、明画像の画像座標(i,j)における輝度であり、Si,jは、暗画像の画像座標(i,j)における輝度である。 Here, I i,j is the luminance at the image coordinates (i,j) of the virtual captured image, P i,j is the value at the coordinates (i,j) of the irradiation map, and B i,j is , The brightness at the image coordinates (i,j) of the bright image, and S i,j is the brightness at the image coordinates (i,j) of the dark image.
 照射マップを作成する際には、まず投影装置のピントが合っている状態、すなわち投影光のボケがない状態を仮定し、投影パターンで照らされる領域を計算する。この時点では照射マップの各要素が持つ値は、0または1である。この照射マップに対して、ガウシアン平滑化などの平滑化フィルタを施すことによって、投影光のボケを再現する。ガウシアン平滑化を施すことにより、パターン光によって照らされる領域と照らされない領域の境界付近における照射マップの値の変化が滑らかになる。境界部分の値の変化が滑らかになることで、投影光の境界がぼやける効果が得られ、投影光のボケを再現することが可能である。 When creating an irradiation map, first assume that the projection device is in focus, that is, there is no blur of the projection light, and calculate the area illuminated by the projection pattern. At this point, the value of each element of the irradiation map is 0 or 1. Blurring of projection light is reproduced by applying a smoothing filter such as Gaussian smoothing to this irradiation map. By performing the Gaussian smoothing, the change in the value of the irradiation map near the boundary between the area illuminated by the pattern light and the area not illuminated by the pattern light becomes smooth. By smoothing the change in the value of the boundary portion, the effect of blurring the boundary of the projection light can be obtained, and the blur of the projection light can be reproduced.
 図16は、本発明の実施の形態4にかかるシミュレーション装置14の動作を示すフローチャートである。センサ視点データ生成部401は、明画像生成部501および暗画像生成部502において、明画像および暗画像を生成する(ステップS601)。センサ視点データ生成部401は、ステップS601と同時並行で、距離データ生成部503において、各視点の距離データを生成する(ステップS602)。センサ視点データ生成部401は、生成した明画像、暗画像および距離データを含むセンサ視点データをマップ生成部402に入力する。 FIG. 16 is a flowchart showing the operation of the simulation device 14 according to the fourth exemplary embodiment of the present invention. The sensor viewpoint data generation unit 401 generates a bright image and a dark image in the bright image generation unit 501 and the dark image generation unit 502 (step S601). The sensor viewpoint data generation unit 401 generates distance data of each viewpoint in the distance data generation unit 503 concurrently with step S601 (step S602). The sensor viewpoint data generation unit 401 inputs the sensor viewpoint data including the generated bright image, dark image, and distance data to the map generation unit 402.
 マップ生成部402は、照射領域算出部601において、照射領域を算出する(ステップS603)。続いてマップ生成部402は、照射マップ生成部602において照射マップを生成する(ステップS604)。 The map generation unit 402 calculates the irradiation area in the irradiation area calculation unit 601 (step S603). Subsequently, the map generation unit 402 generates an irradiation map in the irradiation map generation unit 602 (step S604).
 マップ生成部402は、誤差要因の付加順を取得する(ステップS605)。続いてマップ生成部402は、誤差要因の種類と強度を取得する(ステップS606)。マップ生成部402は、相互反射を付加するか否かを判断する(ステップS607)。相互反射を付加すると判断した場合(ステップS607:Yes)、マップ生成部402は、反射領域算出部604に反射領域を算出させ(ステップS608)、反射マップ生成部605に反射マップを生成させる(ステップS609)。 The map generation unit 402 acquires the order of addition of error factors (step S605). Subsequently, the map generation unit 402 acquires the type and strength of the error factor (step S606). The map generation unit 402 determines whether to add mutual reflection (step S607). When it is determined that the mutual reflection is added (step S607: Yes), the map generation unit 402 causes the reflection region calculation unit 604 to calculate the reflection region (step S608), and causes the reflection map generation unit 605 to generate the reflection map (step). S609).
 相互反射を付加しないと判断した場合(ステップS607:No)、ステップS608およびステップS609の処理は省略される。続いてマップ生成部402は、光ボケを付加するか否かを判断する(ステップS610)。光ボケを付加すると判断した場合(ステップS610:Yes)、マップ生成部402の光ボケ再現部603は、照射マップに光ボケを再現する(ステップS611)。 When it is determined that the mutual reflection is not added (step S607: No), the processes of steps S608 and S609 are omitted. Subsequently, the map generation unit 402 determines whether or not to add light blur (step S610). When it is determined that the light blur is added (step S610: Yes), the light blur reproduction unit 603 of the map generation unit 402 reproduces the light blur on the irradiation map (step S611).
 光ボケを付加しないと判断した場合(ステップS610:No)、ステップS611の処理は省略される。続いてマップ生成部402は、環境光を付加するか否かを判断する(ステップS612)。環境光を付加すると判断した場合(ステップS612:Yes)、マップ生成部402の照射マップ生成部602は、照射マップに環境光を付加する(ステップS613)。環境光を付加しないと判断した場合(ステップS612:No)、ステップS613の処理は省略される。 When it is determined that the light blur is not added (step S610: No), the process of step S611 is omitted. Subsequently, the map generation unit 402 determines whether to add ambient light (step S612). When it is determined that ambient light is added (step S612: Yes), the irradiation map generation unit 602 of the map generation unit 402 adds ambient light to the irradiation map (step S613). When it is determined that the ambient light is not added (step S612: No), the process of step S613 is omitted.
 マップ生成部402は、誤差付加処理を終了するか否かを判断する(ステップS614)。誤差付加処理を終了しないと判断した場合(ステップS614:No)、ステップS606の処理に戻る。誤差付加処理を終了すると判断した場合(ステップS614:Yes)、画像合成部403は、画像合成処理を実行する(ステップS615)。 The map generation unit 402 determines whether or not to finish the error addition process (step S614). When it is determined that the error adding process is not to be ended (step S614: No), the process returns to step S606. When it is determined that the error addition process is to be ended (step S614: Yes), the image composition unit 403 executes the image composition process (step S615).
 以上説明したように、本発明の実施の形態4にかかるシミュレーション装置14によれば、光学的な現象を個別に再現した画像である第1画像、照射マップ、反射マップを作成することで、複数の投影パターンを用いる場合の光学再現画像を生成する処理およびデータ構造を簡素化することができる。また、照射マップを用いてピントずれに起因する光の陰影境界のボケ効果を表現し、反射マップを用いて相互反射の効果を表現することで、簡易な演算処理によって投影装置のピントずれ、相互反射といった複雑な光学現象が複合した撮影画像を再現することができるという効果を奏する。 As described above, according to the simulation device 14 according to the fourth exemplary embodiment of the present invention, by creating the first image, the irradiation map, and the reflection map, which are images in which optical phenomena are individually reproduced, It is possible to simplify the process and the data structure for generating the optical reproduction image when the projection pattern of 1 is used. In addition, by using the irradiation map to represent the blur effect of the light shadow boundary caused by the focus shift, and by using the reflection map to express the effect of the mutual reflection, the focus shift and This has the effect of being able to reproduce a captured image in which complex optical phenomena such as reflection are combined.
 実際の三次元計測装置を構成して評価を行う場合、光学的な現象が全て混在した状態の結果しか得ることができない。これに対してシミュレーション装置14によれば、複数の種類の光学的な現象のそれぞれを個別に再現することができる。このため、三次元計測によって生じる計測誤差がどの要因によって引き起こされたかをより詳細に知ることができる。このような構成をとることで、三次元計測装置の設計および性能評価が容易になるという効果を奏する。 When configuring an actual three-dimensional measuring device for evaluation, it is possible to obtain only the result of a state in which all optical phenomena are mixed. On the other hand, the simulation apparatus 14 can individually reproduce each of a plurality of types of optical phenomena. Therefore, it is possible to know in detail which factor caused the measurement error caused by the three-dimensional measurement. By adopting such a configuration, there is an effect that the design and performance evaluation of the three-dimensional measuring device become easy.
実施の形態5.
 実施の形態5にかかる不図示のシミュレーション装置15は、図7に示すシミュレーション装置13と同様の構成を有する。シミュレーション装置15は、出力する表示画面がシミュレーション装置13と異なる。以下、シミュレーション装置13と同様の部分については説明を省略し、シミュレーション装置13と異なる部分について主に説明する。
Embodiment 5.
The simulation device 15 (not shown) according to the fifth embodiment has the same configuration as the simulation device 13 shown in FIG. 7. The output screen of the simulation device 15 is different from that of the simulation device 13. Hereinafter, description of the same parts as those of the simulation device 13 will be omitted, and parts different from those of the simulation device 13 will be mainly described.
 図17は、本発明の実施の形態5にかかるシミュレーション装置15が出力する表示画面40の一例を示す図である。シミュレーション装置15は、計測条件情報の少なくとも一部と、仮想撮影画像とのうち少なくとも一方をさらに含むシミュレーション結果を出力することができる。 FIG. 17 is a diagram showing an example of the display screen 40 output by the simulation device 15 according to the fifth embodiment of the present invention. The simulation device 15 can output a simulation result that further includes at least one of the measurement condition information and at least one of the virtual captured image.
 表示画面40は、調整項目表示領域41と、保存ボタン42と、終了ボタン43と、計測結果表示領域44とを含むことができる。調整項目表示領域41は、調整対象の項目の設定値がリスト表示される領域である。計測結果表示領域44は、調整対象の計測条件の項目をリスト表示されたそれぞれの設定値に設定したときの計測値を表示する領域である。保存ボタン42は、計測結果を保存する操作を行うための操作部であり、終了ボタン43は、処理を終了させる操作を行うための操作部である。 The display screen 40 can include an adjustment item display area 41, a save button 42, an end button 43, and a measurement result display area 44. The adjustment item display area 41 is an area in which set values of items to be adjusted are displayed in a list. The measurement result display area 44 is an area for displaying the measurement values when the items of the measurement condition to be adjusted are set to the respective set values listed. The save button 42 is an operation unit for performing an operation of saving the measurement result, and the end button 43 is an operation unit for performing an operation of ending the process.
 表示画面40では、調整対象の計測条件の項目の設定値と、処理結果である計測値と、中間結果として処理過程で得られる仮想撮影画像、照射マップなどが並べて表示される。また、いずれかの設定値における処理結果を基準とした差分結果を併せて表示したり、処理結果の差異が大きい部分を強調表示したりしてもよい。 On the display screen 40, the setting values of the measurement condition items to be adjusted, the measurement values that are the processing results, and virtual captured images and irradiation maps that are obtained in the processing process as intermediate results are displayed side by side. Further, the difference result based on the processing result for any of the set values may be displayed together, or a portion where the difference between the processing results is large may be highlighted.
 以上説明したように、本発明の実施の形態5にかかるシミュレーション装置15によれば、計測値とあわせて、計測条件情報および仮想撮影画像などを出力することにより、ユーザが、計測条件ごとの三次元計測結果の過程を確認することができる。このとき、複数の計測条件に対応する処理結果および中間結果を並べて表示することで、計測条件の設定値を比較して検討することが可能になる。したがって、投影装置および撮像装置の配置、性能の検討が容易になるという効果を奏する。 As described above, according to the simulation device 15 according to the fifth embodiment of the present invention, the user outputs the measurement condition information, the virtual captured image, and the like together with the measurement value, so that the user can obtain the third order for each measurement condition. The process of the original measurement result can be confirmed. At this time, by displaying the processing results and the intermediate results corresponding to the plurality of measurement conditions side by side, it becomes possible to compare and examine the set values of the measurement conditions. Therefore, the arrangement and performance of the projection device and the imaging device can be easily examined.
実施の形態6.
 図18は、本発明の実施の形態6にかかるシミュレーション装置16の機能構成を示す図である。シミュレーション装置16は、計測条件取得部101と、仮想撮影画像生成部102と、三次元計測演算部103と、出力部104と、誤差情報取得部105と、評価基準データ生成部106と、計測評価部107とを有する。
Sixth Embodiment
FIG. 18 is a diagram showing a functional configuration of the simulation device 16 according to the sixth exemplary embodiment of the present invention. The simulation device 16 includes a measurement condition acquisition unit 101, a virtual captured image generation unit 102, a three-dimensional measurement calculation unit 103, an output unit 104, an error information acquisition unit 105, an evaluation reference data generation unit 106, and a measurement evaluation. And part 107.
 シミュレーション装置16は、シミュレーション装置13の構成に加えて、評価基準データ生成部106と、計測評価部107とを有する。以下、シミュレーション装置13と同様の構成については詳細な説明を省略し、シミュレーション装置13と異なる部分について主に説明する。 The simulation device 16 has an evaluation reference data generation unit 106 and a measurement evaluation unit 107 in addition to the configuration of the simulation device 13. Hereinafter, detailed description of the same configuration as that of the simulation device 13 will be omitted, and the differences from the simulation device 13 will be mainly described.
 評価基準データ生成部106は、計測条件情報を用いて、シミュレーション結果の評価基準となる評価基準データを生成する。評価基準データ生成部106は、生成した評価基準データを計測評価部107に入力する。計測評価部107は、評価基準データを用いてシミュレーション結果を評価し、シミュレーション評価を取得する。出力部104は、シミュレーション結果に加えて、シミュレーション評価結果も出力する。 The evaluation reference data generation unit 106 uses the measurement condition information to generate evaluation reference data that is an evaluation reference for simulation results. The evaluation reference data generation unit 106 inputs the generated evaluation reference data to the measurement evaluation unit 107. The measurement evaluation unit 107 evaluates the simulation result using the evaluation reference data and acquires the simulation evaluation. The output unit 104 outputs the simulation evaluation result in addition to the simulation result.
 図19は、図18に示すシミュレーション装置16の動作を示すフローチャートである。図19に示す動作は、ステップS301からステップS307までは、図9と同様であるため詳細な説明を省略する。本実施の形態では、三次元計測処理が終わった後、評価基準データ生成部106が評価基準データを生成する(ステップS701)。続いて、計測評価部107は、シミュレーションによって得られた計測データと評価基準データとを比較して定量的な評価値を算出する計測評価処理を行う(ステップS702)。出力部104は、シミュレーション評価を出力する(ステップS703)。 FIG. 19 is a flowchart showing the operation of the simulation device 16 shown in FIG. The operation shown in FIG. 19 is the same as that of FIG. 9 from step S301 to step S307, and detailed description thereof will be omitted. In the present embodiment, the evaluation reference data generating unit 106 generates the evaluation reference data after the three-dimensional measurement process is completed (step S701). Subsequently, the measurement evaluation unit 107 performs a measurement evaluation process of comparing the measurement data obtained by the simulation with the evaluation reference data to calculate a quantitative evaluation value (step S702). The output unit 104 outputs the simulation evaluation (step S703).
 評価基準データとしては、センサ視点データに含まれる、撮像装置からの距離データ、誤差要因を付加する前の仮想撮影画像を入力とした場合の三次元計測の処理結果、照射マップのような処理の過程で得られるデータ、実際に計測対象物を計測して得られる実計測データが考えられる。 As the evaluation reference data, the distance data from the imaging device included in the sensor viewpoint data, the processing result of the three-dimensional measurement when the virtual captured image before adding the error factor is input, and the processing such as the irradiation map The data obtained in the process and the actual measurement data obtained by actually measuring the measurement object can be considered.
 シミュレーション評価としては、理論上、計測データが得られうる領域である計測可能領域に対する、計測データが得られない欠損領域の割合、評価基準データと比較した場合の計測誤差などを用いることが考えられる。計測可能領域は、例えば、誤差要因を付加する前の仮想撮影画像を入力として得られる計測データにおいて、欠損が生じていない領域とすることができる。また、評価基準データとシミュレーション結果とを比較して得られる、平均値、分散、標準偏差、最大値といった統計量を評価指標として用いてもよい。 For the simulation evaluation, it is theoretically possible to use the ratio of the defective area where the measurement data cannot be obtained to the measurable area where the measurement data can be obtained, and the measurement error when compared with the evaluation reference data. .. The measurable region can be, for example, a region in which no deficiency occurs in the measurement data obtained by inputting the virtual captured image before adding the error factor. Further, statistics such as average value, variance, standard deviation, and maximum value obtained by comparing the evaluation reference data and the simulation result may be used as the evaluation index.
 実計測データを評価基準データとして用いる場合、よりシミュレーション評価がよくなるように、すなわち実計測データとシミュレーション結果の間の差異が小さくなるようにフィードバックをすることも有用である。 When using the actual measurement data as the evaluation reference data, it is also useful to provide feedback so that the simulation evaluation is better, that is, the difference between the actual measurement data and the simulation result is smaller.
 フィードバックの方法としては、例えば、複数の異なるシミュレーション結果を生成し、最もシミュレーション評価が良い設定値を取得することが考えられる。複数のシミュレーション結果を得るためには、例えば、計測条件における計測対象物の表面特性の値を変化させたり、誤差要因の強度を変化させたりする方法がある。 As a feedback method, for example, it is possible to generate a plurality of different simulation results and acquire the setting value with the best simulation evaluation. In order to obtain a plurality of simulation results, for example, there is a method of changing the value of the surface characteristic of the measurement object under the measurement condition or changing the strength of the error factor.
 シミュレーション評価が最もよくなるときの、計測対象物の表面特性、誤差要因の強度といったパラメータが、最適なシミュレーション設定として定義できる。この最適なシミュレーション設定をユーザに提供することによって、より正確な検証実験が実施できるようになる。 Parameters such as the surface characteristics of the measurement target and the strength of error factors when the simulation evaluation is the best can be defined as the optimum simulation settings. By providing the user with this optimum simulation setting, more accurate verification experiments can be performed.
 図20は、図18に示すシミュレーション装置16の表示画面50の一例を示す図である。表示画面50は、表示画面40の構成要素に加えて、評価基準データ表示領域51を含む。 FIG. 20 is a diagram showing an example of the display screen 50 of the simulation device 16 shown in FIG. The display screen 50 includes an evaluation reference data display area 51 in addition to the components of the display screen 40.
 表示画面50では、計測条件の調整対象項目の複数の設定値と並んで、評価基準データが表示される。また、基準値を使用した三次元計測処理のシミュレーションから得られる中間結果も表示してもよい。また、表示画面50は、シミュレーション評価をさらに表示することもできる。 On the display screen 50, the evaluation standard data is displayed along with a plurality of setting values of the adjustment target items of the measurement conditions. In addition, the intermediate result obtained from the simulation of the three-dimensional measurement process using the reference value may be displayed. In addition, the display screen 50 can further display the simulation evaluation.
 以上説明したように、本発明の実施の形態6にかかるシミュレーション装置16では、シミュレーション結果に加えて、評価基準データを得ることができる。評価基準データは、シミュレーション結果を評価する際に基準となるデータであって、実計測データなどである。シミュレーション装置16は、シミュレーション結果に含まれる計測値と、評価基準データとを並べて表示することで、シミュレーション結果の検討を容易にすることができる。 As described above, the simulation device 16 according to the sixth exemplary embodiment of the present invention can obtain evaluation reference data in addition to the simulation result. The evaluation reference data is data serving as a reference when evaluating the simulation result, and is actual measurement data or the like. The simulation device 16 can facilitate examination of the simulation result by displaying the measured value included in the simulation result and the evaluation reference data side by side.
 また、シミュレーション装置16は、評価基準データを用いてシミュレーション結果を評価したシミュレーション評価を得ることができる。このような構成をとることで、計測誤差および欠損を定量的に把握することができる。このため、撮影装置および投影装置の性能検討、計測対象物ごとに三次元計測の適不適を判定するなどの作業が容易になる。また、性能評価方法が規格で定められている場合、その性能評価方法を使用してシミュレーション評価を得ることで、容易に適不適の判定を行うことが可能になる。 Also, the simulation device 16 can obtain a simulation evaluation in which the simulation result is evaluated using the evaluation reference data. With such a configuration, it is possible to quantitatively grasp the measurement error and the loss. For this reason, it is easy to study the performance of the imaging device and the projection device, and to determine the suitability of three-dimensional measurement for each measurement object. Further, when the performance evaluation method is defined by the standard, it is possible to easily perform the suitability judgment by obtaining the simulation evaluation by using the performance evaluation method.
実施の形態7.
 図21は、本発明の実施の形態7にかかるシミュレーション装置17の機能構成を示す図である。シミュレーション装置17は、実施の形態6にかかるシミュレーション装置16の構成に加えて、物体認識処理部108および認識評価部109を有する。以下、シミュレーション装置16と異なる部分を主に説明する。
Embodiment 7.
FIG. 21 is a diagram showing a functional configuration of the simulation apparatus 17 according to the seventh embodiment of the present invention. The simulation device 17 has an object recognition processing unit 108 and a recognition evaluation unit 109 in addition to the configuration of the simulation device 16 according to the sixth embodiment. Hereinafter, a part different from the simulation device 16 will be mainly described.
 物体認識処理部108は、三次元計測演算部103が出力するシミュレーション結果と、計測条件取得部101が出力する計測条件情報とを入力として、撮影空間内に存在する物体の位置姿勢と、物体を把持可能な位置である把持位置とのうち少なくとも1つを含む認識結果を取得する。物体認識処理部108は、認識結果を認識評価部109に入力する。 The object recognition processing unit 108 receives the simulation result output by the three-dimensional measurement calculation unit 103 and the measurement condition information output by the measurement condition acquisition unit 101, and determines the position and orientation of the object existing in the imaging space and the object. A recognition result including at least one of a gripping position that is a grippable position is acquired. The object recognition processing unit 108 inputs the recognition result to the recognition evaluation unit 109.
 認識評価部109は計測条件情報に基づいて、認識結果を評価する。具体的には、認識評価部109は、認識結果に含まれる位置姿勢の推定精度と、把持位置の推定精度のうち少なくとも1つを含む認識評価結果を取得する。認識評価部109は、物体の認識結果および認識評価結果を出力部104に入力する。出力部104は、シミュレーション結果およびシミュレーション評価に加えて、認識結果および認識評価結果を出力する。 The recognition evaluation unit 109 evaluates the recognition result based on the measurement condition information. Specifically, the recognition evaluation unit 109 acquires a recognition evaluation result that includes at least one of the position and orientation estimation accuracy included in the recognition result and the gripping position estimation accuracy. The recognition evaluation unit 109 inputs the recognition result of the object and the recognition evaluation result to the output unit 104. The output unit 104 outputs the recognition result and the recognition evaluation result in addition to the simulation result and the simulation evaluation.
 図22は、図21に示すシミュレーション装置17の動作を示すフローチャートである。ステップS301からステップS307,ステップS701からステップS703の動作は、図19と同様であるためここでは説明を省略する。以下、図19と異なる部分について主に説明する。 22 is a flowchart showing the operation of the simulation apparatus 17 shown in FIG. The operations in steps S301 to S307 and steps S701 to S703 are the same as those in FIG. 19, and thus description thereof will be omitted here. Hereinafter, the part different from FIG. 19 will be mainly described.
 ステップS306に示す三次元計測処理が終わると、物体認識処理部108は、物体認識処理を実行して認識結果を取得する(ステップS801)。認識結果が得られると、認識評価部109は、認識結果を評価する認識評価処理を実行する(ステップS802)。認識評価結果が得られると、出力部104は、認識評価結果を出力する(ステップS803)。ここでステップS803において、認識評価結果に加えて、認識結果を出力してもよい。 When the three-dimensional measurement process shown in step S306 is completed, the object recognition processing unit 108 executes the object recognition process and acquires the recognition result (step S801). When the recognition result is obtained, the recognition evaluation unit 109 executes a recognition evaluation process for evaluating the recognition result (step S802). When the recognition evaluation result is obtained, the output unit 104 outputs the recognition evaluation result (step S803). Here, in step S803, the recognition result may be output in addition to the recognition evaluation result.
 物体の三次元計測を行う目的の1つとして、計測対象物の位置姿勢などの認識が挙げられる。例えばロボットに物体を把持させるアプリケーションプログラムが該当する。把持対象となる物体の位置姿勢があらかじめ分かっていない場合には、その場で対象物をセンシングし、対象物の位置姿勢もしくはロボットが把持可能な位置を認識する必要がある。 -One of the purposes of three-dimensional measurement of objects is recognition of the position and orientation of the measurement target. For example, an application program that causes a robot to grip an object is applicable. When the position and orientation of the object to be grasped is not known in advance, it is necessary to sense the object on the spot and recognize the position and orientation of the object or the position where the robot can grasp.
 そこで本発明の実施の形態7では、三次元計測のシミュレーション結果を入力として、物体認識処理を行う物体認識処理部108を備える。物体認識処理部108における物体認識のためのアルゴリズムは任意のアルゴリズムを使用してよい。物体認識アルゴリズムは、三次元計測の結果を三次元空間における点の集合として扱う三次元点群を入力としてもよいし、三次元計測結果を二次元画像で表現したデプス画像を入力としてもよい。 Therefore, in the seventh embodiment of the present invention, the object recognition processing unit 108 that performs the object recognition processing with the simulation result of the three-dimensional measurement as an input is provided. An arbitrary algorithm may be used as an algorithm for object recognition in the object recognition processing unit 108. The object recognition algorithm may be input with a three-dimensional point group that handles the result of the three-dimensional measurement as a set of points in a three-dimensional space, or with a depth image that represents the three-dimensional measurement result as a two-dimensional image.
 さらに、物体認識の結果を用いて、認識した物体の位置姿勢の推定精度や把持位置の推定精度といった認識結果の評価も行う。物体の位置姿勢を認識するアルゴリズムを評価する際、多くのケースで問題となるのは、物体の位置姿勢の真値が分からないことである。真値が分からないため、物体の位置姿勢の認識結果が出力されても、その良し悪しを定量的に判断することは難しい。しかし、シミュレーションにおいては物体の位置姿勢が既知であるため、認識結果を定量的に評価することが可能である。 Further, using the result of the object recognition, the recognition result such as the estimation accuracy of the position and orientation of the recognized object and the estimation accuracy of the gripping position is evaluated. When evaluating an algorithm for recognizing the position and orientation of an object, a problem in many cases is that the true value of the position and orientation of the object is unknown. Since the true value is unknown, even if the recognition result of the position and orientation of the object is output, it is difficult to quantitatively determine the quality. However, since the position and orientation of the object are known in the simulation, it is possible to quantitatively evaluate the recognition result.
 以上説明したように、本発明の実施の形態7にかかるシミュレーション装置17によれば、シミュレーション結果を用いて物体認識の検証を行うことが可能になる。このような構成をとることで、実際に三次元計測装置を構成することなく物体認識の性能評価が可能になる。シミュレーション空間において、認識対象物の位置姿勢は既知であるため、物体認識結果を真値と比較することができ、物体認識性能の定量的な評価が容易になるという効果を奏する。 As described above, according to the simulation device 17 according to the seventh embodiment of the present invention, it is possible to verify the object recognition using the simulation result. With such a configuration, it is possible to evaluate the performance of object recognition without actually configuring a three-dimensional measuring device. Since the position and orientation of the recognition target are known in the simulation space, the result of object recognition can be compared with the true value, and the effect of facilitating quantitative evaluation of the object recognition performance is achieved.
 ロボットで認識した物体を把持するシステムでは、物体認識の精度が十分であるか否かを定量的に評価することが非常に重要である。ロボットによる把持が成功するかどうかは物体認識の精度に大きく影響されるためである。しかしながら、例えば複数の物体が規則性なく積まれたばら積み状態からの把持を想定すると、ばら積みされた物体の位置姿勢の真値を知ることは困難である。そのため三次元計測装置およびロボットを含む実機を用いて試験を行ったとしても、物体認識の厳密な精度を定量的に把握することができない。本発明によれば、シミュレーションによって容易に真値を取得して、物体認識の精度を定量的に評価することができるという利点がある。 In a system that grips an object recognized by a robot, it is extremely important to quantitatively evaluate whether or not the accuracy of object recognition is sufficient. This is because whether or not the robot is successful in gripping is greatly affected by the accuracy of object recognition. However, for example, assuming gripping from a stacked state in which a plurality of objects are stacked without regularity, it is difficult to know the true value of the position and orientation of the stacked objects. Therefore, even if a test is performed using an actual machine including a three-dimensional measuring device and a robot, the strict accuracy of object recognition cannot be grasped quantitatively. According to the present invention, there is an advantage that the true value can be easily obtained by simulation and the accuracy of object recognition can be quantitatively evaluated.
実施の形態8.
 図23は、本発明の実施の形態8にかかるシミュレーション装置18の機能構成を示す図である。シミュレーション装置18は、実施の形態7にかかるシミュレーション装置17の構成に加えて、物体把持評価部110を有する。
Eighth embodiment.
FIG. 23 is a diagram showing the functional configuration of the simulation apparatus 18 according to the eighth embodiment of the present invention. The simulation device 18 has an object grip evaluation unit 110 in addition to the configuration of the simulation device 17 according to the seventh embodiment.
 以下、シミュレーション装置17と同様の構成については説明を省略し、シミュレーション装置17と異なる部分について主に説明する。 Description of the same configuration as the simulation device 17 will be omitted, and the differences from the simulation device 17 will be mainly described below.
 物体把持評価部110は、計測条件取得部101が出力する計測条件情報と、三次元計測演算部103が出力するシミュレーション結果と、物体認識処理部108が出力する認識結果とに基づいて、物体の把持成功確率を評価した物体の把持評価結果を取得する。物体把持評価部110は、把持評価結果を出力部104に入力する。出力部104は、把持評価結果も出力する。 The object grip evaluation unit 110 calculates the object based on the measurement condition information output by the measurement condition acquisition unit 101, the simulation result output by the three-dimensional measurement calculation unit 103, and the recognition result output by the object recognition processing unit 108. The grip evaluation result of the object for which the grip success probability is evaluated is acquired. The object grip evaluation unit 110 inputs the grip evaluation result to the output unit 104. The output unit 104 also outputs the grip evaluation result.
 図24は、図23に示すシミュレーション装置18の動作を示すフローチャートである。ステップS301からステップS307,ステップS701からステップS703,ステップS801からステップS803の動作は図22と同様であるため、ここでは説明を省略する。以下、図22と異なる部分について主に説明する。 FIG. 24 is a flowchart showing the operation of the simulation device 18 shown in FIG. The operations in steps S301 to S307, steps S701 to S703, and steps S801 to S803 are the same as those in FIG. 22, and thus the description thereof is omitted here. Hereinafter, a part different from FIG. 22 will be mainly described.
 物体認識結果が得られると、物体把持評価部110は、物体把持評価を実行する(ステップS901)。物体把持評価部110は、把持評価結果を出力部104に入力する。出力部104は、把持評価結果を出力する(ステップS902)。ステップS901の動作は、認識評価処理と同時並行で実行することができる。 When the object recognition result is obtained, the object grip evaluation unit 110 executes the object grip evaluation (step S901). The object grip evaluation unit 110 inputs the grip evaluation result to the output unit 104. The output unit 104 outputs the grip evaluation result (step S902). The operation of step S901 can be executed concurrently with the recognition evaluation process.
 物体の認識結果を用いることで、ロボットハンドがどの位置で物体を把持するかの情報を得ることができる。その情報を用いることで、ロボットハンドを把持位置まで移動させ、把持動作を実行した場合に、物体と接触する位置や、ロボットハンドあるいは物体に生じる力の大きさと方向をシミュレーションすることができる。 By using the recognition result of the object, it is possible to obtain the information on the position where the robot hand grips the object. By using the information, when the robot hand is moved to the gripping position and the gripping operation is executed, the position of contact with the object and the magnitude and direction of the force generated in the robot hand or the object can be simulated.
 ロボットハンドおよび物体に生じる力の大きさと方向が分かれば、把持が成功するかどうかを推定することが可能である。例えば、ロボットハンドが2本の爪を有する平行ハンドである場合を考える。把持に失敗するのは、ロボットハンドの爪の隙間から把持物体が外れてしまうケースである。このケースに該当する状況としては、把持物体に対して、平行ハンドが閉じる方向に対して垂直な方向にかかる力が、ロボットハンドと把持物体との間の摩擦力よりも大きいときが考えられる。したがって、ロボットハンドおよび把持物体の表面の摩擦係数、把持物体の重量、ロボットハンドの把持力といった情報があれば把持が成功するか否かの判定が可能である。 If the magnitude and direction of the force generated in the robot hand and the object are known, it is possible to estimate whether the gripping will be successful. For example, consider a case where the robot hand is a parallel hand having two claws. The case where gripping fails is a case where the gripped object comes off from the gap between the claws of the robot hand. As a situation corresponding to this case, it is conceivable that the force applied to the gripping object in the direction perpendicular to the closing direction of the parallel hand is larger than the frictional force between the robot hand and the gripping object. Therefore, it is possible to determine whether or not the grip is successful if there is information such as the friction coefficient of the surfaces of the robot hand and the grip object, the weight of the grip object, and the grip force of the robot hand.
 ただし、把持の成功確率を評価する方法は、ロボットハンドと把持物体との間に生じる力を推定する方法だけでなく、任意の評価方法を用いて良い。本発明の実施の形態8では、このようなロボットによる物体把持のシミュレーション機能を備えた物体把持評価部110を有する。実施の形態8における出力画面例を図25に示す。図25は、図23に示すシミュレーション装置18が出力する表示画面60の一例を示す図である。表示画面60は、表示画面30の表示内容に加えて、認識および把持評価結果表示領域61を有する。認識結果および把持評価結果の出力方法としては、認識や把持の成否を成功の場合と失敗の場合とで異なる記号を出力してもよいし、失敗時には失敗の原因を出力してもよいし、評価に用いる何らかの定量値を表示してもよい。 However, the method of evaluating the success probability of gripping is not limited to the method of estimating the force generated between the robot hand and the gripped object, and any evaluation method may be used. The eighth embodiment of the present invention has an object grip evaluation unit 110 having a simulation function of such an object grip by a robot. FIG. 25 shows an output screen example according to the eighth embodiment. FIG. 25 is a diagram showing an example of the display screen 60 output by the simulation device 18 shown in FIG. The display screen 60 has a recognition and grip evaluation result display area 61 in addition to the display content of the display screen 30. As a method of outputting the recognition result and the grip evaluation result, different symbols may be output depending on whether the recognition or grip is successful or unsuccessful, or the cause of the failure may be output at the time of failure, Any quantitative value used for evaluation may be displayed.
 以上説明したように、本発明の実施の形態8にかかるシミュレーション装置18によれば、シミュレーション上で把持成功率が評価できるため、ロボットシステムを構築する前の事前検証が容易になるという効果がある。 As described above, according to the simulation apparatus 18 according to the eighth embodiment of the present invention, the grip success rate can be evaluated on the simulation, so that there is an effect that the pre-verification before building the robot system becomes easy. ..
 ロボット実機を用いて物体の認識から把持までの一連の動作が可能なシステムを構築するためには、3次元計測装置の設置から物体認識のソフトウェア調整、およびロボットの動作調整など、多大な期間とコストを要する。さらに、実空間で物体の把持を繰り返し行う試験にも長い時間と試行錯誤を要する。本発明によれば、シミュレーション上で把持試験が可能になるため、従来よりも短期間で試験を行うことができる。 In order to build a system that can perform a series of operations from object recognition to gripping using an actual robot, it takes a great deal of time to install the 3D measuring device, adjust the object recognition software, and adjust the robot operation. Costly. Further, a test in which an object is repeatedly gripped in a real space requires a long time and trial and error. According to the present invention, since a gripping test can be performed on a simulation, the test can be performed in a shorter period than before.
 また、実際のロボットシステムを用いて把持試験を行い、把持失敗の原因分析をしようとした場合、3次元計測の誤差からロボットの動作調整ミスまで、あらゆる失敗要因が混在した状態で把持成功か把持失敗かの結果だけが得られることになる。そのため、どこを修正すれば把持の成功率が向上するかを解析するのは困難である。本発明ではシミュレーションによって失敗要因を切り分けて検証することができるため、短期間で原因究明および改善を行うことができる。 In addition, when a grip test is performed using an actual robot system and an attempt is made to analyze the cause of gripping failure, gripping success or gripping occurs in a state where all failure factors are mixed, from errors in three-dimensional measurement to robot operation adjustment errors. Only the result of failure will be obtained. Therefore, it is difficult to analyze what should be corrected to improve the success rate of gripping. In the present invention, the cause of failure can be isolated and verified by simulation, so that the cause can be investigated and improved in a short period of time.
 続いて、本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18のハードウェア構成について説明する。計測条件取得部101、仮想撮影画像生成部102、三次元計測演算部103、出力部104、誤差情報取得部105、評価基準データ生成部106、計測評価部107、物体認識処理部108、認識評価部109および物体把持評価部110は、処理回路により実現される。これらの処理回路は、専用のハードウェアにより実現されてもよいし、CPU(Central Processing Unit)を用いた制御回路であってもよい。 Next, the hardware configuration of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention will be described. Measurement condition acquisition unit 101, virtual captured image generation unit 102, three-dimensional measurement calculation unit 103, output unit 104, error information acquisition unit 105, evaluation reference data generation unit 106, measurement evaluation unit 107, object recognition processing unit 108, recognition evaluation The unit 109 and the object grip evaluation unit 110 are realized by a processing circuit. These processing circuits may be realized by dedicated hardware, or may be control circuits using a CPU (Central Processing Unit).
 上記の処理回路が、専用のハードウェアにより実現される場合、これらは、図26に示す処理回路90により実現される。図26は、本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18の機能を実現するための専用のハードウェアを示す図である。処理回路90は、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものである。 When the above processing circuits are realized by dedicated hardware, these are realized by the processing circuit 90 shown in FIG. FIG. 26 is a diagram showing dedicated hardware for realizing the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention. The processing circuit 90 is a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
 上記の処理回路が、CPUを用いた制御回路で実現される場合、この制御回路は例えば図27に示す構成の制御回路91である。図27は、本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18の機能を実現するための制御回路91の構成を示す図である。図27に示すように、制御回路91は、プロセッサ92と、メモリ93とを備える。プロセッサ92は、CPUであり、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)などとも呼ばれる。メモリ93は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(登録商標)(Electrically EPROM)などの不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disk)などである。 When the above processing circuit is realized by a control circuit using a CPU, this control circuit is, for example, the control circuit 91 having the configuration shown in FIG. FIG. 27 is a diagram showing the configuration of the control circuit 91 for realizing the functions of the simulation apparatuses 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention. As shown in FIG. 27, the control circuit 91 includes a processor 92 and a memory 93. The processor 92 is a CPU and is also called a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like. The memory 93 is, for example, a nonvolatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable ROM), and an EEPROM (registered trademark) (Electrically EPROM), These include magnetic disks, flexible disks, optical disks, compact disks, mini disks, and DVD (Digital Versatile Disk).
 上記の処理回路が制御回路91により実現される場合、プロセッサ92がメモリ93に記憶された、各構成要素の処理に対応するプログラムを読み出して実行することにより実現される。また、メモリ93は、プロセッサ92が実行する各処理における一時メモリとしても使用される。 When the above processing circuit is realized by the control circuit 91, it is realized by the processor 92 reading and executing a program stored in the memory 93 and corresponding to the processing of each component. The memory 93 is also used as a temporary memory in each process executed by the processor 92.
 また、本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18の機能は、図28に示す構成のハードウェアを用いて実現することもできる。図28は、本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18の機能を実現するためのハードウェア構成の一例を示す図である。本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18の機能は、プロセッサ92およびメモリ93に加えて、入力装置94および出力装置95を用いて実現することができる。入力装置94は、キーボード、マウス、タッチセンサなどユーザからの入力操作を受け付ける入力インタフェースである。出力装置95は、例えば表示装置であり、ユーザに対して出力画面を表示することができる。なお、タッチパネルを用いる場合、タッチパネルの表示装置が出力装置95であり、表示装置に重畳されたタッチセンサが入力装置94である。 Further, the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention can also be realized by using the hardware having the configuration shown in FIG. .. FIG. 28 is a diagram showing an example of a hardware configuration for realizing the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention. The functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention use the input device 94 and the output device 95 in addition to the processor 92 and the memory 93. Can be realized. The input device 94 is an input interface such as a keyboard, a mouse, and a touch sensor that receives an input operation from a user. The output device 95 is, for example, a display device and can display an output screen to the user. When a touch panel is used, the display device of the touch panel is the output device 95 and the touch sensor superimposed on the display device is the input device 94.
 例えば、シミュレーション装置10,12,13,14,15,16,17,18の計測条件取得部101および出力部104の機能は、プロセッサ92だけによって実現されてもよいし、プロセッサ92と入力装置94または出力装置95、もしくはそれらとのインタフェースとによって実現されてもよい。 For example, the functions of the measurement condition acquisition unit 101 and the output unit 104 of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 may be realized by only the processor 92, or the processor 92 and the input device 94. Alternatively, it may be realized by the output device 95 or an interface therewith.
 以上の実施の形態に示した構成は、本発明の内容の一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、本発明の要旨を逸脱しない範囲で、構成の一部を省略、変更することも可能である。 The configurations described in the above embodiments are examples of the content of the present invention, and can be combined with another known technique, and the configurations of the configurations are not deviated from the scope not departing from the gist of the present invention. It is also possible to omit or change parts.
 例えば、本発明の実施の形態1~8にかかるシミュレーション装置10,12,13,14,15,16,17,18の機能は、1台のハードウェア上で実現されてもよいし、複数のハードウェアで分散処理されてもよい。 For example, the functions of the simulation devices 10, 12, 13, 14, 15, 16, 17, and 18 according to the first to eighth embodiments of the present invention may be realized by one piece of hardware, or a plurality of functions may be realized. It may be distributed and processed by hardware.
 また、上記で示した表示画面20,30,40,50,60は一例であり、様々な変更を加えることが可能である。 The display screens 20, 30, 40, 50, 60 shown above are examples, and various changes can be made.
 10,12,13,14,15,16,17,18 シミュレーション装置、20,30,40,50,60 表示画面、21 処理結果表示領域、22 計測条件表示領域、23 リスト表示領域、24 表示内容選択領域、25 実行ボタン、26,42 保存ボタン、27,43 終了ボタン、28 誤差要因設定領域、41 調整項目表示領域、44 計測結果表示領域、51 評価基準データ表示領域、61 認識および把持評価結果表示領域、90 処理回路、91 制御回路、92 プロセッサ、93 メモリ、101 計測条件取得部、102 仮想撮影画像生成部、103 三次元計測演算部、104 出力部、105 誤差情報取得部、106 評価基準データ生成部、107 計測評価部、108 物体認識処理部、109 認識評価部、110 物体把持評価部、201 投影条件情報取得部、202 撮影条件情報取得部、203 計測対象物情報取得部、204 非計測対象物情報取得部、301 光学再現画像生成部、302 画質劣化処理部、303 誤差要因決定部、401 センサ視点データ生成部、402 マップ生成部、403 画像合成部、501 明画像生成部、502 暗画像生成部、503 距離データ生成部、601 照射領域算出部、602 照射マップ生成部、603 光ボケ再現部、604 反射領域算出部、605 反射マップ生成部。 10, 12, 13, 14, 15, 16, 17, 18 simulation device, 20, 30, 40, 50, 60 display screen, 21 processing result display area, 22 measurement condition display area, 23 list display area, 24 display content Selection area, 25 execution button, 26, 42 save button, 27, 43 end button, 28 error factor setting area, 41 adjustment item display area, 44 measurement result display area, 51 evaluation reference data display area, 61 recognition and grip evaluation result Display area, 90 processing circuit, 91 control circuit, 92 processor, 93 memory, 101 measurement condition acquisition unit, 102 virtual captured image generation unit, 103 three-dimensional measurement calculation unit, 104 output unit, 105 error information acquisition unit, 106 evaluation criteria Data generation unit, 107 measurement evaluation unit, 108 object recognition processing unit, 109 recognition evaluation unit, 110 object gripping evaluation unit, 201 projection condition information acquisition unit, 202 shooting condition information acquisition unit, 203 measurement object information acquisition unit, 204 non Measurement object information acquisition unit, 301 optical reproduction image generation unit, 302 image quality deterioration processing unit, 303 error factor determination unit, 401 sensor viewpoint data generation unit, 402 map generation unit, 403 image combination unit, 501 clear image generation unit, 502 Dark image generation unit, 503 distance data generation unit, 601, irradiation area calculation unit, 602 irradiation map generation unit, 603 light blur reproduction unit, 604 reflection area calculation unit, 605 reflection map generation unit.

Claims (29)

  1.  光を計測対象物に投影する投影装置と、前記投影装置からの投影光が照射された前記計測対象物を含む撮影空間を撮影する撮像装置とを備える三次元計測装置の計測条件を示す計測条件情報を取得する計測条件取得部と、
     前記計測条件情報に基づいて、前記撮像装置が出力する撮影画像を再現した仮想撮影画像を生成する仮想撮影画像生成部と、
     前記仮想撮影画像を用いて前記計測対象物の表面の三次元位置を計測する三次元計測処理を実行し、計測値を得る三次元計測演算部と、
     前記計測値を含むシミュレーション結果を出力する出力部と、
     を備えることを特徴とするシミュレーション装置。
    A measurement condition indicating a measurement condition of a three-dimensional measurement device including a projection device that projects light onto a measurement target and an imaging device that captures a shooting space including the measurement target irradiated with projection light from the projection device. A measurement condition acquisition unit that acquires information,
    A virtual captured image generation unit that generates a virtual captured image that reproduces the captured image output by the imaging device based on the measurement condition information;
    A three-dimensional measurement calculation unit that performs a three-dimensional measurement process of measuring a three-dimensional position of the surface of the measurement target using the virtual captured image, and obtains a measurement value;
    An output unit that outputs a simulation result including the measurement value,
    A simulation device comprising:
  2.  前記計測条件情報は、前記投影装置の投影条件を示す情報と、前記撮像装置の撮影条件を示す情報と、前記計測対象物の形状を示す情報と、前記計測対象物の状態を示す情報と、前記計測対象物の特性を示す情報との少なくとも1つを含むことを特徴とする請求項1に記載のシミュレーション装置。 The measurement condition information is information indicating a projection condition of the projection device, information indicating a shooting condition of the imaging device, information indicating a shape of the measurement target, and information indicating a state of the measurement target, The simulation apparatus according to claim 1, comprising at least one of information indicating characteristics of the measurement target.
  3.  前記投影条件を示す情報は、前記投影装置の位置姿勢、解像度およびピントを特定可能な情報であり、
     前記撮影条件を示す情報は、前記撮像装置の位置姿勢、解像度およびピントを特定可能な情報であり、
     前記計測対象物の状態を示す情報は、前記計測対象物の位置および姿勢の少なくとも1つを含み、
     前記計測対象物の特性を示す情報は、前記計測対象物の色、拡散反射率、および鏡面反射率の少なくとも1つを含むことを特徴とする請求項2に記載のシミュレーション装置。
    The information indicating the projection condition is information capable of specifying the position and orientation of the projection device, the resolution, and the focus,
    The information indicating the shooting condition is information capable of specifying the position and orientation, the resolution, and the focus of the imaging device,
    The information indicating the state of the measurement target includes at least one of the position and the posture of the measurement target,
    The simulation device according to claim 2, wherein the information indicating the characteristic of the measurement target includes at least one of the color, the diffuse reflectance, and the specular reflectance of the measurement target.
  4.  前記計測条件情報は、前記撮影空間を構成する前記計測対象物以外の物体の位置、形状および特性の少なくとも1つを示す情報をさらに含むことを特徴とする請求項1から3のいずれか1項に記載のシミュレーション装置。 4. The measurement condition information further includes information indicating at least one of a position, a shape, and a characteristic of an object other than the measurement target object that configures the imaging space. The simulation device according to 1.
  5.  前記計測条件情報は、環境光の状態を示す情報をさらに含むことを特徴とする請求項1から4のいずれか1項に記載のシミュレーション装置。 The simulation device according to any one of claims 1 to 4, wherein the measurement condition information further includes information indicating a state of ambient light.
  6.  前記仮想撮影画像生成部は、前記計測条件情報に基づいて、前記撮影画像における前記投影光によって生じる陰影の境界を示す位置に含まれる誤差を画素値で再現した前記仮想撮影画像を生成することを特徴とする請求項1から5のいずれか1項に記載のシミュレーション装置。 Based on the measurement condition information, the virtual captured image generation unit may generate the virtual captured image in which an error included in a position indicating a boundary of a shadow generated by the projection light in the captured image is reproduced with a pixel value. The simulation apparatus according to any one of claims 1 to 5, which is characterized.
  7.  前記仮想撮影画像生成部は、物体の表面で反射した光によって照らされる相互反射、前記投影装置のピントが合っていないことに起因する前記投影光のボケ、前記撮像装置のレンズの湾曲収差に起因する画像ひずみ、ランダムノイズ、および前記撮像装置のピントが合っていないことに起因する画像ボケのうち少なくとも1つの発生要因から生じる前記誤差を再現した前記仮想撮影画像を生成することを特徴とする請求項6に記載のシミュレーション装置。 The virtual captured image generation unit is caused by mutual reflection illuminated by light reflected on the surface of an object, blur of the projection light caused by the out-of-focus of the projection device, and curvature aberration of a lens of the imaging device. The virtual captured image is generated by reproducing the error caused by at least one of the image distortion, the random noise, and the image blur caused by the out-of-focus of the imaging device. Item 6. The simulation device according to Item 6.
  8.  前記仮想撮影画像生成部は、
     前記計測条件情報に基づいて、光学的なシミュレーションにより前記撮影画像を再現した光学再現画像を生成する光学再現画像生成部と、
     前記光学再現画像に対して、前記誤差に基づいて画質の劣化処理を施す画質劣化処理部と、を含み、
     前記画質の劣化処理後の画像を前記仮想撮影画像とすることを特徴とする請求項6または7に記載のシミュレーション装置。
    The virtual captured image generation unit,
    An optical reproduction image generation unit that generates an optical reproduction image by reproducing the captured image by optical simulation based on the measurement condition information;
    An image quality deterioration processing unit that performs an image quality deterioration process on the optical reproduction image based on the error,
    The simulation device according to claim 6, wherein the image after the image quality deterioration process is used as the virtual captured image.
  9.  前記仮想撮影画像生成部は、
     前記仮想撮影画像で表現する前記誤差の発生要因を示す誤差情報を取得する誤差情報取得部、
     をさらに備え、
     前記光学再現画像生成部は、前記計測条件情報および前記誤差情報に基づいて、前記光学再現画像を生成し、
     前記画質劣化処理部は、前記計測条件情報および前記誤差情報に基づいて、前記光学再現画像に対して画質の劣化処理を施すことを特徴とする請求項8に記載のシミュレーション装置。
    The virtual captured image generation unit,
    An error information acquisition unit that acquires error information indicating the cause of the error represented by the virtual captured image,
    Further equipped with,
    The optical reproduction image generation unit generates the optical reproduction image based on the measurement condition information and the error information,
    The simulation apparatus according to claim 8, wherein the image quality deterioration processing unit performs image quality deterioration processing on the optical reproduction image based on the measurement condition information and the error information.
  10.  前記誤差情報は、前記誤差の発生要因の種類ごとの誤差の強度を含み、
     前記仮想撮影画像生成部は、前記誤差情報に基づいて、前記光学再現画像に対して施す前記劣化処理の強度を、前記誤差の発生要因の種類ごとに決定することを特徴とする請求項9に記載のシミュレーション装置。
    The error information includes the intensity of the error for each type of the cause of the error,
    10. The virtual captured image generation unit determines, based on the error information, the strength of the deterioration process to be performed on the optical reproduction image for each type of the cause of the error. The described simulation device.
  11.  前記誤差情報は、前記誤差の発生要因の種類ごとの付加順を含み、
     前記仮想撮影画像生成部は、前記誤差情報に基づいて、前記光学再現画像に対して施す前記劣化処理の順序を、前記誤差の発生要因の種類ごとに決定することを特徴とする請求項9または10に記載のシミュレーション装置。
    The error information includes an addition order for each type of the error generation factor,
    10. The virtual captured image generation unit determines, based on the error information, the order of the deterioration process to be performed on the optical reproduction image, for each type of the cause of the error. 10. The simulation device according to 10.
  12.  前記仮想撮影画像生成部は、前記計測条件情報に基づいて前記光の進行経路を算出し、前記投影光が第1の面で反射し、反射光が第2の面に入射する場合、前記第2の面における前記反射光の入射点を含む画素の輝度を前記反射光の影響を考慮しない第1の輝度よりも増加させることを特徴とする請求項1から11のいずれか1項に記載のシミュレーション装置。 The virtual captured image generation unit calculates the traveling path of the light based on the measurement condition information, and when the projection light is reflected on the first surface and the reflected light is incident on the second surface, 12. The brightness of the pixel including the incident point of the reflected light on the surface No. 2 is increased more than the first brightness that does not consider the influence of the reflected light, according to any one of claims 1 to 11. Simulation device.
  13.  前記仮想撮影画像生成部は、輝度を増加させる前記画素の輝度の増加量を、前記第1の面の表面特性、前記第2の面の表面特性、および、前記第1の面の前記第2の面に対する姿勢に基づき決定することを特徴とする請求項12に記載のシミュレーション装置。 The virtual captured image generation unit calculates the amount of increase in the brightness of the pixel, which increases the brightness, from the surface characteristics of the first surface, the surface characteristics of the second surface, and the second characteristics of the first surface. 13. The simulation apparatus according to claim 12, wherein the determination is made based on the posture with respect to the plane.
  14.  前記仮想撮影画像生成部は、前記計測条件情報に基づいて、前記投影光が照射される領域である第1の領域と、前記投影光が照射されない領域である第2の領域とを算出し、前記第1の領域と前記第2の領域との境界から予め定められた距離以内の画素の輝度を、前記画素と前記境界との距離に基づいて決定することを特徴とする請求項1から13のいずれか1項に記載のシミュレーション装置。 The virtual captured image generation unit calculates, based on the measurement condition information, a first region which is a region to which the projection light is applied and a second region which is a region to which the projection light is not applied, 14. The brightness of a pixel within a predetermined distance from the boundary between the first area and the second area is determined based on the distance between the pixel and the boundary. The simulation device according to any one of 1.
  15.  前記仮想撮影画像生成部は、前記境界から予め定められた距離以内の前記画素の輝度を、前記投影光が照射された場合の輝度である最大値以下であり、前記投影光が照射されない場合の輝度である最小値以上とすることを特徴とする請求項14に記載のシミュレーション装置。 The virtual captured image generation unit, the brightness of the pixel within a predetermined distance from the boundary is less than or equal to the maximum value which is the brightness when the projection light is applied, and when the projection light is not applied. The simulation apparatus according to claim 14, wherein the brightness is equal to or higher than a minimum value.
  16.  前記仮想撮影画像生成部は、前記画素が前記第1の領域に近いほど当該画素の輝度を前記最大値に近づけ、前記画素が前記第2の領域に近いほど当該画素の輝度を前記最小値に近づけることを特徴とする請求項15に記載のシミュレーション装置。 The virtual captured image generation unit makes the brightness of the pixel closer to the maximum value as the pixel is closer to the first region, and sets the brightness of the pixel to the minimum value as the pixel is closer to the second region. The simulation device according to claim 15, wherein the simulation device is brought close to the simulation device.
  17.  前記仮想撮影画像生成部は、
     前記計測条件情報を用いて、前記撮像装置から見える前記撮影空間を示す第1画像と、前記撮像装置および前記投影装置のそれぞれから前記撮影空間内の物体までの距離データとを含むセンサ視点データを生成するセンサ視点データ生成部と、
     前記計測条件情報に基づいて、前記第1画像において前記投影光が照射される領域と、前記投影光が照射されない領域とを異なる数値で示すマップである照射マップを生成するマップ生成部と、
     を有し、
     前記照射マップおよび前記第1画像に基づいて、前記仮想撮影画像を生成することを特徴とする請求項1から16のいずれか1項に記載のシミュレーション装置。
    The virtual captured image generation unit,
    Using the measurement condition information, sensor viewpoint data including a first image showing the shooting space seen from the image pickup device and distance data from each of the image pickup device and the projection device to an object in the shooting space is obtained. A sensor viewpoint data generation unit that generates,
    A map generation unit that generates an irradiation map, which is a map showing, in the first image, a region irradiated with the projection light and a region not irradiated with the projection light with different numerical values based on the measurement condition information;
    Have
    The simulation device according to claim 1, wherein the virtual captured image is generated based on the irradiation map and the first image.
  18.  前記第1画像は、前記投影光が照射されたときの明るさで前記撮影空間の全体が表現された明画像と、前記撮影空間に前記投影光が照射されていない暗画像とを含み、
     前記仮想撮影画像生成部は、前記照射マップの値に基づいて、画素ごとに前記明画像と前記暗画像とを合成して前記仮想撮影画像を生成することを特徴とする請求項17に記載のシミュレーション装置。
    The first image includes a bright image in which the entire shooting space is represented by the brightness when the projection light is irradiated, and a dark image in which the projection light is not irradiated in the shooting space,
    18. The virtual photographed image generation unit generates the virtual photographed image by combining the bright image and the dark image for each pixel based on the value of the irradiation map. Simulation device.
  19.  前記マップ生成部は、
     前記計測条件情報および前記センサ視点データを用いて、前記投影光が照射された点の三次元位置と、前記第1画像上の座標とを対応づけて、前記第1画像中で前記投影光が照射される領域である照射領域を算出する照射領域算出部と、
     前記照射領域と前記計測条件情報とを用いて、前記投影光が示す投影パターンごとに前記照射マップを生成する照射マップ生成部と、
     を有することを特徴とする請求項18に記載のシミュレーション装置。
    The map generator is
    Using the measurement condition information and the sensor viewpoint data, the three-dimensional position of the point irradiated with the projection light and the coordinates on the first image are associated with each other, and the projection light is detected in the first image. An irradiation area calculation unit that calculates an irradiation area that is an area to be irradiated,
    An irradiation map generation unit that generates the irradiation map for each projection pattern indicated by the projection light using the irradiation region and the measurement condition information,
    The simulation apparatus according to claim 18, further comprising:
  20.  前記マップ生成部は、
     前記光の陰影境界のボケの度合に応じて前記照射マップの値を調整することで前記光のボケを再現する光ボケ再現部をさらに有し、
     調整後の前記照射マップの値に基づいて、前記明画像および前記暗画像の輝度のそれぞれを重みづけし、重みづけした輝度を合計して前記仮想撮影画像の各画素の輝度を決定することを特徴とする請求項19に記載のシミュレーション装置。
    The map generator is
    Further comprising a light blur reproducing unit for reproducing the blur of the light by adjusting the value of the irradiation map according to the degree of blur of the shadow boundary of the light,
    Based on the value of the irradiation map after adjustment, each of the brightness of the bright image and the dark image is weighted, and the weighted brightness is summed to determine the brightness of each pixel of the virtual captured image. The simulation device according to claim 19, wherein the simulation device is a device.
  21.  前記マップ生成部は、
     前記計測条件情報および前記センサ視点データを用いて、前記投影装置から照射された光線が前記撮影空間内の物体に反射して光線の強度および方向が変化した光線である反射光線が照射される点の三次元位置と、前記第1画像上の座標とを対応づけることによって、前記第1画像中で前記反射光線が照射される領域である反射領域を算出する反射領域算出部と、
     前記反射領域と前記計測条件情報とを用いて、前記投影光が示す投影パターンごとに、前記反射領域と前記反射光線が照射されない領域とを異なる数値で示すマップである反射マップを生成する反射マップ生成部と、
     を有し、
     前記反射マップ、前記照射マップ、および前記第1画像を用いて、前記仮想撮影画像を生成することを特徴とする請求項18から20のいずれか1項に記載のシミュレーション装置。
    The map generator is
    Using the measurement condition information and the sensor viewpoint data, a point where a light ray emitted from the projection device is reflected by an object in the imaging space and a reflected light ray whose intensity and direction are changed is emitted. A three-dimensional position and a coordinate on the first image by associating with each other, a reflection region calculation unit that calculates a reflection region that is a region in the first image to which the reflected light beam is applied,
    A reflection map that uses the reflection area and the measurement condition information to generate a reflection map that is a map showing different numerical values for the reflection area and the area where the reflected light ray is not irradiated, for each projection pattern indicated by the projection light. A generator,
    Have
    The simulation apparatus according to claim 18, wherein the virtual captured image is generated using the reflection map, the irradiation map, and the first image.
  22.  前記マップ生成部は、前記反射光線の強度および前記投影光の陰影境界のボケの度合に応じて前記反射マップの値を調整し、調整後の前記反射マップおよび前記照射マップの値に基づいて、前記明画像および前記暗画像の輝度のそれぞれを重みづけし、重みづけした輝度を合計して前記仮想撮影画像の各画素の輝度を決定することを特徴とする請求項21に記載のシミュレーション装置。 The map generation unit adjusts the value of the reflection map according to the intensity of the reflected light and the degree of blurring of the shadow boundary of the projection light, based on the value of the adjusted reflection map and the irradiation map, 22. The simulation apparatus according to claim 21, wherein the brightness of each of the bright image and the dark image is weighted, and the weighted brightness is summed to determine the brightness of each pixel of the virtual captured image.
  23.  前記出力部は、前記計測条件情報の少なくとも一部および前記仮想撮影画像の少なくとも一方をさらに含む前記シミュレーション結果を出力することを特徴とする請求項1から22のいずれか1項に記載のシミュレーション装置。 The simulation device according to any one of claims 1 to 22, wherein the output unit outputs the simulation result further including at least one of the measurement condition information and at least one of the virtual captured image. ..
  24.  前記計測条件取得部は、複数の前記計測条件情報を取得し、
     前記出力部は、複数の前記計測条件情報のうち調整対象項目として指定された1つ以上の項目の情報と前記計測値とを対応づけたシミュレーション結果のリストを出力することを特徴とする請求項1から23のいずれか1項に記載のシミュレーション装置。
    The measurement condition acquisition unit acquires a plurality of the measurement condition information,
    The output unit outputs a list of simulation results in which information of one or more items designated as adjustment target items among a plurality of the measurement condition information items and the measured values are associated with each other. The simulation device according to any one of 1 to 23.
  25.  前記計測条件情報を用いて、前記シミュレーション結果の評価基準である評価基準データを生成する評価基準データ生成部と、
     前記シミュレーション結果および前記評価基準データを用いて、前記シミュレーション結果を評価する計測評価部と、
     をさらに備え、
     前記出力部は、前記シミュレーション結果に加えて、評価結果を出力することを特徴とする請求項1から24のいずれか1項に記載のシミュレーション装置。
    Using the measurement condition information, an evaluation reference data generation unit that generates evaluation reference data that is an evaluation reference for the simulation result,
    Using the simulation result and the evaluation reference data, a measurement evaluation unit that evaluates the simulation result,
    Further equipped with,
    25. The simulation apparatus according to claim 1, wherein the output unit outputs an evaluation result in addition to the simulation result.
  26.  前記シミュレーション結果に基づいて、前記撮影空間内の物体の位置姿勢と前記物体を把持可能な位置である把持位置とのうち少なくとも1つを認識結果として取得する物体認識処理部と、
     前記認識結果と前記計測条件情報とを用いて、前記位置姿勢の推定精度と前記把持位置の推定精度とのうち少なくとも1つを含む認識評価結果を取得する認識評価部と、
     をさらに備え、
     前記出力部は、前記認識評価結果をさらに出力することを特徴とする請求項1から25のいずれか1項に記載のシミュレーション装置。
    An object recognition processing unit that acquires at least one of a position and orientation of an object in the imaging space and a grip position that is a position where the object can be gripped as a recognition result based on the simulation result
    A recognition evaluation unit that uses the recognition result and the measurement condition information to acquire a recognition evaluation result including at least one of the position and orientation estimation accuracy and the grip position estimation accuracy,
    Further equipped with,
    The simulation device according to any one of claims 1 to 25, wherein the output unit further outputs the recognition evaluation result.
  27.  前記計測条件情報と前記認識結果とを用いて、前記撮影空間内の物体の把持成功確率を評価する物体把持評価部、
     をさらに備えることを特徴とする請求項26に記載のシミュレーション装置。
    An object grip evaluation unit that evaluates the probability of successful grip of an object in the imaging space using the measurement condition information and the recognition result,
    The simulation apparatus according to claim 26, further comprising:
  28.  光を計測対象物に投影する投影装置と、前記投影装置からの投影光が照射された前記計測対象物を含む撮影空間を撮影する撮像装置とを備える三次元計測装置の三次元計測処理をシミュレーションするシミュレーション装置が、
     前記三次元計測装置の計測条件を示す計測条件情報を取得するステップと、
     前記計測条件情報に基づいて、前記撮像装置が出力する撮影画像を再現した仮想撮影画像を生成するステップと、
     前記仮想撮影画像を用いて前記計測対象物の表面の三次元位置を計測する三次元計測処理を実行し、計測値を得るステップと、
     前記計測値を含むシミュレーション結果を出力するステップと、
     を含むことを特徴とするシミュレーション方法。
    Simulates the three-dimensional measurement process of the three-dimensional measurement device including a projection device that projects light onto a measurement target and an imaging device that captures an imaging space including the measurement target irradiated with projection light from the projection device. The simulation device that
    A step of acquiring measurement condition information indicating a measurement condition of the three-dimensional measuring device;
    Generating a virtual captured image that reproduces a captured image output by the image capturing device based on the measurement condition information;
    Performing a three-dimensional measurement process of measuring a three-dimensional position of the surface of the measurement target using the virtual captured image, and obtaining a measurement value,
    Outputting a simulation result including the measurement value,
    A simulation method comprising:
  29.  光を計測対象物に投影する投影装置と、前記投影装置からの投影光が照射された前記計測対象物を含む撮影空間を撮影する撮像装置とを備える三次元計測装置の計測条件を示す計測条件情報を取得するステップと、
     前記計測条件情報に基づいて、前記撮像装置が出力する撮影画像を再現した仮想撮影画像を生成するステップと、
     前記仮想撮影画像を用いて前記計測対象物の表面の三次元位置を計測する三次元計測処理を実行し、計測値を得るステップと、
     前記計測値を含むシミュレーション結果を出力するステップと、
     をコンピュータに実行させることを特徴とするシミュレーションプログラム。
    A measurement condition indicating a measurement condition of a three-dimensional measurement device including a projection device that projects light onto a measurement target and an imaging device that captures a shooting space including the measurement target irradiated with projection light from the projection device. The step of obtaining information,
    Generating a virtual captured image that reproduces a captured image output by the image capturing device based on the measurement condition information;
    Performing a three-dimensional measurement process of measuring a three-dimensional position of the surface of the measurement target using the virtual captured image, and obtaining a measurement value,
    Outputting a simulation result including the measurement value,
    A simulation program that causes a computer to execute.
PCT/JP2019/005156 2019-02-13 2019-02-13 Simulation device, simulation method, and simulation program WO2020165976A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2019/005156 WO2020165976A1 (en) 2019-02-13 2019-02-13 Simulation device, simulation method, and simulation program
CN201980091525.1A CN113412500B (en) 2019-02-13 2019-02-13 Simulation device and simulation method
DE112019006855.5T DE112019006855T5 (en) 2019-02-13 2019-02-13 SIMULATION DEVICE AND SIMULATION METHOD
JP2020571967A JP7094400B2 (en) 2019-02-13 2019-02-13 Simulation equipment and simulation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/005156 WO2020165976A1 (en) 2019-02-13 2019-02-13 Simulation device, simulation method, and simulation program

Publications (1)

Publication Number Publication Date
WO2020165976A1 true WO2020165976A1 (en) 2020-08-20

Family

ID=72044404

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/005156 WO2020165976A1 (en) 2019-02-13 2019-02-13 Simulation device, simulation method, and simulation program

Country Status (4)

Country Link
JP (1) JP7094400B2 (en)
CN (1) CN113412500B (en)
DE (1) DE112019006855T5 (en)
WO (1) WO2020165976A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114202981A (en) * 2021-12-10 2022-03-18 新疆工程学院 Simulation platform for photogrammetry experiment
KR20220053936A (en) * 2020-10-23 2022-05-02 엘아이지넥스원 주식회사 Method for simulating image sensor
CN115802014A (en) * 2021-09-09 2023-03-14 卡西欧计算机株式会社 Recording medium, setting simulation method, and setting simulation apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015203652A (en) * 2014-04-15 2015-11-16 キヤノン株式会社 Information processing unit and information processing method
WO2016186211A1 (en) * 2015-05-21 2016-11-24 国立大学法人 鹿児島大学 Three-dimensional measurement system, three-dimensional measurement method, and three-dimensional measurement program
JP2018144158A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot simulation device, robot simulation method, robot simulation program, computer-readable recording medium and recording device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200907764A (en) * 2007-08-01 2009-02-16 Unique Instr Co Ltd Three-dimensional virtual input and simulation apparatus
JP5198078B2 (en) * 2008-01-24 2013-05-15 株式会社日立製作所 Measuring device and measuring method
CN108369089B (en) * 2015-11-25 2020-03-24 三菱电机株式会社 3D image measuring device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015203652A (en) * 2014-04-15 2015-11-16 キヤノン株式会社 Information processing unit and information processing method
WO2016186211A1 (en) * 2015-05-21 2016-11-24 国立大学法人 鹿児島大学 Three-dimensional measurement system, three-dimensional measurement method, and three-dimensional measurement program
JP2018144158A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot simulation device, robot simulation method, robot simulation program, computer-readable recording medium and recording device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220053936A (en) * 2020-10-23 2022-05-02 엘아이지넥스원 주식회사 Method for simulating image sensor
KR102507277B1 (en) * 2020-10-23 2023-03-07 엘아이지넥스원 주식회사 Method for simulating image sensor
CN115802014A (en) * 2021-09-09 2023-03-14 卡西欧计算机株式会社 Recording medium, setting simulation method, and setting simulation apparatus
CN114202981A (en) * 2021-12-10 2022-03-18 新疆工程学院 Simulation platform for photogrammetry experiment
CN114202981B (en) * 2021-12-10 2023-06-16 新疆工程学院 Simulation platform for photogrammetry experiments

Also Published As

Publication number Publication date
CN113412500B (en) 2023-12-29
CN113412500A (en) 2021-09-17
JP7094400B2 (en) 2022-07-01
JPWO2020165976A1 (en) 2021-09-30
DE112019006855T5 (en) 2021-11-04

Similar Documents

Publication Publication Date Title
JP6745173B2 (en) Image inspection apparatus, image inspection method, image inspection program, computer-readable recording medium, and recorded device
JP7422689B2 (en) Article inspection with dynamic selection of projection angle
US9563954B2 (en) Method for capturing the three-dimensional surface geometry of objects
CN107525479A (en) Object property is determined for particular optical measurement
Sun et al. An empirical evaluation of factors influencing camera calibration accuracy using three publicly available techniques
WO2020165976A1 (en) Simulation device, simulation method, and simulation program
JP6519265B2 (en) Image processing method
CN107025663A (en) It is used for clutter points-scoring system and method that 3D point cloud is matched in vision system
JP6054576B2 (en) Method and apparatus for generating at least one virtual image of a measurement object
CN103021061B (en) Method for assessing the object detector of motor vehicles
US20110150338A1 (en) Method and system for generating intrinsic images using single reflectance technique
CN114503156A (en) Multi-imaging mode image alignment
JP6614905B2 (en) Three-dimensional measuring apparatus and control method thereof
WO2020075252A1 (en) Information processing device, program, and information processing method
CN106415198B (en) image recording method and coordinate measuring machine for carrying out said method
US20020065637A1 (en) Method and apparatus for simulating the measurement of a part without using a physical measurement system
JP2023534175A (en) Neural network analysis of LFA specimens
JP2018200328A (en) Inspection device, inspection method and program
JP6776004B2 (en) Image processing equipment, image processing methods and programs
KR20230136291A (en) Method for assessing 3D mesh quality using structural similarity of projective space, apparatus and computer program for performing the method
JP4764963B2 (en) Image processing device
US20200234458A1 (en) Apparatus and method for encoding in structured depth camera system
US20040258311A1 (en) Method for generating geometric models for optical partial recognition
TWI836146B (en) Multi-imaging mode image alignment
JP2003168129A (en) Method, program, apparatus and system for three- dimensional image processing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19915567

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020571967

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 19915567

Country of ref document: EP

Kind code of ref document: A1