WO2017090111A1 - Three-dimensional image measurement device and method - Google Patents

Three-dimensional image measurement device and method Download PDF

Info

Publication number
WO2017090111A1
WO2017090111A1 PCT/JP2015/083036 JP2015083036W WO2017090111A1 WO 2017090111 A1 WO2017090111 A1 WO 2017090111A1 JP 2015083036 W JP2015083036 W JP 2015083036W WO 2017090111 A1 WO2017090111 A1 WO 2017090111A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image data
sensor
pattern
pattern image
Prior art date
Application number
PCT/JP2015/083036
Other languages
French (fr)
Japanese (ja)
Inventor
幸康 堂前
関 真規人
鹿井 正博
奥田 晴久
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to CN201580084658.8A priority Critical patent/CN108369089B/en
Priority to JP2016546864A priority patent/JP6038415B1/en
Priority to PCT/JP2015/083036 priority patent/WO2017090111A1/en
Priority to DE112015007146.6T priority patent/DE112015007146T5/en
Publication of WO2017090111A1 publication Critical patent/WO2017090111A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2536Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object using several gratings with variable grating pitch, projected on the object with the same angle of incidence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Definitions

  • the present invention is a three-dimensional image measurement that measures a three-dimensional shape of an object by projecting a pattern onto the object, imaging the reflected light, and analyzing it using a three-dimensional image measurement method such as a spatial coding method.
  • the present invention relates to an apparatus and a method.
  • a pattern image which is a reflected light image from a measurement object on which a predetermined pattern is projected, is obtained by photographing the reflected light of the pattern projected on the object with a sensor such as a camera.
  • a sensor such as a camera.
  • the distance from the sensor to the object can be restored by analyzing the pattern image based on the principle of triangulation.
  • this includes a three-dimensional image measurement method called an active stereo method, which includes a spatial encoding method, a structured light method, a light cutting method, a spot light projection method, and a slit light projection method.
  • the pixel value obtained when the reflected light of the pattern projected onto the object is received by the sensor element and the other elements that do not receive the reflected light.
  • a clear difference is required between the pixel values.
  • These pixel values are determined depending on various factors such as ambient light other than the projected pattern, diffuse reflectance on the surface material of the object, light intensity of the projected pattern, and light receiving sensitivity of the sensor.
  • Patent Document 1 As a method for measuring a three-dimensional image of a scene in which a plurality of objects having different surface states are mixed, a pattern image (multiple shutter speed) obtained by combining pattern images taken at different exposure times. There is a method of restoring the distance from (image) (Patent Document 1).
  • three-dimensional image measurement is performed by a spatial encoding method using a multiple exposure pattern image obtained by synthesizing a short exposure time pattern image and a long exposure time pattern image. Since the continuity between pixels deteriorates due to insufficient range of the light receiving element of the camera, there is a problem that pixels having inaccurate values are generated, that is, erroneous measurement occurs.
  • An object of the present invention is to solve the above-described problems, and in a scene where a plurality of objects having different surface states exist, a three-dimensional image measurement apparatus capable of performing stable measurement with less measurement loss while suppressing erroneous measurement. Is to provide.
  • a three-dimensional image measurement apparatus includes: A three-dimensional image measuring device that generates distance image data representing a distance from the sensor to the object as a pixel value from pattern image data obtained by photographing a pattern projected onto the object to be photographed by the projector.
  • Control means for complementing a pixel whose pixel value is unknown in distance image data obtained from pattern image data photographed by the sensor with a pixel value of a distance image obtained from pattern image data photographed by the sensor under different photographing conditions It is characterized by that.
  • the three-dimensional image measurement apparatus it is possible to perform stable measurement with few measurement omissions while suppressing erroneous measurement in a scene where there are a plurality of objects having different surface states.
  • FIG. 1 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the first embodiment of the present invention.
  • a camera 1 that images a three-dimensional object and outputs image data
  • a projector 2 that projects a predetermined amount of light onto the three-dimensional object are respectively interface circuits 36 of the information processing apparatus 3. , 37.
  • the information processing apparatus 3 includes a control unit such as a computer or a digital computer. Specifically, a central processing unit (hereinafter referred to as a CPU) 31 that executes a three-dimensional image measurement process, and a three-dimensional image measurement process.
  • a CPU central processing unit
  • ROM Read-only memory
  • RAM random access memory
  • An input device 34 that includes a recognition camera, a wearable acceleration sensor, and the like and receives an operator's operation, and a display device 35 that presents information to the operator, such as a display, are provided. It is connected and configured.
  • the interface circuit 36 performs conversion of communication data and signals executed between the camera 1 and the CPU 31. Further, the interface circuit 37 performs conversion of a command signal from the CPU 1 to the projector 2 and the like. Specifically, the CPU 1 outputs a pattern projection command signal for instructing to project a predetermined pattern, which will be described in detail later, to the projector 2 via the interface circuit 37. In addition, the CPU 31 generates a shooting command signal and outputs it to the camera 1 via the interface circuit 36. In response to this, the camera 1 outputs pattern image data which is image data obtained by shooting a predetermined pattern and its high-frequency pattern. The image data is output to the CPU 1 via the interface circuit 36.
  • the three-dimensional image measurement apparatus synthesizes distance images (images in which the distance from the camera 1 to the measurement object is represented by pixel values) obtained from pattern images photographed under different photographing conditions. is there. At that time, a pixel with an unknown pixel value of a distance image obtained from a pattern image photographed with a relatively small amount of received light can be obtained from a pattern image photographed with a relatively large amount of received light. Interpolation is performed using pixel values of the distance image. In this case, the composition method is also changed between pixels with high and low reliability of measurement of high-frequency components in the pattern.
  • the camera 1 is a digital camera for photographing the pattern of the three-dimensional object projected by the projector 2 and outputting the image data captured to the information processing apparatus 3.
  • the camera 1 has a communication function for receiving a shooting command from the information processing device 3 and returning the shot pattern image to the information processing device 3
  • the camera 1 is a pinhole camera, a range finder camera, or a view camera. There may be.
  • a pattern image can be taken, a light field camera provided with a microlens array for estimating the direction of light entering the lens may be used.
  • the projector 2 receives a pattern projection command from the information processing device 3 and irradiates a predetermined pattern to a three-dimensional object.
  • the projector 2 is a CRT projector, a liquid crystal projector, a digital light processing projector, a reflective liquid crystal element projector, A reflective display element projector using a diffraction phenomenon, a laser, a diffraction grating, a polygon mirror, and a driving device that changes a laser and a laser irradiation position may be used.
  • FIG. 2 is a block diagram showing a schematic function of the information processing apparatus 3 of FIG.
  • the information processing apparatus 3 (1) A projection pattern control unit 10 that outputs a pattern projection command signal for projecting a predetermined pattern to the projector 2; (2) An image acquisition control unit 11 that outputs a shooting command signal to the camera 1 at a timing synchronized with a pattern projection command signal from the projection pattern control unit 10; (3) a distance restoration unit 12 that restores the distance between the camera 1 and the object as distance image data of digital information from a pattern image captured by the camera 1; (4) From each of the distance image data restored by the distance restoration unit 12 and the high-frequency pattern image data photographed by the camera 1 (images obtained by photographing patterns having fine features in a plurality of types of patterns projected by the projector 2), each pixel A measurement reliability evaluation unit 13 for evaluating the reliability of the distance estimation of (5) From a plurality of distance image data restored by the distance restoration unit 12 and a measurement reliability map of the distance image data created by the measurement reliability evaluation unit 13, a pixel
  • the projection pattern control unit 10 and the image acquisition control unit 11 are controlled using a synchronization signal, and synchronize the timing of the pattern projection command signal and the imaging command signal. Thereby, the increased data can be obtained for the synchronized pattern image.
  • FIG. 3A is a diagram showing a camera image 21 when a measurement target is photographed by the three-dimensional image measurement apparatus of FIG.
  • FIG. 3B is a diagram showing a distance image 22 when a measurement object is photographed by the three-dimensional image measurement apparatus of FIG.
  • the intensity of light reflected by the reflectance of the material of the object is reflected in the pixel value, so that the black object is black and the white object is white.
  • the distance image 22 of FIG. 3B corresponding to the camera image 21 since the distance from the camera 1 to the object is expressed as a pixel value, the object close to the camera 1 is bright and the distant object is displayed black. .
  • FIG. 4 is a diagram showing an example of a pattern image including a positive pattern image 23 and a negative pattern image 24 used in a general spatial coding method. That is, FIG. 4 shows an image of a plurality of binary patterns used in a general spatial coding method.
  • the slit pattern in the vertical direction is irradiated, and the irradiated portion is displayed bright and the non-irradiated portion is displayed dark.
  • a plurality of slit patterns having different frequency bands are projected. Depending on the thickness of the slit, a thin slit pattern is called a high-frequency pattern image, and a thick slit pattern is called a low-frequency pattern image.
  • a pair of slit patterns whose binary patterns are opposite in polarity that is, image data of a pair obtained by reversing the light and dark of the photographing pattern
  • a pair of the paired pattern images is referred to as a positive pattern image 23
  • the other pair is referred to as a negative pattern image 24.
  • the distance restoration unit 12 restores the distance including the distance from the camera 1 to the object from the pattern image obtained by the image acquisition control unit 11. Although it is possible to express the distance restoration result as a point cloud that is a set of point clouds for a three-dimensional object, it is possible to utilize image processing with a short processing time by expressing it as distance image data. .
  • the measurement reliability evaluation unit 13 quantifies the possibility that the pixel value is unknown or inaccurate in each pixel of the distance image from the camera 1 to the object restored by the distance restoration unit 12.
  • high-frequency pattern image data is used among the pattern images photographed by the image acquisition control unit 11. Since the high-frequency pattern is a thin pattern, the pattern tends to disappear according to the intensity change of the reflected light of the ambient light or the pattern light itself. In particular, when the intensity of the reflected light is strong, not only the pixels that directly receive the reflected light are saturated but also the surrounding pixels are saturated, leading to erroneous measurement over a wide range. Since the high frequency pattern is easily affected by fluctuations in reflected light, analyzing its state is suitable for evaluation of measurement reliability.
  • the measurement reliability map R PN is defined as follows.
  • I P High is high frequency positive pattern image data
  • I N High is high frequency negative pattern image data.
  • the value of each pixel in the high-frequency positive pattern image data and the high-frequency negative pattern image data inherently has an extreme difference, but no difference occurs depending on the influence of ambient light and the intensity of reflected light of the pattern light. This is equivalent to the fact that a high frequency pattern could not be observed. That is, when the value of the measurement reliability map RPN is 0 or lower than a predetermined threshold (for example, a positive value near 0, for example, 0.01), it can be evaluated that the measurement reliability is extremely low.
  • a predetermined threshold for example, a positive value near 0, for example, 0.01
  • the distance image composition unit 14 synthesizes distance images photographed under different photographing conditions, and corrects a pixel whose pixel value is unknown under each photographing condition or a pixel value different from the actual distance between the camera 1 and the object. I do.
  • the different shooting conditions are that a pattern image is shot by changing the amount of received light of the camera element, and the distance image is restored.
  • the information processing apparatus 3 has an exposure time of the camera 1 (for example, using a first exposure time and a second exposure time longer than the first exposure time), an aperture of the camera 1. By changing the value, the gain of the camera 1, the light projecting intensity of the projector 2, and the light projecting direction of the projector 2, different shooting conditions can be realized.
  • FIG. 5A is a diagram of an image example showing that the photographing result of the measurement object varies depending on the exposure time, and is a diagram of an image example of the image 25 in the first darkest case.
  • FIG. 5B is an image example showing that the measurement result of the measurement object varies depending on the exposure time, and is an image example of the image 26 in the second darkest case.
  • FIG. 5C is a diagram of an image example showing that the photographing result of the measurement object varies depending on the difference in exposure time, and is a diagram of an image example of the image 27 in the third darkest case.
  • FIG. 5A is a diagram of an image example showing that the photographing result of the measurement object varies depending on the exposure time, and is a diagram of an image example of the image 25 in the first darkest case.
  • FIG. 5B is an image example showing that the measurement result of the measurement object varies depending on the exposure time, and is an image example of the image 26 in the second darkest case.
  • FIG. 5C is a diagram of an
  • FIGS. 5A to 5D are images of images obtained by shooting the same object under different exposure conditions under different shooting conditions.
  • the image in FIG. 5A has the shortest exposure time, and the exposure time becomes longer in the order of FIGS. 5B, 5C, and 5D.
  • the difference between the black object and the surroundings is difficult to distinguish, and in an image with a long exposure time, the difference between the white object and the surroundings is difficult to distinguish.
  • FIG. 6A is a diagram illustrating an image example of the camera image 41 of a pattern image with a small amount of light received by pixels.
  • FIG. 6B is a diagram illustrating an image example of the distance image 42 obtained from the camera image 41 of FIG. 6A.
  • FIG. 7A is a diagram illustrating an image example of the camera image 43 of a pattern image having a relatively large amount of light received by pixels.
  • FIG. 7B is a diagram showing an example of a distance image obtained from the camera image 43 of FIG. 7A.
  • a small black circle indicates a pixel whose pixel value is unknown.
  • the pixel value of the white object is easily obtained accurately, but the pixel value of the black object is likely to be unknown.
  • a portion such as a texture in the distance image 44 in FIG. 7B indicates an incorrect pixel value.
  • the pixel value of the black object can be accurately obtained, but the pixel value of the white object tends to be inaccurate.
  • Interpolation with neighboring pixels can be applied to pixels whose pixel values are unknown, but pixels with inaccurate pixel values cannot be interpolated, and serious problems such as robot collision are likely to occur in actual applications. Therefore, high-quality distance image data is obtained by synthesizing the distance images by the following strategy method.
  • the distance image synthesizing unit 14 generates and outputs distance image data I fusion (x, y) synthesized by synthesizing distance images as in the following equation.
  • I Low indicates distance image data with a low light reception amount of pixel values based on a pattern image photographed under a certain photographing condition.
  • I High refers to distance image data with a high received light amount of pixel values based on pattern images shot under different shooting conditions.
  • the distance image is acquired twice while changing the exposure time, the distance image data based on shooting with a short exposure time corresponds to I Low , and the distance image data based on shooting with a long exposure time corresponds to I High .
  • (x, y) means a two-dimensional position of a pixel in an image.
  • a pattern image with a low amount of received light the pattern disappears, so that the pixel value of the distance image data that can be calculated tends to be unknown.
  • the influence of ambient light is also suppressed, light reception noise (hereinafter referred to as noise) is reduced, and the pixel value for which the pixel value of the distance image data can be calculated tends to be accurate.
  • noise light reception noise
  • a pattern image with a high amount of received light is strongly affected by ambient light, so that noise increases and pixel values of distance image data tend to be inaccurate. That is, the distance image data I Low has little noise, but there are many places where the pixel value is unknown.
  • the distance image data I High is noisy.
  • the result of the synthesized distance image will also be incorrect. Therefore, based on the measurement reliability map R PN obtained from the high-frequency positive pattern image data I P High obtained when the distance image data I High is calculated and the high-frequency negative pattern image data I N High , the following is performed.
  • the distance image data I fusion (x, y) is synthesized.
  • the pixel value of the distance image data I High is complemented with the pixel value of the distance image data I High.
  • the value of the measurement reliability map RPN is 0, the pixel value is 0 (pixel value is unknown). .
  • high-quality distance image data I fusion (x, y) with further reduced noise can be obtained.
  • the case is classified according to whether the value of the measurement reliability map RPN is 0 or not, but a predetermined threshold value (for example, a value near 0, for example, 0.01) ) May be defined and the measurement reliability map RPN may be classified according to whether the value is greater than or less than the threshold value.
  • the pixel value is set to 0 when the value is less than the threshold value.
  • FIG. 8A is a measurement result of the three-dimensional image measurement apparatus in FIG. 1 and is a diagram illustrating an example of a camera image of an object.
  • FIG. 8B is a measurement result of the three-dimensional image measurement apparatus in FIG. 1 and shows an example of an original distance image based on photographing with a relatively short exposure time.
  • FIG. 8C is a diagram showing a measurement result of the three-dimensional image measurement apparatus of FIG. 1 and showing an example of a distance image calculated by Expression (2).
  • FIG. 8D is a measurement result of the three-dimensional image measurement apparatus of FIG. 1, and is a diagram showing an example of the distance image calculated by the equations (2) and (3).
  • the left image is a black object
  • the right image is an object having a white portion and a metal portion.
  • many pixels with unknown pixel values are generated.
  • Expressions (2), (3), and (4) in FIGS. are complemented with exact values. It shows remarkable results especially for black objects.
  • pixel values that are completely inaccurate are mixed in a part of a white object or a metal part.
  • FIG. 8D according to the equations (3) and (4), it can be seen that most of them are removed.
  • FIG. 9 is a flowchart showing a 3D image measurement process executed by the information processing apparatus 3 of the 3D image measurement apparatus of FIG.
  • FIG. 10 is a flowchart showing measurement reliability evaluation processing (step S9) which is a subroutine of FIG.
  • FIG. 11 is a flowchart showing distance image synthesis processing (step S10) which is a subroutine of FIG.
  • a pattern image is photographed by irradiating with a projection pattern twice with different photographing parameters P1, P2 (steps S1, S5) (steps S2-S3, S6-S7).
  • the parameters are the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, the light projection direction of the projector 2, the intensity of ambient light at the time of shooting, and the environment at the time of shooting. It corresponds to the change of the direction of light.
  • the shooting parameter P1 in FIG. 9 is a parameter setting in which the brightness value of the shot image becomes dark overall. If the exposure time of the camera 1 is short, the exposure time of the camera 1 is set short.
  • the projection pattern is irradiated from the projector 2, and the pattern image is taken while the camera 1 is synchronized.
  • a distance image is created from the captured pattern image.
  • parameters are set so that the brightness value of the captured image becomes bright overall, and pattern projection, shooting, and distance image creation are performed. Based on the two shootings, the measurement reliability is evaluated (step S9), and the distance images are synthesized (step 10).
  • the absolute difference between the high-frequency negative pattern image and the high-frequency positive pattern image is calculated and the absolute difference between the high-frequency negative pattern image and the high-frequency positive pattern image is acquired.
  • Difference image data having a pixel value of the value is generated (step S11).
  • the following steps S13 to S15 are performed using one unselected pixel as the selected pixel to be processed. If the difference absolute value of the selected pixel is 0 (YES in step S13), the measurement reliability value of the selected pixel is set to 0 and the process proceeds to step S16.
  • step S13 if the difference absolute value of the selected pixel is 1 (NO in step S13), the measurement reliability value of the selected pixel is set to 1 (step S15), and the process proceeds to step S16.
  • the processing from step S12 to S14 is repeated for all pixels of the processing target image (step S16), and if the result of calculating the measurement reliability for all the pixels is obtained (YES in step S16), a measurement reliability map is generated. (Step S17), the process returns to the original routine.
  • an unselected pixel is set as a selected pixel to be processed (step S21), and the following steps S22 to S26 are performed.
  • the process proceeds to step S23, but if it is not 0 (NO in step S22;
  • step S24 the pixel value of the distance image data (imaging parameter P1) is added to the pixel value of the synthesized distance image data in step S24. Then, the process proceeds to step S27.
  • step S23 If the pixel value of the measurement reliability map is 0 (YES in step S23), 0 is substituted into the pixel value of the synthesized distance image data, and the process proceeds to step S27. If not, the composition is not 0 (NO in step S23).
  • the pixel value of the distance image data (imaging parameter P2) is substituted into the pixel value of the distance image data thus obtained, and the process proceeds to step S27.
  • the processing from step S21 to S26 is repeated for all pixels of the processing target image (step S27), and if the result of combining the distance image data is obtained for all pixels (YES in step S27), the combined distance image data is generated. And output (step S28), the process returns to the original routine.
  • the distance image data of the shooting parameter P1 is obtained based on the distance image data obtained from the shooting of the shooting parameter P1 adjusted so that the overall luminance value becomes dark. If the pixel value is not 0 (meaning that the distance value is unknown) (NO in step S22), the pixel value of the distance image data of the shooting parameter P1 is set as the pixel value of the synthesized distance image data (step). S24). If the pixel value of the distance image data of the shooting parameter P1 is 0 (YES in step S22), the measurement reliability map created using the image data of FIG. 4 is referred to.
  • the pixel value of the distance image data of the shooting parameter P2 is set as the pixel value of the synthesized distance image (step S26). . If the pixel value of the corresponding pixel is 0 (YES in step S23), the corresponding pixel value of the synthesized distance image is set to 0 (step S25).
  • the same procedure is used for the distance image data captured based on the combined distance image and captured with different shooting parameters P3 (different from the shooting parameters P1 and P2).
  • a distance image with fewer missing measurements may be created by combining the images.
  • FIG. FIG. 12 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the second embodiment of the present invention. 12, the three-dimensional image measurement apparatus according to the second embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
  • the information processing apparatus 3 further includes an interface circuit 38 connected to the bus 30.
  • the mobile device 4 further includes a movable device 4 connected to the interface circuit 38 and moving an object to be photographed, for example, in three dimensions.
  • the CPU 31 of the information processing device 3 transmits a predetermined movement command signal to the movable device 4 via the interface circuit 38, and in response to this, the movable device 4 sends a response signal (for example, an ACK signal) to the CPU 31.
  • a response signal for example, an ACK signal
  • the movable device 4 moves the object to be photographed in, for example, a three-dimensional manner based on the movement command signal, so that the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, Not only can the projection direction of the projector 2 be changed, but also the intensity of ambient light during shooting and the direction of ambient light during shooting can be changed, so that shooting conditions can be changed more flexibly.
  • ambient light is light emitted from other than the projector 2, and includes sunlight, electric light, thermal radiation including flame, and luminescence.
  • the intensity of the ambient light can be changed by changing the position of the oblique light means.
  • the oblique light means is a bag or a cover, and a movable device including an industrial robot may be used.
  • the movable device 4 is, for example, an actuator.
  • the movable device 4 itself may function as a shielding device that shields ambient light from the camera and the projector.
  • the movable device 4 may be composed of a plurality of actuators. In that case, ambient light from a plurality of light sources can be blocked.
  • the movable device 4 may be a multi-axis movable device such as a robot arm composed of a plurality of actuators.
  • a hand such as a gripper may be attached to the tip. In this case, the robot arm may change the posture, and the hand may block the light source of ambient light.
  • the ambient light source may be blocked by holding a shielding object such as a peripheral plate or translucent ground glass by a robot arm and changing the posture of the shielding object.
  • a shielding object such as a peripheral plate or translucent ground glass
  • the direction of ambient light may be changed by holding a mirror.
  • the ambient light illumination may be turned on and off by operating a switch that controls the light source of the ambient light with a robot arm.
  • the movable device 4 may have any configuration as long as the change in the intensity and direction of the ambient light is realized. Even with a snake-shaped robot or an actuator that repeatedly expands or contracts by air, a similar action to ambient light is realized, so that it is obvious that the movable device 4 can be configured.
  • the same three-dimensional image measurement process as in FIG. 9 is executed.
  • the brightness value of the image can be adjusted to become darker or brighter as a whole.
  • the projector 2 is visible light, it is strongly influenced by environmental light. Therefore, in a state where the ambient light is blocked by the movable device 4 or in a state where the ambient light is constant, the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, By changing the projection direction of the projector 2 and adjusting the shooting parameters, it is possible to create a high-quality distance image with less noise.
  • FIG. 13 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the third embodiment of the present invention.
  • the three-dimensional image measurement apparatus according to the third embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
  • the information processing apparatus 3 further includes an interface circuit 39.
  • the communication device 5 further connected to the interface circuit 39 is further provided.
  • a communication device 6 connected to the communication device 5 via the communication line 9 is further provided.
  • the difference will be described in detail.
  • the interface circuit 39 of the information processing device 3 is connected to the lighting device 7 through the communication device 5, the communication line 9, and the communication device 6 to form a command signal communication path.
  • the CPU 1 of the information processing device 3 controls the operation of the lighting device 7 by transmitting a lighting command signal to the lighting device 7 via the command signal communication path, and in response to this, the lighting device 7 A response signal (for example, ACK signal) corresponding to this is transmitted to the CPU 31 via the command signal communication path.
  • ACK signal for example, ACK signal
  • the communication device 5 and the communication device 6 are respectively a wireless communication device or a wired communication device.
  • the wireless communication device is, for example, a wireless router device.
  • the wired communication device is a communication device having a communication port such as an Ethernet (registered trademark) port, a USB port, and a serial port.
  • the illuminating device 7 can change one or more of the intensity and orientation of the irradiation with respect to the object based on the illumination command signal from the CPU 1 of the information processing device 3, and the on / off time can be set by pulse width modulation. It includes at least one of an intensity control unit that makes it possible to adjust the intensity, such as control, and a movable unit that changes the direction of the intensity light.
  • FIG. 14 is a flowchart showing a three-dimensional image measurement process executed by the information processing apparatus 3 of the three-dimensional image measurement apparatus in FIG.
  • the three-dimensional image measurement process of FIG. 14 differs from the three-dimensional image measurement process of FIG. 9 in the following points.
  • the process includes a process of step S31 “weakening the irradiation intensity of the illumination device 7 so as to be equal to or less than a predetermined threshold value Th1, or setting the posture in such a manner”.
  • step S5 the process includes the process of step S32 that “intensifies the irradiation intensity of the illumination device 7 to be equal to or higher than a predetermined threshold Th2 (> Th1) or sets the posture in such a manner”
  • Th2 a predetermined threshold
  • the intensity and / or posture of the illumination apparatus 7 is changed by controlling the intensity control unit or the movable part of the illumination apparatus 7. At this time, the luminance value of the entire image of the target object becomes dark and the illumination intensity is weakened so as to be equal to or lower than a predetermined threshold value Th1, or the illumination posture is changed so as not to irradiate directly in the shooting direction of the camera. To set.
  • the intensity and / or posture of the illumination device 7 is changed by controlling the intensity control unit or the movable unit of the illumination device 7. At this time, the overall luminance value of the image of the object becomes brighter, and the intensity is increased by increasing the irradiation intensity to be equal to or higher than a predetermined threshold value Th2 (> Th1) or directly in the shooting direction of the camera 1. Change the posture so that
  • the ambient light can be controlled using the illumination device 7, and a high-quality distance image can be created.
  • FIG. FIG. 15 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the fourth embodiment of the present invention.
  • the three-dimensional image measurement apparatus according to the fourth embodiment in FIG. 15 is different from the three-dimensional image measurement apparatus according to the second embodiment in FIG. 12 in the following points.
  • a movable control device 8 having a function of controlling the operations of the camera 1 and the projector 2 as well as the function of the movable device 4 is provided.
  • the difference will be described in detail.
  • the CPU 31 of the information processing device 3 transmits a predetermined movement command signal to the movable control device 8 via the interface circuit 38, and the movable control device 8 responds to the response signal (for example, ACK signal). Is returned to the CPU 31.
  • the movable control device 8 changes the positions and postures of the camera 1 and the projector 2 using the imaging command signal and the projection command signal based on the movement command signal from the CPU 31 of the information processing device 3.
  • the camera 1 returns a response signal (for example, ACK signal) to the movable control device 8, and the projector 2 returns a response signal (for example, ACK signal) to the movable control device 8.
  • the movable control device 8 may have a configuration called a hand eye in which the robot arm, the camera 1 and the projector 2 are mounted on their hands.
  • the movable control apparatus 8 may be comprised with the some actuator so that each position and attitude
  • the distance image measurement value unknown and inaccurate measurement are suppressed by the distance image synthesis method proposed in the first embodiment. be able to.
  • the measurement value is unknown or inaccurate due to the positional relationship between the camera 1 and the projector 2 and the object, and the light of the projector 2 cannot be observed due to the shielding as viewed from the camera 1.
  • moving the camera 1 and the projector 2 with the movable control device 8 can further eliminate unknown measurement values and inaccurate measurement.
  • the line-of-sight direction is maintained so as to keep an eye on the area where the pixel value of the measurement reliability map is low and the reliability is low.
  • the measurement value is unknown and inaccurate due to different surface conditions, and the positional relationship between the camera 1 and the projector 2 and the target object.
  • a high-quality distance image that efficiently suppresses both unknown and inaccurate measurement values can be generated.
  • the camera 1 is used.
  • the present invention is not limited to this, and may be a sensor such as a camera, a CCD sensor, or an imaging sensor that can capture an image of an object.
  • CPU Central calculation Processing device
  • ROM read only memory
  • RAM random access memory

Abstract

A three-dimensional image measurement device for generating distance image data in which distances from a sensor to an object are represented by pixel values from pattern image data obtained by using a sensor to photograph a pattern projected by a light projector onto an object to be photographed, wherein a control means is provided for supplementing pixels having unknown pixel values in distance image data obtained from pattern image data photographed by the sensor with pixel values of a distance image obtained from pattern image data photographed by the sensor under different photography conditions. The control means achieves different photography conditions by changing at least one from among the exposure time of the sensor, the aperture value of the sensor, the gain of the sensor, the light projection intensity of the light projector, the light projection direction of the light projector, the ambient light position at the time of photography, and the ambient light direction at the time of photography.

Description

3次元画像計測装置及び方法3D image measuring apparatus and method
 本発明は、対象物にパタンを投影し、その反射光を撮像して空間コード化法などの3次元画像計測法を用いて解析することで対象物の3次元形状を計測する3次元画像計測装置及び方法に関する。 The present invention is a three-dimensional image measurement that measures a three-dimensional shape of an object by projecting a pattern onto the object, imaging the reflected light, and analyzing it using a three-dimensional image measurement method such as a spatial coding method. The present invention relates to an apparatus and a method.
 対象物に投影したパタンの反射光をカメラなどのセンサで撮影することで、所定のパタンを投影された測定対象物からの反射光画像であるパタン画像が得られる。パタンを投影した投光器と、パタン画像を撮影したセンサの位置関係が既知の場合、このパタン画像を解析することで、三角測量の原理に基づき、センサから物体までの距離を復元することができる。一般的に、アクティブステレオ法と呼ばれる3次元画像計測方法がこれにあたり、空間コード化法、構成化光(Structured Light)法、光切断法、スポット光投影法、スリット光投影法がこれに含まれる。 A pattern image, which is a reflected light image from a measurement object on which a predetermined pattern is projected, is obtained by photographing the reflected light of the pattern projected on the object with a sensor such as a camera. When the positional relationship between the projector that projects the pattern and the sensor that captured the pattern image is known, the distance from the sensor to the object can be restored by analyzing the pattern image based on the principle of triangulation. In general, this includes a three-dimensional image measurement method called an active stereo method, which includes a spatial encoding method, a structured light method, a light cutting method, a spot light projection method, and a slit light projection method. .
 この3次元画像計測方法で対象物の距離を計測するには、対象物に投影したパタンの反射光をセンサの素子で受光した際に得られる画素値と、反射光を受光しない他の素子の画素値の値に、明確な差が必要となる。これら画素値は、投影したパタン以外の環境光、対象物の表面素材における拡散反射率、投影パタンの光の強度、センサの受光感度など、様々な要因に依存して決定される。 In order to measure the distance of the object by this three-dimensional image measurement method, the pixel value obtained when the reflected light of the pattern projected onto the object is received by the sensor element and the other elements that do not receive the reflected light. A clear difference is required between the pixel values. These pixel values are determined depending on various factors such as ambient light other than the projected pattern, diffuse reflectance on the surface material of the object, light intensity of the projected pattern, and light receiving sensitivity of the sensor.
 拡散反射率の高い黒色の対象物にパタンを投影すると、弱い強度の反射光しか返ってこないため、パタンの計測が困難である。この場合、センサの素子の受光量を増やす必要がある。露光時間を長くすることは、受光量を増やす一つの方法である。しかし、露光時間を長くした場合、今度は環境光の影響を強く受けるようになる。例えば白色の対象物や拡散反射率の低い対象物にパタンを投影した際、パタンが投影されていない部分からも、環境光やパタン光の映り込みが反射光として返ってくる。この場合もパタンの計測が困難である。これは、拡散反射率の高い黒い物体と、白い物体や拡散反射率の低い物体を同時に計測するのが困難であることを示す一例であるが、露光時間以外の方法によりセンサの素子の受光量を増やしたとしても、同様の問題が発生する。 When projecting a pattern onto a black object with a high diffuse reflectance, only reflected light with a weak intensity is returned, making it difficult to measure the pattern. In this case, it is necessary to increase the amount of light received by the sensor element. Increasing the exposure time is one method for increasing the amount of received light. However, if the exposure time is lengthened, this time, it will be strongly influenced by ambient light. For example, when a pattern is projected onto a white object or an object having a low diffuse reflectance, reflection of ambient light or pattern light returns as reflected light from a portion where the pattern is not projected. In this case also, it is difficult to measure the pattern. This is an example showing that it is difficult to simultaneously measure a black object with a high diffuse reflectance and a white object or an object with a low diffuse reflectance, but the amount of light received by the sensor element by a method other than the exposure time. Even if the number is increased, the same problem occurs.
 このような、異なる表面状態を持つ複数の物体が混在するシーンの3次元画像計測をするための方法として、異なる露光時間で撮影したパタン画像同士を合成した、多重露光によるパタン画像(多重シャッタースピード画像)から距離を復元する方法がある(特許文献1)。 As a method for measuring a three-dimensional image of a scene in which a plurality of objects having different surface states are mixed, a pattern image (multiple shutter speed) obtained by combining pattern images taken at different exposure times. There is a method of restoring the distance from (image) (Patent Document 1).
特許第4889373号公報Japanese Patent No. 4889373
 このような3次元画像計測方法にあっては、短露光時間のパタン画像と長露光時間のパタン画像とを合成した多重露光によるパタン画像を用いた空間コード化法により3次元画像計測を行うが、カメラの受光素子のレンジ不足により画素間の連続性が悪くなることから、不正確な値を持つ画素が発生する、つまり誤計測が生じるという課題があった。 In such a three-dimensional image measurement method, three-dimensional image measurement is performed by a spatial encoding method using a multiple exposure pattern image obtained by synthesizing a short exposure time pattern image and a long exposure time pattern image. Since the continuity between pixels deteriorates due to insufficient range of the light receiving element of the camera, there is a problem that pixels having inaccurate values are generated, that is, erroneous measurement occurs.
 本発明の目的は上記問題点を解決し、複数の異なる表面状態の物体を存在するシーンにおいて、誤計測を抑制しつつ、計測抜けの少ない、安定した計測を行うことができる3次元画像計測装置を提供することにある。 An object of the present invention is to solve the above-described problems, and in a scene where a plurality of objects having different surface states exist, a three-dimensional image measurement apparatus capable of performing stable measurement with less measurement loss while suppressing erroneous measurement. Is to provide.
 本発明の一態様にかかる3次元画像計測装置は、
 投光器が撮影すべき対象物に投光したパタンをセンサで撮影して得られるパタン画像データから、上記センサから対象物までの距離を画素値で表した距離画像データを生成する3次元画像計測装置において、
 上記センサが撮影したパタン画像データから得られる距離画像データにおける画素値が不明な画素を、異なる撮影条件で上記センサにより撮影したパタン画像データから得られる距離画像の画素値で補完する制御手段を備えたことを特徴とする。
A three-dimensional image measurement apparatus according to an aspect of the present invention includes:
A three-dimensional image measuring device that generates distance image data representing a distance from the sensor to the object as a pixel value from pattern image data obtained by photographing a pattern projected onto the object to be photographed by the projector. In
Control means for complementing a pixel whose pixel value is unknown in distance image data obtained from pattern image data photographed by the sensor with a pixel value of a distance image obtained from pattern image data photographed by the sensor under different photographing conditions It is characterized by that.
 従って、本発明に係る3次元画像計測装置によれば、複数の異なる表面状態の物体を存在するシーンにおいて、誤計測を抑制しつつ、計測抜けの少ない、安定した計測を行うことができる。 Therefore, according to the three-dimensional image measurement apparatus according to the present invention, it is possible to perform stable measurement with few measurement omissions while suppressing erroneous measurement in a scene where there are a plurality of objects having different surface states.
本発明の実施の形態1にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the three-dimensional image measuring device concerning Embodiment 1 of this invention. 図1の情報処理装置3の概略機能を示すブロック図である。It is a block diagram which shows the schematic function of the information processing apparatus 3 of FIG. 図1の3次元画像計測装置により計測対象物を撮影したときのカメラ画像21を示す図である。It is a figure which shows the camera image 21 when a measurement target object is image | photographed with the three-dimensional image measuring device of FIG. 図1の3次元画像計測装置により計測対象物を撮影したときの距離画像22を示す図である。It is a figure which shows the distance image 22 when a measurement target object is image | photographed with the three-dimensional image measuring device of FIG. 一般的な空間コード化法で使われるポジティブパタン画像23及びネガティブパタン画像24を含むパタン画像の例を示す図である。It is a figure which shows the example of the pattern image containing the positive pattern image 23 and the negative pattern image 24 which are used by the general space coding method. 露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって1番目に濃い場合の画像25の画像例の図である。It is a figure of the example of an image which shows that the imaging | photography result of a measurement object changes with the difference in exposure time, Comprising: It is a figure of the example of an image of the image 25 in the case of the 1st darkness. 露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって2番目に濃い場合の画像26の画像例の図である。It is a figure of the example of an image which shows that the imaging | photography result of a measurement object changes with the difference in exposure time, Comprising: It is a figure of the example of an image of the image 26 in the case of the 2nd darkness. 露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって3番目に濃い場合の画像27の画像例の図である。It is a figure of the example of an image which shows that the imaging | photography result of a measurement object changes with the difference in exposure time, Comprising: It is a figure of the example of an image of the image 27 in the case of the 3rd darkness. 露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって4番目に濃い場合、すなわち一番目に薄い場合の画像28の画像例の図である。It is a figure of the example of an image which shows that the imaging | photography result of a measurement object changes with the difference in exposure time, Comprising: It is a figure of the example of the image 28 when it is the 4th dark, ie, the case where it is the first thin. 画素の受光量の小さいパタン画像のカメラ画像41の画像例を示す図である。It is a figure which shows the image example of the camera image 41 of the pattern image with small light reception amount of a pixel. 図6Aのカメラ画像41から得られる距離画像42の画像例を示す図である。It is a figure which shows the image example of the distance image 42 obtained from the camera image 41 of FIG. 6A. 画素の受光量の比較的大きいパタン画像のカメラ画像43の画像例を示す図である。It is a figure which shows the image example of the camera image 43 of the pattern image with comparatively large light reception amount of a pixel. 図7Aのカメラ画像43から得られる距離画像の画像例を示す図である。It is a figure which shows the image example of the distance image obtained from the camera image 43 of FIG. 7A. 図1の3次元画像計測装置の計測結果であって、対象物のカメラ画像の画像例を示す図である。It is a measurement result of the three-dimensional image measurement apparatus of FIG. 1 and is a diagram illustrating an image example of a camera image of an object. 図1の3次元画像計測装置の計測結果であって、露光時間が比較的短い撮影に基づく元の距離画像の画像例を示す図である。It is a measurement result of the three-dimensional image measurement apparatus of FIG. 1, and is a diagram illustrating an image example of an original distance image based on photographing with a relatively short exposure time. 図1の3次元画像計測装置の計測結果であって、式(2)によって計算された距離画像の画像例を示す図である。It is a measurement result of the three-dimensional image measurement apparatus of FIG. 1, and is a diagram illustrating an image example of a distance image calculated by Expression (2). 図1の3次元画像計測装置の計測結果であって、式(2)及び式(3)によって計算された距離画像の画像例を示す図である。It is a measurement result of the three-dimensional image measurement apparatus of FIG. 1, and is a diagram illustrating an image example of a distance image calculated by Expression (2) and Expression (3). 図1の3次元画像計測装置の情報処理装置3によって実行される3次元画像計測処理を示すフローチャートである。It is a flowchart which shows the three-dimensional image measurement process performed by the information processing apparatus 3 of the three-dimensional image measurement apparatus of FIG. 図9のサブルーチンである計測信頼性評価処理(ステップS9)を示すフローチャートである。10 is a flowchart showing measurement reliability evaluation processing (step S9) which is a subroutine of FIG. 図9のサブルーチンである距離画像合成処理(ステップS10)を示すフローチャートである。It is a flowchart which shows the distance image composition process (step S10) which is a subroutine of FIG. 本発明の実施の形態2にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the three-dimensional image measuring device concerning Embodiment 2 of this invention. 本発明の実施の形態3にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the three-dimensional image measuring device concerning Embodiment 3 of this invention. 図13の3次元画像計測装置の情報処理装置3により実行される3次元画像計測処理を示すフローチャートである。It is a flowchart which shows the three-dimensional image measurement process performed by the information processing apparatus 3 of the three-dimensional image measurement apparatus of FIG. 本発明の実施の形態4にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the three-dimensional image measuring device concerning Embodiment 4 of this invention.
実施の形態1.
 図1は本発明の実施の形態1にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図1において、3次元の対象物を撮像して画像データを出力するカメラ1と、3次元の対象物に対して所定光量の投光を行う投光器2がそれぞれ、情報処理装置3のインターフェース回路36,37に接続される。情報処理装置3は例えばコンピュータ又はディジタル計算機などの制御手段で構成され、具体的には、3次元画像計測処理を実行する中央演算処理装置(以下、CPUという。)31と、3次元画像計測処理のプログラムを保存したリードオンリーメモリ(以下、ROMという。)32と、入出力データ及び入出力信号を一時的に保持するランダムアクセスメモリ(以下、RAMという。)33と、例えばマウス、キーボード、ジェスチャ認識用カメラ、装着型加速度センサなどを含み操作者の操作を受け付ける入力装置34と、例えばディスプレイなど操作者に対して情報を提示する表示装置35とを備え、これら構成要素31~37はバス30を介して接続されて構成される。
Embodiment 1 FIG.
FIG. 1 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the first embodiment of the present invention. In FIG. 1, a camera 1 that images a three-dimensional object and outputs image data, and a projector 2 that projects a predetermined amount of light onto the three-dimensional object are respectively interface circuits 36 of the information processing apparatus 3. , 37. The information processing apparatus 3 includes a control unit such as a computer or a digital computer. Specifically, a central processing unit (hereinafter referred to as a CPU) 31 that executes a three-dimensional image measurement process, and a three-dimensional image measurement process. Read-only memory (hereinafter referred to as ROM) 32, random access memory (hereinafter referred to as RAM) 33 that temporarily stores input / output data and input / output signals, and a mouse, keyboard, gesture, for example. An input device 34 that includes a recognition camera, a wearable acceleration sensor, and the like and receives an operator's operation, and a display device 35 that presents information to the operator, such as a display, are provided. It is connected and configured.
 インターフェース回路36はカメラ1とCPU31との間で実行される通信のデータ及び信号の変換等を行う。また、インターフェース回路37はCPU1から投光器2への命令信号の変換等を行う。具体的には、CPU1は詳細後述する、所定のパタンを投影するように命令するパタン投影命令信号をインターフェース回路37を介して投光器2に出力する。また、CPU31は撮影命令信号を発生してインターフェース回路36を介してカメラ1に出力し、これに応答して、カメラ1は所定のパタンを撮影した画像データであるパタン画像データと、その高周波パタン画像データとをインターフェース回路36を介してCPU1に出力する。 The interface circuit 36 performs conversion of communication data and signals executed between the camera 1 and the CPU 31. Further, the interface circuit 37 performs conversion of a command signal from the CPU 1 to the projector 2 and the like. Specifically, the CPU 1 outputs a pattern projection command signal for instructing to project a predetermined pattern, which will be described in detail later, to the projector 2 via the interface circuit 37. In addition, the CPU 31 generates a shooting command signal and outputs it to the camera 1 via the interface circuit 36. In response to this, the camera 1 outputs pattern image data which is image data obtained by shooting a predetermined pattern and its high-frequency pattern. The image data is output to the CPU 1 via the interface circuit 36.
 本実施の形態にかかる3次元画像計測装置は、異なる撮影条件で撮影したパタン画像から得られる距離画像(カメラ1から計測対象物までの距離を画素値で表した画像)同士を合成するものである。その際、画素の受光量が比較的小さな撮影条件で撮影したパタン画像から得られる距離画像の画素値の不明な画素を、画素の受光量が比較的大きな撮影条件で撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする。また、その際、パタンにおける高周波成分の計測の信頼性が高い画素と低い画素で、合成方法を変更することも特徴とする。 The three-dimensional image measurement apparatus according to the present embodiment synthesizes distance images (images in which the distance from the camera 1 to the measurement object is represented by pixel values) obtained from pattern images photographed under different photographing conditions. is there. At that time, a pixel with an unknown pixel value of a distance image obtained from a pattern image photographed with a relatively small amount of received light can be obtained from a pattern image photographed with a relatively large amount of received light. Interpolation is performed using pixel values of the distance image. In this case, the composition method is also changed between pixels with high and low reliability of measurement of high-frequency components in the pattern.
 カメラ1は、投光器2が投影する3次元の対象物のパタンを撮影して撮像された画像データを情報処理装置3に出力するためのデジタルカメラである。ただし、カメラ1が、情報処理装置3から撮影命令を受け取り、撮影したパタン画像を情報処理装置3に返すための通信機能を備える場合、カメラ1は、ピンホールカメラ、レンジファインダーカメラ、ビューカメラであってもよい。また、パタン画像を撮影できる場合、レンズに侵入する光の光線方向を推定するためのマイクロレンズアレーを備えたライトフィールドカメラを利用してもよい。 The camera 1 is a digital camera for photographing the pattern of the three-dimensional object projected by the projector 2 and outputting the image data captured to the information processing apparatus 3. However, when the camera 1 has a communication function for receiving a shooting command from the information processing device 3 and returning the shot pattern image to the information processing device 3, the camera 1 is a pinhole camera, a range finder camera, or a view camera. There may be. When a pattern image can be taken, a light field camera provided with a microlens array for estimating the direction of light entering the lens may be used.
 投光器2は、情報処理装置3からパタン投影命令を受け、3次元の対象物に対して所定のパタンを照射するものであり、CRTプロジェクタ、液晶プロジェクタ、デジタルライトプロセッシングプロジェクタ、反射型液晶素子プロジェクタ、回折現象を利用した反射型表示素子プロジェクタ、レーザーと回折格子とポリゴンミラー、レーザーとレーザー照射位置を変更する駆動装置を利用してもよい。 The projector 2 receives a pattern projection command from the information processing device 3 and irradiates a predetermined pattern to a three-dimensional object. The projector 2 is a CRT projector, a liquid crystal projector, a digital light processing projector, a reflective liquid crystal element projector, A reflective display element projector using a diffraction phenomenon, a laser, a diffraction grating, a polygon mirror, and a driving device that changes a laser and a laser irradiation position may be used.
 図2は図1の情報処理装置3の概略機能を示すブロック図である。図2において、情報処理装置3は、
(1)投光器2に対して、所定のパタンを投影するパタン投影命令信号を出力する投影パタン制御部10と、
(2)カメラ1に対して、投影パタン制御部10からのパタン投影命令信号と同期したタイミングで撮影命令信号を出力する画像取得制御部11と、
(3)カメラ1が撮影したパタン画像から、カメラ1と対象物の距離をデジタル情報の距離画像データとして復元する距離復元部12と、
(4)距離復元部12が復元した距離画像データと、カメラ1が撮影した高周波パタン画像データ(投光器2が投影する複数種類のパタンにおいて、細かい特徴を持つパタンを撮影した画像)から、各画素の距離推定の信頼性を評価する計測信頼性評価部13と、
(5)距離復元部12が復元した複数の距離画像データと、計測信頼性評価部13が作成した距離画像データの計測信頼性マップから、画素値が不明な画素と、実際のカメラと対象物までの距離に対して不正確な画素値を持つ画素が少なくなるように複数の距離画像同士を合成する距離画像合成部14
とを備えて構成される。
FIG. 2 is a block diagram showing a schematic function of the information processing apparatus 3 of FIG. In FIG. 2, the information processing apparatus 3
(1) A projection pattern control unit 10 that outputs a pattern projection command signal for projecting a predetermined pattern to the projector 2;
(2) An image acquisition control unit 11 that outputs a shooting command signal to the camera 1 at a timing synchronized with a pattern projection command signal from the projection pattern control unit 10;
(3) a distance restoration unit 12 that restores the distance between the camera 1 and the object as distance image data of digital information from a pattern image captured by the camera 1;
(4) From each of the distance image data restored by the distance restoration unit 12 and the high-frequency pattern image data photographed by the camera 1 (images obtained by photographing patterns having fine features in a plurality of types of patterns projected by the projector 2), each pixel A measurement reliability evaluation unit 13 for evaluating the reliability of the distance estimation of
(5) From a plurality of distance image data restored by the distance restoration unit 12 and a measurement reliability map of the distance image data created by the measurement reliability evaluation unit 13, a pixel whose pixel value is unknown, an actual camera, and an object Distance image composition unit 14 for compositing a plurality of distance images so that the number of pixels having inaccurate pixel values with respect to the distance up to
And is configured.
 投影パタン制御部10と画像取得制御部11は同期信号を用いて制御され、パタン投影命令信号と撮影命令信号のタイミングを合わせる。これにより、同期したパタン画像のは増データを得ることができる。 The projection pattern control unit 10 and the image acquisition control unit 11 are controlled using a synchronization signal, and synchronize the timing of the pattern projection command signal and the imaging command signal. Thereby, the increased data can be obtained for the synchronized pattern image.
 図3Aは図1の3次元画像計測装置により計測対象物を撮影したときのカメラ画像21を示す図である。また、図3Bは図1の3次元画像計測装置により計測対象物を撮影したときの距離画像22を示す図である。図3Aのカメラ画像21では対象物の素材の反射率によって反射してくる光の強さが画素値に反映されるので、黒い対象物が黒く、白い対象物が白くなっている。一方、カメラ画像21に対応する図3Bの距離画像22では、カメラ1から対象物までの距離が画素値として表現されるので、カメラ1に近い対象物が明るく、遠い対象物が黒く表示される。 FIG. 3A is a diagram showing a camera image 21 when a measurement target is photographed by the three-dimensional image measurement apparatus of FIG. FIG. 3B is a diagram showing a distance image 22 when a measurement object is photographed by the three-dimensional image measurement apparatus of FIG. In the camera image 21 of FIG. 3A, the intensity of light reflected by the reflectance of the material of the object is reflected in the pixel value, so that the black object is black and the white object is white. On the other hand, in the distance image 22 of FIG. 3B corresponding to the camera image 21, since the distance from the camera 1 to the object is expressed as a pixel value, the object close to the camera 1 is bright and the distant object is displayed black. .
 図4は一般的な空間コード化法で使われるポジティブパタン画像23及びネガティブパタン画像24を含むパタン画像の例を示す図である。すなわち、図4において、一般的な空間コード化法で利用される複数の2値のパタンのイメージを示す。縦方向のスリットパタンが照射されており、照射部分は明るく、非照射部分は暗く表示されている。図4に示すように、複数の異なる周波数帯のスリットパタンを投影するが、スリットの太さに応じて、細いスリットパタンを高周波パタン画像、太いスリットパタンを低周波パタン画像と呼ぶ。また、空間コード化法における計測誤差を抑制するために、2値パタンの正負が逆のスリットパタンのペア(すなわち、撮影パタンの明暗を互いに反転させて得られるペアの画像データ)を利用する。ここでは、このペアのパタン画像の一対をポジティブパタン画像23と呼び、もう一対をネガティブパタン画像24と呼ぶ。 FIG. 4 is a diagram showing an example of a pattern image including a positive pattern image 23 and a negative pattern image 24 used in a general spatial coding method. That is, FIG. 4 shows an image of a plurality of binary patterns used in a general spatial coding method. The slit pattern in the vertical direction is irradiated, and the irradiated portion is displayed bright and the non-irradiated portion is displayed dark. As shown in FIG. 4, a plurality of slit patterns having different frequency bands are projected. Depending on the thickness of the slit, a thin slit pattern is called a high-frequency pattern image, and a thick slit pattern is called a low-frequency pattern image. In addition, in order to suppress measurement errors in the spatial coding method, a pair of slit patterns whose binary patterns are opposite in polarity (that is, image data of a pair obtained by reversing the light and dark of the photographing pattern) is used. Here, a pair of the paired pattern images is referred to as a positive pattern image 23, and the other pair is referred to as a negative pattern image 24.
 距離復元部12は、画像取得制御部11により得られたパタン画像から、カメラ1から対象物までの距離を含む距離を復元する。距離復元結果を、3次元の対象物を点群の集合であるポイントクラウドとして表現することも可能であるが、距離画像データとして表現することで、処理時間の短い画像処理を活用することができる。 The distance restoration unit 12 restores the distance including the distance from the camera 1 to the object from the pattern image obtained by the image acquisition control unit 11. Although it is possible to express the distance restoration result as a point cloud that is a set of point clouds for a three-dimensional object, it is possible to utilize image processing with a short processing time by expressing it as distance image data. .
 計測信頼性評価部13は、距離復元部12で復元されたカメラ1から対象物までの距離画像の各画素において、画素値が不明、もしくは不正確な計測が発生する可能性を定量化する。そのために、画像取得制御部11により撮影したパタン画像のうち、高周波パタン画像データを活用する。高周波パタンは、細いパタンであるため、環境光やパタン光自体の反射光の強度変化に応じて、パタンが消失しやすい。特に反射光の強度が強い場合に、直接反射光を受光する画素が飽和させるだけでなく、その周辺の画素まで飽和させるため、広範囲にわたる誤計測に繋がる。高周波パタンは反射光の変動の影響を受けやすい分、その状態を解析することは、計測信頼性の評価に適している。 The measurement reliability evaluation unit 13 quantifies the possibility that the pixel value is unknown or inaccurate in each pixel of the distance image from the camera 1 to the object restored by the distance restoration unit 12. For this purpose, high-frequency pattern image data is used among the pattern images photographed by the image acquisition control unit 11. Since the high-frequency pattern is a thin pattern, the pattern tends to disappear according to the intensity change of the reflected light of the ambient light or the pattern light itself. In particular, when the intensity of the reflected light is strong, not only the pixels that directly receive the reflected light are saturated but also the surrounding pixels are saturated, leading to erroneous measurement over a wide range. Since the high frequency pattern is easily affected by fluctuations in reflected light, analyzing its state is suitable for evaluation of measurement reliability.
 ここで、計測信頼性マップRPNを次式のように定義する。 Here, the measurement reliability map R PN is defined as follows.
Figure JPOXMLDOC01-appb-M000001



   (1)
Figure JPOXMLDOC01-appb-M000001



(1)
 ここで、I Highは高周波ポジティブパタン画像データ、I Highは高周波ネガティブパタン画像データである。高周波ポジティブパタン画像データと高周波ネガティブパタン画像データの各画素の値は、本来極端な差が生じるが、環境光の影響や、パタン光の反射光の強度によっては、差が発生しない。これは、高周波パタンが観測できなかったことと等しい。つまり、計測信頼性マップRPNの値が0、もしくは所定のしきい値(例えば、0近傍の正値、例えば0.01)より低い場合、計測信頼性が著しく低いと評価することができる。 Here, I P High is high frequency positive pattern image data, and I N High is high frequency negative pattern image data. The value of each pixel in the high-frequency positive pattern image data and the high-frequency negative pattern image data inherently has an extreme difference, but no difference occurs depending on the influence of ambient light and the intensity of reflected light of the pattern light. This is equivalent to the fact that a high frequency pattern could not be observed. That is, when the value of the measurement reliability map RPN is 0 or lower than a predetermined threshold (for example, a positive value near 0, for example, 0.01), it can be evaluated that the measurement reliability is extremely low.
 距離画像合成部14は異なる撮影条件で撮影した距離画像同士を合成し、各撮影条件では画素値が不明な画素、カメラ1と対象物間の実際の距離と異なる画素値となった画素の補正を行う。異なる撮影条件とは、カメラの素子の受光量を変更させてパタン画像を撮影し、距離画像を復元することである。図1の装置構成において、情報処理装置3が、カメラ1の露光時間(例えば、第1の露光時間と、上記第1の露光時間よりも長い第2の露光時間を用いる)、カメラ1の絞り値、カメラ1のゲイン、投光器2の投光強度、投光器2の投光方向を変更することで、異なる撮影条件を実現することができる。 The distance image composition unit 14 synthesizes distance images photographed under different photographing conditions, and corrects a pixel whose pixel value is unknown under each photographing condition or a pixel value different from the actual distance between the camera 1 and the object. I do. The different shooting conditions are that a pattern image is shot by changing the amount of received light of the camera element, and the distance image is restored. In the apparatus configuration of FIG. 1, the information processing apparatus 3 has an exposure time of the camera 1 (for example, using a first exposure time and a second exposure time longer than the first exposure time), an aperture of the camera 1. By changing the value, the gain of the camera 1, the light projecting intensity of the projector 2, and the light projecting direction of the projector 2, different shooting conditions can be realized.
 図5Aは露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって1番目に濃い場合の画像25の画像例の図である。また、図5Bは露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって2番目に濃い場合の画像26の画像例の図である。さらに、図5Cは露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって3番目に濃い場合の画像27の画像例の図である。またさらに、図5Dは露光時間の違いにより、計測対象物の撮影結果が異なることを示す画像例の図であって4番目に濃い場合、すなわち一番目に薄い場合の画像28の画像例の図である。すなわち、図5A~図5Dは、異なる撮影条件として異なる露光時間にて同じ対象物を撮影した画像のイメージである。図5Aの画像が最も露光時間が短く、図5B、図5C、図5Dの順で露光時間が長くなっている。露光時間の比較的短い画像では黒い対象物と周囲の差が判別しにくく、露光時間の長い画像では白い対象物と周囲の差が判別しにくくなる。 FIG. 5A is a diagram of an image example showing that the photographing result of the measurement object varies depending on the exposure time, and is a diagram of an image example of the image 25 in the first darkest case. FIG. 5B is an image example showing that the measurement result of the measurement object varies depending on the exposure time, and is an image example of the image 26 in the second darkest case. Furthermore, FIG. 5C is a diagram of an image example showing that the photographing result of the measurement object varies depending on the difference in exposure time, and is a diagram of an image example of the image 27 in the third darkest case. Furthermore, FIG. 5D is a diagram of an image example showing that the photographing result of the measurement object varies depending on the exposure time, and is a diagram of an image example of the image 28 when it is the fourth darkest, that is, when it is the thinnest first. It is. That is, FIGS. 5A to 5D are images of images obtained by shooting the same object under different exposure conditions under different shooting conditions. The image in FIG. 5A has the shortest exposure time, and the exposure time becomes longer in the order of FIGS. 5B, 5C, and 5D. In an image with a relatively short exposure time, the difference between the black object and the surroundings is difficult to distinguish, and in an image with a long exposure time, the difference between the white object and the surroundings is difficult to distinguish.
 図6Aは画素の受光量の小さいパタン画像のカメラ画像41の画像例を示す図である。また、図6Bは図6Aのカメラ画像41から得られる距離画像42の画像例を示す図である。さらに、図7Aは画素の受光量の比較的大きいパタン画像のカメラ画像43の画像例を示す図である。またさらに、図7Bは図7Aのカメラ画像43から得られる距離画像の画像例を示す図である。 FIG. 6A is a diagram illustrating an image example of the camera image 41 of a pattern image with a small amount of light received by pixels. FIG. 6B is a diagram illustrating an image example of the distance image 42 obtained from the camera image 41 of FIG. 6A. Furthermore, FIG. 7A is a diagram illustrating an image example of the camera image 43 of a pattern image having a relatively large amount of light received by pixels. FIG. 7B is a diagram showing an example of a distance image obtained from the camera image 43 of FIG. 7A.
 図6Bの距離画像42における、黒い小さな円は画素値が不明の画素を指す。露光時間が短い場合、白い対象物の画素値は正確に得られやすいが、黒い対象物では画素値が不明となりやすい。図7Bの距離画像44におけるテクスチャのような部分は、不正確な画素値を示している。露光時間が長い場合、黒い対象物の画素値は正確に得られるが、白い対象物の画素値が不正確になりやすい。画素値不明の画素については、周辺画素による補間が適用できるが、不正確な画素値を持つ画素は、補間できず、実応用において、例えばロボットが衝突するなど、深刻な問題を発生しやすい。そこで、次の戦略方法により、距離画像同士を合成することで、高品質な距離画像データを得る。 In the distance image 42 in FIG. 6B, a small black circle indicates a pixel whose pixel value is unknown. When the exposure time is short, the pixel value of the white object is easily obtained accurately, but the pixel value of the black object is likely to be unknown. A portion such as a texture in the distance image 44 in FIG. 7B indicates an incorrect pixel value. When the exposure time is long, the pixel value of the black object can be accurately obtained, but the pixel value of the white object tends to be inaccurate. Interpolation with neighboring pixels can be applied to pixels whose pixel values are unknown, but pixels with inaccurate pixel values cannot be interpolated, and serious problems such as robot collision are likely to occur in actual applications. Therefore, high-quality distance image data is obtained by synthesizing the distance images by the following strategy method.
 距離画像合成部14は次式のように距離画像同士を合成して合成された距離画像データIfusion(x,y)を生成して出力する。 The distance image synthesizing unit 14 generates and outputs distance image data I fusion (x, y) synthesized by synthesizing distance images as in the following equation.
Figure JPOXMLDOC01-appb-M000002




   (2)
Figure JPOXMLDOC01-appb-M000002




(2)
 ここで、ILowは、ある撮影条件により撮影したパタン画像に基づく、画素値の受光量が低い距離画像データを指す。IHighは、異なる撮影条件により撮影したパタン画像に基づく、画素値の受光量が高い距離画像データを指す。露光時間を変えて距離画像を2回取得した場合、露光時間の短い撮影に基づく距離画像データがILowに対応し、露光時間の長い撮影に基づく距離画像データがIHighに対応する。ここで、(x,y)とは画像における画素の2次元位置を意味する。 Here, I Low indicates distance image data with a low light reception amount of pixel values based on a pattern image photographed under a certain photographing condition. I High refers to distance image data with a high received light amount of pixel values based on pattern images shot under different shooting conditions. When the distance image is acquired twice while changing the exposure time, the distance image data based on shooting with a short exposure time corresponds to I Low , and the distance image data based on shooting with a long exposure time corresponds to I High . Here, (x, y) means a two-dimensional position of a pixel in an image.
 受光量が低いパタン画像においては、パタンが消失することから、計算できる距離画像データの画素値が不明となりやすい。しかし、環境光の影響も抑えられるため、受光ノイズ(以下、ノイズと記載する)が少なくなり、距離画像データの画素値を計算できた画素の値は正確になりやすい。一方、受光量が高いパタン画像においては、環境光の影響を強く受けるため、ノイズが多くなり、距離画像データの画素値が不正確になりやすい。つまり、距離画像データILowはノイズが少ないが、画素値が不明な箇所が多い。距離画像データIHighはノイズが多い。しかし、距離画像データILowにおける画素値が不明な画素の画素値が距離画像データIHighから得られる可能性がある。よって、式(2)のように、距離画像データILowを基準とし、ILowの画素値が不明な画素(式(2)では、ILow(x,y)=0が、画素値が不明な画素に相当する)については、距離画像データIHighの画素値で補完するという戦略により、ノイズが少なく、かつ、画素値が不明な画素の少ない、高品質な距離画像データIfusion(x,y)を得ることができる。 In a pattern image with a low amount of received light, the pattern disappears, so that the pixel value of the distance image data that can be calculated tends to be unknown. However, since the influence of ambient light is also suppressed, light reception noise (hereinafter referred to as noise) is reduced, and the pixel value for which the pixel value of the distance image data can be calculated tends to be accurate. On the other hand, a pattern image with a high amount of received light is strongly affected by ambient light, so that noise increases and pixel values of distance image data tend to be inaccurate. That is, the distance image data I Low has little noise, but there are many places where the pixel value is unknown. The distance image data I High is noisy. However, there is a possibility that the pixel value of a pixel whose pixel value is unknown in the distance image data I Low is obtained from the distance image data I High . Therefore, as in Equation (2), the distance image data I Low as a reference, the I Low pixel values unknown pixel (equation (2), I Low (x , y) = 0 is unknown pixel value Is equivalent to a pixel value of the distance image data I High , and high-quality distance image data I fusion (x, y) can be obtained.
 この場合、距離画像データILowの画素値が不明な画素の距離画像データIHighの画素値が不正確な場合、合成した距離画像の結果も不正確になってしまう。そこで、距離画像データIHighを計算した際に得られる高周波ポジティブパタン画像データI High、及び高周波ネガティブパタン画像データI Highから得られる計測信頼性マップRPNをもとに、次のように距離画像データIfusion(x,y)を合成する。 In this case, if the pixel value of the distance image data I High of the pixel whose pixel value of the distance image data I Low is unknown is incorrect, the result of the synthesized distance image will also be incorrect. Therefore, based on the measurement reliability map R PN obtained from the high-frequency positive pattern image data I P High obtained when the distance image data I High is calculated and the high-frequency negative pattern image data I N High , the following is performed. The distance image data I fusion (x, y) is synthesized.
Figure JPOXMLDOC01-appb-M000003




   (3)
Figure JPOXMLDOC01-appb-M000003




(3)
Figure JPOXMLDOC01-appb-M000004

   (4)
Figure JPOXMLDOC01-appb-M000004

(4)
 つまり、式(2)と同様に、距離画像データILowを基準とし、距離画像データILowの画素値が不明な画素(式(2)では、ILow(x,y)=0が、画素値が不明な画素に相当する)については、距離画像データIHighの画素値で補完するが、計測信頼性マップRPNの値が0の場合、画素値を0(画素値が不明)とする。これにより、さらにノイズを抑えた高品質な距離画像データIfusion(x,y)を得ることができる。 That is, as in the equation (2), the distance image data I Low is used as a reference, and the pixel value of the distance image data I Low is unknown (in the equation (2), I Low (x, y) = 0 The pixel value of the distance image data I High is complemented with the pixel value of the distance image data I High. However, when the value of the measurement reliability map RPN is 0, the pixel value is 0 (pixel value is unknown). . As a result, high-quality distance image data I fusion (x, y) with further reduced noise can be obtained.
 なお、式(3)においては、計測信頼性マップRPNの値が0か否かで場合分けをしているが、所定のしきい値(例えば、0近傍値であって、例えば0.01)を定義し、計測信頼性マップRPNの値がそのしきい値以上か以下かで場合分けしてもよい。その場合しきい値以下の場合は画素値を0とする。 In Expression (3), the case is classified according to whether the value of the measurement reliability map RPN is 0 or not, but a predetermined threshold value (for example, a value near 0, for example, 0.01) ) May be defined and the measurement reliability map RPN may be classified according to whether the value is greater than or less than the threshold value. In this case, the pixel value is set to 0 when the value is less than the threshold value.
 図8Aは図1の3次元画像計測装置の計測結果であって、対象物のカメラ画像の画像例を示す図である。また、図8Bは図1の3次元画像計測装置の計測結果であって、露光時間が比較的短い撮影に基づく元の距離画像の画像例を示す図である。さらに、図8Cは図1の3次元画像計測装置の計測結果であって、式(2)によって計算された距離画像の画像例を示す図である。またさらに、図8Dは図1の3次元画像計測装置の計測結果であって、式(2)及び式(3)によって計算された距離画像の画像例を示す図である。 FIG. 8A is a measurement result of the three-dimensional image measurement apparatus in FIG. 1 and is a diagram illustrating an example of a camera image of an object. FIG. 8B is a measurement result of the three-dimensional image measurement apparatus in FIG. 1 and shows an example of an original distance image based on photographing with a relatively short exposure time. Further, FIG. 8C is a diagram showing a measurement result of the three-dimensional image measurement apparatus of FIG. 1 and showing an example of a distance image calculated by Expression (2). Further, FIG. 8D is a measurement result of the three-dimensional image measurement apparatus of FIG. 1, and is a diagram showing an example of the distance image calculated by the equations (2) and (3).
 ここで、図8A~図8Dにおいて、左側の画像が黒い対象物であり、右側の画像が白い部分と金属部分を持つ対象物である。図8Aに示す元の距離画像においては、多くの画素値が不明な画素が発生しているが、図8C、図8Dにおける式(2)、式(3)及び式(4)による結果ではそれが正確な値で補完されている。特に黒い対象物に対して顕著な結果を示している。式(2)による図8Cの結果では、白い対象物の一部や、金属部分に、全く不正確な画素値が混入している。しかし、式(3)及び式(4)による図8Dの結果では、その多くが除去されていることがわかる。 Here, in FIGS. 8A to 8D, the left image is a black object, and the right image is an object having a white portion and a metal portion. In the original distance image shown in FIG. 8A, many pixels with unknown pixel values are generated. However, in the results of Expressions (2), (3), and (4) in FIGS. Are complemented with exact values. It shows remarkable results especially for black objects. In the result of FIG. 8C by Expression (2), pixel values that are completely inaccurate are mixed in a part of a white object or a metal part. However, in the result of FIG. 8D according to the equations (3) and (4), it can be seen that most of them are removed.
 なお、式(3)及び式(4)を利用した場合(図8D)でも、画素値が不明な画素は残る可能性がある。その場合は、周辺画素値から補間してもよい。補間には手法は問わず、バイキュービック法、Bスプライン法を用いてもよい。また、さらに異なる撮影条件を利用し、式(2)もしくは、式(3)及び式(4)を利用して、繰り返し距離画像同士を合成してもよい。 In addition, even when Expression (3) and Expression (4) are used (FIG. 8D), a pixel with an unknown pixel value may remain. In that case, interpolation may be performed from the peripheral pixel values. Any method may be used for the interpolation, and a bicubic method or a B-spline method may be used. Further, different distance conditions may be used to synthesize repeated distance images using Equation (2) or Equation (3) and Equation (4).
 図9は図1の3次元画像計測装置の情報処理装置3によって実行される3次元画像計測処理を示すフローチャートである。また、図10は図9のサブルーチンである計測信頼性評価処理(ステップS9)を示すフローチャートである。さらに、図11は図9のサブルーチンである距離画像合成処理(ステップS10)を示すフローチャートである。 FIG. 9 is a flowchart showing a 3D image measurement process executed by the information processing apparatus 3 of the 3D image measurement apparatus of FIG. FIG. 10 is a flowchart showing measurement reliability evaluation processing (step S9) which is a subroutine of FIG. Further, FIG. 11 is a flowchart showing distance image synthesis processing (step S10) which is a subroutine of FIG.
 図9の3次元画像計測処理において、異なる撮影パラメタP1,P2(ステップS1,S5)で2回、投影パタンで照射してパタン画像を撮影する(ステップS2~S3,S6~S7)。ここで、パラメタとは、カメラ1の露光時間、カメラ1の絞り値、カメラ1のゲイン、投光器2の投光強度、投光器2の投光方向、撮影時の環境光の強度、撮影時の環境光の方向の変化にあたる。図9の撮影パラメタP1とは、撮影画像の輝度値が全体的に暗い状態になるパラメタ設定である。カメラ1の露光時間であれば短く、カメラ1のゲインであれば小さく設定する。そこから投光器2より投影パタンを照射し、カメラ1が同期しながらパタン画像を撮影する。撮影したパタン画像から距離画像を作成する。次に、撮影画像の輝度値が全体的に明るい状態になるようパラメタを設定し、パタン投光、撮影、距離画像作成を行う。2回の撮影をもとに、計測信頼性を評価し(ステップS9)、距離画像同士を合成する(ステップ10)。 In the three-dimensional image measurement process of FIG. 9, a pattern image is photographed by irradiating with a projection pattern twice with different photographing parameters P1, P2 (steps S1, S5) (steps S2-S3, S6-S7). Here, the parameters are the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, the light projection direction of the projector 2, the intensity of ambient light at the time of shooting, and the environment at the time of shooting. It corresponds to the change of the direction of light. The shooting parameter P1 in FIG. 9 is a parameter setting in which the brightness value of the shot image becomes dark overall. If the exposure time of the camera 1 is short, the exposure time of the camera 1 is set short. From there, the projection pattern is irradiated from the projector 2, and the pattern image is taken while the camera 1 is synchronized. A distance image is created from the captured pattern image. Next, parameters are set so that the brightness value of the captured image becomes bright overall, and pattern projection, shooting, and distance image creation are performed. Based on the two shootings, the measurement reliability is evaluated (step S9), and the distance images are synthesized (step 10).
 図10の計測信頼性評価処理において、対象物で全体の輝度値が明るくなるよう調整したパラメタP2での撮影における、高周波ネガティブパタン画像と、高周波ポジティブパタン画像の差分絶対値を計算して差分絶対値の画素値を有する差分画像データを生成する(ステップS11)。未選択の1つの画素を処理対象の選択画素として以下のステップS13~S15の処理を行う。選択画素の差分絶対値が0であれば(ステップS13でYES)その選択画素の計測信頼性の値を0にしてステップS16に進む。一方、選択画素の差分絶対値が1であれば(ステップS13でNO)、選択画素の計測信頼性の値を1とし(ステップS15)、ステップS16に進む。処理対象画像のすべての画素についてステップS12~S14までの処理を繰り返し(ステップS16)、全ての画素について計測信頼性を計算した結果が得られれば(ステップS16でYES)計測信頼性マップを生成して(ステップS17)元のルーチンに戻る。 In the measurement reliability evaluation process of FIG. 10, the absolute difference between the high-frequency negative pattern image and the high-frequency positive pattern image is calculated and the absolute difference between the high-frequency negative pattern image and the high-frequency positive pattern image is acquired. Difference image data having a pixel value of the value is generated (step S11). The following steps S13 to S15 are performed using one unselected pixel as the selected pixel to be processed. If the difference absolute value of the selected pixel is 0 (YES in step S13), the measurement reliability value of the selected pixel is set to 0 and the process proceeds to step S16. On the other hand, if the difference absolute value of the selected pixel is 1 (NO in step S13), the measurement reliability value of the selected pixel is set to 1 (step S15), and the process proceeds to step S16. The processing from step S12 to S14 is repeated for all pixels of the processing target image (step S16), and if the result of calculating the measurement reliability for all the pixels is obtained (YES in step S16), a measurement reliability map is generated. (Step S17), the process returns to the original routine.
 図11の距離画像合成処理において、まず、未選択の1つの画素を処理対象の選択画素とし(ステップS21)以下のステップS22~S26の処理を行う。ここで、距離画像データ(撮影パラメタP1)の画素値が0であれば(ステップS22でYES)ステップS23に進む一方、0でなければ(ステップS22でNO;なお、0でない場合は、所定のしきい値(例えば、0近傍値であって、例えば0.01)以上の画素値をいう。)ステップS24では合成された距離画像データの画素値に距離画像データ(撮影パラメタP1)の画素値を代入してステップS27に進む。計測信頼性マップの画素値が0であれば(ステップS23でYES)合成された距離画像データの画素値に0を代入してステップS27に進む一方、0でなければ(ステップS23でNO)合成された距離画像データの画素値に距離画像データ(撮影パラメタP2)の画素値を代入してステップS27に進む。処理対象画像のすべての画素についてステップS21~S26までの処理を繰り返し(ステップS27)、全ての画素について距離画像データの合成結果が得られれば(ステップS27でYES)合成された距離画像データを生成して出力し(ステップS28)元のルーチンに戻る。 In the distance image compositing process of FIG. 11, first, an unselected pixel is set as a selected pixel to be processed (step S21), and the following steps S22 to S26 are performed. Here, if the pixel value of the distance image data (imaging parameter P1) is 0 (YES in step S22), the process proceeds to step S23, but if it is not 0 (NO in step S22; In step S24, the pixel value of the distance image data (imaging parameter P1) is added to the pixel value of the synthesized distance image data in step S24. Then, the process proceeds to step S27. If the pixel value of the measurement reliability map is 0 (YES in step S23), 0 is substituted into the pixel value of the synthesized distance image data, and the process proceeds to step S27. If not, the composition is not 0 (NO in step S23). The pixel value of the distance image data (imaging parameter P2) is substituted into the pixel value of the distance image data thus obtained, and the process proceeds to step S27. The processing from step S21 to S26 is repeated for all pixels of the processing target image (step S27), and if the result of combining the distance image data is obtained for all pixels (YES in step S27), the combined distance image data is generated. And output (step S28), the process returns to the original routine.
 以上説明したように、図11の距離画像合成処理においては、全体の輝度値が暗くなるよう調整した撮影パラメタP1の撮影から得られた距離画像データをベースとし、撮影パラメタP1の距離画像データの画素値が0(距離値が不明な状態を意味する)でなければ(ステップS22でNO)、撮影パラメタP1の距離画像データの画素値を、合成された距離画像データの画素値とする(ステップS24)。もし、撮影パラメタP1の距離画像データの画素値が0であれば(ステップS22でYES)図4の画像データを用いて作成した計測信頼性マップを参照する。計測信頼性マップにおいて、対応する画素の画素値が0でなければ(ステップS23でNO)、撮影パラメタP2の距離画像データの画素値を、合成された距離画像の画素値とする(ステップS26)。対応する画素の画素値が0であれば(ステップS23でYES)合成された距離画像の対応する画素値を0とする(ステップS25)。 As described above, in the distance image composition processing of FIG. 11, the distance image data of the shooting parameter P1 is obtained based on the distance image data obtained from the shooting of the shooting parameter P1 adjusted so that the overall luminance value becomes dark. If the pixel value is not 0 (meaning that the distance value is unknown) (NO in step S22), the pixel value of the distance image data of the shooting parameter P1 is set as the pixel value of the synthesized distance image data (step). S24). If the pixel value of the distance image data of the shooting parameter P1 is 0 (YES in step S22), the measurement reliability map created using the image data of FIG. 4 is referred to. If the pixel value of the corresponding pixel is not 0 in the measurement reliability map (NO in step S23), the pixel value of the distance image data of the shooting parameter P2 is set as the pixel value of the synthesized distance image (step S26). . If the pixel value of the corresponding pixel is 0 (YES in step S23), the corresponding pixel value of the synthesized distance image is set to 0 (step S25).
 なお、計測信頼性マップの画素値が0の画素については、合成された距離画像をベースとし、さらに異なる撮影パラメタP3(撮影パラメタP1、P2と異なる)で撮影した距離画像データを、同様の手順で合成することで、さらに計測抜けの少ない距離画像を作成してもよい。 For pixels with a pixel value of 0 in the measurement reliability map, the same procedure is used for the distance image data captured based on the combined distance image and captured with different shooting parameters P3 (different from the shooting parameters P1 and P2). A distance image with fewer missing measurements may be created by combining the images.
 以上により、画素値が不明な画素が少なく、かつノイズも少ない、合成された距離画像データIfusionを得ることができ、白い物体と黒い物体が混在するシーンなど、計測が難しい環境でも、高品質な距離画像を得ることができる。 As described above, it is possible to obtain synthesized distance image data I fusion with few pixels with unknown pixel values and little noise, and high quality even in difficult measurement environments such as scenes where white and black objects are mixed A range image can be obtained.
実施の形態2.
 図12は本発明の実施の形態2にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図12において、実施の形態2にかかる3次元画像計測装置は、図1の実施の形態1にかかる3次元画像計測装置に比較して、以下の点が異なる。
(1)情報処理装置3はさらに、バス30に接続されるインターフェース回路38を備える。
(2)インターフェース回路38に接続され、撮影すべき対象物を例えば3次元で移動させる可動装置4をさらに備える。
 以下、相違点について詳述する。
Embodiment 2. FIG.
FIG. 12 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the second embodiment of the present invention. 12, the three-dimensional image measurement apparatus according to the second embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
(1) The information processing apparatus 3 further includes an interface circuit 38 connected to the bus 30.
(2) The mobile device 4 further includes a movable device 4 connected to the interface circuit 38 and moving an object to be photographed, for example, in three dimensions.
Hereinafter, the differences will be described in detail.
 図12において、情報処理装置3のCPU31は所定の移動命令信号をインターフェース回路38を介して可動装置4に送信し、これに応答して可動装置4はその応答信号(例えば、ACK信号)をCPU31に返信する。可動装置4は、移動命令信号に基づいて撮影すべき対象物を例えば3次元で移動させることで、カメラ1の露光時間、カメラ1の絞り値、カメラ1のゲイン、投光器2の投光強度、投光器2の投光方向を変更できるだけでなく、撮影時の環境光の強度、撮影時の環境光の方向も変更させることができるようになり、より柔軟に、撮影条件を変更することができる。 In FIG. 12, the CPU 31 of the information processing device 3 transmits a predetermined movement command signal to the movable device 4 via the interface circuit 38, and in response to this, the movable device 4 sends a response signal (for example, an ACK signal) to the CPU 31. Reply to The movable device 4 moves the object to be photographed in, for example, a three-dimensional manner based on the movement command signal, so that the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, Not only can the projection direction of the projector 2 be changed, but also the intensity of ambient light during shooting and the direction of ambient light during shooting can be changed, so that shooting conditions can be changed more flexibly.
 ここで、環境光とは投光器2以外から発せられる光であり、太陽光、電灯、炎を含む熱放射、ルミネセンスを含む。環境光の強度は、斜光手段の位置を変化することで変更することができる。斜光手段は、庇や覆いであり、また、産業用ロボットを含む可動装置を利用してもよい。 Here, ambient light is light emitted from other than the projector 2, and includes sunlight, electric light, thermal radiation including flame, and luminescence. The intensity of the ambient light can be changed by changing the position of the oblique light means. The oblique light means is a bag or a cover, and a movable device including an industrial robot may be used.
 可動装置4は例えばアクチュエータである。可動装置4自体が、カメラや投光器に対する環境光を遮る遮蔽装置として機能してもよい。また、可動装置4は、複数のアクチュエータで構成されてもよい。その場合、複数の光源からの環境光を遮ることができる。また、可動装置4は複数のアクチュエータから構成されるロボットアームなどの、多軸構成の可動装置であってもよい。先端にはグリッパなどのハンドを着けてもよい。この場合、ロボットアームが姿勢を変更し、ハンドが環境光の光源を遮ってもよい。もしくは、周辺の板や半透明のすりガラスなどの遮蔽物をロボットアームが把持し、遮蔽物の姿勢を変更することで、環境光の光源を遮ってもよい。もしくは、鏡を把持し、環境光の方向を変更してもよい。もしくは、環境光の光源を制御するスイッチをロボットアームにより操作して、環境光の照明をオンオフしてもよい。 The movable device 4 is, for example, an actuator. The movable device 4 itself may function as a shielding device that shields ambient light from the camera and the projector. Moreover, the movable device 4 may be composed of a plurality of actuators. In that case, ambient light from a plurality of light sources can be blocked. The movable device 4 may be a multi-axis movable device such as a robot arm composed of a plurality of actuators. A hand such as a gripper may be attached to the tip. In this case, the robot arm may change the posture, and the hand may block the light source of ambient light. Alternatively, the ambient light source may be blocked by holding a shielding object such as a peripheral plate or translucent ground glass by a robot arm and changing the posture of the shielding object. Alternatively, the direction of ambient light may be changed by holding a mirror. Alternatively, the ambient light illumination may be turned on and off by operating a switch that controls the light source of the ambient light with a robot arm.
 以上説明したように、環境光の強度や方向の変更が実現するのであれば、可動装置4はどのような構成でもよい。蛇型のロボットや、空気により膨張又は収縮を繰り返すアクチュエータでも、同様の環境光へのアクションが実現するため、可動装置4を構成しえることは自明である。 As described above, the movable device 4 may have any configuration as long as the change in the intensity and direction of the ambient light is realized. Even with a snake-shaped robot or an actuator that repeatedly expands or contracts by air, a similar action to ambient light is realized, so that it is obvious that the movable device 4 can be configured.
 この実施の形態においても図9と同様の3次元画像計測処理を実行する。ここで、図9のステップS1及びS5における撮影パラメタP1、P2の調節において、環境光を制御することで、画像の輝度値が全体的に暗くなるように、もしくは明るくなるように調整可能となる。また、投光器2が可視光の場合、環境光の影響を強く受ける。そのため、可動装置4により、環境光を遮った状態、もしくは環境光を一定にした状態で、さらに、カメラ1の露光時間、カメラ1の絞り値、カメラ1のゲイン、投光器2の投光強度、投光器2の投光方向を変更させることで、撮影パラメタを調整することで、よりノイズの少ない、高品質な距離画像を作成することができる。 Also in this embodiment, the same three-dimensional image measurement process as in FIG. 9 is executed. Here, in the adjustment of the shooting parameters P1 and P2 in steps S1 and S5 of FIG. 9, by controlling the ambient light, the brightness value of the image can be adjusted to become darker or brighter as a whole. . Further, when the projector 2 is visible light, it is strongly influenced by environmental light. Therefore, in a state where the ambient light is blocked by the movable device 4 or in a state where the ambient light is constant, the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, By changing the projection direction of the projector 2 and adjusting the shooting parameters, it is possible to create a high-quality distance image with less noise.
実施の形態3.
 図13は本発明の実施の形態3にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図13において、実施の形態3にかかる3次元画像計測装置は、図1の実施の形態1にかかる3次元画像計測装置に比較して以下の点が異なる。
(1)情報処理装置3はさらに、インターフェース回路39を備える。
(2)インターフェース回路39に接続される通信装置5をさらに備える。
(3)通信装置5に通信回線9を介して接続される通信装置6をさらに備える。
(4)通信装置6に接続され、対象物を照明する照明装置7をさらに備える。
 以下、上記相違点について詳述する。
Embodiment 3 FIG.
FIG. 13 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the third embodiment of the present invention. 13, the three-dimensional image measurement apparatus according to the third embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
(1) The information processing apparatus 3 further includes an interface circuit 39.
(2) The communication device 5 further connected to the interface circuit 39 is further provided.
(3) A communication device 6 connected to the communication device 5 via the communication line 9 is further provided.
(4) It further includes an illumination device 7 that is connected to the communication device 6 and illuminates the object.
Hereinafter, the difference will be described in detail.
 図13において、情報処理装置3のインターフェース回路39は通信装置5、通信回線9、及び通信装置6を介して照明装置7に接続されて命令信号用通信経路を形成する。情報処理装置3のCPU1は上記命令信号用通信経路を介して照明装置7に対して照明命令信号を送信することで、照明装置7の動作を制御し、これに応答して、照明装置7はこれに対応する応答信号(例えば、ACK信号)を上記命令信号用通信経路を介してCPU31に対して送信する。これにより、実施の形態2と同様に、本実施の形態でも、対象物に対する環境光を制御することができる。 In FIG. 13, the interface circuit 39 of the information processing device 3 is connected to the lighting device 7 through the communication device 5, the communication line 9, and the communication device 6 to form a command signal communication path. The CPU 1 of the information processing device 3 controls the operation of the lighting device 7 by transmitting a lighting command signal to the lighting device 7 via the command signal communication path, and in response to this, the lighting device 7 A response signal (for example, ACK signal) corresponding to this is transmitted to the CPU 31 via the command signal communication path. Thereby, similarly to Embodiment 2, also in this Embodiment, the ambient light with respect to a target object can be controlled.
 ここで、通信装置5と通信装置6はそれぞれ無線通信装置又は有線通信装置である。無線通信装置は例えば無線ルーター装置である。有線通信装置は例えばイーサネット(登録商標)ポート、USBポート、シリアルポートなどの通信用のポートを有する通信装置である。 Here, the communication device 5 and the communication device 6 are respectively a wireless communication device or a wired communication device. The wireless communication device is, for example, a wireless router device. The wired communication device is a communication device having a communication port such as an Ethernet (registered trademark) port, a USB port, and a serial port.
 照明装置7は、情報処理装置3のCPU1からの照明命令信号に基づいて、対象物に対する照射の強度、姿勢のうちどちらか一つ以上を変更できるものであり、パルス幅変調によりオンオフの時間を制御するなど、強度を調整することを可能にする強度制御部と、強度光の方向を変える可動部とのうちの少なくとも1つを備える。 The illuminating device 7 can change one or more of the intensity and orientation of the irradiation with respect to the object based on the illumination command signal from the CPU 1 of the information processing device 3, and the on / off time can be set by pulse width modulation. It includes at least one of an intensity control unit that makes it possible to adjust the intensity, such as control, and a movable unit that changes the direction of the intensity light.
 図14は図13の3次元画像計測装置の情報処理装置3により実行される3次元画像計測処理を示すフローチャートである。図14の3次元画像計測処理は、図9の3次元画像計測処理に比較して以下の点が異なる。
(1)ステップS1に代えて、「照明装置7の照射強度を所定のしきい値Th1以下になるように弱め、もしくはそのように姿勢を設定する」ステップS31の処理を備える。
(2)ステップS5に代えて、「照明装置7の照射強度を所定のしきい値Th2(>Th1)以上になるように強め、もしくはそのように姿勢を設定する」ステップS32の処理を備える。
 以下、上記相違点について詳述する。
FIG. 14 is a flowchart showing a three-dimensional image measurement process executed by the information processing apparatus 3 of the three-dimensional image measurement apparatus in FIG. The three-dimensional image measurement process of FIG. 14 differs from the three-dimensional image measurement process of FIG. 9 in the following points.
(1) In place of step S1, the process includes a process of step S31 “weakening the irradiation intensity of the illumination device 7 so as to be equal to or less than a predetermined threshold value Th1, or setting the posture in such a manner”.
(2) In place of step S5, the process includes the process of step S32 that “intensifies the irradiation intensity of the illumination device 7 to be equal to or higher than a predetermined threshold Th2 (> Th1) or sets the posture in such a manner”
Hereinafter, the difference will be described in detail.
 図14のステップS31において、情報処理装置3のCPU31からの照明命令信号に基づき、照明装置7の強度制御部もしくは可動部の制御により、照明装置7の強度及び/又は姿勢を変更する。このとき、対象物の画像の全体の輝度値が暗くなり、照明強度を所定のしきい値Th1以下になるように弱め、もしくは、カメラの撮影方向に直接照射しないように照明の姿勢を変更して設定する。 14, based on the illumination command signal from the CPU 31 of the information processing apparatus 3, the intensity and / or posture of the illumination apparatus 7 is changed by controlling the intensity control unit or the movable part of the illumination apparatus 7. At this time, the luminance value of the entire image of the target object becomes dark and the illumination intensity is weakened so as to be equal to or lower than a predetermined threshold value Th1, or the illumination posture is changed so as not to irradiate directly in the shooting direction of the camera. To set.
 図14のステップS32において、情報処理装置3のCPU31からの照明命令信号に基づき、照明装置7の強度制御部もしくは可動部の制御により、照明装置7の強度及び/又は姿勢を変更する。このとき、対象物の画像の全体の輝度値が明るくなり、照射強度を所定のしきい値Th2(>Th1)以上になるように強め、もしくはカメラ1の撮影方向に直接するなど、強度が強くなるように姿勢を変更して設定する。 14, based on the illumination command signal from the CPU 31 of the information processing device 3, the intensity and / or posture of the illumination device 7 is changed by controlling the intensity control unit or the movable unit of the illumination device 7. At this time, the overall luminance value of the image of the object becomes brighter, and the intensity is increased by increasing the irradiation intensity to be equal to or higher than a predetermined threshold value Th2 (> Th1) or directly in the shooting direction of the camera 1. Change the posture so that
 以上説明したように本実施の形態によれば、照明装置7を用いて環境光を制御することができ、高品質な距離画像を作成することができる。 As described above, according to the present embodiment, the ambient light can be controlled using the illumination device 7, and a high-quality distance image can be created.
実施の形態4.
 図15は本発明の実施の形態4にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図15の実施の形態4にかかる3次元画像計測装置は、図12の実施の形態2にかかる3次元画像計測装置に比較して以下の点が異なる。
(1)可動装置4に代えて、可動装置4の機能とともに、カメラ1及び投光器2の動作を制御する機能を有する可動制御装置8を備える。
 以下、上記相違点について詳述する。
Embodiment 4 FIG.
FIG. 15 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the fourth embodiment of the present invention. The three-dimensional image measurement apparatus according to the fourth embodiment in FIG. 15 is different from the three-dimensional image measurement apparatus according to the second embodiment in FIG. 12 in the following points.
(1) Instead of the movable device 4, a movable control device 8 having a function of controlling the operations of the camera 1 and the projector 2 as well as the function of the movable device 4 is provided.
Hereinafter, the difference will be described in detail.
 図15において、情報処理装置3のCPU31は所定の移動命令信号をインターフェース回路38を介して可動制御装置8に送信し、これに応答して可動制御装置8はその応答信号(例えば、ACK信号)をCPU31に返信する。可動制御装置8は、情報処理装置3のCPU31からの移動命令信号に基づいて撮影命令信号、投影命令信号を用いて、カメラ1及び投光器2の各位置及び姿勢を変更する。これに応答して、カメラ1は応答信号(例えば、ACK信号)を可動制御装置8に返信し、投光器2は応答信号(例えば、ACK信号)を可動制御装置8に返信する。可動制御装置8はロボットアーム、カメラ1と投光器2がその手先に搭載されているようなハンドアイと呼ばれる構成でもよい。なお、可動制御装置8は、カメラ1及び投光器2の各位置及び姿勢をそれぞれ独立に変更することが可能なように、複数のアクチュエータで構成されていてもよい。 In FIG. 15, the CPU 31 of the information processing device 3 transmits a predetermined movement command signal to the movable control device 8 via the interface circuit 38, and the movable control device 8 responds to the response signal (for example, ACK signal). Is returned to the CPU 31. The movable control device 8 changes the positions and postures of the camera 1 and the projector 2 using the imaging command signal and the projection command signal based on the movement command signal from the CPU 31 of the information processing device 3. In response to this, the camera 1 returns a response signal (for example, ACK signal) to the movable control device 8, and the projector 2 returns a response signal (for example, ACK signal) to the movable control device 8. The movable control device 8 may have a configuration called a hand eye in which the robot arm, the camera 1 and the projector 2 are mounted on their hands. In addition, the movable control apparatus 8 may be comprised with the some actuator so that each position and attitude | position of the camera 1 and the light projector 2 can each be changed independently.
 撮影する対象物が黒い物体や白い物体など、表面状態の異なる物体を混在する場合、実施の形態1で提案した距離画像の合成方法により、距離画像の計測値不明、不正確な計測を抑制することができる。一方で、カメラ1及び投光器2と対象物の位置関係による計測値不明、不正確な計測も発生する場合があり、カメラ1から見て、投光器2の光が遮蔽により観測できない場合などである。こういった場合、カメラ1及び投光器2を可動制御装置8により動かすことで、さらに計測値不明、不正確な計測を排除できる。この際、図10の計測信頼性評価処理により生成した計測信頼性マップにおいて、信頼性が低い、計測信頼性マップの画素値が0となった領域を注視するように視線方向を維持したまま、照射方向のみを変更することで、投光器2の光が遮蔽されたことによる計測値不明、不正確な計測が発生した箇所を効率的に再計測できる。その際、計測信頼性マップの画素値が0となった大きな領域を優先することで、計測値不明、不正確な計測となった画素を、より効率的に再計測することができる。 When objects to be photographed include objects with different surface states such as black objects and white objects, the distance image measurement value unknown and inaccurate measurement are suppressed by the distance image synthesis method proposed in the first embodiment. be able to. On the other hand, there may be cases where the measurement value is unknown or inaccurate due to the positional relationship between the camera 1 and the projector 2 and the object, and the light of the projector 2 cannot be observed due to the shielding as viewed from the camera 1. In such a case, moving the camera 1 and the projector 2 with the movable control device 8 can further eliminate unknown measurement values and inaccurate measurement. At this time, in the measurement reliability map generated by the measurement reliability evaluation process of FIG. 10, the line-of-sight direction is maintained so as to keep an eye on the area where the pixel value of the measurement reliability map is low and the reliability is low. By changing only the irradiation direction, it is possible to efficiently re-measure the location where the measurement value is unknown or inaccurate due to the light of the projector 2 being blocked. At that time, by giving priority to a large area in which the pixel value of the measurement reliability map is 0, it is possible to re-measure pixels with unknown measurement values and inaccurate measurement more efficiently.
 以上説明したように本実施の形態によれば、可動制御装置8を備えることで、表面状態の異なることにより計測値不明、不正確な計測と、カメラ1及び投光器2と対象物の位置関係による計測値不明、不正確な計測を両方効率的に抑制した高品質な距離画像が生成できる。 As described above, according to the present embodiment, by providing the movable control device 8, the measurement value is unknown and inaccurate due to different surface conditions, and the positional relationship between the camera 1 and the projector 2 and the target object. A high-quality distance image that efficiently suppresses both unknown and inaccurate measurement values can be generated.
変形例.
 以上の実施の形態においては、カメラ1を用いているが、本発明はこれに限らず、対象物の画像を撮像できるカメラ、CCDセンサ、撮像センサなどのセンサであってもよい。
Modified example.
In the above embodiment, the camera 1 is used. However, the present invention is not limited to this, and may be a sensor such as a camera, a CCD sensor, or an imaging sensor that can capture an image of an object.
 1 カメラ、2 投光器、3 情報処理装置、4 可動装置、5 通信装置、6 通信装置、7 照明装置、8 可動制御装置、9 通信回線、10 投影パタン制御部、11 画像取得制御部、12 距離復元部、13 計測信頼性評価部、14 距離画像合成部、15 投射面、21 カメラ画像、22 距離画像、23 ポジティブパタン画像、24 ネガティブパタン画像、25~28 計測画像、30 バス、31 中央演算処理装置(CPU)、32 リードオンリーメモリ(ROM)、33 ランダムアクセスメモリ(RAM)、34 入力装置、35 表示装置、36~39 インターフェース回路、41 カメラ画像、42 距離画像、43 カメラ画像、44 距離画像。 1 camera, 2 projector, 3 information processing device, 4 movable device, 5 communication device, 6 communication device, 7 illumination device, 8 movable control device, 9 communication line, 10 projection pattern control unit, 11 image acquisition control unit, 12 distance Restoration unit, 13 Measurement reliability evaluation unit, 14 Distance image composition unit, 15 Projection plane, 21 Camera image, 22 Distance image, 23 Positive pattern image, 24 Negative pattern image, 25-28 Measurement image, 30 Bus, 31 Central calculation Processing device (CPU), 32 read only memory (ROM), 33 random access memory (RAM), 34 input device, 35 display device, 36-39 interface circuit, 41 camera image, 42 distance image, 43 camera image, 44 distance image.

Claims (15)

  1.  投光器が撮影すべき対象物に投光したパタンをセンサで撮影して得られるパタン画像データから、上記センサから対象物までの距離を画素値で表した距離画像データを生成する3次元画像計測装置において、
     上記センサが撮影したパタン画像データから得られる距離画像データにおける画素値が不明な画素を、異なる撮影条件で上記センサにより撮影したパタン画像データから得られる距離画像の画素値で補完する制御手段を備えたことを特徴とする3次元画像計測装置。
    A three-dimensional image measuring device that generates distance image data representing a distance from the sensor to the object as a pixel value from pattern image data obtained by photographing a pattern projected onto the object to be photographed by the projector. In
    Control means for complementing a pixel whose pixel value is unknown in distance image data obtained from pattern image data photographed by the sensor with a pixel value of a distance image obtained from pattern image data photographed by the sensor under different photographing conditions A three-dimensional image measuring apparatus characterized by that.
  2.  上記制御手段は、上記センサの露光時間、上記センサの絞り値、上記センサのゲイン、上記投光器の投光強度、上記投光器の投光方向、撮影時の環境光の位置、撮影時の環境光の方向のうちの少なくとも1つを変更することで、異なる撮影条件を実現することを特徴とする請求項1記載の3次元画像計測装置。 The control means includes an exposure time of the sensor, an aperture value of the sensor, a gain of the sensor, a light projection intensity of the projector, a light projection direction of the projector, a position of ambient light at the time of shooting, and a position of ambient light at the time of shooting. The three-dimensional image measurement apparatus according to claim 1, wherein different photographing conditions are realized by changing at least one of the directions.
  3.  上記制御手段は、
     上記異なる撮影条件で上記センサにより撮影したパタン画像であって、空間コード化法を用いてパタンの明暗を互いに反転させて得られるペアの画像データであるポジティブパタン画像データ及びネガティブパタン画像データを生成し、
     上記ポジティブパタン画像データと上記ネガティブパタン画像データとの差分から得られる差分絶対値の画素値を有する差分画像データを生成し、
     上記差分画像データの画素値が0ではない、もしくは所定のしきい値よりも高い場合にのみ、上記センサが撮影したパタン画像から得られる距離画像における画素値が不明な画素を、上記異なる条件で上記センサにより撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項2記載の3次元画像計測装置。
    The control means includes
    Generates a pattern image captured by the sensor under the different imaging conditions, and a positive pattern image data and a negative pattern image data, which are paired image data obtained by reversing the brightness of the pattern using a spatial encoding method. And
    Generating difference image data having a pixel value of a difference absolute value obtained from a difference between the positive pattern image data and the negative pattern image data;
    Only when the pixel value of the difference image data is not 0 or higher than a predetermined threshold value, a pixel whose pixel value is unknown in the distance image obtained from the pattern image captured by the sensor is determined under the different conditions. 3. The three-dimensional image measurement apparatus according to claim 2, wherein interpolation is performed using a pixel value of a distance image obtained from a pattern image photographed by the sensor.
  4.  上記生成したポジティブパタン画像及びネガティブパタン画像は、高周波数帯のパタンを含むことを特徴とする請求項3記載の3次元画像計測装置。 4. The three-dimensional image measurement apparatus according to claim 3, wherein the generated positive pattern image and negative pattern image include a high frequency band pattern.
  5.  上記制御手段は、上記生成したポジティブパタン画像及びネガティブパタン画像の差分より得られる計測信頼性に基づいて、上記センサと上記投光器の各位置及び姿勢のうちの少なくとも1つを制御することを特徴とする請求項4記載の3次元画像計測装置。 The control means controls at least one of each position and orientation of the sensor and the projector based on measurement reliability obtained from a difference between the generated positive pattern image and negative pattern image. The three-dimensional image measurement apparatus according to claim 4.
  6.  上記制御手段は、所定の第1の露光時間で撮影したパタン画像から得られる距離画像の画素値が不明な画素を、上記第1の露光時間よりも長い第2の露光時間で撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項4記載の3次元画像計測装置。 The control means is a pattern image in which a pixel whose pixel value is unknown in a distance image obtained from a pattern image photographed at a predetermined first exposure time is photographed at a second exposure time longer than the first exposure time. The three-dimensional image measurement apparatus according to claim 4, wherein interpolation is performed using a pixel value of a distance image obtained from the above.
  7.  上記3次元画像計測装置は、上記制御手段に接続され、上記制御手段からの命令信号に基づいて、上記対象物を移動させることで上記異なる撮影条件を実現する可動装置を備えたことを特徴とする請求項1~6のうちのいずれか1つに記載の3次元画像計測装置。 The three-dimensional image measurement apparatus includes a movable device connected to the control unit and configured to realize the different photographing conditions by moving the object based on a command signal from the control unit. The three-dimensional image measurement apparatus according to any one of claims 1 to 6.
  8.  上記3次元画像計測装置は、上記制御手段に接続され、対象物に照明する照明装置であって、上記制御手段からの命令信号に基づいて、上記対象物に対する照明する照明条件を変更することで上記異なる撮影条件を実現する照明装置を備えたことを特徴とする請求項1~6のうちのいずれか1つに記載の3次元画像計測装置。 The three-dimensional image measurement device is an illumination device that is connected to the control means and illuminates the object, and changes illumination conditions for illuminating the object based on a command signal from the control means. The three-dimensional image measurement apparatus according to any one of claims 1 to 6, further comprising an illumination device that realizes the different photographing conditions.
  9.  上記3次元画像計測装置は、上記制御手段、上記センサ及び上記投光器に接続され、対象物を移動させる可動制御装置であって、上記制御手段からの命令信号に基づいて、上記センサ及び上記投光器の位置及び姿勢のうちの少なくとも1つを変更することで上記異なる撮影条件を実現する照明装置を備えたことを特徴とする請求項1~6のうちのいずれか1つに記載の3次元画像計測装置。 The three-dimensional image measuring device is a movable control device that is connected to the control means, the sensor, and the projector and moves an object, and based on a command signal from the control means, the sensor and the projector. The three-dimensional image measurement according to any one of claims 1 to 6, further comprising an illumination device that realizes the different photographing conditions by changing at least one of a position and a posture. apparatus.
  10.  投光器が撮影すべき対象物に投光したパタンをセンサで撮影して得られるパタン画像データから、上記センサから対象物までの距離を画素値で表した距離画像データを生成する制御手段を備えた3次元画像計測装置のための3次元画像計測方法において、
     上記制御手段が、上記センサが撮影したパタン画像データから得られる距離画像データにおける画素値が不明な画素を、異なる撮影条件で上記センサにより撮影したパタン画像データから得られる距離画像の画素値で補完するステップを含むことを特徴とする3次元画像計測方法。
    Control means for generating distance image data in which the distance from the sensor to the object is represented by a pixel value from pattern image data obtained by photographing the pattern projected onto the object to be photographed by the projector. In a 3D image measurement method for a 3D image measurement apparatus,
    The control means complements a pixel whose pixel value is unknown in the distance image data obtained from the pattern image data photographed by the sensor with a pixel value of the distance image obtained from the pattern image data photographed by the sensor under different photographing conditions. A three-dimensional image measurement method comprising the step of:
  11.  上記補完するステップは、上記制御手段が、上記センサの露光時間、上記センサの絞り値、上記センサのゲイン、上記投光器の投光強度、上記投光器の投光方向、撮影時の環境光の位置、撮影時の環境光の方向のうちの少なくとも1つを変更することで、異なる撮影条件を実現することを特徴とする請求項10記載の3次元画像計測方法。 In the step of complementing, the control means includes an exposure time of the sensor, an aperture value of the sensor, a gain of the sensor, a light projection intensity of the light projector, a light projection direction of the light projector, a position of ambient light at the time of photographing, The three-dimensional image measurement method according to claim 10, wherein different shooting conditions are realized by changing at least one of the directions of ambient light during shooting.
  12.  上記補完するステップは、上記制御手段が、
     上記異なる撮影条件で上記センサにより撮影したパタン画像であって、空間コード化法を用いてパタンの明暗を互いに反転させて得られるペアの画像データであるポジティブパタン画像データ及びネガティブパタン画像データを生成し、
     上記ポジティブパタン画像データと上記ネガティブパタン画像データとの差分から得られる差分絶対値の画素値を有する差分画像データを生成し、
     上記差分画像データの画素値が0ではない、もしくは所定のしきい値よりも高い場合にのみ、上記センサが撮影したパタン画像から得られる距離画像における画素値が不明な画素を、上記異なる条件で上記センサにより撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項11記載の3次元画像計測方法。
    In the step of complementing, the control means
    Generates a pattern image captured by the sensor under the different imaging conditions, and a positive pattern image data and a negative pattern image data, which are paired image data obtained by reversing the brightness of the pattern using a spatial encoding method. And
    Generating difference image data having a pixel value of a difference absolute value obtained from a difference between the positive pattern image data and the negative pattern image data;
    Only when the pixel value of the difference image data is not 0 or higher than a predetermined threshold value, a pixel whose pixel value is unknown in the distance image obtained from the pattern image captured by the sensor is determined under the different conditions. The three-dimensional image measurement method according to claim 11, wherein interpolation is performed using a pixel value of a distance image obtained from a pattern image photographed by the sensor.
  13.  上記生成したポジティブパタン画像及びネガティブパタン画像は、高周波数帯のパタンを含むことを特徴とする請求項12記載の3次元画像計測方法。 13. The three-dimensional image measurement method according to claim 12, wherein the generated positive pattern image and negative pattern image include a high frequency band pattern.
  14.  上記補完するステップは、上記制御手段が、上記生成したポジティブパタン画像及びネガティブパタン画像の差分より得られる計測信頼性に基づいて、上記センサと上記投光器の各位置及び姿勢のうちの少なくとも1つを制御することを特徴とする請求項13記載の3次元画像計測方法。 In the complementing step, the control means determines at least one of the positions and orientations of the sensor and the projector based on the measurement reliability obtained from the difference between the generated positive pattern image and negative pattern image. The three-dimensional image measurement method according to claim 13, wherein control is performed.
  15.  上記補完するステップは、上記制御手段が、所定の第1の露光時間で撮影したパタン画像から得られる距離画像の画素値が不明な画素を、上記第1の露光時間よりも長い第2の露光時間で撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項13記載の3次元画像計測方法。 In the complementing step, the control means applies a second exposure longer than the first exposure time to a pixel whose pixel value of the distance image obtained from the pattern image taken at the predetermined first exposure time is unknown. The three-dimensional image measurement method according to claim 13, wherein interpolation is performed using pixel values of a distance image obtained from a pattern image photographed over time.
PCT/JP2015/083036 2015-11-25 2015-11-25 Three-dimensional image measurement device and method WO2017090111A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201580084658.8A CN108369089B (en) 2015-11-25 2015-11-25 3D image measuring device and method
JP2016546864A JP6038415B1 (en) 2015-11-25 2015-11-25 3D image measuring apparatus and method
PCT/JP2015/083036 WO2017090111A1 (en) 2015-11-25 2015-11-25 Three-dimensional image measurement device and method
DE112015007146.6T DE112015007146T5 (en) 2015-11-25 2015-11-25 DEVICE AND METHOD FOR THREE-DIMENSIONAL IMAGE MEASUREMENT

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/083036 WO2017090111A1 (en) 2015-11-25 2015-11-25 Three-dimensional image measurement device and method

Publications (1)

Publication Number Publication Date
WO2017090111A1 true WO2017090111A1 (en) 2017-06-01

Family

ID=57483112

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/083036 WO2017090111A1 (en) 2015-11-25 2015-11-25 Three-dimensional image measurement device and method

Country Status (4)

Country Link
JP (1) JP6038415B1 (en)
CN (1) CN108369089B (en)
DE (1) DE112015007146T5 (en)
WO (1) WO2017090111A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019185624A1 (en) * 2018-03-30 2019-10-03 Koninklijke Philips N.V. System and method for 3d scanning
JP2020122664A (en) * 2019-01-29 2020-08-13 株式会社キーエンス Three-dimensional measurement device
TWI724594B (en) * 2019-10-29 2021-04-11 鑑微科技股份有限公司 Apparatus for three-dimensional measurement
WO2024062809A1 (en) * 2022-09-21 2024-03-28 ソニーセミコンダクタソリューションズ株式会社 Optical detecting device, and optical detecting system

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10488192B2 (en) 2015-05-10 2019-11-26 Magik Eye Inc. Distance sensor projecting parallel patterns
JP7133554B2 (en) 2016-12-07 2022-09-08 マジック アイ インコーポレイテッド Range sensor with adjustable focus image sensor
EP3692396A4 (en) 2017-10-08 2021-07-21 Magik Eye Inc. Distance measurement using a longitudinal grid pattern
EP3692501A4 (en) 2017-10-08 2021-07-07 Magik Eye Inc. Calibrating a sensor system including multiple movable sensors
US10679076B2 (en) 2017-10-22 2020-06-09 Magik Eye Inc. Adjusting the projection system of a distance sensor to optimize a beam layout
JP7354133B2 (en) 2018-03-20 2023-10-02 マジック アイ インコーポレイテッド Camera exposure adjustment for 3D depth sensing and 2D imaging
US11062468B2 (en) 2018-03-20 2021-07-13 Magik Eye Inc. Distance measurement using projection patterns of varying densities
EP3803266A4 (en) 2018-06-06 2022-03-09 Magik Eye Inc. Distance measurement using high density projection patterns
US11475584B2 (en) 2018-08-07 2022-10-18 Magik Eye Inc. Baffles for three-dimensional sensors having spherical fields of view
JP7252755B2 (en) * 2018-12-27 2023-04-05 株式会社小糸製作所 Active sensors, object identification systems, vehicles, vehicle lighting
WO2020150131A1 (en) 2019-01-20 2020-07-23 Magik Eye Inc. Three-dimensional sensor including bandpass filter having multiple passbands
JP7028814B2 (en) * 2019-02-07 2022-03-02 ファナック株式会社 External shape recognition device, external shape recognition system and external shape recognition method
WO2020165976A1 (en) * 2019-02-13 2020-08-20 三菱電機株式会社 Simulation device, simulation method, and simulation program
WO2020197813A1 (en) 2019-03-25 2020-10-01 Magik Eye Inc. Distance measurement using high density projection patterns
WO2020231747A1 (en) 2019-05-12 2020-11-19 Magik Eye Inc. Mapping three-dimensional depth map data onto two-dimensional images
CN114503543A (en) * 2019-09-26 2022-05-13 株式会社小糸制作所 Door-controlled camera, automobile, vehicle lamp, image processing device, and image processing method
WO2021113135A1 (en) 2019-12-01 2021-06-10 Magik Eye Inc. Enhancing triangulation-based three-dimensional distance measurements with time of flight information
WO2021138139A1 (en) 2019-12-29 2021-07-08 Magik Eye Inc. Associating three-dimensional coordinates with two-dimensional feature points
WO2021138677A1 (en) 2020-01-05 2021-07-08 Magik Eye Inc. Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera
CN112272435B (en) * 2020-10-14 2023-03-14 四川长虹网络科技有限责任公司 Light control system and method for indoor photography
KR102627422B1 (en) * 2022-04-14 2024-01-18 부산대학교 산학협력단 Polarized quantum rod light emitting device using langmuir-blodgett technique and the method for manufacturing thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07225834A (en) * 1994-02-10 1995-08-22 Matsushita Electric Ind Co Ltd Picture noise detector
JPH0921620A (en) * 1995-07-05 1997-01-21 Fuji Facom Corp Method for measuring three-dimensional shape of object
JP2006275529A (en) * 2005-03-28 2006-10-12 Citizen Watch Co Ltd Three-dimensional shape measuring method and measuring device
JP2009222399A (en) * 2008-03-13 2009-10-01 Nikon Corp Image gain adjusting device and method, and three-dimensional shape measuring instrument
JP2009222418A (en) * 2008-03-13 2009-10-01 Aisin Seiki Co Ltd Uneven surface inspection apparatus
JP2009264862A (en) * 2008-04-24 2009-11-12 Panasonic Electric Works Co Ltd Three-dimensional shape measuring method and device
JP2011002416A (en) * 2009-06-22 2011-01-06 Nikon Corp Three-dimensional shape measuring device
JP4889373B2 (en) * 2006-05-24 2012-03-07 ローランドディー.ジー.株式会社 Three-dimensional shape measuring method and apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19928341C2 (en) * 1999-06-21 2002-06-20 Inb Vision Ag Method for three-dimensional optical measurement of object surfaces
JP5290233B2 (en) * 2010-04-13 2013-09-18 Ckd株式会社 Three-dimensional measuring device and substrate inspection device
JP5864950B2 (en) * 2011-08-15 2016-02-17 キヤノン株式会社 Three-dimensional measuring apparatus, three-dimensional measuring method and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07225834A (en) * 1994-02-10 1995-08-22 Matsushita Electric Ind Co Ltd Picture noise detector
JPH0921620A (en) * 1995-07-05 1997-01-21 Fuji Facom Corp Method for measuring three-dimensional shape of object
JP2006275529A (en) * 2005-03-28 2006-10-12 Citizen Watch Co Ltd Three-dimensional shape measuring method and measuring device
JP4889373B2 (en) * 2006-05-24 2012-03-07 ローランドディー.ジー.株式会社 Three-dimensional shape measuring method and apparatus
JP2009222399A (en) * 2008-03-13 2009-10-01 Nikon Corp Image gain adjusting device and method, and three-dimensional shape measuring instrument
JP2009222418A (en) * 2008-03-13 2009-10-01 Aisin Seiki Co Ltd Uneven surface inspection apparatus
JP2009264862A (en) * 2008-04-24 2009-11-12 Panasonic Electric Works Co Ltd Three-dimensional shape measuring method and device
JP2011002416A (en) * 2009-06-22 2011-01-06 Nikon Corp Three-dimensional shape measuring device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019185624A1 (en) * 2018-03-30 2019-10-03 Koninklijke Philips N.V. System and method for 3d scanning
US10935376B2 (en) 2018-03-30 2021-03-02 Koninklijke Philips N.V. System and method for 3D scanning
JP2021512430A (en) * 2018-03-30 2021-05-13 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Systems and methods for 3D scanning
JP2020122664A (en) * 2019-01-29 2020-08-13 株式会社キーエンス Three-dimensional measurement device
JP7164451B2 (en) 2019-01-29 2022-11-01 株式会社キーエンス Three-dimensional measuring device
TWI724594B (en) * 2019-10-29 2021-04-11 鑑微科技股份有限公司 Apparatus for three-dimensional measurement
WO2024062809A1 (en) * 2022-09-21 2024-03-28 ソニーセミコンダクタソリューションズ株式会社 Optical detecting device, and optical detecting system

Also Published As

Publication number Publication date
CN108369089B (en) 2020-03-24
CN108369089A (en) 2018-08-03
JPWO2017090111A1 (en) 2017-11-24
DE112015007146T5 (en) 2018-08-02
JP6038415B1 (en) 2016-12-07

Similar Documents

Publication Publication Date Title
JP6038415B1 (en) 3D image measuring apparatus and method
EP3552180B1 (en) Distance sensor including adjustable focus imaging sensor
CN108718373B (en) Image device
EP3198852B1 (en) Image processing apparatus and control method thereof
JP5108093B2 (en) Imaging apparatus and imaging method
US10928518B2 (en) Range image generation apparatus and range image generation method
JP7371443B2 (en) 3D measuring device
CN109831660A (en) Depth image acquisition method, depth image obtaining module and electronic equipment
US11006087B2 (en) Image synthesizing device and image synthesizing method
US10713810B2 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
US11803982B2 (en) Image processing device and three-dimensional measuring system
JP2020144136A (en) Depth sensing systems and methods
JP6377295B2 (en) Distance measuring device and distance measuring method
US10542875B2 (en) Imaging device, endoscope apparatus, and imaging method
JP2021044710A (en) Image processing apparatus, image processing method and program
US11747135B2 (en) Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor
JP2008128771A (en) Apparatus and method for simultaneously acquiring spectroscopic information and shape information
JP7228294B2 (en) Projector control device, projector, projection system, projection method and program
US11501408B2 (en) Information processing apparatus, information processing method, and program
JP7028814B2 (en) External shape recognition device, external shape recognition system and external shape recognition method
JP2012085093A (en) Imaging device and acquisition method
US20200314310A1 (en) Moving Object Imaging Device and Moving Object Imaging Method
CN115150545B (en) Measurement system for acquiring three-dimensional measurement points
WO2021084892A1 (en) Image processing device, image processing method, image processing program, and image processing system
JP2016205928A (en) Self position calculation device and self position calculation method

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016546864

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15909230

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112015007146

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15909230

Country of ref document: EP

Kind code of ref document: A1