WO2017090111A1 - Three-dimensional image measurement device and method - Google Patents
Three-dimensional image measurement device and method Download PDFInfo
- Publication number
- WO2017090111A1 WO2017090111A1 PCT/JP2015/083036 JP2015083036W WO2017090111A1 WO 2017090111 A1 WO2017090111 A1 WO 2017090111A1 JP 2015083036 W JP2015083036 W JP 2015083036W WO 2017090111 A1 WO2017090111 A1 WO 2017090111A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image data
- sensor
- pattern
- pattern image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2536—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object using several gratings with variable grating pitch, projected on the object with the same angle of incidence
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
Definitions
- the present invention is a three-dimensional image measurement that measures a three-dimensional shape of an object by projecting a pattern onto the object, imaging the reflected light, and analyzing it using a three-dimensional image measurement method such as a spatial coding method.
- the present invention relates to an apparatus and a method.
- a pattern image which is a reflected light image from a measurement object on which a predetermined pattern is projected, is obtained by photographing the reflected light of the pattern projected on the object with a sensor such as a camera.
- a sensor such as a camera.
- the distance from the sensor to the object can be restored by analyzing the pattern image based on the principle of triangulation.
- this includes a three-dimensional image measurement method called an active stereo method, which includes a spatial encoding method, a structured light method, a light cutting method, a spot light projection method, and a slit light projection method.
- the pixel value obtained when the reflected light of the pattern projected onto the object is received by the sensor element and the other elements that do not receive the reflected light.
- a clear difference is required between the pixel values.
- These pixel values are determined depending on various factors such as ambient light other than the projected pattern, diffuse reflectance on the surface material of the object, light intensity of the projected pattern, and light receiving sensitivity of the sensor.
- Patent Document 1 As a method for measuring a three-dimensional image of a scene in which a plurality of objects having different surface states are mixed, a pattern image (multiple shutter speed) obtained by combining pattern images taken at different exposure times. There is a method of restoring the distance from (image) (Patent Document 1).
- three-dimensional image measurement is performed by a spatial encoding method using a multiple exposure pattern image obtained by synthesizing a short exposure time pattern image and a long exposure time pattern image. Since the continuity between pixels deteriorates due to insufficient range of the light receiving element of the camera, there is a problem that pixels having inaccurate values are generated, that is, erroneous measurement occurs.
- An object of the present invention is to solve the above-described problems, and in a scene where a plurality of objects having different surface states exist, a three-dimensional image measurement apparatus capable of performing stable measurement with less measurement loss while suppressing erroneous measurement. Is to provide.
- a three-dimensional image measurement apparatus includes: A three-dimensional image measuring device that generates distance image data representing a distance from the sensor to the object as a pixel value from pattern image data obtained by photographing a pattern projected onto the object to be photographed by the projector.
- Control means for complementing a pixel whose pixel value is unknown in distance image data obtained from pattern image data photographed by the sensor with a pixel value of a distance image obtained from pattern image data photographed by the sensor under different photographing conditions It is characterized by that.
- the three-dimensional image measurement apparatus it is possible to perform stable measurement with few measurement omissions while suppressing erroneous measurement in a scene where there are a plurality of objects having different surface states.
- FIG. 1 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the first embodiment of the present invention.
- a camera 1 that images a three-dimensional object and outputs image data
- a projector 2 that projects a predetermined amount of light onto the three-dimensional object are respectively interface circuits 36 of the information processing apparatus 3. , 37.
- the information processing apparatus 3 includes a control unit such as a computer or a digital computer. Specifically, a central processing unit (hereinafter referred to as a CPU) 31 that executes a three-dimensional image measurement process, and a three-dimensional image measurement process.
- a CPU central processing unit
- ROM Read-only memory
- RAM random access memory
- An input device 34 that includes a recognition camera, a wearable acceleration sensor, and the like and receives an operator's operation, and a display device 35 that presents information to the operator, such as a display, are provided. It is connected and configured.
- the interface circuit 36 performs conversion of communication data and signals executed between the camera 1 and the CPU 31. Further, the interface circuit 37 performs conversion of a command signal from the CPU 1 to the projector 2 and the like. Specifically, the CPU 1 outputs a pattern projection command signal for instructing to project a predetermined pattern, which will be described in detail later, to the projector 2 via the interface circuit 37. In addition, the CPU 31 generates a shooting command signal and outputs it to the camera 1 via the interface circuit 36. In response to this, the camera 1 outputs pattern image data which is image data obtained by shooting a predetermined pattern and its high-frequency pattern. The image data is output to the CPU 1 via the interface circuit 36.
- the three-dimensional image measurement apparatus synthesizes distance images (images in which the distance from the camera 1 to the measurement object is represented by pixel values) obtained from pattern images photographed under different photographing conditions. is there. At that time, a pixel with an unknown pixel value of a distance image obtained from a pattern image photographed with a relatively small amount of received light can be obtained from a pattern image photographed with a relatively large amount of received light. Interpolation is performed using pixel values of the distance image. In this case, the composition method is also changed between pixels with high and low reliability of measurement of high-frequency components in the pattern.
- the camera 1 is a digital camera for photographing the pattern of the three-dimensional object projected by the projector 2 and outputting the image data captured to the information processing apparatus 3.
- the camera 1 has a communication function for receiving a shooting command from the information processing device 3 and returning the shot pattern image to the information processing device 3
- the camera 1 is a pinhole camera, a range finder camera, or a view camera. There may be.
- a pattern image can be taken, a light field camera provided with a microlens array for estimating the direction of light entering the lens may be used.
- the projector 2 receives a pattern projection command from the information processing device 3 and irradiates a predetermined pattern to a three-dimensional object.
- the projector 2 is a CRT projector, a liquid crystal projector, a digital light processing projector, a reflective liquid crystal element projector, A reflective display element projector using a diffraction phenomenon, a laser, a diffraction grating, a polygon mirror, and a driving device that changes a laser and a laser irradiation position may be used.
- FIG. 2 is a block diagram showing a schematic function of the information processing apparatus 3 of FIG.
- the information processing apparatus 3 (1) A projection pattern control unit 10 that outputs a pattern projection command signal for projecting a predetermined pattern to the projector 2; (2) An image acquisition control unit 11 that outputs a shooting command signal to the camera 1 at a timing synchronized with a pattern projection command signal from the projection pattern control unit 10; (3) a distance restoration unit 12 that restores the distance between the camera 1 and the object as distance image data of digital information from a pattern image captured by the camera 1; (4) From each of the distance image data restored by the distance restoration unit 12 and the high-frequency pattern image data photographed by the camera 1 (images obtained by photographing patterns having fine features in a plurality of types of patterns projected by the projector 2), each pixel A measurement reliability evaluation unit 13 for evaluating the reliability of the distance estimation of (5) From a plurality of distance image data restored by the distance restoration unit 12 and a measurement reliability map of the distance image data created by the measurement reliability evaluation unit 13, a pixel
- the projection pattern control unit 10 and the image acquisition control unit 11 are controlled using a synchronization signal, and synchronize the timing of the pattern projection command signal and the imaging command signal. Thereby, the increased data can be obtained for the synchronized pattern image.
- FIG. 3A is a diagram showing a camera image 21 when a measurement target is photographed by the three-dimensional image measurement apparatus of FIG.
- FIG. 3B is a diagram showing a distance image 22 when a measurement object is photographed by the three-dimensional image measurement apparatus of FIG.
- the intensity of light reflected by the reflectance of the material of the object is reflected in the pixel value, so that the black object is black and the white object is white.
- the distance image 22 of FIG. 3B corresponding to the camera image 21 since the distance from the camera 1 to the object is expressed as a pixel value, the object close to the camera 1 is bright and the distant object is displayed black. .
- FIG. 4 is a diagram showing an example of a pattern image including a positive pattern image 23 and a negative pattern image 24 used in a general spatial coding method. That is, FIG. 4 shows an image of a plurality of binary patterns used in a general spatial coding method.
- the slit pattern in the vertical direction is irradiated, and the irradiated portion is displayed bright and the non-irradiated portion is displayed dark.
- a plurality of slit patterns having different frequency bands are projected. Depending on the thickness of the slit, a thin slit pattern is called a high-frequency pattern image, and a thick slit pattern is called a low-frequency pattern image.
- a pair of slit patterns whose binary patterns are opposite in polarity that is, image data of a pair obtained by reversing the light and dark of the photographing pattern
- a pair of the paired pattern images is referred to as a positive pattern image 23
- the other pair is referred to as a negative pattern image 24.
- the distance restoration unit 12 restores the distance including the distance from the camera 1 to the object from the pattern image obtained by the image acquisition control unit 11. Although it is possible to express the distance restoration result as a point cloud that is a set of point clouds for a three-dimensional object, it is possible to utilize image processing with a short processing time by expressing it as distance image data. .
- the measurement reliability evaluation unit 13 quantifies the possibility that the pixel value is unknown or inaccurate in each pixel of the distance image from the camera 1 to the object restored by the distance restoration unit 12.
- high-frequency pattern image data is used among the pattern images photographed by the image acquisition control unit 11. Since the high-frequency pattern is a thin pattern, the pattern tends to disappear according to the intensity change of the reflected light of the ambient light or the pattern light itself. In particular, when the intensity of the reflected light is strong, not only the pixels that directly receive the reflected light are saturated but also the surrounding pixels are saturated, leading to erroneous measurement over a wide range. Since the high frequency pattern is easily affected by fluctuations in reflected light, analyzing its state is suitable for evaluation of measurement reliability.
- the measurement reliability map R PN is defined as follows.
- I P High is high frequency positive pattern image data
- I N High is high frequency negative pattern image data.
- the value of each pixel in the high-frequency positive pattern image data and the high-frequency negative pattern image data inherently has an extreme difference, but no difference occurs depending on the influence of ambient light and the intensity of reflected light of the pattern light. This is equivalent to the fact that a high frequency pattern could not be observed. That is, when the value of the measurement reliability map RPN is 0 or lower than a predetermined threshold (for example, a positive value near 0, for example, 0.01), it can be evaluated that the measurement reliability is extremely low.
- a predetermined threshold for example, a positive value near 0, for example, 0.01
- the distance image composition unit 14 synthesizes distance images photographed under different photographing conditions, and corrects a pixel whose pixel value is unknown under each photographing condition or a pixel value different from the actual distance between the camera 1 and the object. I do.
- the different shooting conditions are that a pattern image is shot by changing the amount of received light of the camera element, and the distance image is restored.
- the information processing apparatus 3 has an exposure time of the camera 1 (for example, using a first exposure time and a second exposure time longer than the first exposure time), an aperture of the camera 1. By changing the value, the gain of the camera 1, the light projecting intensity of the projector 2, and the light projecting direction of the projector 2, different shooting conditions can be realized.
- FIG. 5A is a diagram of an image example showing that the photographing result of the measurement object varies depending on the exposure time, and is a diagram of an image example of the image 25 in the first darkest case.
- FIG. 5B is an image example showing that the measurement result of the measurement object varies depending on the exposure time, and is an image example of the image 26 in the second darkest case.
- FIG. 5C is a diagram of an image example showing that the photographing result of the measurement object varies depending on the difference in exposure time, and is a diagram of an image example of the image 27 in the third darkest case.
- FIG. 5A is a diagram of an image example showing that the photographing result of the measurement object varies depending on the exposure time, and is a diagram of an image example of the image 25 in the first darkest case.
- FIG. 5B is an image example showing that the measurement result of the measurement object varies depending on the exposure time, and is an image example of the image 26 in the second darkest case.
- FIG. 5C is a diagram of an
- FIGS. 5A to 5D are images of images obtained by shooting the same object under different exposure conditions under different shooting conditions.
- the image in FIG. 5A has the shortest exposure time, and the exposure time becomes longer in the order of FIGS. 5B, 5C, and 5D.
- the difference between the black object and the surroundings is difficult to distinguish, and in an image with a long exposure time, the difference between the white object and the surroundings is difficult to distinguish.
- FIG. 6A is a diagram illustrating an image example of the camera image 41 of a pattern image with a small amount of light received by pixels.
- FIG. 6B is a diagram illustrating an image example of the distance image 42 obtained from the camera image 41 of FIG. 6A.
- FIG. 7A is a diagram illustrating an image example of the camera image 43 of a pattern image having a relatively large amount of light received by pixels.
- FIG. 7B is a diagram showing an example of a distance image obtained from the camera image 43 of FIG. 7A.
- a small black circle indicates a pixel whose pixel value is unknown.
- the pixel value of the white object is easily obtained accurately, but the pixel value of the black object is likely to be unknown.
- a portion such as a texture in the distance image 44 in FIG. 7B indicates an incorrect pixel value.
- the pixel value of the black object can be accurately obtained, but the pixel value of the white object tends to be inaccurate.
- Interpolation with neighboring pixels can be applied to pixels whose pixel values are unknown, but pixels with inaccurate pixel values cannot be interpolated, and serious problems such as robot collision are likely to occur in actual applications. Therefore, high-quality distance image data is obtained by synthesizing the distance images by the following strategy method.
- the distance image synthesizing unit 14 generates and outputs distance image data I fusion (x, y) synthesized by synthesizing distance images as in the following equation.
- I Low indicates distance image data with a low light reception amount of pixel values based on a pattern image photographed under a certain photographing condition.
- I High refers to distance image data with a high received light amount of pixel values based on pattern images shot under different shooting conditions.
- the distance image is acquired twice while changing the exposure time, the distance image data based on shooting with a short exposure time corresponds to I Low , and the distance image data based on shooting with a long exposure time corresponds to I High .
- (x, y) means a two-dimensional position of a pixel in an image.
- a pattern image with a low amount of received light the pattern disappears, so that the pixel value of the distance image data that can be calculated tends to be unknown.
- the influence of ambient light is also suppressed, light reception noise (hereinafter referred to as noise) is reduced, and the pixel value for which the pixel value of the distance image data can be calculated tends to be accurate.
- noise light reception noise
- a pattern image with a high amount of received light is strongly affected by ambient light, so that noise increases and pixel values of distance image data tend to be inaccurate. That is, the distance image data I Low has little noise, but there are many places where the pixel value is unknown.
- the distance image data I High is noisy.
- the result of the synthesized distance image will also be incorrect. Therefore, based on the measurement reliability map R PN obtained from the high-frequency positive pattern image data I P High obtained when the distance image data I High is calculated and the high-frequency negative pattern image data I N High , the following is performed.
- the distance image data I fusion (x, y) is synthesized.
- the pixel value of the distance image data I High is complemented with the pixel value of the distance image data I High.
- the value of the measurement reliability map RPN is 0, the pixel value is 0 (pixel value is unknown). .
- high-quality distance image data I fusion (x, y) with further reduced noise can be obtained.
- the case is classified according to whether the value of the measurement reliability map RPN is 0 or not, but a predetermined threshold value (for example, a value near 0, for example, 0.01) ) May be defined and the measurement reliability map RPN may be classified according to whether the value is greater than or less than the threshold value.
- the pixel value is set to 0 when the value is less than the threshold value.
- FIG. 8A is a measurement result of the three-dimensional image measurement apparatus in FIG. 1 and is a diagram illustrating an example of a camera image of an object.
- FIG. 8B is a measurement result of the three-dimensional image measurement apparatus in FIG. 1 and shows an example of an original distance image based on photographing with a relatively short exposure time.
- FIG. 8C is a diagram showing a measurement result of the three-dimensional image measurement apparatus of FIG. 1 and showing an example of a distance image calculated by Expression (2).
- FIG. 8D is a measurement result of the three-dimensional image measurement apparatus of FIG. 1, and is a diagram showing an example of the distance image calculated by the equations (2) and (3).
- the left image is a black object
- the right image is an object having a white portion and a metal portion.
- many pixels with unknown pixel values are generated.
- Expressions (2), (3), and (4) in FIGS. are complemented with exact values. It shows remarkable results especially for black objects.
- pixel values that are completely inaccurate are mixed in a part of a white object or a metal part.
- FIG. 8D according to the equations (3) and (4), it can be seen that most of them are removed.
- FIG. 9 is a flowchart showing a 3D image measurement process executed by the information processing apparatus 3 of the 3D image measurement apparatus of FIG.
- FIG. 10 is a flowchart showing measurement reliability evaluation processing (step S9) which is a subroutine of FIG.
- FIG. 11 is a flowchart showing distance image synthesis processing (step S10) which is a subroutine of FIG.
- a pattern image is photographed by irradiating with a projection pattern twice with different photographing parameters P1, P2 (steps S1, S5) (steps S2-S3, S6-S7).
- the parameters are the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, the light projection direction of the projector 2, the intensity of ambient light at the time of shooting, and the environment at the time of shooting. It corresponds to the change of the direction of light.
- the shooting parameter P1 in FIG. 9 is a parameter setting in which the brightness value of the shot image becomes dark overall. If the exposure time of the camera 1 is short, the exposure time of the camera 1 is set short.
- the projection pattern is irradiated from the projector 2, and the pattern image is taken while the camera 1 is synchronized.
- a distance image is created from the captured pattern image.
- parameters are set so that the brightness value of the captured image becomes bright overall, and pattern projection, shooting, and distance image creation are performed. Based on the two shootings, the measurement reliability is evaluated (step S9), and the distance images are synthesized (step 10).
- the absolute difference between the high-frequency negative pattern image and the high-frequency positive pattern image is calculated and the absolute difference between the high-frequency negative pattern image and the high-frequency positive pattern image is acquired.
- Difference image data having a pixel value of the value is generated (step S11).
- the following steps S13 to S15 are performed using one unselected pixel as the selected pixel to be processed. If the difference absolute value of the selected pixel is 0 (YES in step S13), the measurement reliability value of the selected pixel is set to 0 and the process proceeds to step S16.
- step S13 if the difference absolute value of the selected pixel is 1 (NO in step S13), the measurement reliability value of the selected pixel is set to 1 (step S15), and the process proceeds to step S16.
- the processing from step S12 to S14 is repeated for all pixels of the processing target image (step S16), and if the result of calculating the measurement reliability for all the pixels is obtained (YES in step S16), a measurement reliability map is generated. (Step S17), the process returns to the original routine.
- an unselected pixel is set as a selected pixel to be processed (step S21), and the following steps S22 to S26 are performed.
- the process proceeds to step S23, but if it is not 0 (NO in step S22;
- step S24 the pixel value of the distance image data (imaging parameter P1) is added to the pixel value of the synthesized distance image data in step S24. Then, the process proceeds to step S27.
- step S23 If the pixel value of the measurement reliability map is 0 (YES in step S23), 0 is substituted into the pixel value of the synthesized distance image data, and the process proceeds to step S27. If not, the composition is not 0 (NO in step S23).
- the pixel value of the distance image data (imaging parameter P2) is substituted into the pixel value of the distance image data thus obtained, and the process proceeds to step S27.
- the processing from step S21 to S26 is repeated for all pixels of the processing target image (step S27), and if the result of combining the distance image data is obtained for all pixels (YES in step S27), the combined distance image data is generated. And output (step S28), the process returns to the original routine.
- the distance image data of the shooting parameter P1 is obtained based on the distance image data obtained from the shooting of the shooting parameter P1 adjusted so that the overall luminance value becomes dark. If the pixel value is not 0 (meaning that the distance value is unknown) (NO in step S22), the pixel value of the distance image data of the shooting parameter P1 is set as the pixel value of the synthesized distance image data (step). S24). If the pixel value of the distance image data of the shooting parameter P1 is 0 (YES in step S22), the measurement reliability map created using the image data of FIG. 4 is referred to.
- the pixel value of the distance image data of the shooting parameter P2 is set as the pixel value of the synthesized distance image (step S26). . If the pixel value of the corresponding pixel is 0 (YES in step S23), the corresponding pixel value of the synthesized distance image is set to 0 (step S25).
- the same procedure is used for the distance image data captured based on the combined distance image and captured with different shooting parameters P3 (different from the shooting parameters P1 and P2).
- a distance image with fewer missing measurements may be created by combining the images.
- FIG. FIG. 12 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the second embodiment of the present invention. 12, the three-dimensional image measurement apparatus according to the second embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
- the information processing apparatus 3 further includes an interface circuit 38 connected to the bus 30.
- the mobile device 4 further includes a movable device 4 connected to the interface circuit 38 and moving an object to be photographed, for example, in three dimensions.
- the CPU 31 of the information processing device 3 transmits a predetermined movement command signal to the movable device 4 via the interface circuit 38, and in response to this, the movable device 4 sends a response signal (for example, an ACK signal) to the CPU 31.
- a response signal for example, an ACK signal
- the movable device 4 moves the object to be photographed in, for example, a three-dimensional manner based on the movement command signal, so that the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, Not only can the projection direction of the projector 2 be changed, but also the intensity of ambient light during shooting and the direction of ambient light during shooting can be changed, so that shooting conditions can be changed more flexibly.
- ambient light is light emitted from other than the projector 2, and includes sunlight, electric light, thermal radiation including flame, and luminescence.
- the intensity of the ambient light can be changed by changing the position of the oblique light means.
- the oblique light means is a bag or a cover, and a movable device including an industrial robot may be used.
- the movable device 4 is, for example, an actuator.
- the movable device 4 itself may function as a shielding device that shields ambient light from the camera and the projector.
- the movable device 4 may be composed of a plurality of actuators. In that case, ambient light from a plurality of light sources can be blocked.
- the movable device 4 may be a multi-axis movable device such as a robot arm composed of a plurality of actuators.
- a hand such as a gripper may be attached to the tip. In this case, the robot arm may change the posture, and the hand may block the light source of ambient light.
- the ambient light source may be blocked by holding a shielding object such as a peripheral plate or translucent ground glass by a robot arm and changing the posture of the shielding object.
- a shielding object such as a peripheral plate or translucent ground glass
- the direction of ambient light may be changed by holding a mirror.
- the ambient light illumination may be turned on and off by operating a switch that controls the light source of the ambient light with a robot arm.
- the movable device 4 may have any configuration as long as the change in the intensity and direction of the ambient light is realized. Even with a snake-shaped robot or an actuator that repeatedly expands or contracts by air, a similar action to ambient light is realized, so that it is obvious that the movable device 4 can be configured.
- the same three-dimensional image measurement process as in FIG. 9 is executed.
- the brightness value of the image can be adjusted to become darker or brighter as a whole.
- the projector 2 is visible light, it is strongly influenced by environmental light. Therefore, in a state where the ambient light is blocked by the movable device 4 or in a state where the ambient light is constant, the exposure time of the camera 1, the aperture value of the camera 1, the gain of the camera 1, the light projection intensity of the projector 2, By changing the projection direction of the projector 2 and adjusting the shooting parameters, it is possible to create a high-quality distance image with less noise.
- FIG. 13 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the third embodiment of the present invention.
- the three-dimensional image measurement apparatus according to the third embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
- the information processing apparatus 3 further includes an interface circuit 39.
- the communication device 5 further connected to the interface circuit 39 is further provided.
- a communication device 6 connected to the communication device 5 via the communication line 9 is further provided.
- the difference will be described in detail.
- the interface circuit 39 of the information processing device 3 is connected to the lighting device 7 through the communication device 5, the communication line 9, and the communication device 6 to form a command signal communication path.
- the CPU 1 of the information processing device 3 controls the operation of the lighting device 7 by transmitting a lighting command signal to the lighting device 7 via the command signal communication path, and in response to this, the lighting device 7 A response signal (for example, ACK signal) corresponding to this is transmitted to the CPU 31 via the command signal communication path.
- ACK signal for example, ACK signal
- the communication device 5 and the communication device 6 are respectively a wireless communication device or a wired communication device.
- the wireless communication device is, for example, a wireless router device.
- the wired communication device is a communication device having a communication port such as an Ethernet (registered trademark) port, a USB port, and a serial port.
- the illuminating device 7 can change one or more of the intensity and orientation of the irradiation with respect to the object based on the illumination command signal from the CPU 1 of the information processing device 3, and the on / off time can be set by pulse width modulation. It includes at least one of an intensity control unit that makes it possible to adjust the intensity, such as control, and a movable unit that changes the direction of the intensity light.
- FIG. 14 is a flowchart showing a three-dimensional image measurement process executed by the information processing apparatus 3 of the three-dimensional image measurement apparatus in FIG.
- the three-dimensional image measurement process of FIG. 14 differs from the three-dimensional image measurement process of FIG. 9 in the following points.
- the process includes a process of step S31 “weakening the irradiation intensity of the illumination device 7 so as to be equal to or less than a predetermined threshold value Th1, or setting the posture in such a manner”.
- step S5 the process includes the process of step S32 that “intensifies the irradiation intensity of the illumination device 7 to be equal to or higher than a predetermined threshold Th2 (> Th1) or sets the posture in such a manner”
- Th2 a predetermined threshold
- the intensity and / or posture of the illumination apparatus 7 is changed by controlling the intensity control unit or the movable part of the illumination apparatus 7. At this time, the luminance value of the entire image of the target object becomes dark and the illumination intensity is weakened so as to be equal to or lower than a predetermined threshold value Th1, or the illumination posture is changed so as not to irradiate directly in the shooting direction of the camera. To set.
- the intensity and / or posture of the illumination device 7 is changed by controlling the intensity control unit or the movable unit of the illumination device 7. At this time, the overall luminance value of the image of the object becomes brighter, and the intensity is increased by increasing the irradiation intensity to be equal to or higher than a predetermined threshold value Th2 (> Th1) or directly in the shooting direction of the camera 1. Change the posture so that
- the ambient light can be controlled using the illumination device 7, and a high-quality distance image can be created.
- FIG. FIG. 15 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the fourth embodiment of the present invention.
- the three-dimensional image measurement apparatus according to the fourth embodiment in FIG. 15 is different from the three-dimensional image measurement apparatus according to the second embodiment in FIG. 12 in the following points.
- a movable control device 8 having a function of controlling the operations of the camera 1 and the projector 2 as well as the function of the movable device 4 is provided.
- the difference will be described in detail.
- the CPU 31 of the information processing device 3 transmits a predetermined movement command signal to the movable control device 8 via the interface circuit 38, and the movable control device 8 responds to the response signal (for example, ACK signal). Is returned to the CPU 31.
- the movable control device 8 changes the positions and postures of the camera 1 and the projector 2 using the imaging command signal and the projection command signal based on the movement command signal from the CPU 31 of the information processing device 3.
- the camera 1 returns a response signal (for example, ACK signal) to the movable control device 8, and the projector 2 returns a response signal (for example, ACK signal) to the movable control device 8.
- the movable control device 8 may have a configuration called a hand eye in which the robot arm, the camera 1 and the projector 2 are mounted on their hands.
- the movable control apparatus 8 may be comprised with the some actuator so that each position and attitude
- the distance image measurement value unknown and inaccurate measurement are suppressed by the distance image synthesis method proposed in the first embodiment. be able to.
- the measurement value is unknown or inaccurate due to the positional relationship between the camera 1 and the projector 2 and the object, and the light of the projector 2 cannot be observed due to the shielding as viewed from the camera 1.
- moving the camera 1 and the projector 2 with the movable control device 8 can further eliminate unknown measurement values and inaccurate measurement.
- the line-of-sight direction is maintained so as to keep an eye on the area where the pixel value of the measurement reliability map is low and the reliability is low.
- the measurement value is unknown and inaccurate due to different surface conditions, and the positional relationship between the camera 1 and the projector 2 and the target object.
- a high-quality distance image that efficiently suppresses both unknown and inaccurate measurement values can be generated.
- the camera 1 is used.
- the present invention is not limited to this, and may be a sensor such as a camera, a CCD sensor, or an imaging sensor that can capture an image of an object.
- CPU Central calculation Processing device
- ROM read only memory
- RAM random access memory
Abstract
Description
投光器が撮影すべき対象物に投光したパタンをセンサで撮影して得られるパタン画像データから、上記センサから対象物までの距離を画素値で表した距離画像データを生成する3次元画像計測装置において、
上記センサが撮影したパタン画像データから得られる距離画像データにおける画素値が不明な画素を、異なる撮影条件で上記センサにより撮影したパタン画像データから得られる距離画像の画素値で補完する制御手段を備えたことを特徴とする。 A three-dimensional image measurement apparatus according to an aspect of the present invention includes:
A three-dimensional image measuring device that generates distance image data representing a distance from the sensor to the object as a pixel value from pattern image data obtained by photographing a pattern projected onto the object to be photographed by the projector. In
Control means for complementing a pixel whose pixel value is unknown in distance image data obtained from pattern image data photographed by the sensor with a pixel value of a distance image obtained from pattern image data photographed by the sensor under different photographing conditions It is characterized by that.
図1は本発明の実施の形態1にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図1において、3次元の対象物を撮像して画像データを出力するカメラ1と、3次元の対象物に対して所定光量の投光を行う投光器2がそれぞれ、情報処理装置3のインターフェース回路36,37に接続される。情報処理装置3は例えばコンピュータ又はディジタル計算機などの制御手段で構成され、具体的には、3次元画像計測処理を実行する中央演算処理装置(以下、CPUという。)31と、3次元画像計測処理のプログラムを保存したリードオンリーメモリ(以下、ROMという。)32と、入出力データ及び入出力信号を一時的に保持するランダムアクセスメモリ(以下、RAMという。)33と、例えばマウス、キーボード、ジェスチャ認識用カメラ、装着型加速度センサなどを含み操作者の操作を受け付ける入力装置34と、例えばディスプレイなど操作者に対して情報を提示する表示装置35とを備え、これら構成要素31~37はバス30を介して接続されて構成される。
FIG. 1 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the first embodiment of the present invention. In FIG. 1, a
(1)投光器2に対して、所定のパタンを投影するパタン投影命令信号を出力する投影パタン制御部10と、
(2)カメラ1に対して、投影パタン制御部10からのパタン投影命令信号と同期したタイミングで撮影命令信号を出力する画像取得制御部11と、
(3)カメラ1が撮影したパタン画像から、カメラ1と対象物の距離をデジタル情報の距離画像データとして復元する距離復元部12と、
(4)距離復元部12が復元した距離画像データと、カメラ1が撮影した高周波パタン画像データ(投光器2が投影する複数種類のパタンにおいて、細かい特徴を持つパタンを撮影した画像)から、各画素の距離推定の信頼性を評価する計測信頼性評価部13と、
(5)距離復元部12が復元した複数の距離画像データと、計測信頼性評価部13が作成した距離画像データの計測信頼性マップから、画素値が不明な画素と、実際のカメラと対象物までの距離に対して不正確な画素値を持つ画素が少なくなるように複数の距離画像同士を合成する距離画像合成部14
とを備えて構成される。 FIG. 2 is a block diagram showing a schematic function of the
(1) A projection
(2) An image
(3) a
(4) From each of the distance image data restored by the
(5) From a plurality of distance image data restored by the
And is configured.
図12は本発明の実施の形態2にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図12において、実施の形態2にかかる3次元画像計測装置は、図1の実施の形態1にかかる3次元画像計測装置に比較して、以下の点が異なる。
(1)情報処理装置3はさらに、バス30に接続されるインターフェース回路38を備える。
(2)インターフェース回路38に接続され、撮影すべき対象物を例えば3次元で移動させる可動装置4をさらに備える。
以下、相違点について詳述する。
FIG. 12 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the second embodiment of the present invention. 12, the three-dimensional image measurement apparatus according to the second embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
(1) The
(2) The
Hereinafter, the differences will be described in detail.
図13は本発明の実施の形態3にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図13において、実施の形態3にかかる3次元画像計測装置は、図1の実施の形態1にかかる3次元画像計測装置に比較して以下の点が異なる。
(1)情報処理装置3はさらに、インターフェース回路39を備える。
(2)インターフェース回路39に接続される通信装置5をさらに備える。
(3)通信装置5に通信回線9を介して接続される通信装置6をさらに備える。
(4)通信装置6に接続され、対象物を照明する照明装置7をさらに備える。
以下、上記相違点について詳述する。
FIG. 13 is a block diagram showing a hardware configuration of the three-dimensional image measurement apparatus according to the third embodiment of the present invention. 13, the three-dimensional image measurement apparatus according to the third embodiment is different from the three-dimensional image measurement apparatus according to the first embodiment in FIG. 1 in the following points.
(1) The
(2) The
(3) A
(4) It further includes an
Hereinafter, the difference will be described in detail.
(1)ステップS1に代えて、「照明装置7の照射強度を所定のしきい値Th1以下になるように弱め、もしくはそのように姿勢を設定する」ステップS31の処理を備える。
(2)ステップS5に代えて、「照明装置7の照射強度を所定のしきい値Th2(>Th1)以上になるように強め、もしくはそのように姿勢を設定する」ステップS32の処理を備える。
以下、上記相違点について詳述する。 FIG. 14 is a flowchart showing a three-dimensional image measurement process executed by the
(1) In place of step S1, the process includes a process of step S31 “weakening the irradiation intensity of the
(2) In place of step S5, the process includes the process of step S32 that “intensifies the irradiation intensity of the
Hereinafter, the difference will be described in detail.
図15は本発明の実施の形態4にかかる3次元画像計測装置のハードウエア構成を示すブロック図である。図15の実施の形態4にかかる3次元画像計測装置は、図12の実施の形態2にかかる3次元画像計測装置に比較して以下の点が異なる。
(1)可動装置4に代えて、可動装置4の機能とともに、カメラ1及び投光器2の動作を制御する機能を有する可動制御装置8を備える。
以下、上記相違点について詳述する。
FIG. 15 is a block diagram showing a hardware configuration of the three-dimensional image measuring apparatus according to the fourth embodiment of the present invention. The three-dimensional image measurement apparatus according to the fourth embodiment in FIG. 15 is different from the three-dimensional image measurement apparatus according to the second embodiment in FIG. 12 in the following points.
(1) Instead of the
Hereinafter, the difference will be described in detail.
以上の実施の形態においては、カメラ1を用いているが、本発明はこれに限らず、対象物の画像を撮像できるカメラ、CCDセンサ、撮像センサなどのセンサであってもよい。 Modified example.
In the above embodiment, the
Claims (15)
- 投光器が撮影すべき対象物に投光したパタンをセンサで撮影して得られるパタン画像データから、上記センサから対象物までの距離を画素値で表した距離画像データを生成する3次元画像計測装置において、
上記センサが撮影したパタン画像データから得られる距離画像データにおける画素値が不明な画素を、異なる撮影条件で上記センサにより撮影したパタン画像データから得られる距離画像の画素値で補完する制御手段を備えたことを特徴とする3次元画像計測装置。 A three-dimensional image measuring device that generates distance image data representing a distance from the sensor to the object as a pixel value from pattern image data obtained by photographing a pattern projected onto the object to be photographed by the projector. In
Control means for complementing a pixel whose pixel value is unknown in distance image data obtained from pattern image data photographed by the sensor with a pixel value of a distance image obtained from pattern image data photographed by the sensor under different photographing conditions A three-dimensional image measuring apparatus characterized by that. - 上記制御手段は、上記センサの露光時間、上記センサの絞り値、上記センサのゲイン、上記投光器の投光強度、上記投光器の投光方向、撮影時の環境光の位置、撮影時の環境光の方向のうちの少なくとも1つを変更することで、異なる撮影条件を実現することを特徴とする請求項1記載の3次元画像計測装置。 The control means includes an exposure time of the sensor, an aperture value of the sensor, a gain of the sensor, a light projection intensity of the projector, a light projection direction of the projector, a position of ambient light at the time of shooting, and a position of ambient light at the time of shooting. The three-dimensional image measurement apparatus according to claim 1, wherein different photographing conditions are realized by changing at least one of the directions.
- 上記制御手段は、
上記異なる撮影条件で上記センサにより撮影したパタン画像であって、空間コード化法を用いてパタンの明暗を互いに反転させて得られるペアの画像データであるポジティブパタン画像データ及びネガティブパタン画像データを生成し、
上記ポジティブパタン画像データと上記ネガティブパタン画像データとの差分から得られる差分絶対値の画素値を有する差分画像データを生成し、
上記差分画像データの画素値が0ではない、もしくは所定のしきい値よりも高い場合にのみ、上記センサが撮影したパタン画像から得られる距離画像における画素値が不明な画素を、上記異なる条件で上記センサにより撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項2記載の3次元画像計測装置。 The control means includes
Generates a pattern image captured by the sensor under the different imaging conditions, and a positive pattern image data and a negative pattern image data, which are paired image data obtained by reversing the brightness of the pattern using a spatial encoding method. And
Generating difference image data having a pixel value of a difference absolute value obtained from a difference between the positive pattern image data and the negative pattern image data;
Only when the pixel value of the difference image data is not 0 or higher than a predetermined threshold value, a pixel whose pixel value is unknown in the distance image obtained from the pattern image captured by the sensor is determined under the different conditions. 3. The three-dimensional image measurement apparatus according to claim 2, wherein interpolation is performed using a pixel value of a distance image obtained from a pattern image photographed by the sensor. - 上記生成したポジティブパタン画像及びネガティブパタン画像は、高周波数帯のパタンを含むことを特徴とする請求項3記載の3次元画像計測装置。 4. The three-dimensional image measurement apparatus according to claim 3, wherein the generated positive pattern image and negative pattern image include a high frequency band pattern.
- 上記制御手段は、上記生成したポジティブパタン画像及びネガティブパタン画像の差分より得られる計測信頼性に基づいて、上記センサと上記投光器の各位置及び姿勢のうちの少なくとも1つを制御することを特徴とする請求項4記載の3次元画像計測装置。 The control means controls at least one of each position and orientation of the sensor and the projector based on measurement reliability obtained from a difference between the generated positive pattern image and negative pattern image. The three-dimensional image measurement apparatus according to claim 4.
- 上記制御手段は、所定の第1の露光時間で撮影したパタン画像から得られる距離画像の画素値が不明な画素を、上記第1の露光時間よりも長い第2の露光時間で撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項4記載の3次元画像計測装置。 The control means is a pattern image in which a pixel whose pixel value is unknown in a distance image obtained from a pattern image photographed at a predetermined first exposure time is photographed at a second exposure time longer than the first exposure time. The three-dimensional image measurement apparatus according to claim 4, wherein interpolation is performed using a pixel value of a distance image obtained from the above.
- 上記3次元画像計測装置は、上記制御手段に接続され、上記制御手段からの命令信号に基づいて、上記対象物を移動させることで上記異なる撮影条件を実現する可動装置を備えたことを特徴とする請求項1~6のうちのいずれか1つに記載の3次元画像計測装置。 The three-dimensional image measurement apparatus includes a movable device connected to the control unit and configured to realize the different photographing conditions by moving the object based on a command signal from the control unit. The three-dimensional image measurement apparatus according to any one of claims 1 to 6.
- 上記3次元画像計測装置は、上記制御手段に接続され、対象物に照明する照明装置であって、上記制御手段からの命令信号に基づいて、上記対象物に対する照明する照明条件を変更することで上記異なる撮影条件を実現する照明装置を備えたことを特徴とする請求項1~6のうちのいずれか1つに記載の3次元画像計測装置。 The three-dimensional image measurement device is an illumination device that is connected to the control means and illuminates the object, and changes illumination conditions for illuminating the object based on a command signal from the control means. The three-dimensional image measurement apparatus according to any one of claims 1 to 6, further comprising an illumination device that realizes the different photographing conditions.
- 上記3次元画像計測装置は、上記制御手段、上記センサ及び上記投光器に接続され、対象物を移動させる可動制御装置であって、上記制御手段からの命令信号に基づいて、上記センサ及び上記投光器の位置及び姿勢のうちの少なくとも1つを変更することで上記異なる撮影条件を実現する照明装置を備えたことを特徴とする請求項1~6のうちのいずれか1つに記載の3次元画像計測装置。 The three-dimensional image measuring device is a movable control device that is connected to the control means, the sensor, and the projector and moves an object, and based on a command signal from the control means, the sensor and the projector. The three-dimensional image measurement according to any one of claims 1 to 6, further comprising an illumination device that realizes the different photographing conditions by changing at least one of a position and a posture. apparatus.
- 投光器が撮影すべき対象物に投光したパタンをセンサで撮影して得られるパタン画像データから、上記センサから対象物までの距離を画素値で表した距離画像データを生成する制御手段を備えた3次元画像計測装置のための3次元画像計測方法において、
上記制御手段が、上記センサが撮影したパタン画像データから得られる距離画像データにおける画素値が不明な画素を、異なる撮影条件で上記センサにより撮影したパタン画像データから得られる距離画像の画素値で補完するステップを含むことを特徴とする3次元画像計測方法。 Control means for generating distance image data in which the distance from the sensor to the object is represented by a pixel value from pattern image data obtained by photographing the pattern projected onto the object to be photographed by the projector. In a 3D image measurement method for a 3D image measurement apparatus,
The control means complements a pixel whose pixel value is unknown in the distance image data obtained from the pattern image data photographed by the sensor with a pixel value of the distance image obtained from the pattern image data photographed by the sensor under different photographing conditions. A three-dimensional image measurement method comprising the step of: - 上記補完するステップは、上記制御手段が、上記センサの露光時間、上記センサの絞り値、上記センサのゲイン、上記投光器の投光強度、上記投光器の投光方向、撮影時の環境光の位置、撮影時の環境光の方向のうちの少なくとも1つを変更することで、異なる撮影条件を実現することを特徴とする請求項10記載の3次元画像計測方法。 In the step of complementing, the control means includes an exposure time of the sensor, an aperture value of the sensor, a gain of the sensor, a light projection intensity of the light projector, a light projection direction of the light projector, a position of ambient light at the time of photographing, The three-dimensional image measurement method according to claim 10, wherein different shooting conditions are realized by changing at least one of the directions of ambient light during shooting.
- 上記補完するステップは、上記制御手段が、
上記異なる撮影条件で上記センサにより撮影したパタン画像であって、空間コード化法を用いてパタンの明暗を互いに反転させて得られるペアの画像データであるポジティブパタン画像データ及びネガティブパタン画像データを生成し、
上記ポジティブパタン画像データと上記ネガティブパタン画像データとの差分から得られる差分絶対値の画素値を有する差分画像データを生成し、
上記差分画像データの画素値が0ではない、もしくは所定のしきい値よりも高い場合にのみ、上記センサが撮影したパタン画像から得られる距離画像における画素値が不明な画素を、上記異なる条件で上記センサにより撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項11記載の3次元画像計測方法。 In the step of complementing, the control means
Generates a pattern image captured by the sensor under the different imaging conditions, and a positive pattern image data and a negative pattern image data, which are paired image data obtained by reversing the brightness of the pattern using a spatial encoding method. And
Generating difference image data having a pixel value of a difference absolute value obtained from a difference between the positive pattern image data and the negative pattern image data;
Only when the pixel value of the difference image data is not 0 or higher than a predetermined threshold value, a pixel whose pixel value is unknown in the distance image obtained from the pattern image captured by the sensor is determined under the different conditions. The three-dimensional image measurement method according to claim 11, wherein interpolation is performed using a pixel value of a distance image obtained from a pattern image photographed by the sensor. - 上記生成したポジティブパタン画像及びネガティブパタン画像は、高周波数帯のパタンを含むことを特徴とする請求項12記載の3次元画像計測方法。 13. The three-dimensional image measurement method according to claim 12, wherein the generated positive pattern image and negative pattern image include a high frequency band pattern.
- 上記補完するステップは、上記制御手段が、上記生成したポジティブパタン画像及びネガティブパタン画像の差分より得られる計測信頼性に基づいて、上記センサと上記投光器の各位置及び姿勢のうちの少なくとも1つを制御することを特徴とする請求項13記載の3次元画像計測方法。 In the complementing step, the control means determines at least one of the positions and orientations of the sensor and the projector based on the measurement reliability obtained from the difference between the generated positive pattern image and negative pattern image. The three-dimensional image measurement method according to claim 13, wherein control is performed.
- 上記補完するステップは、上記制御手段が、所定の第1の露光時間で撮影したパタン画像から得られる距離画像の画素値が不明な画素を、上記第1の露光時間よりも長い第2の露光時間で撮影したパタン画像から得られる距離画像の画素値で補間することを特徴とする請求項13記載の3次元画像計測方法。 In the complementing step, the control means applies a second exposure longer than the first exposure time to a pixel whose pixel value of the distance image obtained from the pattern image taken at the predetermined first exposure time is unknown. The three-dimensional image measurement method according to claim 13, wherein interpolation is performed using pixel values of a distance image obtained from a pattern image photographed over time.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580084658.8A CN108369089B (en) | 2015-11-25 | 2015-11-25 | 3D image measuring device and method |
JP2016546864A JP6038415B1 (en) | 2015-11-25 | 2015-11-25 | 3D image measuring apparatus and method |
PCT/JP2015/083036 WO2017090111A1 (en) | 2015-11-25 | 2015-11-25 | Three-dimensional image measurement device and method |
DE112015007146.6T DE112015007146T5 (en) | 2015-11-25 | 2015-11-25 | DEVICE AND METHOD FOR THREE-DIMENSIONAL IMAGE MEASUREMENT |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/083036 WO2017090111A1 (en) | 2015-11-25 | 2015-11-25 | Three-dimensional image measurement device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017090111A1 true WO2017090111A1 (en) | 2017-06-01 |
Family
ID=57483112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/083036 WO2017090111A1 (en) | 2015-11-25 | 2015-11-25 | Three-dimensional image measurement device and method |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP6038415B1 (en) |
CN (1) | CN108369089B (en) |
DE (1) | DE112015007146T5 (en) |
WO (1) | WO2017090111A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019185624A1 (en) * | 2018-03-30 | 2019-10-03 | Koninklijke Philips N.V. | System and method for 3d scanning |
JP2020122664A (en) * | 2019-01-29 | 2020-08-13 | 株式会社キーエンス | Three-dimensional measurement device |
TWI724594B (en) * | 2019-10-29 | 2021-04-11 | 鑑微科技股份有限公司 | Apparatus for three-dimensional measurement |
WO2024062809A1 (en) * | 2022-09-21 | 2024-03-28 | ソニーセミコンダクタソリューションズ株式会社 | Optical detecting device, and optical detecting system |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10488192B2 (en) | 2015-05-10 | 2019-11-26 | Magik Eye Inc. | Distance sensor projecting parallel patterns |
JP7133554B2 (en) | 2016-12-07 | 2022-09-08 | マジック アイ インコーポレイテッド | Range sensor with adjustable focus image sensor |
EP3692396A4 (en) | 2017-10-08 | 2021-07-21 | Magik Eye Inc. | Distance measurement using a longitudinal grid pattern |
EP3692501A4 (en) | 2017-10-08 | 2021-07-07 | Magik Eye Inc. | Calibrating a sensor system including multiple movable sensors |
US10679076B2 (en) | 2017-10-22 | 2020-06-09 | Magik Eye Inc. | Adjusting the projection system of a distance sensor to optimize a beam layout |
JP7354133B2 (en) | 2018-03-20 | 2023-10-02 | マジック アイ インコーポレイテッド | Camera exposure adjustment for 3D depth sensing and 2D imaging |
US11062468B2 (en) | 2018-03-20 | 2021-07-13 | Magik Eye Inc. | Distance measurement using projection patterns of varying densities |
EP3803266A4 (en) | 2018-06-06 | 2022-03-09 | Magik Eye Inc. | Distance measurement using high density projection patterns |
US11475584B2 (en) | 2018-08-07 | 2022-10-18 | Magik Eye Inc. | Baffles for three-dimensional sensors having spherical fields of view |
JP7252755B2 (en) * | 2018-12-27 | 2023-04-05 | 株式会社小糸製作所 | Active sensors, object identification systems, vehicles, vehicle lighting |
WO2020150131A1 (en) | 2019-01-20 | 2020-07-23 | Magik Eye Inc. | Three-dimensional sensor including bandpass filter having multiple passbands |
JP7028814B2 (en) * | 2019-02-07 | 2022-03-02 | ファナック株式会社 | External shape recognition device, external shape recognition system and external shape recognition method |
WO2020165976A1 (en) * | 2019-02-13 | 2020-08-20 | 三菱電機株式会社 | Simulation device, simulation method, and simulation program |
WO2020197813A1 (en) | 2019-03-25 | 2020-10-01 | Magik Eye Inc. | Distance measurement using high density projection patterns |
WO2020231747A1 (en) | 2019-05-12 | 2020-11-19 | Magik Eye Inc. | Mapping three-dimensional depth map data onto two-dimensional images |
CN114503543A (en) * | 2019-09-26 | 2022-05-13 | 株式会社小糸制作所 | Door-controlled camera, automobile, vehicle lamp, image processing device, and image processing method |
WO2021113135A1 (en) | 2019-12-01 | 2021-06-10 | Magik Eye Inc. | Enhancing triangulation-based three-dimensional distance measurements with time of flight information |
WO2021138139A1 (en) | 2019-12-29 | 2021-07-08 | Magik Eye Inc. | Associating three-dimensional coordinates with two-dimensional feature points |
WO2021138677A1 (en) | 2020-01-05 | 2021-07-08 | Magik Eye Inc. | Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera |
CN112272435B (en) * | 2020-10-14 | 2023-03-14 | 四川长虹网络科技有限责任公司 | Light control system and method for indoor photography |
KR102627422B1 (en) * | 2022-04-14 | 2024-01-18 | 부산대학교 산학협력단 | Polarized quantum rod light emitting device using langmuir-blodgett technique and the method for manufacturing thereof |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07225834A (en) * | 1994-02-10 | 1995-08-22 | Matsushita Electric Ind Co Ltd | Picture noise detector |
JPH0921620A (en) * | 1995-07-05 | 1997-01-21 | Fuji Facom Corp | Method for measuring three-dimensional shape of object |
JP2006275529A (en) * | 2005-03-28 | 2006-10-12 | Citizen Watch Co Ltd | Three-dimensional shape measuring method and measuring device |
JP2009222399A (en) * | 2008-03-13 | 2009-10-01 | Nikon Corp | Image gain adjusting device and method, and three-dimensional shape measuring instrument |
JP2009222418A (en) * | 2008-03-13 | 2009-10-01 | Aisin Seiki Co Ltd | Uneven surface inspection apparatus |
JP2009264862A (en) * | 2008-04-24 | 2009-11-12 | Panasonic Electric Works Co Ltd | Three-dimensional shape measuring method and device |
JP2011002416A (en) * | 2009-06-22 | 2011-01-06 | Nikon Corp | Three-dimensional shape measuring device |
JP4889373B2 (en) * | 2006-05-24 | 2012-03-07 | ローランドディー.ジー.株式会社 | Three-dimensional shape measuring method and apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19928341C2 (en) * | 1999-06-21 | 2002-06-20 | Inb Vision Ag | Method for three-dimensional optical measurement of object surfaces |
JP5290233B2 (en) * | 2010-04-13 | 2013-09-18 | Ckd株式会社 | Three-dimensional measuring device and substrate inspection device |
JP5864950B2 (en) * | 2011-08-15 | 2016-02-17 | キヤノン株式会社 | Three-dimensional measuring apparatus, three-dimensional measuring method and program |
-
2015
- 2015-11-25 DE DE112015007146.6T patent/DE112015007146T5/en active Pending
- 2015-11-25 JP JP2016546864A patent/JP6038415B1/en active Active
- 2015-11-25 CN CN201580084658.8A patent/CN108369089B/en active Active
- 2015-11-25 WO PCT/JP2015/083036 patent/WO2017090111A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07225834A (en) * | 1994-02-10 | 1995-08-22 | Matsushita Electric Ind Co Ltd | Picture noise detector |
JPH0921620A (en) * | 1995-07-05 | 1997-01-21 | Fuji Facom Corp | Method for measuring three-dimensional shape of object |
JP2006275529A (en) * | 2005-03-28 | 2006-10-12 | Citizen Watch Co Ltd | Three-dimensional shape measuring method and measuring device |
JP4889373B2 (en) * | 2006-05-24 | 2012-03-07 | ローランドディー.ジー.株式会社 | Three-dimensional shape measuring method and apparatus |
JP2009222399A (en) * | 2008-03-13 | 2009-10-01 | Nikon Corp | Image gain adjusting device and method, and three-dimensional shape measuring instrument |
JP2009222418A (en) * | 2008-03-13 | 2009-10-01 | Aisin Seiki Co Ltd | Uneven surface inspection apparatus |
JP2009264862A (en) * | 2008-04-24 | 2009-11-12 | Panasonic Electric Works Co Ltd | Three-dimensional shape measuring method and device |
JP2011002416A (en) * | 2009-06-22 | 2011-01-06 | Nikon Corp | Three-dimensional shape measuring device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019185624A1 (en) * | 2018-03-30 | 2019-10-03 | Koninklijke Philips N.V. | System and method for 3d scanning |
US10935376B2 (en) | 2018-03-30 | 2021-03-02 | Koninklijke Philips N.V. | System and method for 3D scanning |
JP2021512430A (en) * | 2018-03-30 | 2021-05-13 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Systems and methods for 3D scanning |
JP2020122664A (en) * | 2019-01-29 | 2020-08-13 | 株式会社キーエンス | Three-dimensional measurement device |
JP7164451B2 (en) | 2019-01-29 | 2022-11-01 | 株式会社キーエンス | Three-dimensional measuring device |
TWI724594B (en) * | 2019-10-29 | 2021-04-11 | 鑑微科技股份有限公司 | Apparatus for three-dimensional measurement |
WO2024062809A1 (en) * | 2022-09-21 | 2024-03-28 | ソニーセミコンダクタソリューションズ株式会社 | Optical detecting device, and optical detecting system |
Also Published As
Publication number | Publication date |
---|---|
CN108369089B (en) | 2020-03-24 |
CN108369089A (en) | 2018-08-03 |
JPWO2017090111A1 (en) | 2017-11-24 |
DE112015007146T5 (en) | 2018-08-02 |
JP6038415B1 (en) | 2016-12-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6038415B1 (en) | 3D image measuring apparatus and method | |
EP3552180B1 (en) | Distance sensor including adjustable focus imaging sensor | |
CN108718373B (en) | Image device | |
EP3198852B1 (en) | Image processing apparatus and control method thereof | |
JP5108093B2 (en) | Imaging apparatus and imaging method | |
US10928518B2 (en) | Range image generation apparatus and range image generation method | |
JP7371443B2 (en) | 3D measuring device | |
CN109831660A (en) | Depth image acquisition method, depth image obtaining module and electronic equipment | |
US11006087B2 (en) | Image synthesizing device and image synthesizing method | |
US10713810B2 (en) | Information processing apparatus, method of controlling information processing apparatus, and storage medium | |
US11803982B2 (en) | Image processing device and three-dimensional measuring system | |
JP2020144136A (en) | Depth sensing systems and methods | |
JP6377295B2 (en) | Distance measuring device and distance measuring method | |
US10542875B2 (en) | Imaging device, endoscope apparatus, and imaging method | |
JP2021044710A (en) | Image processing apparatus, image processing method and program | |
US11747135B2 (en) | Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor | |
JP2008128771A (en) | Apparatus and method for simultaneously acquiring spectroscopic information and shape information | |
JP7228294B2 (en) | Projector control device, projector, projection system, projection method and program | |
US11501408B2 (en) | Information processing apparatus, information processing method, and program | |
JP7028814B2 (en) | External shape recognition device, external shape recognition system and external shape recognition method | |
JP2012085093A (en) | Imaging device and acquisition method | |
US20200314310A1 (en) | Moving Object Imaging Device and Moving Object Imaging Method | |
CN115150545B (en) | Measurement system for acquiring three-dimensional measurement points | |
WO2021084892A1 (en) | Image processing device, image processing method, image processing program, and image processing system | |
JP2016205928A (en) | Self position calculation device and self position calculation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016546864 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15909230 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015007146 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15909230 Country of ref document: EP Kind code of ref document: A1 |