WO2013153661A1 - Dispositif de mesure tridimensionnelle, système de mesure tridimensionnelle, procédé de commande, programme et support de stockage - Google Patents

Dispositif de mesure tridimensionnelle, système de mesure tridimensionnelle, procédé de commande, programme et support de stockage Download PDF

Info

Publication number
WO2013153661A1
WO2013153661A1 PCT/JP2012/060112 JP2012060112W WO2013153661A1 WO 2013153661 A1 WO2013153661 A1 WO 2013153661A1 JP 2012060112 W JP2012060112 W JP 2012060112W WO 2013153661 A1 WO2013153661 A1 WO 2013153661A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
measurement object
light pattern
measurement
transmission device
Prior art date
Application number
PCT/JP2012/060112
Other languages
English (en)
Japanese (ja)
Inventor
良司 野口
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2012/060112 priority Critical patent/WO2013153661A1/fr
Priority to JP2014509993A priority patent/JP5795431B2/ja
Publication of WO2013153661A1 publication Critical patent/WO2013153661A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices

Definitions

  • the present invention relates to a three-dimensional measurement technique.
  • Patent Document 1 a technique for performing three-dimensional measurement of an object as a subject based on an image taken by a camera is known.
  • Patent Document 1 and Non-Patent Document 1 a plurality of light patterns are projected on a measurement object, and the three-dimensional position of the measurement object is measured based on an image obtained by photographing the measurement object on which these light patterns are projected.
  • Technology is disclosed.
  • Patent Document 2 when a person is photographed through a screen while projecting and displaying an image on a screen in a video conference system or the like, a process of subtracting a noise signal generated based on the characteristics of the screen, A technique for improving the image quality of a screen is disclosed.
  • a part of the light pattern to be projected is projected and displayed on the transmission device.
  • the problem that the accuracy of the original measurement is reduced may occur depending on the characteristics of the transmission device. Further, when the process of removing the light pattern projected and displayed on the transmission device is performed, there is a problem that the luminance of the light pattern projected on the measurement object is lowered.
  • the present invention has been made to solve the above-described problems, and preferably performs measurement even when a light pattern is projected onto a measurement object and a measurement object is photographed via a transmission device.
  • the main object is to provide a three-dimensional measuring apparatus capable of three-dimensional measurement of an object.
  • the invention according to claim 1 is a three-dimensional measurement apparatus that performs three-dimensional measurement of a measurement object onto which a light pattern is projected via a transmission device, and an image of the measurement object photographed through the transmission device
  • Brightness correction means for detecting the display area and correcting the brightness of the display area.
  • the invention according to claim 10 is a control method executed by a three-dimensional measurement apparatus that performs three-dimensional measurement of a measurement object onto which a light pattern is projected through a transmission device, and is photographed through the transmission device.
  • the image acquisition step of acquiring the image of the measurement object the removal step of removing the light pattern projected and displayed on the transmission device from the image of the measurement object, and in the image of the measurement object, the light pattern is irradiated
  • a luminance correction step of detecting a display area of the measured object and correcting the luminance of the display area.
  • the invention according to claim 11 is a program executed by a three-dimensional measurement apparatus that performs three-dimensional measurement of a measurement object onto which a light pattern is projected via a transmission device, and is taken by the transmission device.
  • An image acquisition unit that acquires an image of the measurement object
  • a removal unit that removes the light pattern projected and displayed on the transmission device from the image of the measurement object, and the light pattern is irradiated on the image of the measurement object.
  • the three-dimensional measurement apparatus is caused to function as a luminance correction unit that detects a display area of the measurement object and corrects the luminance of the display area.
  • the functional block of a three-dimensional measuring device is shown. It is an image which a projection pattern process part refers and produces
  • a three-dimensional measurement apparatus that performs three-dimensional measurement of a measurement object onto which a light pattern is projected via a transmission device, the measurement object photographed via the transmission device
  • Image acquisition means for acquiring the image of the measurement object
  • removal means for removing the light pattern projected and displayed on the transmission device from the image of the measurement object, and the light pattern irradiated on the image of the measurement object.
  • Brightness correction means for detecting the display area of the measurement object and correcting the brightness of the display area.
  • the three-dimensional measurement apparatus includes an image acquisition unit, a removal unit, and a luminance correction unit, and performs a three-dimensional measurement of a measurement object onto which a light pattern is projected via a transmission device.
  • the image acquisition unit acquires an image of the measurement object photographed through the transmission device.
  • the removing unit removes the light pattern projected and displayed on the transmission device from the image of the measurement object. Thereby, a removal means suppresses that a transmissive device is detected accidentally.
  • the brightness correction unit detects the display area of the measurement object irradiated with the light pattern and corrects the brightness of the display area. In this way, when the three-dimensional measurement apparatus removes the light pattern projected on the transmission device, the display area of the measurement object irradiated with the light pattern becomes dark. It can suppress that measurement accuracy falls.
  • the three-dimensional measuring apparatus in the image of the measurement object, an overlapping portion of the area where the light pattern is removed and the display area of the measurement object are overlapped with the measurement object irradiated with the light pattern. Detect as display area. Thereby, the three-dimensional measuring apparatus can preferably recognize the display area of the measurement object irradiated with the light pattern.
  • the luminance correction unit corrects the luminance of the display area of the measurement object irradiated with the light pattern based on the luminance before processing by the removing unit.
  • the three-dimensional measuring device increases the difference between the luminance of the display area of the measurement object irradiated with the light pattern and the luminance of the display area of the measurement object not irradiated with the light pattern.
  • Original measurement can be performed with high accuracy.
  • the brightness correction unit may determine the brightness of the display area of the measurement object irradiated with the light pattern and the brightness of the display area of the measurement object not irradiated with the light pattern. Correct based on brightness. Even in this mode, the three-dimensional measurement apparatus increases the difference between the luminance of the display area of the measurement object irradiated with the light pattern and the luminance of the display area of the measurement object not irradiated with the light pattern, and performs three-dimensional measurement. Can be performed with high accuracy.
  • the removing unit includes a projection pattern processing unit that generates an image indicating the light pattern projected and displayed on the transmission device, and an image of the measurement object.
  • Difference processing means for performing difference processing on the image generated by the projection pattern processing means.
  • difference processing refers to processing for removing a component of a predetermined image from a target image, for example, processing for subtracting each pixel value of the latter image from each pixel value of the former image.
  • the removal means can preferably remove the light pattern projected and displayed on the transmission device from the image of the measurement object.
  • the projection pattern processing unit may be configured so that the measurement object does not exist with respect to an image of the transmission device on which the light pattern is projected in a state where the measurement object does not exist.
  • An image indicating the light pattern is generated by performing a difference process on the image of the transmissive device on which the light pattern is not projected.
  • the removing unit can generate an image showing the light pattern projected and displayed on the transmission device, and can remove the light pattern projected and displayed on the transmission device from the image of the measurement object.
  • the difference processing unit may be configured such that the luminance of the light pattern projected and displayed on the transmission device in an image captured in a state where the measurement object does not exist, When it is detected that the brightness of the light pattern projected and displayed on the transmissive device in the image is smaller than the brightness value of the light pattern generated by the projection pattern processing means, the difference process is performed.
  • the projected image is displayed on the transmission device.
  • the brightness of the generated light pattern may be different. Even in such a case, the three-dimensional measurement apparatus can suitably remove the light pattern projected and displayed on the transmission device from the image of the measurement object.
  • the removal unit further removes the light pattern projected and displayed on a background object of the measurement object from the image of the measurement object
  • the luminance correction unit includes the brightness correction unit
  • the display area of the measurement object irradiated with the light pattern is detected in the image of the measurement object in the area to be processed by the removing unit, and the luminance of the display area is corrected.
  • the three-dimensional measurement apparatus can correct the luminance of the display area of the measurement object irradiated with the light pattern even when the light pattern is irradiated on the background object of the measurement object.
  • a three-dimensional measurement system in another preferred embodiment, includes a transmission device, the three-dimensional measurement apparatus described above, and a projector that emits the light pattern to the measurement object via the transmission device. And a camera that transmits an image obtained by photographing the measurement object via the transmission device to the three-dimensional measurement apparatus.
  • the three-dimensional measurement apparatus performs measurement because the display area of the measurement object irradiated with the light pattern becomes dark when removing the light pattern projected on the transmission device. It can suppress that the measurement accuracy of an object falls.
  • a control method executed by a three-dimensional measurement apparatus that performs three-dimensional measurement of a measurement object onto which a light pattern is projected via a transmission device, the image being shot through the transmission device.
  • the image acquisition step of acquiring the image of the measurement object the removal step of removing the light pattern projected and displayed on the transmission device from the image of the measurement object, and the image of the measurement object, the light pattern is A luminance correction step of detecting a display area of the irradiated measurement object and correcting the luminance of the display area.
  • the three-dimensional measurement apparatus performs measurement because the display area of the measurement object irradiated with the light pattern becomes dark when removing the light pattern projected on the transmission device. It can suppress that the measurement accuracy of an object falls.
  • a program executed by a three-dimensional measurement apparatus that performs three-dimensional measurement of a measurement object onto which a light pattern is projected through a transmission device, and is photographed through the transmission device.
  • Image acquisition means for acquiring an image of the measurement object
  • removal means for removing the light pattern projected and displayed on the transmission device from the image of the measurement object, and irradiation of the light pattern in the image of the measurement object
  • the display area of the measured object thus detected is detected, and the three-dimensional measurement device is caused to function as a luminance correction means for correcting the luminance of the display area.
  • the 3D measurement device removes the light pattern projected on the transmission device, resulting in the display area of the measurement object irradiated with the light pattern becoming dark. It is possible to suppress a decrease in measurement accuracy.
  • the program is stored in a storage medium.
  • FIG. 1 is a schematic configuration diagram of a three-dimensional measurement system according to the present embodiment.
  • the three-dimensional measurement system includes a transmission device 1, a projector 2, a camera 3, a three-dimensional measurement device 4, and a measurement object 5.
  • the transmission device 1 is a transparent device such as a transparent screen, a transparent display, or transparent glass.
  • the transmission device 1 may be a device that can switch between a transmission state in which light is transmitted and a shielding state in which light is blocked. In this case, the transmissive device 1 is in a transmissive state when the measurement object 5 is measured.
  • the projector 2 irradiates the measurement object 5 with a plurality of light patterns for three-dimensional measurement.
  • the light pattern is, for example, striped, and may be various patterns used in a pattern projection method for three-dimensional measurement. For example, when a three-dimensional position is measured by a spatial encoding method, the projector 2 projects a striped light pattern in which, for example, the light / dark width is changed by two times in order.
  • the camera 3 is electrically connected to the three-dimensional measuring device 4 and transmits the captured image (also referred to as “captured image Im”) to the three-dimensional measuring device 4.
  • the camera 3 photographs the measurement object 5 irradiated with the light pattern by the projector 2 via the transmission device 1 in the transmission state.
  • the three-dimensional measuring apparatus 4 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like (not shown), and the tertiary of the measuring object 5 based on the captured image Im acquired from the camera 3. Calculate the original coordinates. Even when the transmissive device 1 is in the transmissive state, the light pattern emitted from the projector 2 may appear on the transmissive device 1. Therefore, the three-dimensional measurement apparatus 4 removes the light pattern projected and displayed on the transmission device 1 from the captured image Im, and acquires the three-dimensional coordinates of the measurement object 5 with high accuracy.
  • the CPU of the three-dimensional measuring device 4 functions as “image acquisition means”, “removal means”, and “luminance correction means” in the present invention.
  • FIG. 2 shows functional blocks of the three-dimensional measuring apparatus 4.
  • the three-dimensional measurement apparatus 4 includes a projection pattern processing unit 11, a difference processing unit 12, a luminance correction processing unit 13, and a three-dimensional coordinate calculation processing unit 14.
  • the projection pattern processing unit 11 generates an image (also referred to as “projection pattern image Ima”) indicating the light pattern projected and displayed on the transmission device 1 before the measurement object 5 is measured. At this time, the projection pattern processing unit 11 projects the light pattern in the absence of the measurement object 5 on the captured image Im projected on the transmission device 1 in which the light pattern is in the transmission state in the absence of the measurement object 5. Difference processing of the captured image Im that has not been performed is performed.
  • the generation time of the captured image Im to be processed by the projection pattern processing unit 11 is also referred to as “reference time”.
  • the difference processing unit 12 generates an image obtained by removing the light pattern component from the captured image Im at the time of measurement (also referred to as “pattern removed image Imb”). At this time, the difference processing unit 12 performs a difference process of the projection pattern image Ima obtained by the projection pattern processing unit 11 for the captured image Im at the time of measurement on which the measurement object 5 is displayed.
  • the luminance correction processing unit 13 corrects the luminance of the display region (also referred to as “target region Rtag”) of the measurement object 5 irradiated with the light pattern in the pattern removal image Imb obtained by the difference processing unit 12. Do. As will be described later, when each pixel value of the projection pattern image Ima is subtracted from each pixel value of the captured image Im at the time of measurement, there arises a problem that the luminance of the target region Rtag decreases. In consideration of the above, the luminance correction processing unit 13 performs a process of increasing the luminance of the target region Rtag in the pattern removal image Imb.
  • the three-dimensional coordinate calculation processing unit 14 measures the three-dimensional position of the measurement object 5 based on the pattern removal image Imb (also referred to as “luminance correction image Imc”) that has been subjected to luminance correction by the luminance correction processing unit 13.
  • the pattern removal image Imb also referred to as “luminance correction image Imc”
  • luminance correction image Imc luminance correction image
  • the spatial coding method a striped light pattern whose brightness width is changed by a factor of two is projected from the projector 2 in order, and when n pattern lights are prepared, the measurement region is projected by pattern light. It is divided into 2 n powers based on the direction.
  • the three-dimensional coordinate calculation processing unit 14 uses the light pattern projection direction and the pixel position on the brightness correction image Imc for the n-bit spatial code value uniquely given corresponding to the presence or absence of each pattern light. From the relationship with the determined measurement direction, distance information is obtained for all pixels based on the principle of triangulation.
  • Pattern removal processing Next, a process (pattern removal process) for removing the light pattern projected and displayed on the transmissive device 1 from the captured image Im generated at the time of measurement will be described.
  • the pattern removal process is executed by the projection pattern processing unit 11 and the difference processing unit 12.
  • FIG. 3A shows an example of a captured image Im at the reference time on which the light pattern is projected (also referred to as “reference time projected image Imx”), and FIG. 3B shows that the light pattern is not projected.
  • An example of the captured image Im at the reference time also referred to as “reference non-projected image Imy” is shown
  • FIG. 3C shows an example of the projection pattern image Ima.
  • the projection range 20 of the projector 2 is indicated by a broken line
  • the display area (also referred to as “projection pattern area Rp”) of the light pattern projected on the transmissive device 1 is indicated by an alternate long and short dash line.
  • the projection pattern processing unit 11 targets the reference-time projection image Imx (see FIG. 3A), and the reference-time non-projection image Imy (see FIG. 3B) in which the light pattern is not projected and displayed on the transmission device 1. ) Difference processing. Specifically, the projection pattern processing unit 11 subtracts the RGB pixel value indicated by each pixel of the reference non-projection image Imy from the RGB pixel value of each pixel of the reference projection image Imx. Thereby, the projection pattern processing unit 11 generates a projection pattern image Ima (see FIG. 3C) that is an image from which only the projection pattern region Rp is extracted.
  • FIG. 4A shows the captured image Im at the time of measurement
  • FIG. 4B shows the projection pattern image Ima generated by the projection pattern processing unit 11
  • FIG. 4C shows the pattern removal image Imb. Show.
  • the captured image Im at the time of measurement displays a measurement object 5 and a projection pattern region Rp indicating a light pattern projected on the transmission device 1. Further, the measurement object 5 is irradiated with the light pattern transmitted through the transmissive device 1 in the target region Rtag that is an overlapping region of the measurement object 5 and the projection pattern region Rp.
  • the difference process part 12 performs the difference process of the projection pattern image Ima shown in FIG.4 (b) for the picked-up image Im at the time of the measurement shown to Fig.4 (a), and is made into FIG.4 (c).
  • the pattern removal image Imb shown is generated.
  • the difference processing unit 12 generates the pattern removal image Imb by subtracting the RGB component indicated by each pixel of the projection pattern image Ima from the RGB component of each pixel of the captured image Im.
  • the pattern removal image Imb the light pattern is projected and displayed only on the target region Rtag, and the light pattern projected on the transmission device 1 is removed.
  • the difference processing unit 12 can remove the light pattern projected and displayed on the transmission device 1 and generate the pattern removal image Imb in which only the measurement object 5 is irradiated with the light pattern.
  • the three-dimensional measurement apparatus 4 can measure without erroneously measuring the transmissive device 1 even when the measurement object 5 is away from the transmissive device 1 as compared with the case where the pattern removal process is not performed. It is possible to perform three-dimensional measurement of the object 5.
  • FIG. 5A shows the pattern removal image Imb
  • FIG. 5B shows the reference non-projection image Imy
  • FIG. 5C removes the display other than the measurement object 5 from the pattern removal image Imb.
  • the image “Imb2” is shown.
  • the luminance correction processing unit 13 performs a difference process on the reference non-projection image Imy shown in FIG. 5B for the pattern removal image Imb shown in FIG. Thereby, the brightness correction processing unit 13 generates an image Imb2 (see FIG. 5C) obtained by removing the display other than the measurement object 5 such as the transmission device 1 from the pattern removal image Imb.
  • the luminance correction processing unit 13 generates an image “Imb3” obtained by extracting only the target region Rtag from the image Imb2. This process will be specifically described with reference to FIG. 6A shows the image Imb2, FIG. 6B shows the projection pattern image Ima, and FIG. 6C shows the image Imb3.
  • the brightness correction processing unit 13 projects the projection pattern area of the projection pattern image Ima (see FIG. 6B) in the image Imb2 (see FIG. 6A).
  • An image Imb3 (see FIG. 6C) in which pixel values other than the pixels overlapping with Rp are set to “0” is generated. Then, the luminance correction processing unit 13 specifies the target region Rtag by binarizing the image Imb3.
  • the brightness correction processing unit 13 generates a brightness correction image Imc in which the brightness of the target region Rtag of the pattern removal image Imb is increased by a predetermined value or a predetermined rate.
  • the luminance correction processing unit 13 sets the predetermined value and the predetermined ratio so that the luminance of the target region Rtag of the luminance correction image Imc is equal to or higher than the luminance of the target region Rtag of the captured image Im at the time of measurement. Set. Thereby, the brightness correction processing unit 13 improves the calculation accuracy of the three-dimensional coordinate position of the measurement object 5.
  • the three-dimensional coordinate calculation processing unit 14 determines whether or not the light pattern is projected on each pixel of the image in which each light pattern is projected and displayed on the measurement object 5. The determination is made with reference to the luminance of the pixel. In this case, the three-dimensional coordinate calculation processing unit 14 sets a threshold value (also referred to as “threshold value Cth”) for determining whether or not the light pattern is projected, and determines whether or not the light pattern is projected for each pixel.
  • a threshold value also referred to as “threshold value Cth”
  • the three-dimensional coordinate calculation processing unit 14 can set the threshold value Cth higher as the luminance difference based on the presence or absence of the projection of the light pattern is larger, and suppresses the erroneous determination described above, thereby generating a striped light pattern.
  • the corresponding area can be determined.
  • the luminance correction processing unit 13 sets the luminance of the target region Rtag of the luminance correction image Imc so as to be equal to or higher than the luminance at the time of actual light pattern projection. Thereby, the brightness correction processing unit 13 can improve the calculation accuracy of the three-dimensional coordinates of the measurement object 5.
  • the brightness correction processing unit 13 preferably executes the above-described processing, so that the transmission device 1 is preferably used regardless of the brightness of the shooting environment of the camera 3, the distance between the projector 2 and the measurement object 5, and the type of the measurement object 5. It is possible to generate the brightness correction image Imc on which the light pattern of the measurement object 5 is projected while eliminating the influence of the above. This will be supplementarily described.
  • the luminance correction processing unit 13 increases the luminance of the target region Rtag based on the luminance correction processing described above.
  • the luminance correction processing unit 13 can set the threshold value Cth to a larger value as compared with the case where the luminance correction processing is not performed, and can improve the accuracy of the determination processing based on the threshold value Cth. Therefore, the three-dimensional measurement apparatus 4 can suppress a decrease in the accuracy of the three-dimensional measurement even when the shooting environment of the camera 3 is bright.
  • the luminance correction processing unit 13 increases the luminance difference based on the presence or absence of the projection of the light pattern by increasing the luminance of the target region Rtag based on the luminance correction processing described above.
  • the luminance correction processing unit 13 can set the threshold value Cth to a larger value as compared with the case where the luminance correction processing is not performed, and can improve the accuracy of the determination processing based on the threshold value Cth. Therefore, the three-dimensional measuring device 4 can suppress a decrease in the accuracy of the three-dimensional measurement even when the distance between the projector 2 and the measurement object 5 is long.
  • the luminance in the irradiated portion of the light pattern may differ depending on whether the optical characteristic of the measurement object 5 is a diffusion type or a reflection type.
  • the threshold value Cth common to all the measurement objects 5 the determination accuracy of whether or not the light pattern is irradiated on the measurement object 5 based on the threshold value Cth decreases, and the accuracy of the three-dimensional measurement decreases.
  • the luminance correction processing unit 13 increases the luminance difference based on the presence or absence of the projection of the light pattern by increasing the luminance of the target region Rtag based on the luminance correction processing described above.
  • the luminance correction processing unit 13 can set the threshold value Cth to a larger value as compared with the case where the luminance correction processing is not performed, and can improve the accuracy of the determination processing based on the threshold value Cth. Therefore, the three-dimensional measurement apparatus 4 can suppress a decrease in the accuracy of the three-dimensional measurement even when the three-dimensional measurement of the measurement object 5 having different optical characteristics is performed.
  • the three-dimensional measurement device 4 needs to perform processing according to the case according to the difference in the optical characteristics of the background object. This process is described in detail in the next section.
  • the brightness of the projection pattern region Rp varies depending on the optical characteristics of the object that is the background of the measurement object 5.
  • the brightness of the projection pattern region Rp is smaller for a reflective background (for example, a whiteboard) having optical characteristics than for a diffuse background (for example, cloth, white paper).
  • the three-dimensional measurement apparatus 4 changes the process to be executed based on the magnitude relationship between the luminance of the light pattern of the transmissive device 1 at the reference time and the luminance of the light pattern of the transmissive device 1 at the time of measurement.
  • description will be made on a case-by-case basis for each of the above-described luminance magnitude relationships.
  • the three-dimensional measurement device 4 uses, for example, a reference when the average luminance of the projection pattern region Rp in the projection pattern image Ima is larger than the average luminance of the captured image Im at the time of measurement corresponding to the projection pattern region Rp. It is determined that the brightness of the light pattern of the transmissive device 1 at the time is larger than the brightness of the light pattern of the transmissive device 1 at the time of measurement. Similarly, the three-dimensional measurement apparatus 4 determines that the reference brightness is obtained when the average brightness of the projection pattern area Rp in the projection pattern image Ima is smaller than the average brightness of the captured image Im at the time of measurement corresponding to the projection pattern area Rp. It is determined that the luminance of the light pattern of the transmissive device 1 is lower than the luminance of the light pattern of the transmissive device 1 at the time of measurement.
  • the projection pattern image Ima is calculated from each pixel value of the captured image Im at the time of measurement.
  • the pattern removal image Imb is generated by performing the subtraction process on each of the pixel values, the projection pattern region Rp does not remain in the pattern removal image Imb as in FIG. Therefore, in this case, the three-dimensional measurement apparatus 4 executes the pattern removal process and the brightness correction process described above.
  • the projection pattern image Ima is similarly calculated from each pixel value of the captured image Im at the time of measurement.
  • the pattern removal image Imb is generated by subtracting each of the pixel values, the projection pattern region Rp does not remain in the pattern removal image Imb as in FIG. 4C. Therefore, in this case, the three-dimensional measurement apparatus 4 executes the pattern removal process and the brightness correction process described above.
  • the difference processing unit 12 performs correction to increase the luminance of each pixel in the projection pattern region Rp of the projection pattern image Ima by a predetermined value or a predetermined rate.
  • the difference process part 12 produces
  • the predetermined value and the predetermined rate described above are set to values at which the luminance of each pixel of the projection pattern region Rp of the projection pattern image Ima is equal to or higher than the luminance of each pixel of the projection pattern region Rp of the captured image Im at the time of measurement.
  • the difference processing unit 12 can suitably generate the pattern removal image Imb from which the projection pattern region Rp has been removed. Then, the luminance correction processing unit 13 executes luminance correction processing based on the pattern removal image Imb generated by the difference processing unit 12.
  • the luminance correction processing unit 13 may correct the luminance of the target region Rtag based on the luminance of the pixels of the measurement object 5 other than the target region Rtag of the captured image Im at the time of measurement. Specifically, the luminance correction processing unit 13 identifies a pixel having a higher luminance among the pixels of the measurement object 5 other than the target region Rtag of the captured image Im at the time of measurement, and the luminance of the target region Rtag is higher than the luminance. Set to be higher. Also by this, the luminance correction processing unit 13 can execute the determination process based on the threshold value Cth with high accuracy.
  • Modification 2 The three-dimensional measurement device 4 generates a pattern removal image Imb from the captured image Im at the time of measurement by pattern removal processing, and then recognizes the target region Rtag in the luminance correction processing, and sets the luminance of the target region Rtag of the pattern removal image Imb. Raised.
  • the processing procedure to which the present invention is applicable is not limited to this.
  • the three-dimensional measuring device 4 first recognizes the target area Rtag, and then lowers the luminance of the target area Rtag of the projection pattern image Ima. Then, in the pattern removal process, the three-dimensional measurement apparatus 4 generates a pattern removal image Imb by performing a difference process on the projection pattern image Ima with the brightness reduced, for the captured image Im at the time of measurement. In this case, the luminance of the target region Rtag of the pattern removal image Imb has already been corrected to a high value as much as the luminance of the target region Rtag of the projection pattern image Ima is lowered. Therefore, in this case, the three-dimensional coordinate calculation processing unit 14 performs a three-dimensional measurement process based on the above-described pattern removal image Imb.
  • the three-dimensional measurement device 4 recognizes the target region Rtag that is the projection region of the light pattern of the measurement object 5, thereby increasing the luminance of the target region Rtag and improving the three-dimensional measurement accuracy. Can be made.
  • the projection pattern processing unit 11 when the projection pattern processing unit 11 performs the difference process of the reference non-projection image Imy on the reference projection image Imx to generate the projection pattern image Ima, the projection pattern image Ima includes the projection pattern region Rp. The portion of the background object that has been illuminated by the irradiation of the light pattern is also displayed. Therefore, in this case, when the difference processing unit 12 performs the difference processing of the pattern removal image Imb on the captured image Im at the time of measurement, the background object is irradiated with the light pattern in addition to the projection pattern region Rp in the pattern removal image Imb. As a result, there is a problem that the brightness of the illuminated portion is also lowered.
  • the luminance correction processing unit 13 determines the overlapping portion of the pattern removal image Imb generated by the difference processing unit 12 and the projection pattern image Ima as the target region Rtag, as in the above-described embodiment, In the removed image Imb, the luminance of the target region Rtag is increased. Thereby, the luminance correction processing unit 13 can increase the luminance of the overlapping portion of the measurement object 5 and the portion illuminated by the light pattern irradiation in addition to the overlapping portion of the measurement object 5 and the projection pattern region Rp. it can.
  • the three-dimensional measurement device 4 appropriately adjusts the luminance of the portion of the measurement object 5 irradiated with the light pattern even when the background object is photographed brightly by the irradiation of the light pattern. Can do.
  • the projection pattern processing unit 11 acquires the reference-time projection image Imx from the camera 3 for each light pattern, and generates and stores the projection pattern image Ima for each light pattern. Instead, the projection pattern processing unit 11 uses a projection pattern image Ima (““ reference pattern image ”) on which a predetermined light pattern is displayed to display another projection pattern image Ima (“ May also be generated and stored.
  • the projection pattern processing unit 11 recognizes the position of the projection pattern region Rp to be displayed on each of the other pattern images by storing in advance information on the shape and arrangement of each light pattern in the projection range 20. Then, the projection pattern processing unit 11 sets each pixel value of the projection pattern region Rp to be displayed on each of the other pattern images based on the pixel value of the projection pattern region Rp displayed on the reference pattern image.
  • the projection pattern processing unit 11 generates the projection pattern image Ima based on the reference-time projection image Imx and the reference-time non-projection image Imy previously captured when the measurement object 5 is not in the imaging range. . That is, the reference time and the measurement time are set at different times. Instead, the projection pattern processing unit 11 may generate the projection pattern image Ima based on the captured image Im at the time of measurement.
  • the projection pattern processing unit 11 recognizes, from the captured image Im at the time of measurement, a part of the projection pattern region Rp that does not overlap the measurement object 5, for example, a stripe of the light pattern formed at the left end or the right end. . And the projection pattern process part 11 produces
  • the projection pattern processing unit 11 stores in advance information on the shape and arrangement of each light pattern in the projection range 20, and recognizes the positions and ranges of other stripes of the light pattern based on the information. This eliminates the need for the camera 3 to capture the reference-time projection image Imx and the reference-time non-projection image Imy in advance in order to generate the projection pattern image Ima before the measurement object 5 is measured.
  • the three-dimensional measurement apparatus 4 performs a determination process based on the threshold value Cth based on the luminance determined by the RGB components. Instead, the three-dimensional measurement device 4 may perform a determination process using the threshold value Cth based on at least one value of RGB.
  • the projector 2 may be a projector that irradiates an infrared light pattern
  • the camera 3 may be an infrared camera.
  • the three-dimensional measurement apparatus 4 performs the same processing as in the above-described embodiment based on the captured image Im transmitted from the infrared camera.
  • Modification 6 In the three-dimensional measurement system shown in FIG. 1, one transmission device 1 is installed between the projector 2, the camera 3, and the measurement object 5. Instead, the projector 2, the camera 3, and the measurement object A plurality of transmission devices 1 may be installed between the first and second transmission devices 1.
  • the projector 2 may emit light for displaying information on the transmissive device 1 (also referred to as “display light”) and light constituting a light pattern (also referred to as “pattern light”).
  • the transmissive device 1 is, for example, a display that can be switched between a transmissive state and a shielding state, and is in a shielding state when display light is emitted from the projector 2, and is transmissive when pattern light is emitted from the projector 2. It becomes a state.
  • the three-dimensional measurement system appropriately controls the display rate and the like so that a human can visually recognize the image by the display light and the measurement object 5 such as a hand. Three-dimensional measurement is possible.
  • the projector 2 may be realized by two projectors: a projector that emits display light and a projector that emits pattern light.
  • the present invention can be suitably applied to non-contact input devices, display devices, and other devices that perform three-dimensional measurement.

Abstract

Selon la présente invention, un dispositif de mesure tridimensionnelle comporte un moyen d'acquisition d'image, un moyen de suppression et un moyen de correction de luminance, et le dispositif conduit une mesure tridimensionnelle d'un objet de mesure sur lequel un motif lumineux est projeté par l'intermédiaire d'un dispositif d'émission. Le moyen d'acquisition d'image acquiert une image de l'objet de mesure capturée par l'intermédiaire du dispositif d'émission. Le moyen de suppression supprime de l'image de l'objet de mesure le motif lumineux projeté et affiché par le dispositif d'émission. Le moyen de correction de luminance détecte la région d'affichage de l'objet de mesure sur laquelle le motif lumineux a été projeté et corrige la luminance de cette région d'affichage.
PCT/JP2012/060112 2012-04-13 2012-04-13 Dispositif de mesure tridimensionnelle, système de mesure tridimensionnelle, procédé de commande, programme et support de stockage WO2013153661A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2012/060112 WO2013153661A1 (fr) 2012-04-13 2012-04-13 Dispositif de mesure tridimensionnelle, système de mesure tridimensionnelle, procédé de commande, programme et support de stockage
JP2014509993A JP5795431B2 (ja) 2012-04-13 2012-04-13 三次元計測装置、三次元計測システム、制御方法、プログラム、及び記憶媒体

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/060112 WO2013153661A1 (fr) 2012-04-13 2012-04-13 Dispositif de mesure tridimensionnelle, système de mesure tridimensionnelle, procédé de commande, programme et support de stockage

Publications (1)

Publication Number Publication Date
WO2013153661A1 true WO2013153661A1 (fr) 2013-10-17

Family

ID=49327267

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/060112 WO2013153661A1 (fr) 2012-04-13 2012-04-13 Dispositif de mesure tridimensionnelle, système de mesure tridimensionnelle, procédé de commande, programme et support de stockage

Country Status (2)

Country Link
JP (1) JP5795431B2 (fr)
WO (1) WO2013153661A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110211A (ja) * 1994-10-11 1996-04-30 Teruaki Yogo 3次元形状測定装置用撮像センサ
JPH11355742A (ja) * 1998-06-05 1999-12-24 Denso Corp 画像入出力システム
JP2000321035A (ja) * 1999-05-14 2000-11-24 Mitsubishi Electric Corp 検出装置
JP2005291839A (ja) * 2004-03-31 2005-10-20 Brother Ind Ltd 投影装置および3次元形状検出装置
JP2009019941A (ja) * 2007-07-11 2009-01-29 Nikon Corp 形状測定方法
JP2011122869A (ja) * 2009-12-09 2011-06-23 Seiko Epson Corp 光学式位置検出装置及び投射型表示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110211A (ja) * 1994-10-11 1996-04-30 Teruaki Yogo 3次元形状測定装置用撮像センサ
JPH11355742A (ja) * 1998-06-05 1999-12-24 Denso Corp 画像入出力システム
JP2000321035A (ja) * 1999-05-14 2000-11-24 Mitsubishi Electric Corp 検出装置
JP2005291839A (ja) * 2004-03-31 2005-10-20 Brother Ind Ltd 投影装置および3次元形状検出装置
JP2009019941A (ja) * 2007-07-11 2009-01-29 Nikon Corp 形状測定方法
JP2011122869A (ja) * 2009-12-09 2011-06-23 Seiko Epson Corp 光学式位置検出装置及び投射型表示装置

Also Published As

Publication number Publication date
JPWO2013153661A1 (ja) 2015-12-17
JP5795431B2 (ja) 2015-10-14

Similar Documents

Publication Publication Date Title
JP5882631B2 (ja) 三次元位置計測装置、三次元位置計測方法、及びコンピュータプログラム
JP6075122B2 (ja) システム、画像投影装置、情報処理装置、情報処理方法およびプログラム
US9857166B2 (en) Information processing apparatus and method for measuring a target object
KR101461068B1 (ko) 삼차원 계측장치, 삼차원 계측방법 및 기억매체
JP6061616B2 (ja) 測定装置及びその制御方法、プログラム
JP2015158887A5 (fr)
US10078907B2 (en) Distance measurement apparatus, distance measurement method, and storage medium
JP6351201B2 (ja) 距離計測装置および方法
JP2015127668A (ja) 計測装置、システムおよびプログラム
JP6553412B2 (ja) 検査システム
CN108885098B (zh) 距离计测装置和距离计测方法
JP2016223911A (ja) 検査システム
JP2014115108A (ja) 距離計測装置
JP2015158885A5 (fr)
US20120218294A1 (en) Projection-type image display apparatus
US10348983B2 (en) Non-transitory storage medium encoded with computer readable image processing program, information processing system, information processing apparatus, and image processing method for determining a position of a subject in an obtained infrared image
JP2009019942A (ja) 形状測定方法、プログラム、および形状測定装置
JP2008232837A (ja) 欠陥強調・欠陥検出の方法および装置、プログラム
JP5795431B2 (ja) 三次元計測装置、三次元計測システム、制御方法、プログラム、及び記憶媒体
JP6927294B2 (ja) 計測装置、計測方法及び計測プログラム
WO2013175595A1 (fr) Dispositif de mesure de face tridimensionnelle (3d), procédé de commande, programme et support d'enregistrement
TW201428558A (zh) 光學滑鼠裝置以及使用在光學滑鼠裝置的方法
JP6552230B2 (ja) 計測装置
JP2014035294A (ja) 情報取得装置および物体検出装置
JP2017191082A (ja) 輝点画像取得装置および輝点画像取得方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12874212

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014509993

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12874212

Country of ref document: EP

Kind code of ref document: A1