WO2016084323A1 - Distance image generation device, distance image generation method, and distance image generation program - Google Patents

Distance image generation device, distance image generation method, and distance image generation program Download PDF

Info

Publication number
WO2016084323A1
WO2016084323A1 PCT/JP2015/005656 JP2015005656W WO2016084323A1 WO 2016084323 A1 WO2016084323 A1 WO 2016084323A1 JP 2015005656 W JP2015005656 W JP 2015005656W WO 2016084323 A1 WO2016084323 A1 WO 2016084323A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
distance
distance image
unit
light emission
Prior art date
Application number
PCT/JP2015/005656
Other languages
French (fr)
Japanese (ja)
Inventor
利章 篠原
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2016084323A1 publication Critical patent/WO2016084323A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates to a distance image generation device, a distance image generation method, and a distance image generation program.
  • a distance image camera is known as one of the three-dimensional measuring devices that perform non-contact measurement of the distance to a three-dimensional object using infrared rays.
  • the distance image camera for example, irradiates an object with light from an infrared LED (Light Emitting Diode), measures the time until the light is reflected and received, and converts the light into a distance.
  • infrared LED Light Emitting Diode
  • the distance image generation device of the present invention includes a light emitting unit that emits light to an object, a light emission amount control unit that controls the amount of light emitted by the light emitting unit, and light emitted from the object with respect to light emitted with the controlled light emission amount.
  • a light receiving unit that receives reflected light
  • a distance image generation unit that generates a plurality of distance images based on an integrated value of a plurality of light receiving amounts obtained for different light emission amounts, and a plurality of distance images that are synthesized
  • the distance image generation method of the present invention is a distance image generation method in a distance image generation device, the step of controlling a light emission amount for emitting light to an object, and a light to the object with the controlled light emission amount.
  • the distance image generation program of the present invention is a program for causing a computer to execute each step of the distance image generation method.
  • the accuracy of the distance between the imaging device indicated by the distance image and the imaging object can be improved.
  • FIG. 1 is a schematic diagram illustrating an arrangement example of an imaging device and an object in the embodiment and an example of a device around the imaging device.
  • FIG. 2 is a block diagram illustrating a configuration example of the imaging apparatus according to the embodiment.
  • FIG. 3 is a timing chart for explaining the electronic shutter control in the embodiment.
  • Drawing 4A is a mimetic diagram showing an example of irradiation light and reflected light according to each distance range in an embodiment.
  • Drawing 4B is a mimetic diagram showing an example of irradiation light and reflected light according to each distance range in an embodiment.
  • Drawing 4C is a mimetic diagram showing an example of irradiation light and reflected light according to each distance range in an embodiment.
  • FIG. 1 is a schematic diagram illustrating an arrangement example of an imaging device and an object in the embodiment and an example of a device around the imaging device.
  • FIG. 2 is a block diagram illustrating a configuration example of the imaging apparatus according to the embodiment.
  • FIG. 3 is
  • FIG. 5 is a flowchart illustrating an operation example of the imaging apparatus according to the embodiment.
  • FIG. 6 is a flowchart illustrating an operation example of the imaging apparatus according to the embodiment (continuation of FIG. 5).
  • FIG. 7A is a schematic diagram illustrating an example of a distance image when light is emitted at a pulse rate corresponding to each distance range in the embodiment.
  • FIG. 7B is a schematic diagram illustrating an example of a distance image when light is emitted at a pulse rate corresponding to each distance range in the embodiment.
  • FIG. 7C is a schematic diagram illustrating an example of a distance image when light is emitted at a pulse rate corresponding to each distance range in the embodiment.
  • FIG. 8 is a schematic diagram illustrating an example of a composite distance image in the embodiment.
  • the present invention has been made in view of the above circumstances, and provides a distance image generation device, a distance image generation method, and a distance image generation program that can improve the accuracy of the distance between the imaging device indicated by the distance image and the object. provide.
  • the amount of reflected light received is appropriate and the distance can be measured.
  • the size of the object is large, for example, when it is a baggage of about 30 cm or more, the amount of reflected light received by the object differs between the one end side and the other end side of the object. Therefore, when priority is given to reflection at one end of the object, the amount of reflected light at the other end becomes insufficient, and when priority is given to reflection at the other end, the amount of reflected light at one end May become excessive, and the accuracy of the distance indicated by the distance image may decrease.
  • the distance between the distance image camera and the object when the distance between the distance image camera and the object is too short, the amount of reflected light received is excessive, and the accuracy of the distance indicated by the distance image may be reduced.
  • the distance between the distance image camera and the object is too long, the amount of reflected light received is insufficient, and the accuracy of the distance indicated by the distance image may be reduced.
  • a distance image generating device capable of improving the accuracy of the distance between the imaging device indicated by the distance image and the imaging object will be described.
  • FIG. 1 is a schematic diagram illustrating an arrangement example of the imaging device 10 and the measurement target 20 in the first embodiment.
  • an apparatus connected to the imaging apparatus 10 is also illustrated.
  • the imaging device 10 is installed at a predetermined height from an indoor ceiling or floor, for example.
  • the imaging apparatus 10 irradiates the object 20 with irradiation light L1 (for example, IR (InfraRed) light), receives reflected light L2 (for example, IR light) reflected by the object 20, and receives the amount of light received for a predetermined period.
  • L1 for example, IR (InfraRed) light
  • L2 for example, IR light
  • a distance image is generated. That is, the imaging device 10 measures the distance between the imaging device 10 and the target 20 (each target point in the target 20).
  • the target point is a partial area or point of the target object 20.
  • the target object 20 is arrange
  • a plurality of imaging devices 10 for measuring the object 20 may be provided. Moreover, the imaging device 10 may be installed outdoors. Note that the distance between the imaging device 10 and a predetermined position where the object 20 is disposed is determined in advance.
  • the imaging device 10 may be connected to the network 7.
  • a recorder 9, an information processing device 8, and a monitor 11 are connected to the network 7.
  • the recorder 9 receives and accumulates various data (for example, a captured image, a distance map, and a composite distance image described later) from the imaging device 10.
  • the information processing apparatus 8 accesses the data stored in the recorder 9 and performs various processes (for example, image processing), editing, summing up, or searching.
  • the monitor 11 displays, for example, a captured image, a distance image, and a combined distance image. Thereby, the user can confirm a captured image, a distance image, and a synthetic distance image, for example.
  • the imaging device 10 may have the functions of the recorder 9, the information processing device 8, and the monitor 11.
  • the network 7 may be, for example, any one of a LAN (Local Area Network), the Internet, a public telephone line network, a dedicated telephone line, a mobile phone line network, or a combination of a plurality of types of networks.
  • LAN Local Area Network
  • the Internet may be, for example, any one of a LAN (Local Area Network), the Internet, a public telephone line network, a dedicated telephone line, a mobile phone line network, or a combination of a plurality of types of networks.
  • the object 20 is, for example, an article stored in a rectangular parallelepiped corrugated cardboard conveyed by a courier or the like, an article of another shape (for example, a cylindrical shape or a prismatic shape as shown in FIG. 1), a golf bag, a suitcase, a cable. Includes drums, buckets, caskets, wooden frames, people and cars. That is, the object 20 includes a wide variety of articles having various shapes and sizes.
  • an imaging system including the imaging device 10, the recorder 9, the information processing device 8, the monitor 11, and the like may be combined with an existing system (for example, a distribution system). Thereby, a distance image can be acquired smoothly in the distribution process.
  • FIG. 2 is a block diagram illustrating a configuration example of the imaging apparatus 10.
  • the imaging device 10 has an image (for example, a visible image or an IR image) for confirming the appearance of the object, and an image (for example, a distance image) for measuring the distance between the imaging device 10 and each target point on the object 20. Can be imaged.
  • an image for example, a visible image or an IR image
  • an image for example, a distance image
  • the distance image is generated by, for example, the TOF (Time Of Flight) method.
  • the imaging device 10 may be a stereo type imaging device including two optical systems that capture images with visible light.
  • the imaging device 10 includes a light emitting element unit 31, a light emission / light reception control / drive unit 32, a light receiving element unit 33, a light reception timing control unit 34, a timing generation unit 35, an A / D conversion unit 36, a separation unit 37, and a distance image signal.
  • a processing unit 38, a visible / IR image signal processing unit 39, an association processing unit 40, a storage unit 41, a CPU (Central Processing Unit) 42, and a communication unit 43 are provided.
  • the light emitting element unit 31 emits infrared light (IR light) toward the imaging range of the distance image.
  • the timing and period of infrared light emission are controlled by a light emission drive signal generated by the timing generation unit 35.
  • the light emission / light reception control / drive unit 32 drives the light emitting element unit 31 according to the light emission drive signal from the timing generation unit 35.
  • the light emission / light reception control / drive unit 32 drives the light reception timing control unit 34 in accordance with the electronic shutter window signal from the timing generation unit 35.
  • the electronic shutter is, for example, a CCD (Charge Coupled Device) global shutter, an optical shutter, or the like, but is not limited thereto.
  • the light receiving element unit 33 receives red light (R), green light (G), and blue light (B) that receive visible light via the imaging optical system 33A, and IR light reception that receives infrared light.
  • An element is provided for each pixel.
  • the IR light receiving element is provided for generating a distance image.
  • the IR light receiving element receives infrared light emitted from the light emitting element unit 31 and reflected by the object 20 (see FIG. 1).
  • the photoelectric conversion element includes, for example, a CCD or a CMOS (Complementary Metal Oxide Semiconductor).
  • the light reception timing control unit 34 performs electronic shutter control for generating a distance image.
  • the distance image indicates the distance to the target point of the target object 20 that reflects the infrared light emitted from the light emitting element unit 31.
  • the electronic shutter that is, the timing and period of photoelectric conversion are controlled by an electronic shutter window signal generated by the timing generator 35.
  • the timing generator 35 generates a light emission drive signal and an electronic shutter window signal.
  • the pulse rate of the light emission drive signal and the electronic shutter window signal can be changed. This pulse rate changes corresponding to the distance range, and is changed according to the amount of light necessary for generating the distance image.
  • the distance range is a range of distance for the imaging device 10 to generate a distance image.
  • the timing generation unit 35 holds in advance information on the correspondence relationship between the distance range and the pulse rate using a pulse setting table or the like. When there is a target point in the above distance range, the integrated value of the received light amount can be obtained without excess or deficiency.
  • the pulse rate is set to a value capable of effectively measuring reflected light having a reflectance of 5% to 95% from the target point of the target object 20.
  • the pulse rate is changed in each frame.
  • the pulse rate is changed, the amount of light per pulse is not changed.
  • the A / D converter 36 is an electrical signal (analog signal) of red light, green light, blue light, infrared light output from the light reception timing control unit 34, and an integrated value of infrared light by electronic shutter control. Is converted into a digital signal.
  • the digital signal converted by the A / D conversion unit 36 is output to the separation unit 37.
  • the separation unit 37 separates the digital signal of the integrated value of infrared light by electronic shutter control from the digital signal from the A / D conversion unit 36 and outputs the integrated value of infrared light to the distance image signal processing unit 38. To do. On the other hand, digital signals of red light, green light, blue light, and infrared light are output to the visible / IR image signal processing unit 39.
  • the distance image signal processing unit 38 calculates an integrated value of infrared light by electronic shutter control for all pixels, and generates a distance image based on the calculated integrated value.
  • This distance image corresponds to information in which the distance to the target point of the target object 20 captured by each pixel is defined.
  • distance calculation and other processing for example, various filter processes, signal processing, distance or inclination correction
  • the pixel is displayed as a gray image having a lower density. Pixels closer to each other may be displayed as a gray image having a higher density.
  • the distance image signal processing unit 38 generates a plurality of distance images using pulse rates of different light emission drive signals.
  • the distance image signal processing unit 38 has a combination flag indicating whether or not to generate a combined distance image, and generates a combined distance image by combining a plurality of distance images according to the combination flag.
  • the visible / IR image signal processing unit 39 generates a visible image for monitoring from each digital signal of red light, green light, and blue light, and an infrared image for monitoring (hereinafter referred to as “IR”) from the digital signal of infrared light. Image ").
  • IR infrared image for monitoring
  • the association processing unit 40 associates the distance image generated by the distance image signal processing unit 38 with the visible image and IR image generated by the visible / IR image signal processing unit 39 using the association information. I do.
  • the association information includes, for example, information on the imaging time, imaging position, and angle of view of each of the distance image, visible image, and IR image.
  • the imaging device 10 includes the association processing unit 40, so that, for example, the image processing of the distance image, the visible image, and the IR image is linked, and the images captured and generated by the same imaging device 10 at the same time are generated. It is possible to display images simultaneously.
  • the monitoring image (for example, visible image, IR image) and distance image are generated by, for example, one light receiving element unit 33 (sensor), but may be generated by different light receiving element units 33 (compound eyes).
  • the light receiving element portion 33 for red light, green light, and blue light may be different from the light receiving element portion 33 for infrared light.
  • optical systems for introducing light into the light receiving element portion 33 for red light, green light, and blue light and the light receiving element portion 33 for infrared light may be provided independently.
  • the association processing unit 40 performs image correction on the center position of the image obtained by the plurality of light receiving element units 33 arranged at different positions and the range captured by each light receiving element unit 33.
  • the association processing unit 40 is a visible image, an IR image, and a distance image (each including a still image or a moving image) having different characteristics, or a plurality of visible images, IR images, and distance images. For example, the accuracy of detection, various measurements, and tracking of the object 20 can be improved.
  • the storage unit 41 includes, for example, an SSD (Solid State Drive), an HDD (Hard Disk Drive), various memories, and an SD card.
  • the storage unit 41 may be an external storage device provided separately from the imaging device 10.
  • the storage unit 41 stores and accumulates, for example, the distance image generated by the distance image signal processing unit 38, the visible image and the IR image generated by the visible / IR image signal processing unit 39, and the like. For example, the distance image, the visible image, and the IR image that are captured and generated by the same imaging device 10 at the same time are associated by the association information generated by the association processing unit 40 and then stored in the storage unit 41. Stored and accumulated.
  • the distance meta information generated by the CPU 42 may be stored in the storage unit 41 together with the associated distance image, visible image, and IR image.
  • the distance meta information includes, for example, information on the size and shape of the captured object 20.
  • the information processing apparatus 8 of the imaging system searches for a specific package, determines whether there is a deformation, and tracks it. Etc. becomes easy.
  • the CPU 42 central processing unit is formed by, for example, SOC (System on a chip).
  • the CPU 42 operates as a control unit of each block of the video imaging unit 30A or the distance image generation unit 30B, and each block of the distance image signal processing unit 38, the association processing unit 40, the storage unit 41, and the communication unit 43.
  • the software program for the CPU 42 to control each block may be stored in the previous storage unit 41 or may be stored in another storage unit (not shown).
  • the CPU 42 reads out a software program from the storage unit 41 and controls each block.
  • the CPU 42 may generate the distance meta information described above.
  • processors such as FPGA (Field Programmable Gate Array), LSI (Large Scale Integration), CPU 42 of the imaging apparatus 10, DSP (Digital Signal Processing), and the like are also included in the target of the computer that executes the program.
  • the communication unit 43 communicates with the information processing apparatus 8 and the recorder 9 via the network 7.
  • the communication unit 43 performs communication using, for example, a wireless LAN or a wide area wireless communication network (for example, a mobile phone network).
  • the one-dot chain line portion including the light receiving element unit 33, the light receiving timing control unit 34, the A / D conversion unit 36, the separation unit 37, and the visible / IR image signal processing unit 39 is the video imaging unit 30A.
  • a light emitting element unit 31, a light emission / light reception control / drive unit 32, a light receiving element unit 33, a light reception timing control unit 34, a timing generation unit 35, an A / D conversion unit 36, a separation unit 37, and a distance image signal processing unit 38 are included.
  • a dotted line part is the distance image generation part 30B.
  • the components excluding the light emitting element unit 31 and the light receiving element unit 33 correspond to the distance image sensor.
  • the light emitting element unit 31, the light emitting / receiving control / driving unit 32, the light receiving element unit 33, the light receiving timing control unit 34, the timing generation unit 35, the A / D conversion unit 36, and the separation unit 37 are combined with the video imaging unit 30 ⁇ / b> A and the distance image. It is used for both the generation unit 30B. Therefore, the configuration (for example, hardware configuration) of the imaging device 10 is simplified by the common use.
  • the image capturing unit 30A captures an image via the image capturing optical system 33A that is an image optical system.
  • the distance image generation unit 30B captures an image through the imaging optical system 33A, which is a distance image optical system, and generates a distance image. That is, the imaging optical system and the distance image optical system of the imaging apparatus 10 are made common to form one imaging optical system 33A. Note that the image optical system and the distance image optical system may be provided separately.
  • FIG. 3 is a timing chart for explaining an example of the electronic shutter control.
  • the light emission drive signal is, for example, a pulse wave, and is repeatedly driven (HIGH: light emission) and stopped (LOW: extinguished) at a constant cycle.
  • the emitted light quantity of the infrared light emitted from the light emitting element unit 31 does not rise and fall according to the light emission drive signal, but rises and falls smoothly.
  • the electronic shutter window signal is, for example, a pulse wave, and is repeatedly driven (HIGH) and stopped (LOW) at the same cycle as the light emission drive signal.
  • the light emission drive signal and the electronic shutter window signal may have the same phase or may be slightly shifted.
  • the electronic shutter window signal may be slightly delayed from the light emission drive signal.
  • the received light amount of infrared light as shown in FIG. 3 can be obtained.
  • the light emitting element unit 31 When the time from when the light emitting element unit 31 emits infrared light to when the reflected light with respect to the infrared light is received by each pixel of the light receiving element unit 33 is long, the light can be received during the High period of the electronic shutter window signal. The amount of reflected light is reduced. That is, when the distance to the subject portion (target point) captured by the received pixels is far from the imaging device 10, the amount of reflected light received during the period when the electronic shutter window signal is High is reduced.
  • the distance to the target point captured by each pixel depends on the magnitude of the integral value of the amount of infrared light received during the period when the electronic shutter window signal is high in each pixel (that is, the luminance value of each pixel). That is, it is measured according to the integral value.
  • the photoelectric conversion element converts an integrated value of the amount of received infrared light into an electrical signal. This electrical signal corresponds to the distance to the target point captured by each pixel. That is, the integrated value of the infrared light of each pixel is equivalent to distance information indicating the distance to the target point captured by each pixel.
  • infrared light emission by the light emitting element unit 31 and infrared light photoelectric conversion (integration of received light amount) by the light receiving timing control unit 34 are performed a plurality of times to generate one distance image. You may be broken.
  • the distance image signal processing unit 38 averages the integral value of each infrared light obtained by a plurality of times of light emission and light reception, and adopts an intermediate value of the integral value of each infrared light. The luminance value of each pixel for generating the distance image may be obtained.
  • FIG. 4A shows the irradiation light L11 and the reflected light L12 when the distance range is the distance range D1.
  • FIG. 4B shows the irradiation light L21 and the reflected light L22 when the distance range is the distance range D2.
  • FIG. 4C shows the irradiation light L31 and the reflected light L32 when the distance range is the distance range D3.
  • 4A to 4C the installation environment of the imaging device 10 and the object 20 is viewed so that the imaging device 10 is on the upper side and the object 20 is on the lower side.
  • the number of distance ranges is three here, it may be two or four or more.
  • the distance from the imaging device 10 to the target point 25 is, for example, 0.6 m to 1.2 m, and the target point 25 is included in the upper part of the target object 20 in FIG. 4A.
  • the distance from the imaging device 10 to the target point 25 is, for example, 1.2 m to 1.8 m, and the target point 25 is included in the middle of the target object 20 in FIG.
  • the distance from the imaging device 10 to the target point 25 is, for example, 1.8 m to 2.4 m, and is included in the lower part of the target object 20 in FIG. 4C.
  • the timing generator 35 When the reflected light from the target point 25 in the distance range D1 is to be obtained sufficiently and sufficiently, the timing generator 35 generates a light emission drive signal having a relatively low pulse rate (for example, 1000 times / second).
  • the light emitting element unit 31 emits the irradiation light L11 with a relatively small amount of light according to the pulse rate.
  • the light receiving element unit 33 receives the reflected light L12 reflected by the target point 25 with respect to the irradiation light L11.
  • the imaging device 10 can differentiate the distance in each pixel even in a distance range where the distance from the imaging device 10 is relatively close, and can generate a distance image with high accuracy.
  • the timing generator 35 When the reflected light from the target point 25 in the distance range D2 is to be obtained sufficiently and sufficiently, the timing generator 35 generates a light emission drive signal with a medium pulse rate (for example, 2000 times / second).
  • the light emitting element unit 31 emits the irradiation light L21 with a medium amount of light according to the pulse rate.
  • the light receiving element unit 33 receives the reflected light L22 reflected by the target point 25 with respect to the irradiation light L21.
  • the imaging device 10 can differentiate the distance in each pixel even in a distance range where the distance from the imaging device 10 is a medium distance, and can generate a distance image with high accuracy.
  • the timing generator 35 When the reflected light from the target point 25 within the distance range D3 is to be obtained sufficiently and sufficiently, the timing generator 35 generates a light emission drive signal having a relatively high pulse rate (for example, 4000 times / second).
  • the light emitting element unit 31 emits the irradiation light L31 with a relatively large amount of light according to the pulse rate.
  • the light receiving element unit 33 receives the reflected light L32 reflected by the target point 25 with respect to the irradiation light L31.
  • the imaging device 10 can differentiate the distance in each pixel even in a distance range where the distance from the imaging device 10 is relatively long, and can generate a distance image with high accuracy.
  • FIG. 5 and FIG. 6 are flowcharts showing an operation example of the imaging apparatus 10.
  • the number of distance ranges and the number of generated image frames are exemplified as “3”, but the present invention is not limited to this.
  • the distance image signal processing unit 38 determines whether or not the synthesis flag is turned on (S11). On / off of the synthesis flag is set, for example, via the operation unit of the information processing apparatus 8. For example, an operation for turning on the synthesis flag is accepted by the operation unit of the information processing device 8, and this operation information is sent to the imaging device 10 via the network 7. The distance image signal processing unit 38 sets the synthesis flag to ON based on the setting information received from the imaging device 10. For example, the operation unit of the information processing device 8 accepts an operation of touching an icon for switching on / off of the combination flag on the screen by the user.
  • variable n indicates the set number of distance ranges and corresponds to the number of derived image frames.
  • the variable n may be a number other than “3”.
  • the distance image signal processing unit 38 sets a distance range D1 as a distance range for generating a distance image (see FIG. 4A).
  • the distance image signal processing unit 38 sets the pulse rate of the light emission drive signal corresponding to the distance range D1 with reference to the pulse setting table or the like (S13).
  • the distance range is set as a different range, for example, every 30 cm.
  • the timing generation unit 35 sets a high period and a low period of the electronic shutter window signal according to the pulse rate, and generates an electronic shutter window signal. Even in this case, the timing generator 35 generates the electronic shutter window signal so as to repeat driving (HIGH) and stopping (LOW) at the same cycle as the light emission driving signal regardless of the set pulse rate. Further, the light emission drive signal and the electronic shutter window signal may have the same phase or may be slightly shifted. For example, the electronic shutter window signal may be slightly delayed from the light emission drive signal.
  • the light emitting element unit 31 emits infrared light to the object 20 in accordance with the set pulse rate according to the light emission drive signal (S14).
  • the light receiving element unit 33 receives infrared light as reflected light from the object 20 (S15).
  • the distance image signal processing unit 38 calculates an integrated value of the received infrared light for each pixel, and generates a distance image based on the calculated integrated value (S16).
  • the distance image signal processing unit 38 stores the generated distance image of the distance range D1 in the storage unit 41 (S17). Thereby, a distance image for one frame is generated and stored.
  • the distance image signal processing unit 38 determines whether or not the variable n is “1” (S18). If the variable n is not “1”, the variable n is decremented (S19), and the process proceeds to S13.
  • the distance image signal processing unit 38 sets the distance range D2 and sets the pulse rate of the light emission drive signal corresponding to the distance range D2, as described above, and the distance image corresponding to this pulse rate. Is generated. As described above, the distance image signal processing unit 38 sets the distance range D3, sets the pulse rate of the light emission drive signal corresponding to the distance range D3, and generates a distance image corresponding to the pulse rate.
  • the distance image signal processing unit 38 acquires the distance images P1 to P3 corresponding to the distance ranges D1 to D3 from the storage unit 41, and the distance is set.
  • the images P1 to P3 are combined to generate a combined distance image P0 (S20).
  • the generated composite distance image P0 is output by an arbitrary method (S21).
  • the storage unit 41 may accumulate the synthesized distance image P0, the communication unit 43 may transmit the synthesized distance image P0 to an external device, or the monitor 11 may display the synthesized distance image P0.
  • the composite distance image P0 can be used on the cloud by being transmitted to the external device.
  • the distance image signal processing unit 38 sets a distance range based on, for example, a predetermined operation (S31). For example, a desired distance range for generating a distance image is received by the operation unit of the information processing device 8, and this operation information is sent to the imaging device 10 via the network 7.
  • the distance image signal processing unit 38 sets a distance range (for example, distance ranges D1 to D3) based on the operation information received from the imaging device 10.
  • the operation unit of the information processing apparatus 8 receives, for example, an operation of sliding a slide bar on the screen or an operation of touching a button by the user (see FIGS. 7A to 7C). These slide bars and buttons are for setting a distance range.
  • the distance image signal processing unit 38 refers to the pulse setting table or the like and sets the pulse rate of the light emission drive signal corresponding to the set distance range (S32).
  • the timing generation unit 35 sets a high period and a low period of the electronic shutter window signal according to the pulse rate, and generates an electronic shutter window signal. Even in this case, the timing generator 35 generates the electronic shutter window signal so as to repeat driving (HIGH) and stopping (LOW) at the same cycle as the light emission driving signal regardless of the set pulse rate. Further, the light emission drive signal and the electronic shutter window signal may have the same phase or may be slightly shifted. For example, the electronic shutter window signal may be slightly delayed from the light emission drive signal.
  • the light emitting element unit 31 emits infrared light to the object 20 according to the set pulse rate in accordance with the light emission drive signal (S33).
  • the light receiving element unit 33 receives infrared light as reflected light from the object 20 (S34).
  • the distance image signal processing unit 38 calculates an integrated value of the received infrared light for each pixel, and generates a distance image based on the calculated integrated value (S35).
  • the generated distance image is output by an arbitrary method (S36).
  • the storage unit 41 may accumulate the distance image
  • the communication unit 43 may transmit the distance image to an external device
  • the monitor 11 may display the distance image.
  • the distance image can be used on the cloud by being transmitted to the external device.
  • FIG. 7A to 7C are schematic diagrams showing image examples of distance images P1 to P3 generated when light is emitted at a pulse rate corresponding to the distance ranges D1 to D3.
  • a conical object is assumed as the object 20 as in FIG.
  • the pulse rate is set so that the reflected light from the target point 25 in the distance range D1 becomes an appropriate amount for generating the distance image.
  • an integrated value of the amount of received light is suitably obtained around the upper portion 71 of the conical object.
  • the composite flag When the composite flag is set to ON, an operation for setting the distance range D1 is not particularly required.
  • the corresponding pulse rates may be set in the order of the distance ranges D1, D2, and D3, and the distance image may be generated.
  • the combination flag When the combination flag is set to OFF, for example, the user operates the slide bar 75 on the screen via the operation unit of the information processing device 8 to adjust the position to the short distance position 75a.
  • the pulse rate is set so that the reflected light from the target point 25 in the distance range D2 becomes an appropriate amount for generating the distance image.
  • an integrated value of the amount of received light is suitably obtained for the periphery of the middle part 72 of the conical object.
  • the operation for setting the distance range D2 is not particularly required.
  • the corresponding pulse rates may be set in the order of the distance ranges D1, D2, and D3, and the distance image may be generated.
  • the synthesis flag is set to OFF, for example, the user operates the slide bar 75 on the screen via the operation unit of the information processing apparatus 8 to adjust the position to the middle distance position 75b.
  • the pulse rate is set so that the reflected light from the target point 25 in the distance range D3 becomes an appropriate amount for generating the distance image.
  • an integrated value of the amount of received light is suitably obtained around the lower portion 73 of the conical object.
  • the composite flag When the composite flag is set to ON, an operation for setting the distance range D3 is not particularly required.
  • the corresponding pulse rates may be set in the order of the distance ranges D1, D2, and D3, and the distance image may be generated.
  • the combination flag When the combination flag is set to OFF, for example, the user operates the slide bar 75 on the screen via the operation unit of the information processing apparatus 8 to adjust the position to the long distance position 75c.
  • FIG. 8 is a schematic diagram showing an example of the composite distance image P0.
  • the distance image signal processing unit 38 performs, for example, statistical processing using the pixel values of the distance images P1 to P3 for each pixel and combines them to generate a combined distance image P0.
  • the statistical processing for example, an addition value, a simple average value, a median value, a weighted average value, a maximum value, and a minimum value of the pixel values of the distance images P1 to P3 are derived.
  • the synthesized distance image P0 includes an upper portion 71 of the object 20 whose distance information is preferably shown in the distance image P1, a middle portion 72 of the object 20 whose distance information is preferably shown in the distance image P2, and a distance.
  • the lower part 73 of the target object 20 whose distance information is suitably shown in the image P3 is included.
  • each distance range D1 to D3 the distance information of each pixel is shown in the same pattern, but the pattern shows a range in which the distance information can be suitably obtained. Actually, the distance information is different for each pixel.
  • the imaging device 10 when the distance between the imaging device 10 and the target point 25 of the target object 20 is short, the pulse rate is lowered and the distance between the imaging device 10 and the target point 25 of the target object 20 is reduced. When the distance is long, the pulse rate is increased. That is, the imaging device 10 changes the light emission method for generating the distance image. Thereby, it can suppress that the integral value of the received light quantity with respect to light emission is insufficient, and can adjust it appropriately. Further, for example, the present embodiment is applied both when the distance between the imaging device 10 and the object 20 is excessively long or excessively short, or when the object 20 is large (for example, 2 m to 3 m). it can.
  • the imaging device 10 can expand the range in which the integrated value of the amount of received light can be measured appropriately (effectively) by controlling the pulse rate of light emission. Therefore, the distance (dynamic range) between the imaging device 10 that can effectively measure the integrated value of the amount of received light and the target point 25 of the target 20 can be expanded. Moreover, the imaging device 10 can supplement the lack of sensitivity related to the range image generation when the pulse rate is single.
  • the distance between the imaging device 10 and the target point 25 of the target object 20 can be derived in each pixel, the 3D (three-dimensional) shape of the target object 20 can be measured. Therefore, it can be applied to, for example, a 3D printer or a 3D scanner.
  • the accuracy of the measured distance can be further improved by combining the image analysis result of the visible image with the distance image.
  • the greater the surface area of the pixel corresponding to the light receiving element (for example, CCD) of the light receiving element section 33 the greater the amount of light that can be received and the greater the charge capacity.
  • the charge capacity determines the dynamic range.
  • the resolution is increased, the number of pixels increases, so the surface area of the pixels decreases, the amount of light that can be received decreases, and the charge capacity decreases.
  • the imaging apparatus 10 it is possible to suitably compensate for a shortage of the dynamic range caused by increasing the resolution.
  • the timing generation unit 35 exemplifies changing the light emission amount corresponding to the distance range D1 to D3 by changing the pulse rate.
  • the light emission amount may be changed by another method.
  • the timing generation unit 35 may control the light emission amount by controlling the pulse width of the light emission drive signal. In this case, the larger the pulse width, the larger the light emission amount, and the smaller the pulse width, the smaller the light emission amount.
  • the imaging device 10 is exemplified as being installed indoors, but may be installed on an indoor wall or the like.
  • the imaging device 10 may be installed at a predetermined height using a pedestal or the like.
  • the imaging device 10 may be provided with a forklift camera installation unit in which the object 20 is arranged on a forklift stand and is separated from the object 20 by a predetermined distance.
  • the three-dimensional size of the article can be grasped while the article as the object 20 is being transported using the forklift.
  • the information processing apparatus 8 can easily track from the delivery source to the delivery destination of the article.
  • the imaging device 10 can speed up the transportation of the article and improve the ease of traceability.
  • the imaging device 10 when the distance between the imaging device 10 and the target point 25 of the target object 20 is set to a predetermined distance when measuring the distance of the target object 20, the imaging device 10 is not fixed and is portable. May be.
  • the imaging device 10 may be applied to a portable transaction terminal device (for example, a payment terminal device). Thereby, the size of the article can be measured with high accuracy in various places.
  • the information processing device 8 may acquire a plurality of distance images from the imaging device 10 and generate a combined distance image.
  • imaging device 10 In the above embodiment, the case where there is one imaging device 10 is shown, but a plurality of imaging devices 10 may be provided. Any one of the imaging devices 10 may combine a plurality of distance images or a plurality of combined distance images generated by the plurality of imaging devices 10 to generate a new combined distance image.
  • the imaging apparatus 10 has a light emitting element unit 31 that emits light to the object 20, a light emission / light reception control / drive unit 32 that controls the light emission amount, and light emitted with a controlled light emission amount. And a light receiving element portion 33 that receives reflected light from the object. Further, the imaging device 10 generates a plurality of distance images based on integral values of a plurality of received light amounts obtained for different light emission amounts, and combines the plurality of distance images to generate a combined distance image.
  • a signal processing unit 38 is provided.
  • the imaging device 10 is an example of a distance image generation device.
  • the light emitting element unit 31 is an example of a light emitting unit.
  • the light emission / light reception control / drive unit 32 is an example of a light emission amount control unit.
  • the light receiving element unit 33 is an example of a light receiving unit.
  • the distance image signal processing unit 38 is an example of a distance image generation unit and an image synthesis unit.
  • the light emission / light reception control / drive unit 32 may control the pulse rate of the light emission drive signal. As a result, a plurality of distance images corresponding to various distance ranges can be generated, which is a basis for generating the composite distance image.
  • the light emission / light reception control / drive unit 32 may control the pulse width of the light emission drive signal. Thereby, it is possible to generate a plurality of distance images corresponding to various distance ranges as a basis for generating the composite distance image.
  • the imaging device 10 includes a communication unit 43 that acquires operation information from the information processing device 8 that receives an operation input, and the distance image signal processing unit 38 generates a synthesized distance image when the operation information is acquired. Also good. Thereby, it can be selected by a user's operation whether a synthetic
  • the information processing device 8 is an example of an operation device.
  • the communication unit 43 is an example of an operation information acquisition unit.
  • the light emitting element unit 31 may emit invisible light. Thereby, compared with the case where a distance image is generated using visible light, the edge of the object 20 is emphasized, and the accuracy of the distance indicated by the combined distance image can be improved.
  • Invisible light is, for example, infrared rays or ultraviolet rays.
  • the distance image generation method in the imaging device 10 controls the amount of light emitted to emit light to the object 20, the step of emitting light to the object 20 with the controlled amount of light emitted, and the emitted light.
  • the distance image generation program of the above embodiment is a program for causing a computer to execute each step of the distance image generation method in the imaging apparatus 10.
  • the present invention is useful for a distance image generation device, a distance image generation method, a distance image generation program, and the like that can improve the accuracy of the distance between the imaging device indicated by the distance image and the imaging object.
  • Network 8 Information processing device 9 Recorder 10 Imaging device 11 Monitor 30A Video imaging unit 30B Distance image generation unit 31 Light emitting element unit 32 Light emission / light reception control / drive unit 33 Light reception element unit 33A Imaging optical system 34 Light reception timing control unit 35 Timing generation Unit 36 A / D conversion unit 37 separation unit 38 distance image signal processing unit 39 visible / IR image signal processing unit 40 association processing unit 41 storage unit 42 CPU 43 Communication unit 44 Communication path

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A distance image generation device is provided with: a light emission unit for emitting light toward an object, a light emission amount control unit for controlling the amount of light emitted by the light emission unit, a light reception unit for receiving light emitted at the controlled light emission amount and reflected by the object, a distance image generation unit for generating a plurality of distance images on the basis of the integrated values of a plurality of light reception amounts obtained from different light emission amounts, and an image combination unit for combining the plurality of distance images and generating a combined distance image.

Description

距離画像生成装置、距離画像生成方法、及び距離画像生成プログラムDistance image generating apparatus, distance image generating method, and distance image generating program
 本発明は、距離画像生成装置、距離画像生成方法、及び距離画像生成プログラムに関する。 The present invention relates to a distance image generation device, a distance image generation method, and a distance image generation program.
 従来、赤外線を用いて3次元対象物までの距離を非接触測定する3次元測定器の1つとして、距離画像カメラが知られている。距離画像カメラは、例えば赤外線LED(Light Emitting Diode)の光を対象物に照射し、この光が反射されて受光されるまでの時間を測定し、距離に換算する。 Conventionally, a distance image camera is known as one of the three-dimensional measuring devices that perform non-contact measurement of the distance to a three-dimensional object using infrared rays. The distance image camera, for example, irradiates an object with light from an infrared LED (Light Emitting Diode), measures the time until the light is reflected and received, and converts the light into a distance.
 従来、距離画像カメラを用いて物体の寸法を測定する物体寸法測定装置が知られている(例えば、特許文献1参照)。 2. Description of the Related Art Conventionally, an object dimension measuring apparatus that measures an object dimension using a range image camera is known (for example, see Patent Document 1).
特開2011-196860号公報JP 2011-196860 A
 本発明の距離画像生成装置は、対象物へ光を発光する発光部と、発光部による発光量を制御する発光量制御部と、制御された発光量で発光された光に対する、対象物からの反射光を受光する受光部と、異なる発光量に対して得られる複数の受光量の積分値に基づいて、複数の距離画像を生成する距離画像生成部と、複数の距離画像を合成し、合成距離画像を生成する画像合成部と、を備える。 The distance image generation device of the present invention includes a light emitting unit that emits light to an object, a light emission amount control unit that controls the amount of light emitted by the light emitting unit, and light emitted from the object with respect to light emitted with the controlled light emission amount. A light receiving unit that receives reflected light, a distance image generation unit that generates a plurality of distance images based on an integrated value of a plurality of light receiving amounts obtained for different light emission amounts, and a plurality of distance images that are synthesized An image composition unit for generating a distance image.
 本発明の距離画像生成方法は、距離画像生成装置における距離画像生成方法であって、対象物へ光を発光するための発光量を制御するステップと、制御された発光量で対象物へ光を発光するステップと、発光された光に対する、対象物からの反射光を受光するステップと、異なる発光量に対して得られる複数の受光量の積分値に基づいて、複数の距離画像を生成するステップと、複数の距離画像を合成し、合成距離画像を生成するステップと、を備える。 The distance image generation method of the present invention is a distance image generation method in a distance image generation device, the step of controlling a light emission amount for emitting light to an object, and a light to the object with the controlled light emission amount. A step of emitting light, a step of receiving reflected light from the object with respect to the emitted light, and a step of generating a plurality of distance images based on an integrated value of a plurality of received light amounts obtained for different light emission amounts. And synthesizing a plurality of distance images to generate a synthesized distance image.
 本発明の距離画像生成プログラムは、上記距離画像生成方法の各ステップをコンピュータに実行させるためのプログラムである。 The distance image generation program of the present invention is a program for causing a computer to execute each step of the distance image generation method.
 本発明によれば、距離画像が示す撮像装置と撮像対象物との間の距離の精度を向上できる。 According to the present invention, the accuracy of the distance between the imaging device indicated by the distance image and the imaging object can be improved.
図1は、実施形態における撮像装置と対象物との配置例、及び撮像装置周辺の装置例を示す模式図である。FIG. 1 is a schematic diagram illustrating an arrangement example of an imaging device and an object in the embodiment and an example of a device around the imaging device. 図2は、実施形態における撮像装置の構成例を示すブロック図である。FIG. 2 is a block diagram illustrating a configuration example of the imaging apparatus according to the embodiment. 図3は、実施形態における電子シャッタ制御を説明するためのタイミングチャートである。FIG. 3 is a timing chart for explaining the electronic shutter control in the embodiment. 図4Aは、実施形態における各距離範囲に応じた照射光と反射光との一例を示す模式図である。Drawing 4A is a mimetic diagram showing an example of irradiation light and reflected light according to each distance range in an embodiment. 図4Bは、実施形態における各距離範囲に応じた照射光と反射光との一例を示す模式図である。Drawing 4B is a mimetic diagram showing an example of irradiation light and reflected light according to each distance range in an embodiment. 図4Cは、実施形態における各距離範囲に応じた照射光と反射光との一例を示す模式図である。Drawing 4C is a mimetic diagram showing an example of irradiation light and reflected light according to each distance range in an embodiment. 図5は、実施形態における撮像装置の動作例を示すフローチャートである。FIG. 5 is a flowchart illustrating an operation example of the imaging apparatus according to the embodiment. 図6は、実施形態における撮像装置の動作例を示すフローチャートである(図5の続き)。FIG. 6 is a flowchart illustrating an operation example of the imaging apparatus according to the embodiment (continuation of FIG. 5). 図7Aは、実施形態における各距離範囲に対応するパルスレートで発光した場合の距離画像の画像例を示す模式図である。FIG. 7A is a schematic diagram illustrating an example of a distance image when light is emitted at a pulse rate corresponding to each distance range in the embodiment. 図7Bは、実施形態における各距離範囲に対応するパルスレートで発光した場合の距離画像の画像例を示す模式図である。FIG. 7B is a schematic diagram illustrating an example of a distance image when light is emitted at a pulse rate corresponding to each distance range in the embodiment. 図7Cは、実施形態における各距離範囲に対応するパルスレートで発光した場合の距離画像の画像例を示す模式図である。FIG. 7C is a schematic diagram illustrating an example of a distance image when light is emitted at a pulse rate corresponding to each distance range in the embodiment. 図8は、実施形態における合成距離画像の画像例を示す模式図である。FIG. 8 is a schematic diagram illustrating an example of a composite distance image in the embodiment.
 以下、本発明の実施形態について、図面を用いて説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 (本発明の一形態を得るに至った経緯)
 本発明の実施の形態の説明に先立ち、従来の装置における問題点を簡単に説明する。特許文献1に記載された物体寸法測定装置では、距離画像が示す撮像装置と対象物との間の距離の精度が不十分であった。
(Background to obtaining one embodiment of the present invention)
Prior to the description of the embodiments of the present invention, problems in the conventional apparatus will be briefly described. In the object dimension measuring device described in Patent Document 1, the accuracy of the distance between the imaging device indicated by the distance image and the object is insufficient.
 本発明は、上記事情に鑑みてなされたものであり、距離画像が示す撮像装置と対象物との間の距離の精度を向上できる距離画像生成装置、距離画像生成方法、及び距離画像生成プログラムを提供する。 The present invention has been made in view of the above circumstances, and provides a distance image generation device, a distance image generation method, and a distance image generation program that can improve the accuracy of the distance between the imaging device indicated by the distance image and the object. provide.
 従来の物体寸法測定装置では、物体の寸法が小さい場合、例えば物体が20cm~30cm程度の荷物である場合、受光される反射光の光量が適度となり、距離を測定できる。一方、物体の寸法が大きい場合、例えば30cm程度以上の荷物である場合、物体において物体の一端側と他端側とで、受光される反射光の光量が異なる。そのため、物体の一端側での反射を優先した場合には、他端側での反射光の光量が不足となり、他端側での反射を優先した場合には、一端側での反射光の光量が過大となり、距離画像が示す距離の精度が低下することがあった。 In the conventional object size measuring apparatus, when the size of the object is small, for example, when the object is a baggage of about 20 cm to 30 cm, the amount of reflected light received is appropriate and the distance can be measured. On the other hand, when the size of the object is large, for example, when it is a baggage of about 30 cm or more, the amount of reflected light received by the object differs between the one end side and the other end side of the object. Therefore, when priority is given to reflection at one end of the object, the amount of reflected light at the other end becomes insufficient, and when priority is given to reflection at the other end, the amount of reflected light at one end May become excessive, and the accuracy of the distance indicated by the distance image may decrease.
 また、距離画像カメラと対象物との間の距離が短すぎる場合には、受光される反射光の光量が過大となり、距離画像が示す距離の精度が低下することがあった。また、距離画像カメラと対象物との間の距離が長すぎる場合には、受光される反射光の光量が不十分となり、距離画像が示す距離の精度が低下することがあった。 In addition, when the distance between the distance image camera and the object is too short, the amount of reflected light received is excessive, and the accuracy of the distance indicated by the distance image may be reduced. When the distance between the distance image camera and the object is too long, the amount of reflected light received is insufficient, and the accuracy of the distance indicated by the distance image may be reduced.
 以下、距離画像が示す撮像装置と撮像対象物との間の距離の精度を向上できる距離画像生成装置、距離画像生成方法、及び距離画像生成プログラムについて説明する。 Hereinafter, a distance image generating device, a distance image generating method, and a distance image generating program capable of improving the accuracy of the distance between the imaging device indicated by the distance image and the imaging object will be described.
 (第1の実施形態)
 図1は、第1の実施形態における撮像装置10と測定対象となる対象物20との配置例を示す模式図である。また、図1では、撮像装置10に接続される装置についても例示されている。
(First embodiment)
FIG. 1 is a schematic diagram illustrating an arrangement example of the imaging device 10 and the measurement target 20 in the first embodiment. In FIG. 1, an apparatus connected to the imaging apparatus 10 is also illustrated.
 図1に示すように、撮像装置10は、例えば、屋内の天井や床面から所定の高さに設置される。撮像装置10は、対象物20に対して照射光L1(例えばIR(InfraRed)光)を照射し、対象物20により反射された反射光L2(例えばIR光)を受光し、所定期間の受光量を基に、距離画像を生成する。即ち、撮像装置10は、撮像装置10と対象物20(対象物20における各対象点)との間の距離を測定する。対象点は、対象物20の一部の領域、点である。 As shown in FIG. 1, the imaging device 10 is installed at a predetermined height from an indoor ceiling or floor, for example. The imaging apparatus 10 irradiates the object 20 with irradiation light L1 (for example, IR (InfraRed) light), receives reflected light L2 (for example, IR light) reflected by the object 20, and receives the amount of light received for a predetermined period. Based on the above, a distance image is generated. That is, the imaging device 10 measures the distance between the imaging device 10 and the target 20 (each target point in the target 20). The target point is a partial area or point of the target object 20.
 対象物20は、例えば、撮像装置10に対向する所定の位置に配置される。対象物20の距離測定が終了した後は、例えば、他の対象物20が所定の位置に配置され、撮像装置10との間の距離を測定される。この距離が測定されることで、対象物20の寸法を計測できる。 The target object 20 is arrange | positioned in the predetermined position which opposes the imaging device 10, for example. After the distance measurement of the target 20 is completed, for example, another target 20 is placed at a predetermined position, and the distance to the imaging device 10 is measured. By measuring this distance, the dimension of the object 20 can be measured.
 対象物20を測定するための撮像装置10は、複数設けられてもよい。また、撮像装置10は、屋外に設置されてもよい。尚、撮像装置10と対象物20が配置される所定の位置との距離は、予め定められている。 A plurality of imaging devices 10 for measuring the object 20 may be provided. Moreover, the imaging device 10 may be installed outdoors. Note that the distance between the imaging device 10 and a predetermined position where the object 20 is disposed is determined in advance.
 撮像装置10は、ネットワーク7と接続されてもよい。ネットワーク7には、例えば、レコーダ9、情報処理装置8、モニタ11が接続される。レコーダ9は、撮像装置10から各種データ(例えば、撮像画像、距離画像(depth map)、後述する合成距離画像)を受信し、蓄積する。情報処理装置8は、レコーダ9に蓄積されたデータにアクセスし、各種の処理(例えば画像処理)、編集、集計、又は検索を行う。モニタ11は、例えば、撮像画像、距離画像、合成距離画像を表示する。これにより、ユーザは、例えば、撮像画像、距離画像、合成距離画像を確認できる。尚、撮像装置10が、レコーダ9、情報処理装置8、モニタ11の機能を備えていてもよい。 The imaging device 10 may be connected to the network 7. For example, a recorder 9, an information processing device 8, and a monitor 11 are connected to the network 7. The recorder 9 receives and accumulates various data (for example, a captured image, a distance map, and a composite distance image described later) from the imaging device 10. The information processing apparatus 8 accesses the data stored in the recorder 9 and performs various processes (for example, image processing), editing, summing up, or searching. The monitor 11 displays, for example, a captured image, a distance image, and a combined distance image. Thereby, the user can confirm a captured image, a distance image, and a synthetic distance image, for example. Note that the imaging device 10 may have the functions of the recorder 9, the information processing device 8, and the monitor 11.
 ネットワーク7は、例えば、LAN(Local Area Network)、インターネット、公衆電話回線網、専用電話回線、携帯電話回線網のいずれかでもよいし、それらのうち複数種類の網の組み合わせでもよい。 The network 7 may be, for example, any one of a LAN (Local Area Network), the Internet, a public telephone line network, a dedicated telephone line, a mobile phone line network, or a combination of a plurality of types of networks.
 対象物20は、例えば、宅配便等で運搬される直方体形状の段ボールに収納された物品、他の形状(例えば図1のような円柱形状、角柱形状)の物品、ゴルフバック、スーツケース、ケーブルが巻かれたドラム、バケツ、鞄、木枠、人、車を含む。つまり、対象物20には、様々な形状や大きさを有する物品が広く含まれる。 The object 20 is, for example, an article stored in a rectangular parallelepiped corrugated cardboard conveyed by a courier or the like, an article of another shape (for example, a cylindrical shape or a prismatic shape as shown in FIG. 1), a golf bag, a suitcase, a cable. Includes drums, buckets, caskets, wooden frames, people and cars. That is, the object 20 includes a wide variety of articles having various shapes and sizes.
 更に、撮像装置10、レコーダ9、情報処理装置8、モニタ11等を備える撮像システムと既存システム(例えば流通システム)とを組み合わせてもよい。これにより、流通過程においてスムーズに距離画像を取得できる。 Furthermore, an imaging system including the imaging device 10, the recorder 9, the information processing device 8, the monitor 11, and the like may be combined with an existing system (for example, a distribution system). Thereby, a distance image can be acquired smoothly in the distribution process.
 図2は、撮像装置10の構成例を示すブロック図である。撮像装置10は、対象物の外観を確認するための画像(例えば可視画像、IR画像)と、撮像装置10と対象物20における各対象点との距離を測定するための画像(例えば距離画像)と、を撮像可能である。 FIG. 2 is a block diagram illustrating a configuration example of the imaging apparatus 10. The imaging device 10 has an image (for example, a visible image or an IR image) for confirming the appearance of the object, and an image (for example, a distance image) for measuring the distance between the imaging device 10 and each target point on the object 20. Can be imaged.
 距離画像は、例えば、TOF(Time Of Flight)方式で生成される。又は、撮像装置10は、可視光により撮像する2つの光学系を備えたステレオタイプの撮像装置でもよい。 The distance image is generated by, for example, the TOF (Time Of Flight) method. Alternatively, the imaging device 10 may be a stereo type imaging device including two optical systems that capture images with visible light.
 撮像装置10は、発光素子部31、発光/受光制御・駆動部32、受光素子部33、受光タイミング制御部34、タイミング生成部35、及びA/D変換部36、分離部37、距離画像信号処理部38、可視・IR画像信号処理部39、対応付け処理部40、記憶部41、CPU(Central Processing Unit)42、及び通信部43を備える。 The imaging device 10 includes a light emitting element unit 31, a light emission / light reception control / drive unit 32, a light receiving element unit 33, a light reception timing control unit 34, a timing generation unit 35, an A / D conversion unit 36, a separation unit 37, and a distance image signal. A processing unit 38, a visible / IR image signal processing unit 39, an association processing unit 40, a storage unit 41, a CPU (Central Processing Unit) 42, and a communication unit 43 are provided.
 発光素子部31は、距離画像の撮像範囲に向けて赤外光(IR光)を発光する。赤外光の発光のタイミング及び期間は、タイミング生成部35が生成する発光駆動信号によって制御される。 The light emitting element unit 31 emits infrared light (IR light) toward the imaging range of the distance image. The timing and period of infrared light emission are controlled by a light emission drive signal generated by the timing generation unit 35.
 発光/受光制御・駆動部32は、タイミング生成部35からの発光駆動信号に従って、発光素子部31を駆動する。発光/受光制御・駆動部32は、タイミング生成部35からの電子シャッタウィンドウ信号に従って、受光タイミング制御部34を駆動する。電子シャッタは、例えば、CCD(Charge Coupled Device)のグローバルシャッタ、光学シャッタなどであり、これらに限定されない。 The light emission / light reception control / drive unit 32 drives the light emitting element unit 31 according to the light emission drive signal from the timing generation unit 35. The light emission / light reception control / drive unit 32 drives the light reception timing control unit 34 in accordance with the electronic shutter window signal from the timing generation unit 35. The electronic shutter is, for example, a CCD (Charge Coupled Device) global shutter, an optical shutter, or the like, but is not limited thereto.
 受光素子部33は、撮像光学系33Aを介して可視光を受光する赤色光(R)、緑色光(G)、及び青色光(B)の各受光素子と、赤外光を受光するIR受光素子とを、各画素に備える。IR受光素子は、距離画像を生成するために設けられる。IR受光素子は、発光素子部31から発光され、対象物20(図1参照)によって反射された赤外光を受光する。 The light receiving element unit 33 receives red light (R), green light (G), and blue light (B) that receive visible light via the imaging optical system 33A, and IR light reception that receives infrared light. An element is provided for each pixel. The IR light receiving element is provided for generating a distance image. The IR light receiving element receives infrared light emitted from the light emitting element unit 31 and reflected by the object 20 (see FIG. 1).
 受光素子部33が受光した赤色光、緑色光、青色光、及び赤外光の各光は、受光素子部33が備える光電変換素子により、電気信号に変換され、出力される。光電変換素子は、例えば、CCD、又は、CMOS(Complementary Metal Oxide Semiconductor)、を含む。 Each light of the red light, the green light, the blue light, and the infrared light received by the light receiving element unit 33 is converted into an electric signal by the photoelectric conversion element provided in the light receiving element unit 33 and output. The photoelectric conversion element includes, for example, a CCD or a CMOS (Complementary Metal Oxide Semiconductor).
 受光タイミング制御部34は、距離画像を生成するための電子シャッタ制御を行う。距離画像は、発光素子部31が出射した赤外光を反射する対象物20の対象点までの距離を示す。電子シャッタ、即ち光電変換のタイミング及び期間は、タイミング生成部35が生成する電子シャッタウィンドウ信号によって制御される。 The light reception timing control unit 34 performs electronic shutter control for generating a distance image. The distance image indicates the distance to the target point of the target object 20 that reflects the infrared light emitted from the light emitting element unit 31. The electronic shutter, that is, the timing and period of photoelectric conversion are controlled by an electronic shutter window signal generated by the timing generator 35.
 タイミング生成部35は、発光駆動信号及び電子シャッタウィンドウ信号を生成する。発光駆動信号及び電子シャッタウィンドウ信号のパルスレートは、変更可能である。このパルスレートは、距離範囲に対応して変化し、距離画像の生成に必要な光量に応じて変更される。距離範囲とは、撮像装置10が距離画像を生成のための距離の範囲である。タイミング生成部35は、距離範囲とパルスレートとの対応関係の情報を、パルス設定テーブル等により予め保持している。上記の距離範囲に対象点がある場合、受光量の積分値が過不足なく得られる。 The timing generator 35 generates a light emission drive signal and an electronic shutter window signal. The pulse rate of the light emission drive signal and the electronic shutter window signal can be changed. This pulse rate changes corresponding to the distance range, and is changed according to the amount of light necessary for generating the distance image. The distance range is a range of distance for the imaging device 10 to generate a distance image. The timing generation unit 35 holds in advance information on the correspondence relationship between the distance range and the pulse rate using a pulse setting table or the like. When there is a target point in the above distance range, the integrated value of the received light amount can be obtained without excess or deficiency.
 例えば、パルスレートは、対象物20の対象点から5%~95%の反射率の反射光を有効に測定可能な値に設定される。距離画像が生成される際には、例えば、各フレームにおいてパルスレートが変更される。尚、パルスレートが変更される場合、1つのパルス当たりの光量は変更されない。 For example, the pulse rate is set to a value capable of effectively measuring reflected light having a reflectance of 5% to 95% from the target point of the target object 20. When the distance image is generated, for example, the pulse rate is changed in each frame. When the pulse rate is changed, the amount of light per pulse is not changed.
 A/D変換部36は、受光タイミング制御部34から出力された赤色光、緑色光、青色光、赤外光、及び電子シャッタ制御による赤外光の積分値、の各電気信号(アナログ信号)を、デジタル信号に変換する。A/D変換部36によって変換されたデジタル信号は、分離部37に出力される。 The A / D converter 36 is an electrical signal (analog signal) of red light, green light, blue light, infrared light output from the light reception timing control unit 34, and an integrated value of infrared light by electronic shutter control. Is converted into a digital signal. The digital signal converted by the A / D conversion unit 36 is output to the separation unit 37.
 分離部37は、A/D変換部36からのデジタル信号から、電子シャッタ制御による赤外光の積分値のデジタル信号を分離して、赤外光の積分値を距離画像信号処理部38に出力する。一方、赤色光、緑色光、青色光、及び赤外光の各デジタル信号は、可視・IR画像信号処理部39に出力される。 The separation unit 37 separates the digital signal of the integrated value of infrared light by electronic shutter control from the digital signal from the A / D conversion unit 36 and outputs the integrated value of infrared light to the distance image signal processing unit 38. To do. On the other hand, digital signals of red light, green light, blue light, and infrared light are output to the visible / IR image signal processing unit 39.
 距離画像信号処理部38は、電子シャッタ制御による赤外光の積分値を全画素について算出し、算出された積分値を基に、距離画像を生成する。この距離画像は、各画素が捉えている対象物20の対象点までの距離が規定された情報に相当する。距離画像が生成される際には、距離の演算、その他の処理(例えば、各種のフィルタ処理、信号処理、距離又は傾きの補正)が行われる。 The distance image signal processing unit 38 calculates an integrated value of infrared light by electronic shutter control for all pixels, and generates a distance image based on the calculated integrated value. This distance image corresponds to information in which the distance to the target point of the target object 20 captured by each pixel is defined. When the distance image is generated, distance calculation and other processing (for example, various filter processes, signal processing, distance or inclination correction) are performed.
 距離画像が表示部(例えばモニタ11)に表示される際には、例えば、撮像装置10から対象点までの距離が遠い画素ほど、濃度が低い濃淡画像として表示され、撮像装置10から対象点までの距離が近い画素ほど、濃度が高い濃淡画像として表示されてもよい。 When the distance image is displayed on the display unit (for example, the monitor 11), for example, as the distance from the imaging device 10 to the target point is longer, the pixel is displayed as a gray image having a lower density. Pixels closer to each other may be displayed as a gray image having a higher density.
 また、距離画像信号処理部38は、異なる発光駆動信号のパルスレートを用いて、複数の距離画像を生成する。距離画像信号処理部38は、合成距離画像を生成するか否かを示す合成フラグを有し、合成フラグに応じて、複数の距離画像を合成して、合成距離画像を生成する。 Further, the distance image signal processing unit 38 generates a plurality of distance images using pulse rates of different light emission drive signals. The distance image signal processing unit 38 has a combination flag indicating whether or not to generate a combined distance image, and generates a combined distance image by combining a plurality of distance images according to the combination flag.
 可視・IR画像信号処理部39は、赤色光、緑色光、及び青色光の各デジタル信号から監視用の可視画像を生成し、赤外光のデジタル信号から監視用の赤外画像(以下「IR画像」)を生成する。IR画像は、例えば、室内の照明が行き届かないために生ずる暗がり部分を監視するのにも有効である。 The visible / IR image signal processing unit 39 generates a visible image for monitoring from each digital signal of red light, green light, and blue light, and an infrared image for monitoring (hereinafter referred to as “IR”) from the digital signal of infrared light. Image "). The IR image is also effective for monitoring a dark portion caused by, for example, poor indoor lighting.
 対応付け処理部40は、対応付け情報を用いて、距離画像信号処理部38により生成された距離画像と、可視・IR画像信号処理部39により生成された可視画像及びIR画像と、の対応付けを行う。対応付け情報は、例えば、距離画像、可視画像、及びIR画像の各々の撮像時刻、撮像位置、画角の情報を含む。 The association processing unit 40 associates the distance image generated by the distance image signal processing unit 38 with the visible image and IR image generated by the visible / IR image signal processing unit 39 using the association information. I do. The association information includes, for example, information on the imaging time, imaging position, and angle of view of each of the distance image, visible image, and IR image.
 撮像装置10は、対応付け処理部40を備えることにより、例えば、距離画像、可視画像、及びIR画像の画像処理を連携させ、同時刻に同一の撮像装置10によって撮像され生成された、これらの画像を同時に表示させることを可能とする。 The imaging device 10 includes the association processing unit 40, so that, for example, the image processing of the distance image, the visible image, and the IR image is linked, and the images captured and generated by the same imaging device 10 at the same time are generated. It is possible to display images simultaneously.
 監視用の画像(例えば可視画像、IR画像)及び距離画像は、例えば一つの受光素子部33(センサ)によって生成されるが、異なる受光素子部33(複眼)によって、それぞれ生成されてもよい。例えば、赤色光、緑色光、青色光の受光素子部33と、赤外光の受光素子部33とは、異なってもよい。また、赤色光、緑色光、青色光の受光素子部33と、赤外光の受光素子部33に光を導入する光学系が、それぞれ独立に設けられてもよい。この場合、例えば、対応付け処理部40は、異なる位置に配置される複数の受光素子部33により得られた画像の中心位置とそれぞれの受光素子部33が撮像する範囲を画像補正する。 The monitoring image (for example, visible image, IR image) and distance image are generated by, for example, one light receiving element unit 33 (sensor), but may be generated by different light receiving element units 33 (compound eyes). For example, the light receiving element portion 33 for red light, green light, and blue light may be different from the light receiving element portion 33 for infrared light. In addition, optical systems for introducing light into the light receiving element portion 33 for red light, green light, and blue light and the light receiving element portion 33 for infrared light may be provided independently. In this case, for example, the association processing unit 40 performs image correction on the center position of the image obtained by the plurality of light receiving element units 33 arranged at different positions and the range captured by each light receiving element unit 33.
 いずれの場合でも、対応付け処理部40は、特徴の異なる可視画像、IR画像、及び距離画像(それぞれ静止画又は動画を含む)のいずれか、又は、可視画像、IR画像、及び距離画像の複数を組み合わせて、例えば、対象物20の検知、各種測定、追跡の精度を向上させることができる。 In any case, the association processing unit 40 is a visible image, an IR image, and a distance image (each including a still image or a moving image) having different characteristics, or a plurality of visible images, IR images, and distance images. For example, the accuracy of detection, various measurements, and tracking of the object 20 can be improved.
 記憶部41は、例えば、SSD(Solid State Drive)、HDD(Hard Disk Drive)、各種メモリ、SDカードにより構成される。記憶部41は、撮像装置10とは別に設けられる外部の記憶装置でもよい。 The storage unit 41 includes, for example, an SSD (Solid State Drive), an HDD (Hard Disk Drive), various memories, and an SD card. The storage unit 41 may be an external storage device provided separately from the imaging device 10.
 記憶部41は、例えば、距離画像信号処理部38により生成された距離画像、可視・IR画像信号処理部39により生成された可視画像及びIR画像、を記憶し、蓄積する。例えば、同時刻に同一の撮像装置10によって撮像され生成された距離画像、可視画像、及びIR画像は、対応付け処理部40により生成される対応付け情報によって対応付けられた後、記憶部41に記憶され、蓄積される。 The storage unit 41 stores and accumulates, for example, the distance image generated by the distance image signal processing unit 38, the visible image and the IR image generated by the visible / IR image signal processing unit 39, and the like. For example, the distance image, the visible image, and the IR image that are captured and generated by the same imaging device 10 at the same time are associated by the association information generated by the association processing unit 40 and then stored in the storage unit 41. Stored and accumulated.
 記憶部41に上記画像が蓄積される場合、CPU42によって生成される距離メタ情報が、対応付けられる距離画像、可視画像、及びIR画像と併せて記憶部41に記憶されてもよい。距離メタ情報は、例えば、撮像された対象物20の大きさ、形状、の情報を含む。 When the image is accumulated in the storage unit 41, the distance meta information generated by the CPU 42 may be stored in the storage unit 41 together with the associated distance image, visible image, and IR image. The distance meta information includes, for example, information on the size and shape of the captured object 20.
 記憶部41が距離画像、可視画像、及びIR画像とともに対象物20の距離メタ情報を保持することで、例えば撮像システムの情報処理装置8は、特定の荷物の検索、変形の有無の判定、追跡等が容易となる。 Since the storage unit 41 holds the distance meta information of the object 20 together with the distance image, the visible image, and the IR image, for example, the information processing apparatus 8 of the imaging system searches for a specific package, determines whether there is a deformation, and tracks it. Etc. becomes easy.
 CPU42(中央演算装置)は、例えばSOC(System on a chip)により形成される。CPU42は、映像撮像部30A又は距離画像生成部30Bを構成する各ブロックと、距離画像信号処理部38、対応付け処理部40、記憶部41、及び通信部43の各ブロックの制御部として動作する。CPU42が各ブロックを制御するためのソフトウェア・プログラムは、先の記憶部41に格納されてもよいし、図示しない別の記憶部に格納されてもよい。CPU42は、例えば記憶部41からソフトウェア・プログラムを読み出して、各ブロックを制御する。CPU42は、前述した距離メタ情報を生成してもよい。尚、ここでは、FPGA(Field Programmable Gate Array)、LSI(Large Scale Integration)、撮像装置10のCPU42、DSP(Digital Signal Processing)などのプロセッサも、プログラムを実行するコンピュータの対象に含まれる。 The CPU 42 (central processing unit) is formed by, for example, SOC (System on a chip). The CPU 42 operates as a control unit of each block of the video imaging unit 30A or the distance image generation unit 30B, and each block of the distance image signal processing unit 38, the association processing unit 40, the storage unit 41, and the communication unit 43. . The software program for the CPU 42 to control each block may be stored in the previous storage unit 41 or may be stored in another storage unit (not shown). For example, the CPU 42 reads out a software program from the storage unit 41 and controls each block. The CPU 42 may generate the distance meta information described above. Here, processors such as FPGA (Field Programmable Gate Array), LSI (Large Scale Integration), CPU 42 of the imaging apparatus 10, DSP (Digital Signal Processing), and the like are also included in the target of the computer that executes the program.
 通信部43は、ネットワーク7を介して、情報処理装置8やレコーダ9との間で通信する。通信部43は、例えば、無線LAN、広域無線通信網(例えば携帯電話回線網)を用いて通信する。 The communication unit 43 communicates with the information processing apparatus 8 and the recorder 9 via the network 7. The communication unit 43 performs communication using, for example, a wireless LAN or a wide area wireless communication network (for example, a mobile phone network).
 尚、受光素子部33、受光タイミング制御部34、A/D変換部36、分離部37、及び可視・IR画像信号処理部39を含む一点鎖線部分は、映像撮像部30Aである。発光素子部31、発光/受光制御・駆動部32、受光素子部33、受光タイミング制御部34、タイミング生成部35、A/D変換部36、分離部37、及び距離画像信号処理部38を含む点線部分は、距離画像生成部30Bである。距離画像生成部30Bの各構成部のうち、発光素子部31及び受光素子部33を除く構成部が、距離画像センサに相当する。 Note that the one-dot chain line portion including the light receiving element unit 33, the light receiving timing control unit 34, the A / D conversion unit 36, the separation unit 37, and the visible / IR image signal processing unit 39 is the video imaging unit 30A. A light emitting element unit 31, a light emission / light reception control / drive unit 32, a light receiving element unit 33, a light reception timing control unit 34, a timing generation unit 35, an A / D conversion unit 36, a separation unit 37, and a distance image signal processing unit 38 are included. A dotted line part is the distance image generation part 30B. Of the components of the distance image generating unit 30B, the components excluding the light emitting element unit 31 and the light receiving element unit 33 correspond to the distance image sensor.
 発光素子部31、発光/受光制御・駆動部32、受光素子部33、受光タイミング制御部34、タイミング生成部35、A/D変換部36、及び分離部37は、映像撮像部30Aと距離画像生成部30Bとの両方に用いられる。従って、撮像装置10の構成(例えばハードウェア構成)は、上記の共通化により簡素化される。 The light emitting element unit 31, the light emitting / receiving control / driving unit 32, the light receiving element unit 33, the light receiving timing control unit 34, the timing generation unit 35, the A / D conversion unit 36, and the separation unit 37 are combined with the video imaging unit 30 </ b> A and the distance image. It is used for both the generation unit 30B. Therefore, the configuration (for example, hardware configuration) of the imaging device 10 is simplified by the common use.
 映像撮像部30Aは、映像光学系である撮像光学系33Aを介して撮像する。距離画像生成部30Bは、距離画像光学系である撮像光学系33Aを介して撮像し、距離画像を生成する。即ち、撮像装置10の映像光学系と距離画像光学系とは、共通化されて、1つの撮像光学系33Aとなっている。尚、映像光学系と距離画像光学系とは別々に設けられてもよい。 The image capturing unit 30A captures an image via the image capturing optical system 33A that is an image optical system. The distance image generation unit 30B captures an image through the imaging optical system 33A, which is a distance image optical system, and generates a distance image. That is, the imaging optical system and the distance image optical system of the imaging apparatus 10 are made common to form one imaging optical system 33A. Note that the image optical system and the distance image optical system may be provided separately.
 尚、図2では、少なくとも距離画像の処理が実施されればよく、可視画像やIR画像の処理に係る構成部、各画像の対応付けに係る構成部については、省略可能である。 In FIG. 2, at least the distance image processing only needs to be performed, and the configuration unit related to the processing of the visible image and the IR image and the configuration unit related to the association of each image can be omitted.
 図3は、電子シャッタ制御の一例を説明するためのタイミングチャートである。図3に示すように、発光駆動信号は、例えばパルス波であり、一定の周期で駆動(HIGH:発光)と停止(LOW:消灯)とを繰り返す。発光素子部31から発光される赤外光の発光光量は、発光駆動信号の通りに立ち上がり、かつ立ち下がるものではなく、立ち上がり及び立ち下がりは滑らかになる。 FIG. 3 is a timing chart for explaining an example of the electronic shutter control. As shown in FIG. 3, the light emission drive signal is, for example, a pulse wave, and is repeatedly driven (HIGH: light emission) and stopped (LOW: extinguished) at a constant cycle. The emitted light quantity of the infrared light emitted from the light emitting element unit 31 does not rise and fall according to the light emission drive signal, but rises and falls smoothly.
 電子シャッタウィンドウ信号は、例えばパルス波であり、発光駆動信号と同一の周期で駆動(HIGH)と停止(LOW)とを繰り返す。発光駆動信号と電子シャッタウィンドウ信号とは、位相が同じでも若干ずれていてもよい。例えば、電子シャッタウィンドウ信号が、発光駆動信号に若干遅れていてもよい。 The electronic shutter window signal is, for example, a pulse wave, and is repeatedly driven (HIGH) and stopped (LOW) at the same cycle as the light emission drive signal. The light emission drive signal and the electronic shutter window signal may have the same phase or may be slightly shifted. For example, the electronic shutter window signal may be slightly delayed from the light emission drive signal.
 発光駆動信号及び電子シャッタウィンドウ信号によって発光素子部31及び受光タイミング制御部34をそれぞれ駆動することにより、図3に示すような赤外光の受光光量が得られる。 By driving the light emitting element unit 31 and the light reception timing control unit 34 by the light emission drive signal and the electronic shutter window signal, the received light amount of infrared light as shown in FIG. 3 can be obtained.
 発光素子部31が赤外光を発光してから、赤外光に対する反射光が受光素子部33の各画素において受光されるまでの時間が長い場合、電子シャッタウィンドウ信号のHighの期間に受光できる反射光の光量は、小さくなる。即ち、受光された画素が捉えている被写体部分(対象点)までの距離が撮像装置10から遠い場合には、電子シャッタウィンドウ信号がHighの期間に受光される反射光の光量は、少なくなる。 When the time from when the light emitting element unit 31 emits infrared light to when the reflected light with respect to the infrared light is received by each pixel of the light receiving element unit 33 is long, the light can be received during the High period of the electronic shutter window signal. The amount of reflected light is reduced. That is, when the distance to the subject portion (target point) captured by the received pixels is far from the imaging device 10, the amount of reflected light received during the period when the electronic shutter window signal is High is reduced.
 一方、発光素子部31が赤外光を発光してから、赤外光に対する反射光が受光素子部33の各画素において受光されるまでの時間が短い場合、電子シャッタウィンドウ信号のHighの期間に受光される反射光の光量は、多くなる。即ち、受光された画素が捉えている対象点までの距離が撮像装置10から近い場合には、電子シャッタウィンドウ信号のHighの期間に受光される反射光の光量は、多くなる。 On the other hand, when the time from when the light emitting element unit 31 emits infrared light until the reflected light with respect to the infrared light is received by each pixel of the light receiving element unit 33 is short, during the High period of the electronic shutter window signal The amount of reflected light received increases. That is, when the distance to the target point captured by the received pixels is close to the imaging device 10, the amount of reflected light received during the High period of the electronic shutter window signal increases.
 従って、各画素が捉えている対象点までの距離は、各画素において電子シャッタウィンドウ信号がHighである期間に受光した赤外光の光量の積分値(即ち各画素の輝度値)の大小によって、つまり当該積分値に応じて計測される。光電変換素子は、受光された赤外光の光量の積分値を、電気信号に変換する。この電気信号は、各画素が捉えている対象点までの距離に対応する。即ち、各画素の赤外光の積分値は、各画素が捉えている対象点までの距離を示す距離情報と等価である。 Therefore, the distance to the target point captured by each pixel depends on the magnitude of the integral value of the amount of infrared light received during the period when the electronic shutter window signal is high in each pixel (that is, the luminance value of each pixel). That is, it is measured according to the integral value. The photoelectric conversion element converts an integrated value of the amount of received infrared light into an electrical signal. This electrical signal corresponds to the distance to the target point captured by each pixel. That is, the integrated value of the infrared light of each pixel is equivalent to distance information indicating the distance to the target point captured by each pixel.
 図3に示すように、発光素子部31による赤外光の発光及び受光タイミング制御部34による赤外光の光電変換(受光光量の積分)は、1つの距離画像を生成するために複数回行われてよい。この場合、距離画像信号処理部38が、例えば、複数回の発光及び受光によって得られた各赤外光の積分値を平均化し、各赤外光の積分値の中間の値を採用することにより、距離画像を生成するための各画素の輝度値を求めてよい。 As shown in FIG. 3, infrared light emission by the light emitting element unit 31 and infrared light photoelectric conversion (integration of received light amount) by the light receiving timing control unit 34 are performed a plurality of times to generate one distance image. You may be broken. In this case, for example, the distance image signal processing unit 38 averages the integral value of each infrared light obtained by a plurality of times of light emission and light reception, and adopts an intermediate value of the integral value of each infrared light. The luminance value of each pixel for generating the distance image may be obtained.
 次に、距離範囲、発光駆動信号のパルスレート、並びに、照射光及び反射光の関係について説明する。 Next, the relationship between the distance range, the pulse rate of the light emission drive signal, and the irradiated light and reflected light will be described.
 図4Aは、距離範囲が距離範囲D1である場合の照射光L11及び反射光L12を示す。図4Bは、距離範囲が距離範囲D2である場合の照射光L21及び反射光L22を示す。図4Cは、距離範囲が距離範囲D3である場合の照射光L31及び反射光L32を示す。図4A~図4Cでは、撮像装置10及び対象物20の設置環境を、撮像装置10が上側、対象物20が下側となるように視ている。尚、ここでは、距離範囲の数が3つであるが、2つでも、4つ以上でもよい。 FIG. 4A shows the irradiation light L11 and the reflected light L12 when the distance range is the distance range D1. FIG. 4B shows the irradiation light L21 and the reflected light L22 when the distance range is the distance range D2. FIG. 4C shows the irradiation light L31 and the reflected light L32 when the distance range is the distance range D3. 4A to 4C, the installation environment of the imaging device 10 and the object 20 is viewed so that the imaging device 10 is on the upper side and the object 20 is on the lower side. Although the number of distance ranges is three here, it may be two or four or more.
 距離範囲D1は、撮像装置10から対象点25までの距離が、例えば0.6m~1.2mであり、図4Aでは、対象点25が対象物20の上部に含まれる。距離範囲D2は、撮像装置10から対象点25までの距離が、例えば1.2m~1.8mであり、図4Bでは、対象点25が対象物20の中部に含まれる。距離範囲D3は、撮像装置10から対象点25までの距離が、例えば1.8m~2.4mであり、図4Cでは、対象物20の下部に含まれる。 In the distance range D1, the distance from the imaging device 10 to the target point 25 is, for example, 0.6 m to 1.2 m, and the target point 25 is included in the upper part of the target object 20 in FIG. 4A. In the distance range D2, the distance from the imaging device 10 to the target point 25 is, for example, 1.2 m to 1.8 m, and the target point 25 is included in the middle of the target object 20 in FIG. In the distance range D3, the distance from the imaging device 10 to the target point 25 is, for example, 1.8 m to 2.4 m, and is included in the lower part of the target object 20 in FIG. 4C.
 距離範囲D1にある対象点25からの反射光を必要十分に得ようとする場合、タイミング生成部35は、比較的低いパルスレート(例えば1000回/秒)の発光駆動信号を生成する。発光素子部31は、このパルスレートに応じて、比較的少ない光量で照射光L11を発光する。受光素子部33は、照射光L11に対して対象点25により反射された反射光L12を受光する。 When the reflected light from the target point 25 in the distance range D1 is to be obtained sufficiently and sufficiently, the timing generator 35 generates a light emission drive signal having a relatively low pulse rate (for example, 1000 times / second). The light emitting element unit 31 emits the irradiation light L11 with a relatively small amount of light according to the pulse rate. The light receiving element unit 33 receives the reflected light L12 reflected by the target point 25 with respect to the irradiation light L11.
 これにより、受光素子部33により受光される赤外光の光量(受光光量)の積分値が過大になることを抑制できる。従って、撮像装置10は、撮像装置10からの距離が比較的近距離にある距離範囲においても、各画素における距離を差別化でき、距離画像を高精度に生成できる。 Thereby, it is possible to prevent the integral value of the light amount (the amount of received light) of the infrared light received by the light receiving element unit 33 from becoming excessive. Therefore, the imaging device 10 can differentiate the distance in each pixel even in a distance range where the distance from the imaging device 10 is relatively close, and can generate a distance image with high accuracy.
 距離範囲D2にある対象点25からの反射光を必要十分に得ようとする場合、タイミング生成部35は、中程度のパルスレート(例えば2000回/秒)の発光駆動信号を生成する。発光素子部31は、このパルスレートに応じて、中程度の光量で照射光L21を発光する。受光素子部33は、照射光L21に対して対象点25により反射された反射光L22を受光する。 When the reflected light from the target point 25 in the distance range D2 is to be obtained sufficiently and sufficiently, the timing generator 35 generates a light emission drive signal with a medium pulse rate (for example, 2000 times / second). The light emitting element unit 31 emits the irradiation light L21 with a medium amount of light according to the pulse rate. The light receiving element unit 33 receives the reflected light L22 reflected by the target point 25 with respect to the irradiation light L21.
 これにより、受光光量の積分値が過大及び過小になることを抑制できる。従って、撮像装置10は、撮像装置10からの距離が中距離にある距離範囲においても、各画素における距離を差別化でき、距離画像を高精度に生成できる。 This can suppress the integral value of the amount of received light from becoming too large or too small. Therefore, the imaging device 10 can differentiate the distance in each pixel even in a distance range where the distance from the imaging device 10 is a medium distance, and can generate a distance image with high accuracy.
 距離範囲D3にある対象点25からの反射光を必要十分に得ようとする場合、タイミング生成部35は、比較的高いパルスレート(例えば4000回/秒)の発光駆動信号を生成する。発光素子部31は、このパルスレートに応じて、比較的多い光量で照射光L31を発光する。受光素子部33は、照射光L31に対して対象点25により反射された反射光L32を受光する。 When the reflected light from the target point 25 within the distance range D3 is to be obtained sufficiently and sufficiently, the timing generator 35 generates a light emission drive signal having a relatively high pulse rate (for example, 4000 times / second). The light emitting element unit 31 emits the irradiation light L31 with a relatively large amount of light according to the pulse rate. The light receiving element unit 33 receives the reflected light L32 reflected by the target point 25 with respect to the irradiation light L31.
 これにより、受光光量の積分値が過小になることを抑制できる。従って、撮像装置10は、撮像装置10からの距離が比較的遠距離にある距離範囲においても、各画素における距離を差別化でき、距離画像を高精度に生成できる。 This makes it possible to suppress the integral value of the received light quantity from becoming too small. Therefore, the imaging device 10 can differentiate the distance in each pixel even in a distance range where the distance from the imaging device 10 is relatively long, and can generate a distance image with high accuracy.
 次に、撮像装置10の動作例について説明する。 Next, an operation example of the imaging device 10 will be described.
 図5及び図6は、撮像装置10の動作例を示すフローチャートである。尚、ここでは、距離範囲の数、生成される画像フレーム数が「3」であることを例示するが、これに限られない。 FIG. 5 and FIG. 6 are flowcharts showing an operation example of the imaging apparatus 10. Here, the number of distance ranges and the number of generated image frames are exemplified as “3”, but the present invention is not limited to this.
 距離画像信号処理部38は、合成フラグがオンされているか否かを判定する(S11)。合成フラグのオンオフは、例えば、情報処理装置8の操作部を介して設定される。例えば、情報処理装置8の操作部により合成フラグをオンにするための操作を受け付け、この操作情報がネットワーク7を介して撮像装置10に送られる。距離画像信号処理部38は、撮像装置10から受信された設定情報を基に、合成フラグをオンに設定する。情報処理装置8の操作部は、例えば、ユーザにより画面上の合成フラグのオンオフを切り替えるアイコンをタッチする操作を受け付ける。 The distance image signal processing unit 38 determines whether or not the synthesis flag is turned on (S11). On / off of the synthesis flag is set, for example, via the operation unit of the information processing apparatus 8. For example, an operation for turning on the synthesis flag is accepted by the operation unit of the information processing device 8, and this operation information is sent to the imaging device 10 via the network 7. The distance image signal processing unit 38 sets the synthesis flag to ON based on the setting information received from the imaging device 10. For example, the operation unit of the information processing device 8 accepts an operation of touching an icon for switching on / off of the combination flag on the screen by the user.
 合成フラグがオンである場合、変数nに「3」を設定する(S12)。変数nは、距離範囲の設定数を示し、導出される画像のフレーム数に相当する。尚、変数nは、「3」以外の数であってもよい。 When the synthesis flag is on, “3” is set to the variable n (S12). The variable n indicates the set number of distance ranges and corresponds to the number of derived image frames. The variable n may be a number other than “3”.
 距離画像信号処理部38は、距離画像を生成するための距離範囲として、距離範囲D1を設定する(図4A参照)。距離画像信号処理部38は、パルス設定テーブル等を参照し、距離範囲D1に対応する発光駆動信号のパルスレートを設定する(S13)。距離範囲は、例えば30cm毎に異なる範囲として設定される。 The distance image signal processing unit 38 sets a distance range D1 as a distance range for generating a distance image (see FIG. 4A). The distance image signal processing unit 38 sets the pulse rate of the light emission drive signal corresponding to the distance range D1 with reference to the pulse setting table or the like (S13). The distance range is set as a different range, for example, every 30 cm.
 パルスレートが設定されると、タイミング生成部35は、パルスレートに応じて、電子シャッタウィンドウ信号のHigh期間とLow期間とを設定し、電子シャッタウィンドウ信号を生成する。この場合でも、タイミング生成部35は、設定されるパルスレートに関わらず、発光駆動信号と同一の周期で駆動(HIGH)と停止(LOW)とを繰り返すよう、電子シャッタウィンドウ信号を生成する。また、発光駆動信号と電子シャッタウィンドウ信号とは、位相が同じでも若干ずれていてもよい。例えば、電子シャッタウィンドウ信号が、発光駆動信号に若干遅れていてもよい。 When the pulse rate is set, the timing generation unit 35 sets a high period and a low period of the electronic shutter window signal according to the pulse rate, and generates an electronic shutter window signal. Even in this case, the timing generator 35 generates the electronic shutter window signal so as to repeat driving (HIGH) and stopping (LOW) at the same cycle as the light emission driving signal regardless of the set pulse rate. Further, the light emission drive signal and the electronic shutter window signal may have the same phase or may be slightly shifted. For example, the electronic shutter window signal may be slightly delayed from the light emission drive signal.
 発光素子部31は、発光駆動信号に従って、設定されたパルスレートに応じて、赤外光を対象物20に対して発光する(S14)。受光素子部33は、対象物20からの反射光としての赤外光を受光する(S15)。 The light emitting element unit 31 emits infrared light to the object 20 in accordance with the set pulse rate according to the light emission drive signal (S14). The light receiving element unit 33 receives infrared light as reflected light from the object 20 (S15).
 距離画像信号処理部38は、画素毎に、受光された赤外光の積分値を算出し、算出された積分値を基に距離画像を生成する(S16)。距離画像信号処理部38は、生成された距離範囲D1の距離画像を記憶部41に記憶させる(S17)。これにより、1フレーム分の距離画像が生成され、記憶される。 The distance image signal processing unit 38 calculates an integrated value of the received infrared light for each pixel, and generates a distance image based on the calculated integrated value (S16). The distance image signal processing unit 38 stores the generated distance image of the distance range D1 in the storage unit 41 (S17). Thereby, a distance image for one frame is generated and stored.
 距離画像信号処理部38は、変数nが「1」であるか否かを判定する(S18)。変数nが「1」でない場合、変数nをデクリメントし(S19)、S13に進む。 The distance image signal processing unit 38 determines whether or not the variable n is “1” (S18). If the variable n is not “1”, the variable n is decremented (S19), and the process proceeds to S13.
 再度S13に進むと、距離画像信号処理部38は、前述と同様に、距離範囲D2を設定し、距離範囲D2に対応する発光駆動信号のパルスレートを設定し、このパルスレートに応じた距離画像を生成する。距離画像信号処理部38は、前述と同様に、距離範囲D3を設定し、距離範囲D3に対応する発光駆動信号のパルスレートを設定し、このパルスレートに応じた距離画像を生成する。 When the process proceeds to S13 again, the distance image signal processing unit 38 sets the distance range D2 and sets the pulse rate of the light emission drive signal corresponding to the distance range D2, as described above, and the distance image corresponding to this pulse rate. Is generated. As described above, the distance image signal processing unit 38 sets the distance range D3, sets the pulse rate of the light emission drive signal corresponding to the distance range D3, and generates a distance image corresponding to the pulse rate.
 距離範囲D3に設定された場合、S18においてnが「1」であるので、距離画像信号処理部38は、記憶部41から距離範囲D1~D3に対応する距離画像P1~P3を取得し、距離画像P1~P3を合成して合成距離画像P0を生成する(S20)。 When the distance range D3 is set, since n is “1” in S18, the distance image signal processing unit 38 acquires the distance images P1 to P3 corresponding to the distance ranges D1 to D3 from the storage unit 41, and the distance is set. The images P1 to P3 are combined to generate a combined distance image P0 (S20).
 生成された合成距離画像P0は、任意の方法で出力される(S21)。例えば、記憶部41が合成距離画像P0を蓄積してもよいし、通信部43が合成距離画像P0を外部装置へ送信してもよいし、モニタ11が合成距離画像P0を表示してもよい。外部装置へ送信されることで、合成距離画像P0をクラウド上でも利用できる。 The generated composite distance image P0 is output by an arbitrary method (S21). For example, the storage unit 41 may accumulate the synthesized distance image P0, the communication unit 43 may transmit the synthesized distance image P0 to an external device, or the monitor 11 may display the synthesized distance image P0. . The composite distance image P0 can be used on the cloud by being transmitted to the external device.
 S11において合成フラグがオフである場合、距離画像信号処理部38は、例えば所定の操作に基づいて、距離範囲を設定する(S31)。例えば、情報処理装置8の操作部により距離画像を生成する所望の距離範囲を受け付け、この操作情報がネットワーク7を介して撮像装置10に送られる。距離画像信号処理部38は、撮像装置10から受信された操作情報を基に、距離範囲(例えば距離範囲D1~D3)を設定する。情報処理装置8の操作部は、例えば、ユーザにより画面上のスライドバーをスライドする操作やボタンをタッチする操作を受け付ける(図7A~図7C参照)。このスライドバーやボタンは、距離範囲を設定するためのものである。 When the synthesis flag is off in S11, the distance image signal processing unit 38 sets a distance range based on, for example, a predetermined operation (S31). For example, a desired distance range for generating a distance image is received by the operation unit of the information processing device 8, and this operation information is sent to the imaging device 10 via the network 7. The distance image signal processing unit 38 sets a distance range (for example, distance ranges D1 to D3) based on the operation information received from the imaging device 10. The operation unit of the information processing apparatus 8 receives, for example, an operation of sliding a slide bar on the screen or an operation of touching a button by the user (see FIGS. 7A to 7C). These slide bars and buttons are for setting a distance range.
 距離画像信号処理部38は、パルス設定テーブル等を参照し、設定された距離範囲に対応する発光駆動信号のパルスレートを設定する(S32)。 The distance image signal processing unit 38 refers to the pulse setting table or the like and sets the pulse rate of the light emission drive signal corresponding to the set distance range (S32).
 パルスレートが設定されると、タイミング生成部35は、パルスレートに応じて、電子シャッタウィンドウ信号のHigh期間とLow期間とを設定し、電子シャッタウィンドウ信号を生成する。この場合でも、タイミング生成部35は、設定されるパルスレートに関わらず、発光駆動信号と同一の周期で駆動(HIGH)と停止(LOW)とを繰り返すよう、電子シャッタウィンドウ信号を生成する。また、発光駆動信号と電子シャッタウィンドウ信号とは、位相が同じでも若干ずれていてもよい。例えば、電子シャッタウィンドウ信号が、発光駆動信号に若干遅れていてもよい。 When the pulse rate is set, the timing generation unit 35 sets a high period and a low period of the electronic shutter window signal according to the pulse rate, and generates an electronic shutter window signal. Even in this case, the timing generator 35 generates the electronic shutter window signal so as to repeat driving (HIGH) and stopping (LOW) at the same cycle as the light emission driving signal regardless of the set pulse rate. Further, the light emission drive signal and the electronic shutter window signal may have the same phase or may be slightly shifted. For example, the electronic shutter window signal may be slightly delayed from the light emission drive signal.
 発光素子部31は、発光駆動信号に従って、設定されたパルスレートに応じて、赤外光を対象物20に対して発光する(S33)。受光素子部33は、対象物20からの反射光としての赤外光を受光する(S34)。 The light emitting element unit 31 emits infrared light to the object 20 according to the set pulse rate in accordance with the light emission drive signal (S33). The light receiving element unit 33 receives infrared light as reflected light from the object 20 (S34).
 距離画像信号処理部38は、画素毎に、受光された赤外光の積分値を算出し、算出された積分値を基に距離画像を生成する(S35)。 The distance image signal processing unit 38 calculates an integrated value of the received infrared light for each pixel, and generates a distance image based on the calculated integrated value (S35).
 生成された距離画像は、任意の方法で出力される(S36)。例えば、記憶部41が距離画像を蓄積してもよいし、通信部43が距離画像を外部装置へ送信してもよいし、モニタ11が距離画像を表示してもよい。外部装置へ送信されることで、距離画像をクラウド上でも利用できる。 The generated distance image is output by an arbitrary method (S36). For example, the storage unit 41 may accumulate the distance image, the communication unit 43 may transmit the distance image to an external device, or the monitor 11 may display the distance image. The distance image can be used on the cloud by being transmitted to the external device.
 次に、距離画像の画像例について説明する。 Next, an example of a distance image will be described.
 図7A~図7Cは、距離範囲D1~D3に対応するパルスレートで発光した場合に生成される距離画像P1~P3の画像例を示す模式図である。図7A~図7Cでは、対象物20として、図1と同様に、円錐形状の物体を想定する。 7A to 7C are schematic diagrams showing image examples of distance images P1 to P3 generated when light is emitted at a pulse rate corresponding to the distance ranges D1 to D3. In FIG. 7A to FIG. 7C, a conical object is assumed as the object 20 as in FIG.
 距離画像P1の生成時には、図4Aに示したように、距離範囲D1にある対象点25からの反射光が距離画像の生成のために適量となるように、パルスレートが設定される。この結果、図7Aに示すように、円錐形状の物体の上部71周辺について、好適に受光光量の積分値が得られる。 When the distance image P1 is generated, as shown in FIG. 4A, the pulse rate is set so that the reflected light from the target point 25 in the distance range D1 becomes an appropriate amount for generating the distance image. As a result, as shown in FIG. 7A, an integrated value of the amount of received light is suitably obtained around the upper portion 71 of the conical object.
 合成フラグがオンに設定されている場合には、距離範囲D1を設定するための操作を特に行わなくてもよい。例えば、距離範囲D1、D2、D3の順に、対応するパルスレートが設定されて、距離画像が生成されてもよい。合成フラグがオフに設定されている場合には、例えば、ユーザは、情報処理装置8の操作部を介して、画面上のスライドバー75を操作して、近距離用の位置75aに合わせる。 When the composite flag is set to ON, an operation for setting the distance range D1 is not particularly required. For example, the corresponding pulse rates may be set in the order of the distance ranges D1, D2, and D3, and the distance image may be generated. When the combination flag is set to OFF, for example, the user operates the slide bar 75 on the screen via the operation unit of the information processing device 8 to adjust the position to the short distance position 75a.
 距離画像P2の生成時には、図4Bに示したように、距離範囲D2にある対象点25からの反射光が距離画像の生成のために適量となるように、パルスレートが設定される。この結果、図7Bに示すように、円錐形状の物体の中部72周辺について、好適に受光光量の積分値が得られる。 When the distance image P2 is generated, as shown in FIG. 4B, the pulse rate is set so that the reflected light from the target point 25 in the distance range D2 becomes an appropriate amount for generating the distance image. As a result, as shown in FIG. 7B, an integrated value of the amount of received light is suitably obtained for the periphery of the middle part 72 of the conical object.
 合成フラグがオンに設定されている場合には、距離範囲D2を設定するための操作を特に行わなくてもよい。例えば、距離範囲D1、D2、D3の順に、対応するパルスレートが設定されて、距離画像が生成されてもよい。合成フラグがオフに設定されている場合には、例えば、ユーザは、情報処理装置8の操作部を介して、画面上のスライドバー75を操作して、中距離用の位置75bに合わせる。 When the composite flag is set to ON, the operation for setting the distance range D2 is not particularly required. For example, the corresponding pulse rates may be set in the order of the distance ranges D1, D2, and D3, and the distance image may be generated. When the synthesis flag is set to OFF, for example, the user operates the slide bar 75 on the screen via the operation unit of the information processing apparatus 8 to adjust the position to the middle distance position 75b.
 距離画像P3の生成時には、図4Cに示したように、距離範囲D3にある対象点25からの反射光が距離画像の生成のために適量となるように、パルスレートが設定される。この結果、図7Cに示すように、円錐形状の物体の下部73周辺について、好適に受光光量の積分値が得られる。 When generating the distance image P3, as shown in FIG. 4C, the pulse rate is set so that the reflected light from the target point 25 in the distance range D3 becomes an appropriate amount for generating the distance image. As a result, as shown in FIG. 7C, an integrated value of the amount of received light is suitably obtained around the lower portion 73 of the conical object.
 合成フラグがオンに設定されている場合には、距離範囲D3を設定するための操作を特に行わなくてもよい。例えば、距離範囲D1、D2、D3の順に、対応するパルスレートが設定されて、距離画像が生成されてもよい。合成フラグがオフに設定されている場合には、例えば、ユーザは、情報処理装置8の操作部を介して、画面上のスライドバー75を操作して、遠距離用の位置75cに合わせる。 When the composite flag is set to ON, an operation for setting the distance range D3 is not particularly required. For example, the corresponding pulse rates may be set in the order of the distance ranges D1, D2, and D3, and the distance image may be generated. When the combination flag is set to OFF, for example, the user operates the slide bar 75 on the screen via the operation unit of the information processing apparatus 8 to adjust the position to the long distance position 75c.
 図8は、合成距離画像P0の画像例を示す模式図である。 FIG. 8 is a schematic diagram showing an example of the composite distance image P0.
 距離画像信号処理部38は、例えば、画素毎に距離画像P1~P3の画素値を用いて統計処理して合成し、合成距離画像P0を生成する。統計処理では、例えば、距離画像P1~P3の画素値の加算値、単純平均値、中央値、加重平均値、最大値、最小値が導出される。図8では、合成距離画像P0には、距離画像P1において距離情報が好適に示された対象物20の上部71、距離画像P2において距離情報が好適に示された対象物20の中部72、距離画像P3において距離情報が好適に示された対象物20の下部73、が含まれる。 The distance image signal processing unit 38 performs, for example, statistical processing using the pixel values of the distance images P1 to P3 for each pixel and combines them to generate a combined distance image P0. In the statistical processing, for example, an addition value, a simple average value, a median value, a weighted average value, a maximum value, and a minimum value of the pixel values of the distance images P1 to P3 are derived. In FIG. 8, the synthesized distance image P0 includes an upper portion 71 of the object 20 whose distance information is preferably shown in the distance image P1, a middle portion 72 of the object 20 whose distance information is preferably shown in the distance image P2, and a distance. The lower part 73 of the target object 20 whose distance information is suitably shown in the image P3 is included.
 尚、図7A~図7C及び図8では、各距離範囲D1~D3において、各画素の距離情報が全て同じ模様で示されているが、当該模様は好適に距離情報が得られる範囲を示しており、実際には画素毎に距離情報は異なる。 In FIG. 7A to FIG. 7C and FIG. 8, in each distance range D1 to D3, the distance information of each pixel is shown in the same pattern, but the pattern shows a range in which the distance information can be suitably obtained. Actually, the distance information is different for each pixel.
 このように、撮像装置10によれば、撮像装置10と対象物20の対象点25との距離が短い場合には、パルスレートを低くし、撮像装置10と対象物20の対象点25との距離が長い場合には、パルスレートを高くする。つまり、撮像装置10は、距離画像生成のための発光方法を変更する。これにより、発光に対する受光量の積分値が過不足することを抑制し、適切に調整できる。また、例えば、撮像装置10と対象物20との間の距離が過度に長い場合若しくは過度に短い場合、又は、対象物20が大きい場合(例えば2m~3m)、の双方に本実施形態を適用できる。 As described above, according to the imaging device 10, when the distance between the imaging device 10 and the target point 25 of the target object 20 is short, the pulse rate is lowered and the distance between the imaging device 10 and the target point 25 of the target object 20 is reduced. When the distance is long, the pulse rate is increased. That is, the imaging device 10 changes the light emission method for generating the distance image. Thereby, it can suppress that the integral value of the received light quantity with respect to light emission is insufficient, and can adjust it appropriately. Further, for example, the present embodiment is applied both when the distance between the imaging device 10 and the object 20 is excessively long or excessively short, or when the object 20 is large (for example, 2 m to 3 m). it can.
 また、撮像装置10は、発光のパルスレートを制御することで、受光量の積分値を適切に(有効)に測定可能な範囲を拡大できる。従って、受光量の積分値を有効に測定可能な撮像装置10と対象物20の対象点25との間の距離(ダイナミックレンジ)を拡大できる。また、撮像装置10は、単一のパルスレートである場合の距離画像生成に関する感度不足を補足できる。 Also, the imaging device 10 can expand the range in which the integrated value of the amount of received light can be measured appropriately (effectively) by controlling the pulse rate of light emission. Therefore, the distance (dynamic range) between the imaging device 10 that can effectively measure the integrated value of the amount of received light and the target point 25 of the target 20 can be expanded. Moreover, the imaging device 10 can supplement the lack of sensitivity related to the range image generation when the pulse rate is single.
 また、各画素において撮像装置10と対象物20の対象点25との間の距離を導出できるので、対象物20の3D(三次元)形状を測定できる。従って、例えば3Dプリンタや3Dスキャナに適用できる。 In addition, since the distance between the imaging device 10 and the target point 25 of the target object 20 can be derived in each pixel, the 3D (three-dimensional) shape of the target object 20 can be measured. Therefore, it can be applied to, for example, a 3D printer or a 3D scanner.
 更に、距離画像とともに、可視画像の画像解析結果を組み合わせることで、測定された距離の精度を一層向上できる。 Furthermore, the accuracy of the measured distance can be further improved by combining the image analysis result of the visible image with the distance image.
 尚、受光素子部33の受光素子(例えばCCD)に相当する画素の表面積が大きい程、受光可能量が多くなり、電荷容量が大きくなる。電荷容量が上記ダイナミックレンジを決定する。一方、解像度を上げると、画素数が多くなるので、画素の表面積が小さくなり、受光可能量が少なくなり、電荷容量が小さくなる。今後、距離画像についても解像度を上げる要求が多くなり、ダイナミックレンジと解像度とのトレードオフの関係は顕在化していくと考えられる。この場合でも、撮像装置10によれば、解像度を上げることによるダイナミックレンジの不足を好適に補償できる。 Note that the greater the surface area of the pixel corresponding to the light receiving element (for example, CCD) of the light receiving element section 33, the greater the amount of light that can be received and the greater the charge capacity. The charge capacity determines the dynamic range. On the other hand, when the resolution is increased, the number of pixels increases, so the surface area of the pixels decreases, the amount of light that can be received decreases, and the charge capacity decreases. In the future, there is an increasing demand for increasing the resolution of distance images, and the trade-off relationship between dynamic range and resolution will become apparent. Even in this case, according to the imaging apparatus 10, it is possible to suitably compensate for a shortage of the dynamic range caused by increasing the resolution.
 なお、本発明は、上記実施形態の構成に限られるものではなく、請求の範囲で示した機能、または本実施形態の構成が持つ機能が達成できる構成であればどのようなものであっても適用可能である。 Note that the present invention is not limited to the configuration of the above-described embodiment, and any configuration can be used as long as the functions shown in the claims or the functions of the configuration of the present embodiment can be achieved. Applicable.
 上記実施形態では、タイミング生成部35は、パルスレートを変更することで、距離範囲D1~D3に対応する発光量に可変することを例示したが、他の方法により発光量を可変してもよい。例えば、タイミング生成部35は、発光駆動信号のパルス幅を制御することで、発光量を制御してもよい。この場合、パルス幅が大きい程、発光量が大きくなり、パルス幅が小さい程、発光量が小さくなる。 In the above embodiment, the timing generation unit 35 exemplifies changing the light emission amount corresponding to the distance range D1 to D3 by changing the pulse rate. However, the light emission amount may be changed by another method. . For example, the timing generation unit 35 may control the light emission amount by controlling the pulse width of the light emission drive signal. In this case, the larger the pulse width, the larger the light emission amount, and the smaller the pulse width, the smaller the light emission amount.
 上記実施形態では、撮像装置10は、屋内に設置されることを例示したが、屋内の壁等に設置されてもよい。また、撮像装置10は、台座等を用いて所定の高さに設置されてもよい。例えば、撮像装置10は、フォークリフトの置台に対象物20が配置され、対象物20から所定距離離間されて、フォークリフトのカメラ設置部が設けられてもよい。これにより、フォークリフトを用いて対象物20としての物品を運搬中に、物品の3次元の大きさを把握できる。また、各種車両(例えば、フォークリフト、トラック、その他の車両)による物品の運搬中に物品の形状が変化していないかを確認でき、その結果を蓄積できる。これにより、例えば、情報処理装置8は、物品の配達元から配達先に至るまでを容易に追跡できる。このように、撮像装置10により、物品の運搬を高速化でき、トレーサビリティの容易性を向上できる。 In the above embodiment, the imaging device 10 is exemplified as being installed indoors, but may be installed on an indoor wall or the like. The imaging device 10 may be installed at a predetermined height using a pedestal or the like. For example, the imaging device 10 may be provided with a forklift camera installation unit in which the object 20 is arranged on a forklift stand and is separated from the object 20 by a predetermined distance. Thus, the three-dimensional size of the article can be grasped while the article as the object 20 is being transported using the forklift. Further, it is possible to confirm whether or not the shape of the article has changed during the transportation of the article by various vehicles (for example, forklifts, trucks, and other vehicles), and the results can be accumulated. Thereby, for example, the information processing apparatus 8 can easily track from the delivery source to the delivery destination of the article. Thus, the imaging device 10 can speed up the transportation of the article and improve the ease of traceability.
 また、対象物20の距離の測定時に、撮像装置10と対象物20の対象点25との間の距離が所定の距離となるようにすれば、撮像装置10は固定されず、可搬型であってもよい。例えば、撮像装置10は、可搬型の取引端末装置(例えば決済端末装置)に適用されてもよい。これにより、様々な場所で物品の大きさを高精度に計測できる。 In addition, when the distance between the imaging device 10 and the target point 25 of the target object 20 is set to a predetermined distance when measuring the distance of the target object 20, the imaging device 10 is not fixed and is portable. May be. For example, the imaging device 10 may be applied to a portable transaction terminal device (for example, a payment terminal device). Thereby, the size of the article can be measured with high accuracy in various places.
 上記実施形態では、撮像装置10が備える一部の構成部が、撮像装置10とは別体として設けられてもよい。例えば、情報処理装置8が、撮像装置10から複数の距離画像を取得し、合成距離画像を生成してもよい。 In the above embodiment, some components included in the imaging device 10 may be provided separately from the imaging device 10. For example, the information processing device 8 may acquire a plurality of distance images from the imaging device 10 and generate a combined distance image.
 上記実施形態では、撮像装置10が1台である場合を示したが、撮像装置10が複数であってもよい。いずれかの撮像装置10が、複数の撮像装置10により生成された複数の距離画像又は複数の合成距離画像を合成し、新たな合成距離画像を生成してもよい。 In the above embodiment, the case where there is one imaging device 10 is shown, but a plurality of imaging devices 10 may be provided. Any one of the imaging devices 10 may combine a plurality of distance images or a plurality of combined distance images generated by the plurality of imaging devices 10 to generate a new combined distance image.
 上述したように、撮像装置10は、対象物20へ光を発光する発光素子部31と、発光量を制御する発光/受光制御・駆動部32と、制御された発光量で発光された光に対する対象物からの反射光を受光する受光素子部33と、を備える。また、撮像装置10は、異なる発光量に対して得られる複数の受光量の積分値に基づいて、複数の距離画像を生成し、複数の距離画像を合成して合成距離画像を生成する距離画像信号処理部38を備える。 As described above, the imaging apparatus 10 has a light emitting element unit 31 that emits light to the object 20, a light emission / light reception control / drive unit 32 that controls the light emission amount, and light emitted with a controlled light emission amount. And a light receiving element portion 33 that receives reflected light from the object. Further, the imaging device 10 generates a plurality of distance images based on integral values of a plurality of received light amounts obtained for different light emission amounts, and combines the plurality of distance images to generate a combined distance image. A signal processing unit 38 is provided.
 撮像装置10は、距離画像生成装置の一例である。発光素子部31は発光部の一例である。発光/受光制御・駆動部32は、発光量制御部の一例である。受光素子部33は受光部の一例である。距離画像信号処理部38は、距離画像生成部及び画像合成部の一例である。 The imaging device 10 is an example of a distance image generation device. The light emitting element unit 31 is an example of a light emitting unit. The light emission / light reception control / drive unit 32 is an example of a light emission amount control unit. The light receiving element unit 33 is an example of a light receiving unit. The distance image signal processing unit 38 is an example of a distance image generation unit and an image synthesis unit.
 これにより、撮像装置10と対象点25との間の距離にあまり依存しない合成距離画像が示す距離の情報を得られるので、距離画像(合成距離画像)が示す撮像装置と撮像対象物との間の距離の精度を向上できる。 As a result, information on the distance indicated by the combined distance image that does not depend much on the distance between the imaging device 10 and the target point 25 can be obtained, so that the distance between the imaging device indicated by the distance image (synthesized distance image) and the imaging target object can be obtained. The accuracy of the distance can be improved.
 また、発光/受光制御・駆動部32は、発光駆動信号のパルスレートを制御してもよい。これにより、合成距離画像を生成する基となる、様々な距離範囲に対応した複数の距離画像を生成できる。 Further, the light emission / light reception control / drive unit 32 may control the pulse rate of the light emission drive signal. As a result, a plurality of distance images corresponding to various distance ranges can be generated, which is a basis for generating the composite distance image.
 また、発光/受光制御・駆動部32は、発光駆動信号のパルス幅を制御してもよい。これにより、これにより、合成距離画像を生成する基となる、様々な距離範囲に対応した複数の距離画像を生成できる。 Further, the light emission / light reception control / drive unit 32 may control the pulse width of the light emission drive signal. Thereby, it is possible to generate a plurality of distance images corresponding to various distance ranges as a basis for generating the composite distance image.
 また、撮像装置10は、操作入力を受け付ける情報処理装置8から操作情報を取得する通信部43を備え、距離画像信号処理部38は、操作情報が取得された場合、合成距離画像を生成してもよい。これにより、合成距離画像を生成するか否かをユーザの操作により選択できる。尚、情報処理装置8は、操作装置の一例である。通信部43は、操作情報取得部の一例である。 In addition, the imaging device 10 includes a communication unit 43 that acquires operation information from the information processing device 8 that receives an operation input, and the distance image signal processing unit 38 generates a synthesized distance image when the operation information is acquired. Also good. Thereby, it can be selected by a user's operation whether a synthetic | combination distance image is produced | generated. The information processing device 8 is an example of an operation device. The communication unit 43 is an example of an operation information acquisition unit.
 また、発光素子部31は、非可視光を発光してもよい。これにより、可視光を用いて距離画像を生成する場合と比較すると、対象物20のエッジが強調され、合成距離画像が示す距離の精度を向上できる。非可視光は、例えば赤外線、紫外線である。 Further, the light emitting element unit 31 may emit invisible light. Thereby, compared with the case where a distance image is generated using visible light, the edge of the object 20 is emphasized, and the accuracy of the distance indicated by the combined distance image can be improved. Invisible light is, for example, infrared rays or ultraviolet rays.
 また、撮像装置10における距離画像生成方法は、対象物20へ光を発光するための発光量を制御するステップと、制御された発光量で対象物20へ光を発光するステップと、発光された光に対する、対象物20からの反射光を受光するステップと、異なる発光量に対して得られる複数の受光量の積分値に基づいて、複数の距離画像を生成するステップと、複数の距離画像を合成し、合成距離画像を生成するステップと、を備える。 In addition, the distance image generation method in the imaging device 10 controls the amount of light emitted to emit light to the object 20, the step of emitting light to the object 20 with the controlled amount of light emitted, and the emitted light. A step of receiving reflected light from the object 20 with respect to light, a step of generating a plurality of distance images based on integrated values of a plurality of received light amounts obtained for different light emission amounts, and a plurality of distance images; Synthesizing and generating a synthesized distance image.
 これにより、撮像装置10と対象点25との間の距離にあまり依存しない合成距離画像が示す距離の情報を得られるので、距離画像(合成距離画像)が示す撮像装置と撮像対象物との間の距離の精度を向上できる。 As a result, information on the distance indicated by the combined distance image that does not depend much on the distance between the imaging device 10 and the target point 25 can be obtained, so that the distance between the imaging device indicated by the distance image (synthesized distance image) and the imaging target object can be obtained. The accuracy of the distance can be improved.
 また、上記実施形態の距離画像生成プログラムは、撮像装置10における距離画像生成方法の各ステップをコンピュータに実行させるためのプログラムである。 In addition, the distance image generation program of the above embodiment is a program for causing a computer to execute each step of the distance image generation method in the imaging apparatus 10.
 これにより、撮像装置10と対象点25との間の距離にあまり依存しない合成距離画像が示す距離の情報を得られるので、距離画像(合成距離画像)が示す撮像装置と撮像対象物との間の距離の精度を向上できる。 As a result, information on the distance indicated by the combined distance image that does not depend much on the distance between the imaging device 10 and the target point 25 can be obtained, so that the distance between the imaging device indicated by the distance image (synthesized distance image) and the imaging target object can be obtained. The accuracy of the distance can be improved.
 本発明は、距離画像が示す撮像装置と撮像対象物との間の距離の精度を向上できる距離画像生成装置及、距離画像生成方法、及び距離画像生成プログラム等に有用である。 The present invention is useful for a distance image generation device, a distance image generation method, a distance image generation program, and the like that can improve the accuracy of the distance between the imaging device indicated by the distance image and the imaging object.
7 ネットワーク
8 情報処理装置
9 レコーダ
10 撮像装置
11 モニタ
30A 映像撮像部
30B 距離画像生成部
31 発光素子部
32 発光/受光制御・駆動部
33 受光素子部
33A 撮像光学系
34 受光タイミング制御部
35 タイミング生成部
36 A/D変換部
37 分離部
38 距離画像信号処理部
39 可視・IR画像信号処理部
40 対応付け処理部
41 記憶部
42 CPU
43 通信部
44 通信路
7 Network 8 Information processing device 9 Recorder 10 Imaging device 11 Monitor 30A Video imaging unit 30B Distance image generation unit 31 Light emitting element unit 32 Light emission / light reception control / drive unit 33 Light reception element unit 33A Imaging optical system 34 Light reception timing control unit 35 Timing generation Unit 36 A / D conversion unit 37 separation unit 38 distance image signal processing unit 39 visible / IR image signal processing unit 40 association processing unit 41 storage unit 42 CPU
43 Communication unit 44 Communication path

Claims (7)

  1.  対象物へ光を発光する発光部と、
     前記発光部による発光量を制御する発光量制御部と、
     前記制御された発光量で発光された光に対する、前記対象物からの反射光を受光する受光部と、
     異なる発光量に対して得られる複数の受光量の積分値に基づいて、複数の距離画像を生成する距離画像生成部と、
     前記複数の距離画像を合成し、合成距離画像を生成する画像合成部と、
     を備える距離画像生成装置。
    A light emitting unit for emitting light to an object;
    A light emission amount control unit for controlling the light emission amount by the light emitting unit;
    A light receiving unit that receives reflected light from the object with respect to light emitted with the controlled light emission amount;
    A distance image generating unit that generates a plurality of distance images based on an integrated value of a plurality of received light amounts obtained for different light emission amounts;
    An image synthesis unit that synthesizes the plurality of distance images and generates a synthesized distance image;
    A distance image generation apparatus comprising:
  2.  請求項1に記載の距離画像生成装置であって、
     前記発光量制御部は、前記発光部による発光パルスのパルスレートを制御する、距離画像生成装置。
    The distance image generating device according to claim 1,
    The light emission amount control unit is a distance image generation device that controls a pulse rate of a light emission pulse by the light emission unit.
  3.  請求項1に記載の距離画像生成装置であって、
     前記発光量制御部は、前記発光部による発光パルスのパルス幅を制御する、距離画像生成装置。
    The distance image generating device according to claim 1,
    The light emission amount control unit is a distance image generation device that controls a pulse width of a light emission pulse by the light emission unit.
  4.  請求項1ないし3のいずれか1項に記載の距離画像生成装置であって、更に、
     操作入力を受け付ける操作装置から操作情報を取得する操作情報取得部を備え、
     前記画像合成部は、前記操作情報が取得された場合、前記合成距離画像を生成する、距離画像生成装置。
    The distance image generating device according to any one of claims 1 to 3, further comprising:
    An operation information acquisition unit that acquires operation information from an operation device that receives operation input is provided.
    The image composition unit is a distance image generation device that generates the composite distance image when the operation information is acquired.
  5.  請求項1ないし3のいずれか1項に記載の距離画像生成装置であって、
     前記発光部は、非可視光を発光する、距離画像生成装置。
    The distance image generating device according to any one of claims 1 to 3,
    The light emitting unit emits invisible light, and is a distance image generation device.
  6.  距離画像生成装置における距離画像生成方法であって、
     対象物へ光を発光するための発光量を制御するステップと、
     前記制御された発光量で前記対象物へ光を発光するステップと、
     前記発光された光に対する、前記対象物からの反射光を受光するステップと、
     異なる発光量に対して得られる複数の受光量の積分値に基づいて、複数の距離画像を生成するステップと、
     前記複数の距離画像を合成し、合成距離画像を生成するステップと、
     を備える距離画像生成方法。
    A distance image generation method in a distance image generation device, comprising:
    Controlling the amount of light emitted to emit light to the object;
    Emitting light to the object with the controlled light emission amount;
    Receiving reflected light from the object with respect to the emitted light;
    Generating a plurality of distance images based on an integrated value of a plurality of received light amounts obtained for different light emission amounts;
    Combining the plurality of distance images to generate a combined distance image;
    A distance image generation method comprising:
  7.  請求項6に記載の距離画像生成方法の各ステップをコンピュータに実行させるための距離画像生成プログラム。 A distance image generation program for causing a computer to execute each step of the distance image generation method according to claim 6.
PCT/JP2015/005656 2014-11-27 2015-11-12 Distance image generation device, distance image generation method, and distance image generation program WO2016084323A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-240483 2014-11-27
JP2014240483A JP2016102697A (en) 2014-11-27 2014-11-27 Device, method, and program for creating distance image

Publications (1)

Publication Number Publication Date
WO2016084323A1 true WO2016084323A1 (en) 2016-06-02

Family

ID=56073919

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/005656 WO2016084323A1 (en) 2014-11-27 2015-11-12 Distance image generation device, distance image generation method, and distance image generation program

Country Status (2)

Country Link
JP (1) JP2016102697A (en)
WO (1) WO2016084323A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017213052A1 (en) * 2016-06-08 2017-12-14 パナソニックIpマネジメント株式会社 Ranging system and ranging method
CN108139484A (en) * 2016-08-30 2018-06-08 索尼半导体解决方案公司 The control method of range unit and range unit
WO2018180391A1 (en) 2017-03-30 2018-10-04 パナソニックIpマネジメント株式会社 Image recognition device and distance image generation method
CN109425864A (en) * 2017-09-04 2019-03-05 日立乐金光科技株式会社 3 dimension distance-measuring devices
CN110766727A (en) * 2019-10-22 2020-02-07 歌尔股份有限公司 Depth module brightness calibration method and device, readable storage medium and depth camera

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112995616A (en) 2016-06-03 2021-06-18 麦克赛尔株式会社 Image pickup apparatus and image pickup system
JP6819376B2 (en) * 2017-03-14 2021-01-27 オムロン株式会社 Displacement measuring device
JP7220343B2 (en) * 2018-02-26 2023-02-10 パナソニックIpマネジメント株式会社 Image processing device
JP7135846B2 (en) * 2018-12-27 2022-09-13 株式会社デンソー Object detection device and object detection method
US20230179841A1 (en) * 2019-03-11 2023-06-08 Koito Manufacturing Co., Ltd. Gating camera
JP7290485B6 (en) * 2019-06-26 2023-06-30 株式会社 日立産業制御ソリューションズ Range image generator
JP2021092420A (en) * 2019-12-09 2021-06-17 アイホン株式会社 Distance measurement camera system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5744809A (en) * 1980-08-28 1982-03-13 Sankusu:Kk Distance measuring apparatus
JP2007139489A (en) * 2005-11-16 2007-06-07 Sunx Ltd Displacement sensor
JP2012168049A (en) * 2011-02-15 2012-09-06 Stanley Electric Co Ltd Distance image generation device and method
JP2013048792A (en) * 2011-08-31 2013-03-14 Fujifilm Corp Endoscopic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5744809A (en) * 1980-08-28 1982-03-13 Sankusu:Kk Distance measuring apparatus
JP2007139489A (en) * 2005-11-16 2007-06-07 Sunx Ltd Displacement sensor
JP2012168049A (en) * 2011-02-15 2012-09-06 Stanley Electric Co Ltd Distance image generation device and method
JP2013048792A (en) * 2011-08-31 2013-03-14 Fujifilm Corp Endoscopic device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017213052A1 (en) * 2016-06-08 2017-12-14 パナソニックIpマネジメント株式会社 Ranging system and ranging method
JPWO2017213052A1 (en) * 2016-06-08 2019-04-04 パナソニックIpマネジメント株式会社 Ranging system and ranging method
CN108139484A (en) * 2016-08-30 2018-06-08 索尼半导体解决方案公司 The control method of range unit and range unit
CN108139484B (en) * 2016-08-30 2023-03-31 索尼半导体解决方案公司 Distance measuring device and method for controlling distance measuring device
WO2018180391A1 (en) 2017-03-30 2018-10-04 パナソニックIpマネジメント株式会社 Image recognition device and distance image generation method
US11467285B2 (en) 2017-03-30 2022-10-11 Panasonic Intellectual Property Management Co., Ltd. Image recognition device and distance image generation method
CN109425864A (en) * 2017-09-04 2019-03-05 日立乐金光科技株式会社 3 dimension distance-measuring devices
CN110766727A (en) * 2019-10-22 2020-02-07 歌尔股份有限公司 Depth module brightness calibration method and device, readable storage medium and depth camera

Also Published As

Publication number Publication date
JP2016102697A (en) 2016-06-02

Similar Documents

Publication Publication Date Title
WO2016084323A1 (en) Distance image generation device, distance image generation method, and distance image generation program
CN105190426B (en) Time-of-flight sensor binning
US10467933B2 (en) Display device and image displaying method therefor
US9294754B2 (en) High dynamic range and depth of field depth camera
TWI512270B (en) Optical distance measurement system with dynamicexposure time
US20190075257A1 (en) System and Methods for Depth Imaging using Conventional CCD Image Sensors
US10616561B2 (en) Method and apparatus for generating a 3-D image
JP6302414B2 (en) Motion sensor device having a plurality of light sources
US8730302B2 (en) Method and system for enhancing 3D effects for 3D video rendering
JP6635382B2 (en) Image output device, image output method, and image output system
US20100046802A1 (en) Distance estimation apparatus, distance estimation method, storage medium storing program, integrated circuit, and camera
WO2016206004A1 (en) Photographing device and method for acquiring depth information
JP2018119942A (en) Imaging device, method of monitoring the same, and program
US9699377B2 (en) Depth detecting apparatus and method, and gesture detecting apparatus and gesture detecting method
CN105245790A (en) Light filling method, device and mobile terminal
JP2010175435A (en) Three-dimensional information detecting apparatus and three-dimensional information detecting method
KR20150019926A (en) Distance detecting apparatus capable of deriving distance information having changed space resolution
US20150317516A1 (en) Method and system for remote controlling
JP2010190675A (en) Distance image sensor system and method of generating distance image
US9686522B2 (en) Display apparatus capable of seamlessly displaying a plurality of projection images on screen
JPWO2017056776A1 (en) Projector device with distance image acquisition device and projection method
CN106534633A (en) Combined photographing system, mobile terminal and image processing method
JPWO2018225517A1 (en) Information processing apparatus and method
TW201617639A (en) Optical distance measurement system and method
JP6042674B2 (en) Image projection device with 3D information acquisition function

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15863621

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15863621

Country of ref document: EP

Kind code of ref document: A1